scholarly journals Recognition of Customers’ Impulsivity from Behavioral Patterns in Virtual Reality

2021 ◽  
Vol 11 (10) ◽  
pp. 4399
Author(s):  
Masoud Moghaddasi ◽  
Javier Marín-Morales ◽  
Jaikishan Khatri ◽  
Jaime Guixeres ◽  
Irene Alice Chicchi Giglioli ◽  
...  

Virtual reality (VR) in retailing (V-commerce) has been proven to enhance the consumer experience. Thus, this technology is beneficial to study behavioral patterns by offering the opportunity to infer customers’ personality traits based on their behavior. This study aims to recognize impulsivity using behavioral patterns. For this goal, 60 subjects performed three tasks—one exploration task and two planned tasks—in a virtual market. Four noninvasive signals (eye-tracking, navigation, posture, and interactions), which are available in commercial VR devices, were recorded, and a set of features were extracted and categorized into zonal, general, kinematic, temporal, and spatial types. They were input into a support vector machine classifier to recognize the impulsivity of the subjects based on the I-8 questionnaire, achieving an accuracy of 87%. The results suggest that, while the exploration task can reveal general impulsivity, other subscales such as perseverance and sensation-seeking are more related to planned tasks. The results also show that posture and interaction are the most informative signals. Our findings validate the recognition of customer impulsivity using sensors incorporated into commercial VR devices. Such information can provide a personalized shopping experience in future virtual shops.

2020 ◽  
Vol 5 (2) ◽  
pp. 504
Author(s):  
Matthias Omotayo Oladele ◽  
Temilola Morufat Adepoju ◽  
Olaide ` Abiodun Olatoke ◽  
Oluwaseun Adewale Ojo

Yorùbá language is one of the three main languages that is been spoken in Nigeria. It is a tonal language that carries an accent on the vowel alphabets. There are twenty-five (25) alphabets in Yorùbá language with one of the alphabets a digraph (GB). Due to the difficulty in typing handwritten Yorùbá documents, there is a need to develop a handwritten recognition system that can convert the handwritten texts to digital format. This study discusses the offline Yorùbá handwritten word recognition system (OYHWR) that recognizes Yorùbá uppercase alphabets. Handwritten characters and words were obtained from different writers using the paint application and M708 graphics tablets. The characters were used for training and the words were used for testing. Pre-processing was done on the images and the geometric features of the images were extracted using zoning and gradient-based feature extraction. Geometric features are the different line types that form a particular character such as the vertical, horizontal, and diagonal lines. The geometric features used are the number of horizontal lines, number of vertical lines, number of right diagonal lines, number of left diagonal lines, total length of all horizontal lines, total length of all vertical lines, total length of all right slanting lines, total length of all left-slanting lines and the area of the skeleton. The characters are divided into 9 zones and gradient feature extraction was used to extract the horizontal and vertical components and geometric features in each zone. The words were fed into the support vector machine classifier and the performance was evaluated based on recognition accuracy. Support vector machine is a two-class classifier, hence a multiclass SVM classifier least square support vector machine (LSSVM) was used for word recognition. The one vs one strategy and RBF kernel were used and the recognition accuracy obtained from the tested words ranges between 66.7%, 83.3%, 85.7%, 87.5%, and 100%. The low recognition rate for some of the words could be as a result of the similarity in the extracted features.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Aaron Frederick Bulagang ◽  
James Mountstephens ◽  
Jason Teo

Abstract Background Emotion prediction is a method that recognizes the human emotion derived from the subject’s psychological data. The problem in question is the limited use of heart rate (HR) as the prediction feature through the use of common classifiers such as Support Vector Machine (SVM), K-Nearest Neighbor (KNN) and Random Forest (RF) in emotion prediction. This paper aims to investigate whether HR signals can be utilized to classify four-class emotions using the emotion model from Russell’s in a virtual reality (VR) environment using machine learning. Method An experiment was conducted using the Empatica E4 wristband to acquire the participant’s HR, a VR headset as the display device for participants to view the 360° emotional videos, and the Empatica E4 real-time application was used during the experiment to extract and process the participant's recorded heart rate. Findings For intra-subject classification, all three classifiers SVM, KNN, and RF achieved 100% as the highest accuracy while inter-subject classification achieved 46.7% for SVM, 42.9% for KNN and 43.3% for RF. Conclusion The results demonstrate the potential of SVM, KNN and RF classifiers to classify HR as a feature to be used in emotion prediction in four distinct emotion classes in a virtual reality environment. The potential applications include interactive gaming, affective entertainment, and VR health rehabilitation.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Vikram Jakkamsetti ◽  
William Scudder ◽  
Gauri Kathote ◽  
Qian Ma ◽  
Gustavo Angulo ◽  
...  

AbstractTime-to-fall off an accelerating rotating rod (rotarod) is widely utilized to evaluate rodent motor performance. We reasoned that this simple outcome could be refined with additional measures explicit in the task (however inconspicuously) to examine what we call movement sub-structure. Our goal was to characterize normal variation or motor impairment more robustly than by using time-to-fall. We also hypothesized that measures (or features) early in the sub-structure could anticipate the learning expected of a mouse undergoing serial trials. Using normal untreated and baclofen-treated movement-impaired mice, we defined these features and automated their analysis using paw video-tracking in three consecutive trials, including paw location, speed, acceleration, variance and approximate entropy. Spectral arc length yielded speed and acceleration uniformity. We found that, in normal mice, paw movement smoothness inversely correlated with rotarod time-to-fall for the three trials. Greater approximate entropy in vertical movements, and opposite changes in horizontal movements, correlated with greater first-trial time-to-fall. First-trial horizontal approximate entropy in the first few seconds predicted subsequent time-to-fall. This allowed for the separation, after only one rotarod trial, of different-weight, untreated mouse groups, and for the detection of mice otherwise unimpaired after baclofen, which displayed a time-to-fall similar to control. A machine-learning support vector machine classifier corroborated these findings. In conclusion, time-to-fall off a rotarod correlated well with several measures, including some obtained during the first few seconds of a trial, and some responsive to learning over the first two trials, allowing for predictions or preemptive experimental manipulations before learning completion.


Sign in / Sign up

Export Citation Format

Share Document