Measurement of Driver's Consciousness by Image Processing 2 -Detection of Concentration on Cellular Phone Call from Facial Expression Change coping with Individual Differences-

Author(s):  
Mami Yamakita ◽  
Kenichi Takahashi ◽  
Keiichi Yamada ◽  
Osami Yamamoto ◽  
Shin Yamamoto
2005 ◽  
Vol 125 (12) ◽  
pp. 1812-1817
Author(s):  
Kenichi Takahashi ◽  
Osami Yamamoto ◽  
Tomoaki Nakano ◽  
Shin Yamamoto

2016 ◽  
Author(s):  
Olímpio Murilo Capeli ◽  
Euvaldo Ferreira Cabral Junior ◽  
Sadao Isotani ◽  
Antonio Roberto Pereira Leite de Albuquerque

Author(s):  
Alejandro A. Arca ◽  
Kaitlin M. Stanford ◽  
Mustapha Mouloua

The current study was designed to empirically examine the effects of individual differences in attention and memory deficits on driver distraction. Forty-eight participants consisting of 37 non-ADHD and 11 ADHD drivers were tested in a medium fidelity GE-ISIM driving simulator. All participants took part in a series of simulated driving scenarios involving both high and low traffic conditions in conjunction with completing a 20-Questions task either by text- message or phone-call. Measures of UFOV, simulated driving, heart rate variability, and subjective (NASA TLX) workload performance were recorded for each of the experimental tasks. It was hypothesized that ADHD diagnosis, type of cellular distraction, and traffic density would affect driving performance as measured by driving performance, workload assessment, and physiological measures. Preliminary results indicated that ADHD diagnosis, type of cellular distraction, and traffic density affected the performance of the secondary task. These results provide further evidence for the deleterious effects of cellphone use on driver distraction, especially for drivers who are diagnosed with attention-deficit and memory capacity deficits. Theoretical and practical implications are discussed, and directions for future research are also presented.


2020 ◽  
Vol 7 (9) ◽  
pp. 190699
Author(s):  
Sarah A. H. Alharbi ◽  
Katherine Button ◽  
Lingshan Zhang ◽  
Kieran J. O'Shea ◽  
Vanessa Fasolt ◽  
...  

Evidence that affective factors (e.g. anxiety, depression, affect) are significantly related to individual differences in emotion recognition is mixed. Palermo et al . (Palermo et al . 2018 J. Exp. Psychol. Hum. Percept. Perform. 44 , 503–517) reported that individuals who scored lower in anxiety performed significantly better on two measures of facial-expression recognition (emotion-matching and emotion-labelling tasks), but not a third measure (the multimodal emotion recognition test). By contrast, facial-expression recognition was not significantly correlated with measures of depression, positive or negative affect, empathy, or autistic-like traits. Because the range of affective factors considered in this study and its use of multiple expression-recognition tasks mean that it is a relatively comprehensive investigation of the role of affective factors in facial expression recognition, we carried out a direct replication. In common with Palermo et al . (Palermo et al . 2018 J. Exp. Psychol. Hum. Percept. Perform. 44 , 503–517), scores on the DASS anxiety subscale negatively predicted performance on the emotion recognition tasks across multiple analyses, although these correlations were only consistently significant for performance on the emotion-labelling task. However, and by contrast with Palermo et al . (Palermo et al . 2018 J. Exp. Psychol. Hum. Percept. Perform. 44 , 503–517), other affective factors (e.g. those related to empathy) often also significantly predicted emotion-recognition performance. Collectively, these results support the proposal that affective factors predict individual differences in emotion recognition, but that these correlations are not necessarily specific to measures of general anxiety, such as the DASS anxiety subscale.


2008 ◽  
Vol 381-382 ◽  
pp. 375-378
Author(s):  
K.T. Song ◽  
M.J. Han ◽  
F.Y. Chang ◽  
S.H. Chang

The capability of recognizing human facial expression plays an important role in advanced human-robot interaction development. Through recognizing facial expressions, a robot can interact with a user in a more natural and friendly manner. In this paper, we proposed a facial expression recognition system based on an embedded image processing platform to classify different facial expressions on-line in real time. A low-cost embedded vision system has been designed and realized for robotic applications using a CMOS image sensor and digital signal processor (DSP). The current design acquires thirty 640x480 image frames per second (30 fps). The proposed emotion recognition algorithm has been successfully implemented on the real-time vision system. Experimental results on a pet robot show that the robot can interact with a person in a responding manner. The developed image processing platform is effective for accelerating the recognition speed to 25 recognitions per second with an average on-line recognition rate of 74.4% for five facial expressions.


Sign in / Sign up

Export Citation Format

Share Document