BlinkListener

Author(s):  
Jialin Liu ◽  
Dong Li ◽  
Lei Wang ◽  
Jie Xiong

Eye blink detection plays a key role in many real-life applications such as Human-Computer Interaction (HCI), drowsy driving prevention and eye disease detection. Although traditional camera-based techniques are promising, multiple issues hinder their wide adoption including the privacy concern, strict lighting condition and line-of-sight (LoS) requirements. On the other hand, wireless sensing without a need for dedicated sensors gains a tremendous amount of attention in recent years. Among the wireless signals utilized for sensing, acoustic signals show a unique potential for fine-grained sensing owing to their low propagation speed in the air. Another trend favoring acoustic sensing is the wide availability of speakers and microphones in commodity devices. Promising progress has been achieved in fine-grained human motion sensing such as breathing using acoustic signals. However, it is still very challenging to employ acoustic signals for eye blink detection due to the unique characteristics of eye blink (i.e., subtle, sparse and aperiodic) and severe interference (i.e., from the human target himself and surrounding objects). We find that even the very subtle involuntary head movement induced by breathing can severely interfere with eye blink detection. In this work, for the first time, we propose a system called BlinkListener to sense the subtle eye blink motion using acoustic signals in a contact-free manner. We first quantitatively model the relationship between signal variation and the subtle movements caused by eye blink and interference. Then, we propose a novel method that exploits the "harmful" interference to maximize the subtle signal variation induced by eye blinks. We implement BlinkListener on both a research-purpose platform (Bela) and a commodity smartphone (iPhone 5c). Experiment results show that BlinkListener can achieve robust performance with a median detection accuracy of 95%. Our system can achieve high accuracies when the smartphone is held in hand, the target wears glasses/sunglasses and in the presence of strong interference with people moving around.

Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4895
Author(s):  
Thanh-Vinh Nguyen ◽  
Masaaki Ichiki

This paper reports on a mask-type sensor for simultaneous pulse wave and respiration measurements and eye blink detection that uses only one sensing element. In the proposed sensor, a flexible air bag-shaped chamber whose inner pressure change can be measured by a microelectromechanical system-based piezoresistive cantilever was used as the sensing element. The air bag-shaped chamber is fabricated by wrapping a sponge pad with plastic film and polyimide tape. The polyimide tape has a hole to which the substrate with the piezoresistive cantilever adheres. By attaching the sensor device to a mask where it contacts the nose of the subject, the sensor can detect the pulses and eye blinks of the subject by detecting the vibration and displacement of the nose skin caused by these physiological parameters. Moreover, the respiration of the subject causes pressure changes in the space between the mask and the face of the subject as well as slight vibrations of the mask. Therefore, information about the respiration of the subject can be extracted from the sensor signal using either the low-frequency component (<1 Hz) or the high-frequency component (>100 Hz). This paper describes the sensor fabrication and provides demonstrations of the pulse wave and respiration measurements as well as eye blink detection using the fabricated sensor.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4141
Author(s):  
Wouter Houtman ◽  
Gosse Bijlenga ◽  
Elena Torta ◽  
René van de Molengraft

For robots to execute their navigation tasks both fast and safely in the presence of humans, it is necessary to make predictions about the route those humans intend to follow. Within this work, a model-based method is proposed that relates human motion behavior perceived from RGBD input to the constraints imposed by the environment by considering typical human routing alternatives. Multiple hypotheses about routing options of a human towards local semantic goal locations are created and validated, including explicit collision avoidance routes. It is demonstrated, with real-time, real-life experiments, that a coarse discretization based on the semantics of the environment suffices to make a proper distinction between a person going, for example, to the left or the right on an intersection. As such, a scalable and explainable solution is presented, which is suitable for incorporation within navigation algorithms.


Author(s):  
Marlene Arangú ◽  
Miguel Salido

A fine-grained arc-consistency algorithm for non-normalized constraint satisfaction problems Constraint programming is a powerful software technology for solving numerous real-life problems. Many of these problems can be modeled as Constraint Satisfaction Problems (CSPs) and solved using constraint programming techniques. However, solving a CSP is NP-complete so filtering techniques to reduce the search space are still necessary. Arc-consistency algorithms are widely used to prune the search space. The concept of arc-consistency is bidirectional, i.e., it must be ensured in both directions of the constraint (direct and inverse constraints). Two of the most well-known and frequently used arc-consistency algorithms for filtering CSPs are AC3 and AC4. These algorithms repeatedly carry out revisions and require support checks for identifying and deleting all unsupported values from the domains. Nevertheless, many revisions are ineffective, i.e., they cannot delete any value and consume a lot of checks and time. In this paper, we present AC4-OP, an optimized version of AC4 that manages the binary and non-normalized constraints in only one direction, storing the inverse founded supports for their later evaluation. Thus, it reduces the propagation phase avoiding unnecessary or ineffective checking. The use of AC4-OP reduces the number of constraint checks by 50% while pruning the same search space as AC4. The evaluation section shows the improvement of AC4-OP over AC4, AC6 and AC7 in random and non-normalized instances.


Author(s):  
Yang Gao ◽  
Yincheng Jin ◽  
Seokmin Choi ◽  
Jiyang Li ◽  
Junjie Pan ◽  
...  

Accurate recognition of facial expressions and emotional gestures is promising to understand the audience's feedback and engagement on the entertainment content. Existing methods are primarily based on various cameras or wearable sensors, which either raise privacy concerns or demand extra devices. To this aim, we propose a novel ubiquitous sensing system based on the commodity microphone array --- SonicFace, which provides an accessible, unobtrusive, contact-free, and privacy-preserving solution to monitor the user's emotional expressions continuously without playing hearable sound. SonicFace utilizes a pair of speaker and microphone array to recognize various fine-grained facial expressions and emotional hand gestures by emitted ultrasound and received echoes. Based on a set of experimental evaluations, the accuracy of recognizing 6 common facial expressions and 4 emotional gestures can reach around 80%. Besides, the extensive system evaluations with distinct configurations and an extended real-life case study have demonstrated the robustness and generalizability of the proposed SonicFace system.


Author(s):  
Dang-Khoa Tran ◽  
Thanh-Hai Nguyen ◽  
Thanh-Nghia Nguyen

In the electroencephalography (EEG) study, eye blinks are a commonly known type of ocular artifact that appears most frequently in any EEG measurement. The artifact can be seen as spiking electrical potentials in which their time-frequency properties are varied across individuals. Their presence can negatively impact various medical or scientific research or be helpful when applying to brain-computer interface applications. Hence, detecting eye-blink signals is beneficial for determining the correlation between the human brain and eye movement in this paper. The paper presents a simple, fast, and automated eye-blink detection algorithm that did not require user training before algorithm execution. EEG signals were smoothed and filtered before eye-blink detection. We conducted experiments with ten volunteers and collected three different eye-blink datasets over three trials using Emotiv EPOC+ headset. The proposed method performed consistently and successfully detected spiking activities of eye blinks with a mean accuracy of over 96%.


Author(s):  
Lina Kluy ◽  
Eileen Roesler

Industrial human-robot collaboration (HRC) is not yet widely spread but on the rise. This development raises the question about properties collaborative robots (cobots) need, to enable a pleasant and smooth interaction. Therefore, this study investigated the influence of transparency and reliability on perception of and trust towards cobots. A video-enhanced online study with 124 participants was conducted. Transparency was provided through the presentation of differing information, and reliability was manipulated through differing error rates. The results showed a positive effect of transparency on perceived safety and intelligence. Reliability had a positive effect on perceived intelligence, likeability and trust. The effect of reliability on trust was more pronounced for low transparent robots. The results indicate the relevance of carefully selected information to counteract negative effects of failures. Future research should transfer the study design into a real-life experiment with more fine-grained levels of transparency and reliability.


Sign in / Sign up

Export Citation Format

Share Document