scholarly journals Room Volume Estimation Based on Ambiguity of Short-Term Interaural Phase Differences Using Humanoid Robot Head

Robotics ◽  
2016 ◽  
Vol 5 (3) ◽  
pp. 16 ◽  
Author(s):  
Ryuichi Shimoyama ◽  
Reo Fukuda
2007 ◽  
Vol 121 (2) ◽  
pp. 1017-1027 ◽  
Author(s):  
Bernhard Ross ◽  
Kelly L. Tremblay ◽  
Terence W. Picton

1972 ◽  
Vol 15 (4) ◽  
pp. 771-780 ◽  
Author(s):  
Courtney Stromsta

Stutterers and nonstutterers cancelled the auditory sensation evoked by bone-conducted sinusoidal signals. They accomplished this by appropriate phase and amplitude adjustments of simultaneously presented bilateral air-conducted signals of the same frequency. Criterion measures of interaural phase difference at the point of cancellation were obtained for seven frequencies. The mean interaural phase differences obtained by stutterers were consistently greater than those of the nonstutterers. Based on time-equivalent values of the mean interaural phase differences, the values for stutterers were approximately twice as great as for nonstutterers at 150, 300, and 1200 Hz. The mean interaural phase difference found to exist for stutterers at 150 Hz approximates the magnitude of phase shift of normally delayed air-conducted auditory feedback of speech sounds that serves to induce experimental blockage of phonation. This relationship, in view of other findings, offers credence to the idea that disturbance of laryngeal function effected by an anomalous audition-phonation control system could be a causative agent in stuttering.


Author(s):  
Xiai Chen ◽  
Hong Xu ◽  
Ling Wang ◽  
Bingrui Wang ◽  
Chenna Yang

2014 ◽  
Vol 2014 ◽  
pp. 1-5 ◽  
Author(s):  
Jizheng Yan ◽  
Zhiliang Wang ◽  
Yan Yan

Emotional robots are always the focus of artificial intelligence (AI), and intelligent control of robot facial expression is a hot research topic. This paper focuses on the design of humanoid robot head, which is divided into three steps to achieve. The first step is to solve the uncanny valley about humanoid robot, to find and avoid the relationship between human being and robot; the second step is to solve the association between human face and robot head; compared with human being and robots, we analyze the similarities and differences and explore the same basis and mechanisms between robot and human analyzing the Facial Action Coding System (FACS), which guides us to achieve humanoid expressions. On the basis of the previous two steps, the third step is to construct a robot head; through a series of experiments we test the robot head, which could show some humanoid expressions; through human-robot interaction, we find people are surprised by the robot head expression and feel happy.


Sign in / Sign up

Export Citation Format

Share Document