Lightning: A Fast and Lightweight Acoustic Localization Protocol Using Low-End Wireless Micro-Sensors

Author(s):  
Qixin Wang ◽  
Rong Zheng ◽  
A. Tirumala ◽  
Xue Liu ◽  
Lui Sha
Author(s):  
Murali Manohar S Kumar ◽  
Himanshu Yadav ◽  
Dawnee Soman ◽  
Ashish Kumar ◽  
Arun Kumar Reddy N

Author(s):  
Atsushi Wada ◽  
Shingo Yoshizawa ◽  
Satoshi Yuasa ◽  
Hideki Sugimoto

Author(s):  
SERGEJ NEUMANN ◽  
DAVID OERTEL ◽  
HEINZ WÖRN ◽  
MARTIN KUROWSKI ◽  
DETLEF DEWITZ ◽  
...  

2000 ◽  
Vol 33 (21) ◽  
pp. 111-116 ◽  
Author(s):  
Lorenzo Mozzone ◽  
Pierfrancesco Lorenzelli ◽  
Andrea Caiti ◽  
Silvio Bongi

2018 ◽  
Vol 8 (10) ◽  
pp. 1862 ◽  
Author(s):  
Shuopeng Wang ◽  
Peng Yang ◽  
Hao Sun

Fingerprinting acoustic localization usually requires tremendous time and effort for database construction in sampling phase and reference points (RPs) matching in positioning phase. To improve the efficiency of this acoustic localization process, an iterative interpolation method is proposed to reduce the initial RPs needed for the required positioning accuracy by generating virtual RPs in positioning phase. Meanwhile, a two-stage matching method based on cluster analysis is proposed for computation reduction of RPs matching. Results reported show that, on the premise of ensuring positioning accuracy, two-stage matching method based on feature clustering partition can reduce the average RPs matching amount to 30.14% of the global linear matching method taken. Meanwhile, the iterative interpolation method can guarantee the positioning accuracy with only 27.77% initial RPs of the traditional method needed.


2021 ◽  
Vol 2 ◽  
Author(s):  
Thirsa Huisman ◽  
Axel Ahrens ◽  
Ewen MacDonald

To reproduce realistic audio-visual scenarios in the laboratory, Ambisonics is often used to reproduce a sound field over loudspeakers and virtual reality (VR) glasses are used to present visual information. Both technologies have been shown to be suitable for research. However, the combination of both technologies, Ambisonics and VR glasses, might affect the spatial cues for auditory localization and thus, the localization percept. Here, we investigated how VR glasses affect the localization of virtual sound sources on the horizontal plane produced using either 1st-, 3rd-, 5th- or 11th-order Ambisonics with and without visual information. Results showed that with 1st-order Ambisonics the localization error is larger than with the higher orders, while the differences across the higher orders were small. The physical presence of the VR glasses without visual information increased the perceived lateralization of the auditory stimuli by on average about 2°, especially in the right hemisphere. Presenting visual information about the environment and potential sound sources did reduce this HMD-induced shift, however it could not fully compensate for it. While the localization performance itself was affected by the Ambisonics order, there was no interaction between the Ambisonics order and the effect of the HMD. Thus, the presence of VR glasses can alter acoustic localization when using Ambisonics sound reproduction, but visual information can compensate for most of the effects. As such, most use cases for VR will be unaffected by these shifts in the perceived location of the auditory stimuli.


Author(s):  
Chunping Wu ◽  
Honggao Deng ◽  
Suqing Yan ◽  
Xiyan Sun ◽  
Yuanfa Ji ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document