scholarly journals A Walking-in-Place Method for Virtual Reality Using Position and Orientation Tracking

Sensors ◽  
2018 ◽  
Vol 18 (9) ◽  
pp. 2832 ◽  
Author(s):  
Juyoung Lee ◽  
Sang Chul Ahn ◽  
Jae-In Hwang

People are interested in traveling in an infinite virtual environment, but no standard navigation method exists yet in Virtual Reality (VR). The Walking-In-Place (WIP) technique is a navigation method that simulates movement to enable immersive travel with less simulator sickness in VR. However, attaching the sensor to the body is troublesome. A previously introduced method that performed WIP using an Inertial Measurement Unit (IMU) helped address this problem. That method does not require placement of additional sensors on the body. That study proved, through evaluation, the acceptable performance of WIP. However, this method has limitations, including a high step-recognition rate when the user does various body motions within the tracking area. Previous works also did not evaluate WIP step recognition accuracy. In this paper, we propose a novel WIP method using position and orientation tracking, which are provided in the most PC-based VR HMDs. Our method also does not require additional sensors on the body and is more stable than the IMU-based method for non-WIP motions. We evaluated our method with nine subjects and found that the WIP step accuracy was 99.32% regardless of head tilt, and the error rate was 0% for squat motion, which is a motion prone to error. We distinguish jog-in-place as “intentional motion” and others as “unintentional motion”. This shows that our method correctly recognizes only jog-in-place. We also apply the saw-tooth function virtual velocity to our method in a mathematical way. Natural navigation is possible when the virtual velocity approach is applied to the WIP method. Our method is useful for various applications which requires jogging.

Sensors ◽  
2019 ◽  
Vol 19 (2) ◽  
pp. 222 ◽  
Author(s):  
Lin Zhang ◽  
Wei Gao ◽  
Qian Li ◽  
Runbing Li ◽  
Zhanwei Yao ◽  
...  

The implementation principle of a typical three-pulse cold atom interference gyroscope is introduced in this paper. Based on its configuration and current research status, the problems of cold atom interference gyro are pointed out. The data-rate is insufficient, and it is difficult to achieve high dynamic measurement. Then, based on these two limitations, a novel design of the monitoring navigation system of the cold atom interference gyroscope (CAIG) and an intermediate-grade inertial measurement unit (IMU) was proposed to obtain the long-term position result without GPS signals, such as the Inertial Navigation System (INS) in underwater vehicles. While the CAIG was used as the external gyro, the bias of IMU and the misalignment angle between the CAIG-frame and the IMU-frame are obtained through filtering technique. The simulation test and field test demonstrated the improvements of the long-term positioning accuracy of the INS.


Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5743
Author(s):  
Fadwa El Aswad ◽  
Gilde Vanel Tchane Djogdom ◽  
Martin J.-D. Otis ◽  
Johannes C. Ayena ◽  
Ramy Meziane

Advances in robotics are part of reducing the burden associated with manufacturing tasks in workers. For example, the cobot could be used as a “third-arm” during the assembling task. Thus, the necessity of designing new intuitive control modalities arises. This paper presents a foot gesture approach centered on robot control constraints to switch between four operating modalities. This control scheme is based on raw data acquired by an instrumented insole located at a human’s foot. It is composed of an inertial measurement unit (IMU) and four force sensors. Firstly, a gesture dictionary was proposed and, from data acquired, a set of 78 features was computed with a statistical approach, and later reduced to 3 via variance analysis ANOVA. Then, the time series collected data were converted into a 2D image and provided as an input for a 2D convolutional neural network (CNN) for the recognition of foot gestures. Every gesture was assimilated to a predefined cobot operating mode. The offline recognition rate appears to be highly dependent on the features to be considered and their spatial representation in 2D image. We achieve a higher recognition rate for a specific representation of features by sets of triangular and rectangular forms. These results were encouraging in the use of CNN to recognize foot gestures, which then will be associated with a command to control an industrial robot.


2011 ◽  
Vol 133 (07) ◽  
pp. 40-45
Author(s):  
Noel C. Perkins ◽  
Kevin King ◽  
Ryan McGinnis ◽  
Jessandra Hough

This article discusses using wireless sensors to improve sports training. One example of wireless sensors is inertial sensors that were first developed for automotive and military applications. They are tiny accelerometers and angular rate gyros that can be combined to form a complete inertial measurement unit. An inertial measurement unit (IMU) detects the three-dimensional motion of a body in space by sensing the acceleration of one point on the body as well as the angular velocity of the body. When this small, but rugged device is mounted on or embedded within sports gear, such as the shaft of a golf club, the IMU provides the essential data needed to resolve the motion of that equipment. This technology—and sound use of the theory of rigid body dynamics—is now being developed and commercialized as the ingredients in new sports training systems. It won’t be too long before microelectromechanical systems based hardware and sophisticated software combine to enable athletes at any level to get world-class training.


Sensors ◽  
2019 ◽  
Vol 19 (13) ◽  
pp. 2845 ◽  
Author(s):  
Michael B. Del Del Rosario ◽  
Nigel H. Lovell ◽  
Stephen J. Redmond

Features were developed which accounted for the changing orientation of the inertial measurement unit (IMU) relative to the body, and demonstrably improved the performance of models for human activity recognition (HAR). The method is proficient at separating periods of standing and sedentary activity (i.e., sitting and/or lying) using only one IMU, even if it is arbitrarily oriented or subsequently re-oriented relative to the body; since the body is upright during walking, learning the IMU orientation during walking provides a reference orientation against which sitting and/or lying can be inferred. Thus, the two activities can be identified (irrespective of the cohort) by analyzing the magnitude of the angle of shortest rotation which would be required to bring the upright direction into coincidence with the average orientation from the most recent 2.5 s of IMU data. Models for HAR were trained using data obtained from a cohort of 37 older adults (83.9 ± 3.4 years) or 20 younger adults (21.9 ± 1.7 years). Test data were generated from the training data by virtually re-orienting the IMU so that it is representative of carrying the phone in five different orientations (relative to the thigh). The overall performance of the model for HAR was consistent whether the model was trained with the data from the younger cohort, and tested with the data from the older cohort after it had been virtually re-oriented (Cohen’s Kappa 95% confidence interval [0.782, 0.793]; total class sensitivity 95% confidence interval [84.9%, 85.6%]), or the reciprocal scenario in which the model was trained with the data from the older cohort, and tested with the data from the younger cohort after it had been virtually re-oriented (Cohen’s Kappa 95% confidence interval [0.765, 0.784]; total class sensitivity 95% confidence interval [82.3%, 83.7%]).


Sensors ◽  
2019 ◽  
Vol 19 (24) ◽  
pp. 5408 ◽  
Author(s):  
Tomasz Hachaj ◽  
Marcin Piekarczyk

The motivation of this paper is to examine the effectiveness of state-of-the-art and newly proposed motion capture pattern recognition methods in the task of head gesture classifications. The head gestures are designed for a user interface that utilizes a virtual reality helmet equipped with an internal measurement unit (IMU) sensor that has 6-axis accelerometer and gyroscope. We will validate a classifier that uses Principal Components Analysis (PCA)-based features with various numbers of dimensions, a two-stage PCA-based method, a feedforward artificial neural network, and random forest. Moreover, we will also propose a Dynamic Time Warping (DTW) classifier trained with extension of DTW Barycenter Averaging (DBA) algorithm that utilizes quaternion averaging and a bagged variation of previous method (DTWb) that utilizes many DTW classifiers that perform voting. The evaluation has been performed on 975 head gesture recordings in seven classes acquired from 12 persons. The highest value of recognition rate in a leave-one-out test has been obtained for DTWb and it equals 0.975 (0.026 better than the best of state-of-the-art methods to which we have compared our approach). Among the most important applications of the proposed method is improving life quality for people who are disabled below the neck by supporting, for example, an assistive autonomous power chair with a head gesture interface or remote controlled interfaces in robotics.


2019 ◽  
Author(s):  
Jake J. Son ◽  
Jon C. Clucas ◽  
Curt White ◽  
Anirudh Krishnakumar ◽  
Joshua T. Vogelstein ◽  
...  

AbstractWearable devices provide a means of tracking hand position in relation to the head, but have mostly relied on wrist-worn inertial measurement unit sensors and proximity sensors, which are inadequate for identifying specific locations. This limits their utility for accurate and precise monitoring of behaviors or providing feedback to guide behaviors. A potential clinical application is monitoring body-focused repetitive behaviors (BFRBs), recurrent, injurious behaviors directed toward the body, such as nail biting and hair pulling, that are often misdiagnosed and undertreated. Here, we demonstrate that including thermal sensors achieves higher accuracy in position tracking when compared against inertial measurement unit and proximity sensor data alone. Our Tingle device distinguished between behaviors from six locations on the head across 39 adult participants, with high AUROC values (best was back of the head: median (1.0), median absolute deviation (0.0); worst was on the cheek: median (0.93), median absolute deviation (0.09)). This study presents preliminary evidence of the advantage of including thermal sensors for position tracking and the Tingle wearable device’s potential use in a wide variety of settings, including BFRB diagnosis and management.


2018 ◽  
Vol 30 (1) ◽  
pp. 76-85 ◽  
Author(s):  
Hiroaki Nakanishi ◽  
Hiroyuki Hashimoto ◽  
◽  

Electrically powered unmanned aerial vehicles (UAV) are useful in performing inspection at various infrastructures or plants. A power supply through a tether cable is effective in extending flight time. During inspection activities, some or all satellites may be occluded. UAVs for inspection must be operated even in GPS-denied areas; therefore, a navigation system for GPS-denied areas is required. Depth information cannot be obtained correctly by a monocular camera. The ARToolkit, which is widely applied in augmented reality (AR), is not sufficient as a UAV navigation system. We have proposed a hybrid navigation method that integrates the ARToolkit and an inertial measurement unit (IMU). An analytic solution for both the worst and best estimation of yaw angle can be obtained by simple computation and helps remove outliers in measurements. From experimental results, it was proven that position estimation using the proposed method corresponded reasonably; however, it was necessary to correct the difference between the camera origin and the body’s center of gravity.


Sign in / Sign up

Export Citation Format

Share Document