scholarly journals A Sensor Fusion Framework for Indoor Localization Using Smartphone Sensors and Wi-Fi RSSI Measurements

2019 ◽  
Vol 9 (20) ◽  
pp. 4379 ◽  
Author(s):  
Alwin Poulose ◽  
Jihun Kim ◽  
Dong Seog Han

Sensor fusion frameworks for indoor localization are developed with the specific goal of reducing positioning errors. Although many conventional localization frameworks without fusion have been improved to reduce positioning error, sensor fusion frameworks generally provide a further improvement in positioning accuracy. In this paper, we propose a sensor fusion framework for indoor localization using the smartphone inertial measurement unit (IMU) sensor data and Wi-Fi received signal strength indication (RSSI) measurements. The proposed sensor fusion framework uses location fingerprinting and trilateration for Wi-Fi positioning. Additionally, a pedestrian dead reckoning (PDR) algorithm is used for position estimation in indoor scenarios. The proposed framework achieves a maximum of 1.17 m localization error for the rectangular motion of a pedestrian and a maximum of 0.44 m localization error for linear motion.

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Guangbing Zhou ◽  
Jing Luo ◽  
Shugong Xu ◽  
Shunqing Zhang ◽  
Shige Meng ◽  
...  

Purpose Indoor localization is a key tool for robot navigation in indoor environments. Traditionally, robot navigation depends on one sensor to perform autonomous localization. This paper aims to enhance the navigation performance of mobile robots, a multiple data fusion (MDF) method is proposed for indoor environments. Design/methodology/approach Here, multiple sensor data i.e. collected information of inertial measurement unit, odometer and laser radar, are used. Then, an extended Kalman filter (EKF) is used to incorporate these multiple data and the mobile robot can perform autonomous localization according to the proposed EKF-based MDF method in complex indoor environments. Findings The proposed method has experimentally been verified in the different indoor environments, i.e. office, passageway and exhibition hall. Experimental results show that the EKF-based MDF method can achieve the best localization performance and robustness in the process of navigation. Originality/value Indoor localization precision is mostly related to the collected data from multiple sensors. The proposed method can incorporate these collected data reasonably and can guide the mobile robot to perform autonomous navigation (AN) in indoor environments. Therefore, the output of this paper would be used for AN in complex and unknown indoor environments.


Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 919 ◽  
Author(s):  
Hao Du ◽  
Wei Wang ◽  
Chaowen Xu ◽  
Ran Xiao ◽  
Changyin Sun

The question of how to estimate the state of an unmanned aerial vehicle (UAV) in real time in multi-environments remains a challenge. Although the global navigation satellite system (GNSS) has been widely applied, drones cannot perform position estimation when a GNSS signal is not available or the GNSS is disturbed. In this paper, the problem of state estimation in multi-environments is solved by employing an Extended Kalman Filter (EKF) algorithm to fuse the data from multiple heterogeneous sensors (MHS), including an inertial measurement unit (IMU), a magnetometer, a barometer, a GNSS receiver, an optical flow sensor (OFS), Light Detection and Ranging (LiDAR), and an RGB-D camera. Finally, the robustness and effectiveness of the multi-sensor data fusion system based on the EKF algorithm are verified by field flights in unstructured, indoor, outdoor, and indoor and outdoor transition scenarios.


Sensors ◽  
2018 ◽  
Vol 19 (1) ◽  
pp. 46 ◽  
Author(s):  
N. Koksal ◽  
M. Jalalmaab ◽  
B. Fidan

In this paper, an infinite-horizon adaptive linear quadratic tracking (ALQT) control scheme is designed for optimal attitude tracking of a quadrotor unmanned aerial vehicle (UAV). The proposed control scheme is experimentally validated in the presence of real-world uncertainties in quadrotor system parameters and sensor measurement. The designed control scheme guarantees asymptotic stability of the close-loop system with the help of complete controllability of the attitude dynamics in applying optimal control signals. To achieve robustness against parametric uncertainties, the optimal tracking solution is combined with an online least squares based parameter identification scheme to estimate the instantaneous inertia of the quadrotor. Sensor measurement noises are also taken into account for the on-board Inertia Measurement Unit (IMU) sensors. To improve controller performance in the presence of sensor measurement noises, two sensor fusion techniques are employed, one based on Kalman filtering and the other based on complementary filtering. The ALQT controller performance is compared for the use of these two sensor fusion techniques, and it is concluded that the Kalman filter based approach provides less mean-square estimation error, better attitude estimation, and better attitude control performance.


Sensors ◽  
2021 ◽  
Vol 21 (21) ◽  
pp. 7051
Author(s):  
José Manuel Villadangos ◽  
Jesús Ureña ◽  
Juan Jesús García-Domínguez ◽  
Ana Jiménez-Martín ◽  
Álvaro Hernández ◽  
...  

Ultrasonic local positioning systems (ULPS) have been brought to the attention of researchers as one of the possibilities that can be used for indoor localization. Acoustic systems combine a suitable trade-off between precision, ease of development, and cost. This work proposes a method for measuring the time of arrival of encoded emissions from a set of ultrasonic beacons, which are used to implement an accurate ULPS. This method uses the generalized cross-correlation technique with PHAT filter and weighting factor β (GCC-PHAT-β). To improve the performance of the GCC-PHAT-β in encoded emission detection, the employment is proposed of mixed-medium multiple-access techniques, based on code division and time division multiplexing of beacon emissions (CDMA and TDMA respectively), and to dynamically adjust the PHAT filter weighting factor. The receiver position is obtained by hyperbolic multilateration from the time differences of arrival (TDoA) between a reference beacon and the rest, thus avoiding the need for receiver synchronization. The results show how the dynamic adaptation of the weighting factor significantly reduces positioning errors from 20 cm to 2 cm in 80% of measurements. The simulated and real experiments prove that the proposed algorithms improve the performance of the ULPS in situations with lower signal-to-noise ratios (SNR) than 0 dB and in environments where the multipath effect makes it difficult to correctly detect the encoded ultrasonic emissions.


Sensors ◽  
2019 ◽  
Vol 19 (23) ◽  
pp. 5084 ◽  
Author(s):  
Alwin Poulose ◽  
Dong Seog Han

Smartphone camera or inertial measurement unit (IMU) sensor-based systems can be independently used to provide accurate indoor positioning results. However, the accuracy of an IMU-based localization system depends on the magnitude of sensor errors that are caused by external electromagnetic noise or sensor drifts. Smartphone camera based positioning systems depend on the experimental floor map and the camera poses. The challenge in smartphone camera-based localization is that accuracy depends on the rapidness of changes in the user’s direction. In order to minimize the positioning errors in both the smartphone camera and IMU-based localization systems, we propose hybrid systems that combine both the camera-based and IMU sensor-based approaches for indoor localization. In this paper, an indoor experiment scenario is designed to analyse the performance of the IMU-based localization system, smartphone camera-based localization system and the proposed hybrid indoor localization system. The experiment results demonstrate the effectiveness of the proposed hybrid system and the results show that the proposed hybrid system exhibits significant position accuracy when compared to the IMU and smartphone camera-based localization systems. The performance of the proposed hybrid system is analysed in terms of average localization error and probability distributions of localization errors. The experiment results show that the proposed oriented fast rotated binary robust independent elementary features (BRIEF)-simultaneous localization and mapping (ORB-SLAM) with the IMU sensor hybrid system shows a mean localization error of 0.1398 m and the proposed simultaneous localization and mapping by fusion of keypoints and squared planar markers (UcoSLAM) with IMU sensor-based hybrid system has a 0.0690 m mean localization error and are compared with the individual localization systems in terms of mean error, maximum error, minimum error and standard deviation of error.


Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4410 ◽  
Author(s):  
Faisal Jamil ◽  
Naeem Iqbal ◽  
Shabir Ahmad ◽  
Do-Hyeun Kim

Internet of Things is advancing, and the augmented role of smart navigation in automating processes is at its vanguard. Smart navigation and location tracking systems are finding increasing use in the area of the mission-critical indoor scenario, logistics, medicine, and security. A demanding emerging area is an Indoor Localization due to the increased fascination towards location-based services. Numerous inertial assessments unit-based indoor localization mechanisms have been suggested in this regard. However, these methods have many shortcomings pertaining to accuracy and consistency. In this study, we propose a novel position estimation system based on learning to the prediction model to address the above challenges. The designed system consists of two modules; learning to prediction module and position estimation using sensor fusion in an indoor environment. The prediction algorithm is attached to the learning module. Moreover, the learning module continuously controls, observes, and enhances the efficiency of the prediction algorithm by evaluating the output and taking into account the exogenous factors that may have an impact on its outcome. On top of that, we reckon a situation where the prediction algorithm can be applied to anticipate the accurate gyroscope and accelerometer reading from the noisy sensor readings. In the designed system, we consider a scenario where the learning module, based on Artificial Neural Network, and Kalman filter are used as a prediction algorithm to predict the actual accelerometer and gyroscope reading from the noisy sensor reading. Moreover, to acquire data, we use the next-generation inertial measurement unit, which contains a 3-axis accelerometer and gyroscope data. Finally, for the performance and accuracy of the proposed system, we carried out numbers of experiments, and we observed that the proposed Kalman filter with learning module performed better than the traditional Kalman filter algorithm in terms of root mean square error metric.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 7
Author(s):  
Vicent Rodrigo Marco ◽  
Jens Kalkkuhl ◽  
Jörg Raisch ◽  
Thomas Seel

Multi-modal sensor fusion has become ubiquitous in the field of vehicle motion estimation. Achieving a consistent sensor fusion in such a set-up demands the precise knowledge of the misalignments between the coordinate systems in which the different information sources are expressed. In ego-motion estimation, even sub-degree misalignment errors lead to serious performance degradation. The present work addresses the extrinsic calibration of a land vehicle equipped with standard production car sensors and an automotive-grade inertial measurement unit (IMU). Specifically, the article presents a method for the estimation of the misalignment between the IMU and vehicle coordinate systems, while considering the IMU biases. The estimation problem is treated as a joint state and parameter estimation problem, and solved using an adaptive estimator that relies on the IMU measurements, a dynamic single-track model as well as the suspension and odometry systems. Additionally, we show that the validity of the misalignment estimates can be assessed by identifying the misalignment between a high-precision INS/GNSS and the IMU and vehicle coordinate systems. The effectiveness of the proposed calibration procedure is demonstrated using real sensor data. The results show that estimation accuracies below 0.1 degrees can be achieved in spite of moderate variations in the manoeuvre execution.


Electronics ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 618
Author(s):  
Jan Grottke ◽  
Jörg Blankenbach

Due to their distinctive presence in everyday life and the variety of available built-in sensors, smartphones have become the focus of recent indoor localization research. Hence, this paper describes a novel smartphone-based sensor fusion algorithm. It combines the relative inertial measurement unit (IMU) based movements of the pedestrian dead reckoning with the absolute fingerprinting-based position estimations of Wireless Local Area Network (WLAN), Bluetooth (Bluetooth Low Energy—BLE), and magnetic field anomalies as well as a building model in real time. Thus, a step-based position estimation without knowledge of any start position was achieved. For this, a grid-based particle filter and a Bayesian filter approach were combined. Furthermore, various optimization methods were compared to weigh the different information sources within the sensor fusion algorithm, thus achieving high position accuracy. Although a particle filter was used, no particles move due to a novel grid-based particle interpretation. Here, the particles’ probability values change with every new information source and every stepwise iteration via a probability-map-based approach. By adjusting the weights of the individual measurement methods compared to a knowledge-based reference, the mean and the maximum position error were reduced by 31%, the RMSE by 34%, and the 95-percentile positioning errors by 52%.


2012 ◽  
Vol 19 (2) ◽  
pp. 31-40
Author(s):  
Lukas Köping ◽  
Thomas Mühsam ◽  
Christian Ofenberg ◽  
Bernhard Czech ◽  
Michael Bernard ◽  
...  

Abstract In this paper we present an indoor localization system based on particle filter and multiple sensor data like acceleration, angular velocity and compass data. With this approach we tackle the problem of documentation on large building yards during the construction phase. Due to the circumstances of such an environment we cannot rely on any data from GPS, Wi-Fi or RFID. Moreover this work should serve us as a first step towards an all-in-one navigation system for mobile devices. Our experimental results show that we can achieve high accuracy in position estimation.


Sensors ◽  
2019 ◽  
Vol 19 (20) ◽  
pp. 4357 ◽  
Author(s):  
Babak Shahian Jahromi ◽  
Theja Tulabandhula ◽  
Sabri Cetin

There are many sensor fusion frameworks proposed in the literature using different sensors and fusion methods combinations and configurations. More focus has been on improving the accuracy performance; however, the implementation feasibility of these frameworks in an autonomous vehicle is less explored. Some fusion architectures can perform very well in lab conditions using powerful computational resources; however, in real-world applications, they cannot be implemented in an embedded edge computer due to their high cost and computational need. We propose a new hybrid multi-sensor fusion pipeline configuration that performs environment perception for autonomous vehicles such as road segmentation, obstacle detection, and tracking. This fusion framework uses a proposed encoder-decoder based Fully Convolutional Neural Network (FCNx) and a traditional Extended Kalman Filter (EKF) nonlinear state estimator method. It also uses a configuration of camera, LiDAR, and radar sensors that are best suited for each fusion method. The goal of this hybrid framework is to provide a cost-effective, lightweight, modular, and robust (in case of a sensor failure) fusion system solution. It uses FCNx algorithm that improve road detection accuracy compared to benchmark models while maintaining real-time efficiency that can be used in an autonomous vehicle embedded computer. Tested on over 3K road scenes, our fusion algorithm shows better performance in various environment scenarios compared to baseline benchmark networks. Moreover, the algorithm is implemented in a vehicle and tested using actual sensor data collected from a vehicle, performing real-time environment perception.


Sign in / Sign up

Export Citation Format

Share Document