scholarly journals Benefits of Multi-Constellation/Multi-Frequency GNSS in a Tightly Coupled GNSS/IMU/Odometry Integration Algorithm

Sensors ◽  
2018 ◽  
Vol 18 (9) ◽  
pp. 3052 ◽  
Author(s):  
Björn Reuper ◽  
Matthias Becker ◽  
Stefan Leinen

Localization algorithms based on global navigation satellite systems (GNSS) play an important role in automotive positioning. Due to the advent of autonomously driving cars, their importance is expected to grow even further in the next years. Simultaneously, the performance requirements for these localization algorithms will increase because they are no longer used exclusively for navigation, but also for control of the vehicle’s movement. These requirements cannot be met with GNSS alone. Instead, algorithms for sensor data fusion are needed. While the combination of GNSS receivers with inertial measurements units (IMUs) is a common approach, it is traditionally executed in a single-frequency/single-constellation architecture, usually with the Global Positioning System’s (GPS) L1 C/A signal. With the advent of new GNSS constellations and civil signals on multiple frequencies, GNSS/IMU integration algorithm performance can be improved by utilizing these new data sources. To achieve this, we upgraded a tightly coupled GNSS/IMU integration algorithm to process measurements from GPS (L1 C/A, L2C, L5) and Galileo (E1, E5a, E5b). After investigating various combination strategies, we chose to preferably work with ionosphere-free combinations of L5-L1 C/A and E5a-E1 pseudo-ranges. L2C-L1 C/A and E5b-E1 combinations as well as single-frequency pseudo-ranges on L1 and E1 serve as backup when no L5/E5a measurements are available. To be able to process these six types of pseudo-range observations simultaneously, the differential code biases (DCBs) of the employed receiver need to be calibrated. Time-differenced carrier-phase measurements on L1 and E1 provide the algorithm with pseudo-range-rate observations. To provide additional aiding, information about the vehicle’s velocity obtained by an odometry model fed with angular velocities from all four wheels as well as the steering wheel angle is incorporated into the algorithm. To evaluate the performance improvement provided by these new data sources, two sets of measurement data are collected and the resulting navigation solutions are compared to a higher-grade reference system, consisting of a geodetic GNSS receiver for real-time kinematic positioning (RTK) and a navigation grade IMU. The multi-frequency/multi-constellation algorithm with odometry aiding achieves a 3-D root mean square (RMS) position error of 3.6 m / 2.1 m in these data sets, compared to 5.2 m / 2.9 m for the single-frequency GPS algorithm without odometry aiding. Odometry is most beneficial to positioning accuracy when GNSS measurement quality is poor. This is demonstrated in data set 1, resulting in a reduction of the horizontal position error’s 95% quantile from 6.2 m without odometry aiding to 4.2 m with odometry aiding.

2011 ◽  
Vol 467-469 ◽  
pp. 108-113
Author(s):  
Xin Yu Li ◽  
Dong Yi Chen

Accurate tracking for Augmented Reality applications is a challenging task. Multi-sensors hybrid tracking generally provide more stable than the effect of the single visual tracking. This paper presents a new tightly-coupled hybrid tracking approach combining vision-based systems with inertial sensor. Based on multi-frequency sampling theory in the measurement data synchronization, a strong tracking filter (STF) is used to smooth sensor data and estimate position and orientation. Through adding time-varying fading factor to adaptively adjust the prediction error covariance of filter, this method improves the performance of tracking for fast moving targets. Experimental results show the efficiency and robustness of this proposed approach.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2169
Author(s):  
Viktor Tihanyi ◽  
Tamás Tettamanti ◽  
Mihály Csonthó ◽  
Arno Eichberger ◽  
Dániel Ficzere ◽  
...  

A spectacular measurement campaign was carried out on a real-world motorway stretch of Hungary with the participation of international industrial and academic partners. The measurement resulted in vehicle based and infrastructure based sensor data that will be extremely useful for future automotive R&D activities due to the available ground truth for static and dynamic content. The aim of the measurement campaign was twofold. On the one hand, road geometry was mapped with high precision in order to build Ultra High Definition (UHD) map of the test road. On the other hand, the vehicles—equipped with differential Global Navigation Satellite Systems (GNSS) for ground truth localization—carried out special test scenarios while collecting detailed data using different sensors. All of the test runs were recorded by both vehicles and infrastructure. The paper also showcases application examples to demonstrate the viability of the collected data having access to the ground truth labeling. This data set may support a large variety of solutions, for the test and validation of different kinds of approaches and techniques. As a complementary task, the available 5G network was monitored and tested under different radio conditions to investigate the latency results for different measurement scenarios. A part of the measured data has been shared openly, such that interested automotive and academic parties may use it for their own purposes.


Sensors ◽  
2020 ◽  
Vol 20 (15) ◽  
pp. 4073 ◽  
Author(s):  
Wenhao Yang ◽  
Yue Liu ◽  
Fanming Liu

The Global Navigation Satellite Systems (GNSS) becomes the primary choice for device localization in outdoor situations. At the same time, many applications do not require precise absolute Earth coordinates, but instead, inferring the geometric configuration information of the constituent nodes in the system by relative positioning. The Real-Time Kinematic (RTK) technique shows its efficiency and accuracy in calculating the relative position. However, when the cycle slips occur, the RTK method may take a long time to obtain a fixed ambiguity value, and the positioning result will be a “float” solution with a low meter accuracy. The novel method presented in this paper is based on the Relative GNSS Tracking Algorithm (Regtrack). It calculates the changes in the relative baseline between two receivers without an ambiguity estimation. The dead reckoning method is used to give out the relative baseline solution while a parallel running Extended Kalman Filter (EKF) method reinitiates the relative baseline when too many validation failures happen. We conducted both static and kinematic tests to assess the performance of the new methodology. The experimental results show that the proposed strategy can give accurate millimeter-scale solutions of relative motion vectors in adjacent two epochs. The relative baseline solution can be sub-decimeter level with or without the base station is holding static. In the meantime, when the initial tracking point and base station coordinates are precisely obtained, the tracking result error can be only 40 cm away from the ground truth after a 25 min drive test in an urban environment. The efficiency test shows that the proposed method can be a real-time method, the time that calculates one epoch of measurement data is no more than 80 ms and is less than 10 ms for best results. The novel method can be used as a more robust and accurate ambiguity free tracking approach for outdoor applications.


Author(s):  
Kyungkoo Jun

Background & Objective: This paper proposes a Fourier transform inspired method to classify human activities from time series sensor data. Methods: Our method begins by decomposing 1D input signal into 2D patterns, which is motivated by the Fourier conversion. The decomposition is helped by Long Short-Term Memory (LSTM) which captures the temporal dependency from the signal and then produces encoded sequences. The sequences, once arranged into the 2D array, can represent the fingerprints of the signals. The benefit of such transformation is that we can exploit the recent advances of the deep learning models for the image classification such as Convolutional Neural Network (CNN). Results: The proposed model, as a result, is the combination of LSTM and CNN. We evaluate the model over two data sets. For the first data set, which is more standardized than the other, our model outperforms previous works or at least equal. In the case of the second data set, we devise the schemes to generate training and testing data by changing the parameters of the window size, the sliding size, and the labeling scheme. Conclusion: The evaluation results show that the accuracy is over 95% for some cases. We also analyze the effect of the parameters on the performance.


2021 ◽  
pp. 158-166
Author(s):  
Noah Balestra ◽  
Gaurav Sharma ◽  
Linda M. Riek ◽  
Ania Busza

<b><i>Background:</i></b> Prior studies suggest that participation in rehabilitation exercises improves motor function poststroke; however, studies on optimal exercise dose and timing have been limited by the technical challenge of quantifying exercise activities over multiple days. <b><i>Objectives:</i></b> The objectives of this study were to assess the feasibility of using body-worn sensors to track rehabilitation exercises in the inpatient setting and investigate which recording parameters and data analysis strategies are sufficient for accurately identifying and counting exercise repetitions. <b><i>Methods:</i></b> MC10 BioStampRC® sensors were used to measure accelerometer and gyroscope data from upper extremities of healthy controls (<i>n</i> = 13) and individuals with upper extremity weakness due to recent stroke (<i>n</i> = 13) while the subjects performed 3 preselected arm exercises. Sensor data were then labeled by exercise type and this labeled data set was used to train a machine learning classification algorithm for identifying exercise type. The machine learning algorithm and a peak-finding algorithm were used to count exercise repetitions in non-labeled data sets. <b><i>Results:</i></b> We achieved a repetition counting accuracy of 95.6% overall, and 95.0% in patients with upper extremity weakness due to stroke when using both accelerometer and gyroscope data. Accuracy was decreased when using fewer sensors or using accelerometer data alone. <b><i>Conclusions:</i></b> Our exploratory study suggests that body-worn sensor systems are technically feasible, well tolerated in subjects with recent stroke, and may ultimately be useful for developing a system to measure total exercise “dose” in poststroke patients during clinical rehabilitation or clinical trials.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2532
Author(s):  
Encarna Quesada ◽  
Juan J. Cuadrado-Gallego ◽  
Miguel Ángel Patricio ◽  
Luis Usero

Anomaly Detection research is focused on the development and application of methods that allow for the identification of data that are different enough—compared with the rest of the data set that is being analyzed—and considered anomalies (or, as they are more commonly called, outliers). These values mainly originate from two sources: they may be errors introduced during the collection or handling of the data, or they can be correct, but very different from the rest of the values. It is essential to correctly identify each type as, in the first case, they must be removed from the data set but, in the second case, they must be carefully analyzed and taken into account. The correct selection and use of the model to be applied to a specific problem is fundamental for the success of the anomaly detection study and, in many cases, the use of only one model cannot provide sufficient results, which can be only reached by using a mixture model resulting from the integration of existing and/or ad hoc-developed models. This is the kind of model that is developed and applied to solve the problem presented in this paper. This study deals with the definition and application of an anomaly detection model that combines statistical models and a new method defined by the authors, the Local Transilience Outlier Identification Method, in order to improve the identification of outliers in the sensor-obtained values of variables that affect the operations of wind tunnels. The correct detection of outliers for the variables involved in wind tunnel operations is very important for the industrial ventilation systems industry, especially for vertical wind tunnels, which are used as training facilities for indoor skydiving, as the incorrect performance of such devices may put human lives at risk. In consequence, the use of the presented model for outlier detection may have a high impact in this industrial sector. In this research work, a proof-of-concept is carried out using data from a real installation, in order to test the proposed anomaly analysis method and its application to control the correct performance of wind tunnels.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 11
Author(s):  
Domonkos Haffner ◽  
Ferenc Izsák

The localization of multiple scattering objects is performed while using scattered waves. An up-to-date approach: neural networks are used to estimate the corresponding locations. In the scattering phenomenon under investigation, we assume known incident plane waves, fully reflecting balls with known diameters and measurement data of the scattered wave on one fixed segment. The training data are constructed while using the simulation package μ-diff in Matlab. The structure of the neural networks, which are widely used for similar purposes, is further developed. A complex locally connected layer is the main compound of the proposed setup. With this and an appropriate preprocessing of the training data set, the number of parameters can be kept at a relatively low level. As a result, using a relatively large training data set, the unknown locations of the objects can be estimated effectively.


AI ◽  
2021 ◽  
Vol 2 (1) ◽  
pp. 48-70
Author(s):  
Wei Ming Tan ◽  
T. Hui Teo

Prognostic techniques attempt to predict the Remaining Useful Life (RUL) of a subsystem or a component. Such techniques often use sensor data which are periodically measured and recorded into a time series data set. Such multivariate data sets form complex and non-linear inter-dependencies through recorded time steps and between sensors. Many current existing algorithms for prognostic purposes starts to explore Deep Neural Network (DNN) and its effectiveness in the field. Although Deep Learning (DL) techniques outperform the traditional prognostic algorithms, the networks are generally complex to deploy or train. This paper proposes a Multi-variable Time Series (MTS) focused approach to prognostics that implements a lightweight Convolutional Neural Network (CNN) with attention mechanism. The convolution filters work to extract the abstract temporal patterns from the multiple time series, while the attention mechanisms review the information across the time axis and select the relevant information. The results suggest that the proposed method not only produces a superior accuracy of RUL estimation but it also trains many folds faster than the reported works. The superiority of deploying the network is also demonstrated on a lightweight hardware platform by not just being much compact, but also more efficient for the resource restricted environment.


2021 ◽  
pp. 1-11
Author(s):  
Yanan Huang ◽  
Yuji Miao ◽  
Zhenjing Da

The methods of multi-modal English event detection under a single data source and isomorphic event detection of different English data sources based on transfer learning still need to be improved. In order to improve the efficiency of English and data source time detection, based on the transfer learning algorithm, this paper proposes multi-modal event detection under a single data source and isomorphic event detection based on transfer learning for different data sources. Moreover, by stacking multiple classification models, this paper makes each feature merge with each other, and conducts confrontation training through the difference between the two classifiers to further make the distribution of different source data similar. In addition, in order to verify the algorithm proposed in this paper, a multi-source English event detection data set is collected through a data collection method. Finally, this paper uses the data set to verify the method proposed in this paper and compare it with the current most mainstream transfer learning methods. Through experimental analysis, convergence analysis, visual analysis and parameter evaluation, the effectiveness of the algorithm proposed in this paper is demonstrated.


2005 ◽  
Vol 65 (1) ◽  
pp. 129-139 ◽  
Author(s):  
M. A. H Penna ◽  
M. A Villacorta-Corrêa ◽  
T. Walter ◽  
M. Petrere-JR

In order to decide which is the best growth model for the tambaqui Colossoma macropomum Cuvier, 1818, we utilized 249 and 256 length-at-age ring readings in otholiths and scales respectively, for the same sample of individuals. The Schnute model was utilized and it is concluded that the Von Bertalanffy model is the most adequate for these data, because it proved highly stable for the data set, and only slightly sensitive to the initial values of the estimated parameters. The phi' values estimated from five different data sources presented a CV = 4.78%. The numerical discrepancies between these values are of not much concern due to the high negative correlation between k and L<FONT FACE=Symbol>¥</FONT> viz, so that when one of them increases, the other decreases and the final result in phi' remains nearly unchanged.


Sign in / Sign up

Export Citation Format

Share Document