A monocular vision-based attitude estimation approach for small Unmanned Aerial Vehicles and its experimental verification

Author(s):  
Yunyan Wu ◽  
Guangwen Li ◽  
Weinan Li ◽  
Xianglun Zhang ◽  
Jun Che ◽  
...  
2018 ◽  
Vol 10 (4) ◽  
pp. 352-361 ◽  
Author(s):  
Adrian Carrio ◽  
Hriday Bavle ◽  
Pascual Campoy

The lack of redundant attitude sensors represents a considerable yet common vulnerability in many low-cost unmanned aerial vehicles. In addition to the use of attitude sensors, exploiting the horizon as a visual reference for attitude control is part of human pilots’ training. For this reason, and given the desirable properties of image sensors, quite a lot of research has been conducted proposing the use of vision sensors for horizon detection in order to obtain redundant attitude estimation onboard unmanned aerial vehicles. However, atmospheric and illumination conditions may hinder the operability of visible light image sensors, or even make their use impractical, such as during the night. Thermal infrared image sensors have a much wider range of operation conditions and their price has greatly decreased during the last years, becoming an alternative to visible spectrum sensors in certain operation scenarios. In this paper, two attitude estimation methods are proposed. The first method consists of a novel approach to estimate the line that best fits the horizon in a thermal image. The resulting line is then used to estimate the pitch and roll angles using an infinite horizon line model. The second method uses deep learning to predict attitude angles using raw pixel intensities from a thermal image. For this, a novel Convolutional Neural Network architecture has been trained using measurements from an inertial navigation system. Both methods presented are proven to be valid for redundant attitude estimation, providing RMS errors below 1.7° and running at up to 48 Hz, depending on the chosen method, the input image resolution and the available computational capabilities.


2021 ◽  
Vol 17 (4) ◽  
pp. 155014772110098
Author(s):  
Xiaoqin Liu ◽  
Xiang Li ◽  
Qi Shi ◽  
Chuanpei Xu ◽  
Yanmei Tang

Three-dimensional attitude estimation for unmanned aerial vehicles is usually based on the combination of magnetometer, accelerometer, and gyroscope (MARG). But MARG sensor can be easily affected by various disturbances, for example, vibration, external magnetic interference, and gyro drift. Optical flow sensor has the ability to extract motion information from image sequence, and thus, it is potential to augment three-dimensional attitude estimation for unmanned aerial vehicles. But the major problem is that the optical flow can be caused by both translational and rotational movements, which are difficult to be distinguished from each other. To solve the above problems, this article uses a gated recurrent unit neural network to implement data fusion for MARG and optical flow sensors, so as to enhance the accuracy of three-dimensional attitude estimation for unmanned aerial vehicles. The proposed algorithm can effectively make use of the attitude information contained in the optical flow measurements and can also achieve multi-sensor fusion for attitude estimation without explicit mathematical model. Compared with the commonly used extended Kalman filter algorithm for attitude estimation, the proposed algorithm shows higher accuracy in the flight test of quad-rotor unmanned aerial vehicles.


Author(s):  
Mohammed Boulekchour ◽  
Nabil Aouf ◽  
Mark Richardson

In this paper, a system for real-time cooperative monocular visual motion estimation with multiple unmanned aerial vehicles is proposed. Distributing the system across a network of vehicles allows for efficient processing in terms of both computational time and estimation accuracy. The resulting global cooperative motion estimation employs state-of-the-art approaches for optimisation, individual motion estimation and registration. Three-view geometry algorithms are developed within a convex optimisation framework on-board the monocular vision systems of each vehicle. In the presented novel distributed cooperative strategy a visual loop-closure module is deployed to detect any simultaneously overlapping fields of view of two or more of the vehicles. A positive feedback from the latter module triggers the collaborative motion estimation algorithm between any vehicles involved in this loop-closure. This scenario creates a flexible stereo set-up which jointly optimises the motion estimates of all vehicles in the cooperative scheme. Prior to that, vehicle-to-vehicle relative pose estimates are recovered with a novel robust registration solution in a global optimisation framework. Furthermore, as a complementary solution, a robust non-linear H∞filter is designed to fuse measurements from the vehicles’ on-board inertial sensors with the visual estimates. The proposed cooperative navigation solution has been validated on real-world data, using two unmanned aerial vehicles equipped with monocular vision systems.


2013 ◽  
Vol 23 (3) ◽  
pp. 701-711 ◽  
Author(s):  
Kefei Liu ◽  
João Paulo C.L. da Costa ◽  
Hing Cheung So ◽  
Florian Römer ◽  
Martin Haardt ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document