Asteroids Dynamic Estimation in a Solely Vision-Based Navigation System

Author(s):  
Neus Monge ◽  
Isabel Gortazar ◽  
Emmanuel Zenou
ACTA IMEKO ◽  
2018 ◽  
Vol 7 (2) ◽  
pp. 102 ◽  
Author(s):  
Silvio Del Pizzo ◽  
Umberto Papa ◽  
Salvatore Gaglione ◽  
Salvatore Troisi ◽  
Giuseppe Del Core

An autonomous vision-based landing system was designed and its performance is analysed and measured by an UAS. The system equipment is based on a single camera to determine its position and attitude with respect to a well-defined landing pattern. The developed procedure is based on photogrammetric Space Resection Solution, which provides the position and camera attitude reckoning starting from at least three, not aligned, reference control points whose image coordinates may be measured in the image camera frame. Five circular coloured targets were placed on a specific landing pattern, their 2D image frame coordinates was extracted through a particular algorithm. The aim of this work is to compute UAS precise position and attitude from single image, in order to have a good approach to landing field. This procedure can be used in addition or for replacement of GPS tracking and can be applied when the landing field is movable or located on a moving platform, the UAS will follow the landing pattern until the landing phase will be closed.


2006 ◽  
Author(s):  
Hanene Chettaoui ◽  
Guillaume Thomann ◽  
Chokri Ben amar ◽  
Tanneguy Redarce

In this paper, we propose a vision based navigation system to guide an endoscope inside human colon. The target to pursue is a dark spot on colonoscopic images, called “pattern”. A novel methodology for “pattern” extraction and tracking was designed. Surgeons observations leads to the basic idea of this technique. Information about target position is then continues and makes possible prediction of the “pattern” position. A set of endoscopic images is tested to demonstrate the effectiveness of the vision technique. An experiment tool to simulate the endoscope navigation was employed to achieve real time performance. An interpretation of the results and the possible amelioration is presented.


Sensors ◽  
2020 ◽  
Vol 20 (18) ◽  
pp. 5409
Author(s):  
Gonzalo Farias ◽  
Ernesto Fabregas ◽  
Enrique Torres ◽  
Gaëtan Bricas ◽  
Sebastián Dormido-Canto ◽  
...  

This work presents the development and implementation of a distributed navigation system based on object recognition algorithms. The main goal is to introduce advanced algorithms for image processing and artificial intelligence techniques for teaching control of mobile robots. The autonomous system consists of a wheeled mobile robot with an integrated color camera. The robot navigates through a laboratory scenario where the track and several traffic signals must be detected and recognized by using the images acquired with its on-board camera. The images are sent to a computer server that performs a computer vision algorithm to recognize the objects. The computer calculates the corresponding speeds of the robot according to the object detected. The speeds are sent back to the robot, which acts to carry out the corresponding manoeuvre. Three different algorithms have been tested in simulation and a practical mobile robot laboratory. The results show an average of 84% success rate for object recognition in experiments with the real mobile robot platform.


2012 ◽  
Vol 19 (2) ◽  
pp. 71-98 ◽  
Author(s):  
Roberto Sabatini ◽  
Celia Bartel ◽  
Anish Kaharkar ◽  
Tesheen Shaid ◽  
Leopoldo Rodriguez ◽  
...  

Abstract In this paper we present a new low-cost navigation system designed for small size Unmanned Aerial Vehicles (UAVs) based on Vision-Based Navigation (VBN) and other avionics sensors. The main objective of our research was to design a compact, light and relatively inexpensive system capable of providing the Required Navigation Performance (RNP) in all phases of flight of a small UAV, with a special focus on precision approach and landing, where Vision Based Navigation (VBN) techniques can be fully exploited in a multisensor integrated architecture. Various existing techniques for VBN were compared and the Appearance-Based Approach (ABA) was selected for implementation. Feature extraction and optical flow techniques were employed to estimate flight parameters such as roll angle, pitch angle, deviation from the runway and body rates. Additionally, we addressed the possible synergies between VBN, Global Navigation Satellite System (GNSS) and MEMS-IMU (Micro-Electromechanical System Inertial Measurement Unit) sensors, as well as the aiding from Aircraft Dynamics Models (ADMs). In particular, by employing these sensors/models, we aimed to compensate for the shortcomings of VBN and MEMS-IMU sensors in high-dynamics attitude determination tasks. An Extended Kalman Filter (EKF) was developed to fuse the information provided by the different sensors and to provide estimates of position, velocity and attitude of the UAV platform in real-time. Two different integrated navigation system architectures were implemented. The first used VBN at 20 Hz and GPS at 1 Hz to augment the MEMS-IMU running at 100 Hz. The second mode also included the ADM (computations performed at 100 Hz) to provide augmentation of the attitude channel. Simulation of these two modes was accomplished in a significant portion of the AEROSONDE UAV operational flight envelope and performing a variety of representative manoeuvres (i.e., straight climb, level turning, turning descent and climb, straight descent, etc.). Simulation of the first integrated navigation system architecture (VBN/IMU/GPS) showed that the integrated system can reach position, velocity and attitude accuracies compatible with CAT-II precision approach requirements. Simulation of the second system architecture (VBN/IMU/GPS/ADM) also showed promising results since the achieved attitude accuracy was higher using the ADM/VBS/IMU than using VBS/IMU only. However, due to rapid divergence of the ADM virtual sensor, there was a need for frequent re-initialisation of the ADM data module, which was strongly dependent on the UAV flight dynamics and the specific manoeuvring transitions performed


2011 ◽  
Vol 301-303 ◽  
pp. 201-207
Author(s):  
Sheng Bei Wang ◽  
Jian Ming Wang ◽  
Xi Wang ◽  
Ling Ma ◽  
Ru Zhen Dou ◽  
...  

Vision for navigation had been an active area of research for more than three decades, and a vision-based navigation system always needs real-time image collecting and processing to acquire navigation information. In the indoor scenarios, illuminant reflection is often found in navigation images because of smooth surfaces in the environment, such as marble floor, planes of furniture and etc. The negative effect of illuminant reflection in navigation images is obvious and might lower the performance of the navigation system, As to resolve the above problem, the issue of how to detect illuminant reflection should be considered, This paper proposed an automatic detection algorithm to segment illuminant reflection regions in a color image using saturation and brightness characteristics as well as the brightness distribution of the illuminant reflective regions. As to verify the robustness and accuracy of this algorithm, experiments were carried out in different indoor environments where illuminant reflection is found in navigation images, the experiments results we obtained indicated that this algorithm is sufficient to handle the problem with providing good detection results as expected.


Sign in / Sign up

Export Citation Format

Share Document