Autonomous navigation in cluttered environments

2016 ◽  
pp. 57-84
Author(s):  
Leobardo Campos‐Macías ◽  
Rodrigo Aldana‐López ◽  
Rafael Guardia ◽  
José I. Parra‐Vilchis ◽  
David Gómez‐Gutiérrez

Author(s):  
Mahdi Haghshenas-Jaryani ◽  
Hakki Erhan Sevil ◽  
Liang Sun

Abstract This paper presents the concept of teaming up snake-robots, as unmanned ground vehicles (UGVs), and unmanned aerial vehicles (UAVs) for autonomous navigation and obstacle avoidance. Snake robots navigate in cluttered environments based on visual servoing of a co-robot UAV. It is assumed that snake-robots do not have any means to map the surrounding environment, detect obstacles, or self-localize, and these tasks are allocated to the UAV, which uses visual sensors to track the UGVs. The obtained images were used for the geo-localization and mapping the environment. Computer vision methods were utilized for the detection of obstacles, finding obstacle clusters, and then, mapping based on Probabilistic Threat Exposure Map (PTEM) construction. A path planner module determines the heading direction and velocity of the snake robot. A combined heading-velocity controller was used for the snake robot to follow the desired trajectories using the lateral undulatory gait. A series of simulations were carried out for analyzing the snake-robot’s maneuverability and proof-of-concept by navigating the snake robot in an environment with two obstacles based on the UAV visual servoing. The results showed the feasibility of the concept and effectiveness of the integrated system for navigation.


Drones ◽  
2021 ◽  
Vol 5 (4) ◽  
pp. 107
Author(s):  
Xishuang Zhao ◽  
Jingzheng Chong ◽  
Xiaohan Qi ◽  
Zhihua Yang

Autonomous navigation of micro aerial vehicles in unknown environments not only requires exploring their time-varying surroundings, but also ensuring the complete safety of flights at all times. The current research addresses estimation of the potential exploration value neglect of safety issues, especially in situations with a cluttered environment and no prior knowledge. To address this issue, we propose a vision object-oriented autonomous navigation method for environment exploration, which develops a B-spline function-based local trajectory re-planning algorithm by extracting spatial-structure information and selecting temporary target points. The proposed method is evaluated in a variety of cluttered environments, such as forests, building areas, and mines. The experimental results show that the proposed autonomous navigation system can effectively complete the global trajectory, during which an appropriate safe distance could always be maintained from multiple obstacles in the environment.


2017 ◽  
Vol 2017 ◽  
pp. 1-14 ◽  
Author(s):  
Rodrigo Munguía ◽  
Carlos López-Franco ◽  
Emmanuel Nuño ◽  
Adriana López-Franco

This work presents a method for implementing a visual-based simultaneous localization and mapping (SLAM) system using omnidirectional vision data, with application to autonomous mobile robots. In SLAM, a mobile robot operates in an unknown environment using only on-board sensors to simultaneously build a map of its surroundings, which it uses to track its position. The SLAM is perhaps one of the most fundamental problems to solve in robotics to build mobile robots truly autonomous. The visual sensor used in this work is an omnidirectional vision sensor; this sensor provides a wide field of view which is advantageous in a mobile robot in an autonomous navigation task. Since the visual sensor used in this work is monocular, a method to recover the depth of the features is required. To estimate the unknown depth we propose a novel stochastic triangulation technique. The system proposed in this work can be applied to indoor or cluttered environments for performing visual-based navigation when GPS signal is not available. Experiments with synthetic and real data are presented in order to validate the proposal.


Sign in / Sign up

Export Citation Format

Share Document