Small unmanned aerial vehicle visual system for ground moving target positioning

Author(s):  
Zhang Zhifei ◽  
Xu Weijie ◽  
Fang Zhou ◽  
Li Ping ◽  
Han Bo
IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 106931-106940 ◽  
Author(s):  
Yueqi Hou ◽  
Xiaolong Liang ◽  
Lyulong He ◽  
Jiaqiang Zhang

Electronics ◽  
2019 ◽  
Vol 8 (2) ◽  
pp. 184 ◽  
Author(s):  
Saul Armendariz ◽  
Victor Becerra ◽  
Nils Bausch

Near-ground manoeuvres, such as landing, are key elements in unmanned aerial vehicle navigation. Traditionally, these manoeuvres have been done using external reference frames to measure or estimate the velocity and the height of the vehicle. Complex near-ground manoeuvres are performed by flying animals with ease. These animals perform these complex manoeuvres by exclusively using the information from their vision and vestibular system. In this paper, we use the Tau theory, a visual strategy that, is believed, is used by many animals to approach objects, as a solution for relative ground distance control for unmanned vehicles. In this paper, it is shown how this approach can be used to perform near-ground manoeuvres in a vertical and horizontal manner on a moving target without the knowledge of height and velocity of either the vehicle or the target. The proposed system is tested with simulations. Here, it is shown that, using the proposed methods, the vehicle is able to perform landing on a moving target, and also they enable the user to choose the dynamic characteristics of the approach.


2011 ◽  
Vol 383-390 ◽  
pp. 7556-7562
Author(s):  
Tian Qin ◽  
Wan Chun Chen ◽  
Xiao Lan Xing

This paper presents a real-time optical flow algorithm for a vision-based guidance of an unmanned aerial vehicle (UAV). The optical flow algorithm detects a moving target, and obtains the optical position and optical flow vectors of the target from the image sequence. Then, a vision-based guidance of the UAV is designed to follow the moving target. Additionally, the control law of the imaging seeker uses visual information from the image sequence for target tracking. The method was tested on a 3 degree of freedom (3DOF) dual-rotor UAV with a video camera and the result proved the effectiveness of this method.


2021 ◽  
Vol 102 (4) ◽  
Author(s):  
Lingjie Yang ◽  
Zhihong Liu ◽  
Xiangke Wang ◽  
Xianguo Yu ◽  
Guanzheng Wang ◽  
...  

IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 121650-121664
Author(s):  
Xi Wang ◽  
Guanzheng Tan ◽  
Yusi Dai ◽  
Fanlei Lu ◽  
Jian Zhao

2020 ◽  
Vol 53 (3-4) ◽  
pp. 427-440 ◽  
Author(s):  
Xiao Liang ◽  
Guodong Chen ◽  
Shirou Zhao ◽  
Yiwei Xiu

Using the characteristics of unmanned aerial vehicle/unmanned ground vehicle, heterogeneous systems can accomplish many complex tasks cooperatively. Moving target tracking is an important basis for the relative positioning and formation maintenance of heterogeneous cooperative systems. This paper first introduces the unmanned aerial vehicle/unmanned ground vehicle collaborative tracking task and heterogeneous system. In order to maintain the original stability of unmanned aerial vehicle, a control method based on SBUS protocol to simulate remote control is proposed. About unmanned ground vehicle with Mecanum wheel, a detailed description of control method is designed. For the problems of real-time performance and occlusion, a tracking scheme based on AprilTag identification is studied. The scheme tracks the Tag target in the case of no occlusion. When occlusion occurs, the scheme tracks the color feature around the Tag. The accuracy of the tracking algorithm and the problem of occlusion are greatly improved. Finally, the scheme is applied to the heterogeneous systems. Simulation and experimental results show that the proposed method is suitable for unmanned aerial vehicle/unmanned ground vehicle heterogeneous system to perform the collaborative tracking task.


2016 ◽  
Vol 2016 ◽  
pp. 1-16 ◽  
Author(s):  
Jia Wei Tang ◽  
Nasir Shaikh-Husin ◽  
Usman Ullah Sheikh ◽  
M. N. Marsono

Moving target detection is the most common task for Unmanned Aerial Vehicle (UAV) to find and track object of interest from a bird’s eye view in mobile aerial surveillance for civilian applications such as search and rescue operation. The complex detection algorithm can be implemented in a real-time embedded system using Field Programmable Gate Array (FPGA). This paper presents the development of real-time moving target detection System-on-Chip (SoC) using FPGA for deployment on a UAV. The detection algorithm utilizes area-based image registration technique which includes motion estimation and object segmentation processes. The moving target detection system has been prototyped on a low-cost Terasic DE2-115 board mounted with TRDB-D5M camera. The system consists of Nios II processor and stream-oriented dedicated hardware accelerators running at 100 MHz clock rate, achieving 30-frame per second processing speed for 640 × 480 pixels’ resolution greyscale videos.


Sign in / Sign up

Export Citation Format

Share Document