Tracking and Detecting Moving Objects Using Stereo Camera on Moving Platform

2012 ◽  
Vol 14 (1) ◽  
pp. 1-9
Author(s):  
Hung-Yin Tsai ◽  
Yen-Po Lin ◽  
Cheng-Che Chen
2010 ◽  
Author(s):  
Jonah C. McBride ◽  
Andrey Ostapchenko ◽  
Howard Schultz ◽  
Magnus S. Snorrason

Author(s):  
Qiang Zhou ◽  
Danping Zou ◽  
Peilin Liu

Purpose This paper aims to develop an obstacle avoidance system for a multi-rotor micro aerial vehicle (MAV) that flies in indoor environments which usually contain transparent, texture-less or moving objects. Design/methodology/approach The system adopts a combination of a stereo camera and an ultrasonic sensor to detect obstacles and extracts three-dimensional (3D) point clouds. The obstacle map is built on a coarse global map and updated by local maps generated by the recent 3D point clouds. An efficient layered A* path planning algorithm is also proposed to address the path planning in 3D space for MAVs. Findings The authors conducted a lot of experiments in both static and dynamic scenes. The results show that the obstacle avoidance system works reliably even when transparent or texture-less obstacles are present. The layered A* path planning algorithm is much faster than the traditional 3D algorithm and makes the system response quickly when the obstacle map has been changed because of the moving objects. Research limitations/implications The limited field of view of both stereo camera and ultrasonic sensor makes the system need to change heading first before moving side to side or moving backward. But this problem could be addressed when multiple systems are mounted toward different directions on the MAV. Practical implications The developed approach could be valuable to applications in indoors. Originality/value This paper presents a robust obstacle avoidance system and a fast layered path planning algorithm that are easy to be implemented for practical systems.


2003 ◽  
Vol 15 (3) ◽  
pp. 304-313 ◽  
Author(s):  
Atsushi Yamashita ◽  
◽  
Toru Kaneko ◽  
Shinya Matsushita ◽  
Kenjiro T. Miura ◽  
...  

In this paper, we propose a fast, easy camera calibration and 3-D measurement method with an active stereo vision system for handling moving objects whose geometric models are known. We use stereo cameras that change direction independently to follow moving objects. To gain extrinsic camera parameters in real time, a baseline stereo camera (parallel stereo camera) model and projective transformation of stereo images are used by considering epipolar constraints. To make use of 3-D measurement results for a moving object, the manipulator hand approaches the object. When the manipulator hand and object are near enough to be situated in a single image, very accurate camera calibration is executed to calculate the manipulator size in the image. Our calibration is simple and practical because it does not need to calibrate all camera parameters. The computation time for real-time calibration is not large because we need only search for one parameter in real time by deciding the relationship between all parameters in advance. Our method does not need complicated image processing or matrix calculation. Experimental results show that the accuracy of 3-D reconstruction of a cubic box whose edge is 60 mm long is within 1.8 mm when the distance between the camera and the box is 500 mm. Total computation time for object tracking, camera calibration, and manipulation control is within 0.5 seconds.


2010 ◽  
Author(s):  
Frank B. ter Haar ◽  
Richard J. M. den Hollander ◽  
Judith Dijk

Author(s):  
H. Kauhanen ◽  
P. Rönnholm

Synchronous triggering is an important task that allows simultaneous data capture from multiple cameras. Accurate synchronization enables 3D measurements of moving objects or from a moving platform. In this paper, we describe one wired and four wireless variations of Arduino-based low-cost remote trigger systems designed to provide a synchronous trigger signal for industrial cameras. Our wireless systems utilize 315 MHz or 434 MHz frequencies with noise filtering capacitors. In order to validate the synchronization accuracy, we developed a prototype of a rotating trigger detection system (named RoTriDeS). This system is suitable to detect the triggering accuracy of global shutter cameras. As a result, the wired system indicated an 8.91 μs mean triggering time difference between two cameras. Corresponding mean values for the four wireless triggering systems varied between 7.92 and 9.42 μs. Presented values include both camera-based and trigger-based desynchronization. Arduino-based triggering systems appeared to be feasible, and they have the potential to be extended to more complicated triggering systems.


Sign in / Sign up

Export Citation Format

Share Document