On-board real-time optic-flow for miniature event-based vision sensors

Author(s):  
J. Conradt
2018 ◽  
Vol 30 (9) ◽  
pp. 2384-2417 ◽  
Author(s):  
M. B. Milde ◽  
O. J. N. Bertrand ◽  
H. Ramachandran ◽  
M. Egelhaaf ◽  
E. Chicca

Apparent motion of the surroundings on an agent's retina can be used to navigate through cluttered environments, avoid collisions with obstacles, or track targets of interest. The pattern of apparent motion of objects, (i.e., the optic flow), contains spatial information about the surrounding environment. For a small, fast-moving agent, as used in search and rescue missions, it is crucial to estimate the distance to close-by objects to avoid collisions quickly. This estimation cannot be done by conventional methods, such as frame-based optic flow estimation, given the size, power, and latency constraints of the necessary hardware. A practical alternative makes use of event-based vision sensors. Contrary to the frame-based approach, they produce so-called events only when there are changes in the visual scene. We propose a novel asynchronous circuit, the spiking elementary motion detector (sEMD), composed of a single silicon neuron and synapse, to detect elementary motion from an event-based vision sensor. The sEMD encodes the time an object's image needs to travel across the retina into a burst of spikes. The number of spikes within the burst is proportional to the speed of events across the retina. A fast but imprecise estimate of the time-to-travel can already be obtained from the first two spikes of a burst and refined by subsequent interspike intervals. The latter encoding scheme is possible due to an adaptive nonlinear synaptic efficacy scaling. We show that the sEMD can be used to compute a collision avoidance direction in the context of robotic navigation in a cluttered outdoor environment and compared the collision avoidance direction to a frame-based algorithm. The proposed computational principle constitutes a generic spiking temporal correlation detector that can be applied to other sensory modalities (e.g., sound localization), and it provides a novel perspective to gating information in spiking neural networks.


2017 ◽  
Vol 11 ◽  
Author(s):  
Abhishek Mishra ◽  
Rohan Ghosh ◽  
Jose C. Principe ◽  
Nitish V. Thakor ◽  
Sunil L. Kukreja

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 134926-134942 ◽  
Author(s):  
Alejandro Linares-Barranco ◽  
Fernando Perez-Pena ◽  
Diederik Paul Moeys ◽  
Francisco Gomez-Rodriguez ◽  
Gabriel Jimenez-Moreno ◽  
...  

Author(s):  
Andrés Bruhn ◽  
Joachim Weickert ◽  
Christian Feddern ◽  
Timo Kohlberger ◽  
Christoph Schnörr

Sensors ◽  
2019 ◽  
Vol 19 (7) ◽  
pp. 1584 ◽  
Author(s):  
Yushan Li ◽  
Wenbo Zhang ◽  
Xuewu Ji ◽  
Chuanxiang Ren ◽  
Jian Wu

The curvature of the lane output by the vision sensor caused by shadows, changes in lighting and line breaking jumps over in a period of time, which leads to serious problems for unmanned driving control. It is particularly important to predict or compensate the real lane in real-time during sensor jumps. This paper presents a lane compensation method based on multi-sensor fusion of global positioning system (GPS), inertial measurement unit (IMU) and vision sensors. In order to compensate the lane, the cubic polynomial function of the longitudinal distance is selected as the lane model. In this method, a Kalman filter is used to estimate vehicle velocity and yaw angle by GPS and IMU measurements, and a vehicle kinematics model is established to describe vehicle motion. It uses the geometric relationship between vehicle and relative lane motion at the current moment to solve the coefficient of the lane polynomial at the next moment. The simulation and vehicle test results show that the prediction information can compensate for the failure of the vision sensor, and has good real-time, robustness and accuracy.


2020 ◽  
Vol 10 (5) ◽  
pp. 1611
Author(s):  
Michael H. Spiegel ◽  
Edmund Widl ◽  
Bernhard Heinzl ◽  
Wolfgang Kastner ◽  
Nabil Akroud

Various development and validation methods for cyber-physical systems such as Controller-Hardware-in-the-Loop (C-HIL) testing strongly benefit from a seamless integration of (hardware) prototypes and simulation models. It has been often demonstrated that linking discrete event-based control systems and hybrid plant models can advance the quality of control implementations. Nevertheless, high manual coupling efforts and sometimes spurious simulation artifacts such as glitches and deviations are observed frequently. This work specifically addresses these two issues by presenting a generic, standard-based infrastructure referred to as virtual component, which enables the efficient coupling of simulation models and automation systems. A novel soft real-time coupling algorithm featuring event-accurate synchronization by extrapolating future model states is outlined. Based on considered standards for model exchange (FMI) and controls (IEC 61499), important properties such as real-time capabilities are derived and experimentally validated. Evaluation demonstrates that virtual components support engineers in efficiently creating C-HIL setups and that the novel algorithm can feature accurate synchronization when conventional approaches fail.


Sign in / Sign up

Export Citation Format

Share Document