Millimeter-wave insect vision sensors for collision avoidance in space

1999 ◽  
Author(s):  
David C. Goodfellow ◽  
Derek Abbott
2021 ◽  
Vol 13 (13) ◽  
pp. 2643
Author(s):  
Dário Pedro ◽  
João P. Matos-Carvalho ◽  
José M. Fonseca ◽  
André Mora

Unmanned Autonomous Vehicles (UAV), while not a recent invention, have recently acquired a prominent position in many industries, and they are increasingly used not only by avid customers, but also in high-demand technical use-cases, and will have a significant societal effect in the coming years. However, the use of UAVs is fraught with significant safety threats, such as collisions with dynamic obstacles (other UAVs, birds, or randomly thrown objects). This research focuses on a safety problem that is often overlooked due to a lack of technology and solutions to address it: collisions with non-stationary objects. A novel approach is described that employs deep learning techniques to solve the computationally intensive problem of real-time collision avoidance with dynamic objects using off-the-shelf commercial vision sensors. The suggested approach’s viability was corroborated by multiple experiments, firstly in simulation, and afterward in a concrete real-world case, that consists of dodging a thrown ball. A novel video dataset was created and made available for this purpose, and transfer learning was also tested, with positive results.


2005 ◽  
Author(s):  
R. Guzinski ◽  
K. Nguyen ◽  
Z. H. Yong ◽  
S. Rajesh ◽  
D. C. O'Carroll ◽  
...  

2008 ◽  
Vol 2008.17 (0) ◽  
pp. 273-276
Author(s):  
Yuta Takimoto ◽  
Hirotomo Muroi ◽  
Ikuko Shimizu ◽  
Masao Nagai ◽  
Michael Darms ◽  
...  

1999 ◽  
Author(s):  
David C. Goodfellow ◽  
Gregory P. Harmer ◽  
Derek Abbott

2018 ◽  
Vol 30 (9) ◽  
pp. 2384-2417 ◽  
Author(s):  
M. B. Milde ◽  
O. J. N. Bertrand ◽  
H. Ramachandran ◽  
M. Egelhaaf ◽  
E. Chicca

Apparent motion of the surroundings on an agent's retina can be used to navigate through cluttered environments, avoid collisions with obstacles, or track targets of interest. The pattern of apparent motion of objects, (i.e., the optic flow), contains spatial information about the surrounding environment. For a small, fast-moving agent, as used in search and rescue missions, it is crucial to estimate the distance to close-by objects to avoid collisions quickly. This estimation cannot be done by conventional methods, such as frame-based optic flow estimation, given the size, power, and latency constraints of the necessary hardware. A practical alternative makes use of event-based vision sensors. Contrary to the frame-based approach, they produce so-called events only when there are changes in the visual scene. We propose a novel asynchronous circuit, the spiking elementary motion detector (sEMD), composed of a single silicon neuron and synapse, to detect elementary motion from an event-based vision sensor. The sEMD encodes the time an object's image needs to travel across the retina into a burst of spikes. The number of spikes within the burst is proportional to the speed of events across the retina. A fast but imprecise estimate of the time-to-travel can already be obtained from the first two spikes of a burst and refined by subsequent interspike intervals. The latter encoding scheme is possible due to an adaptive nonlinear synaptic efficacy scaling. We show that the sEMD can be used to compute a collision avoidance direction in the context of robotic navigation in a cluttered outdoor environment and compared the collision avoidance direction to a frame-based algorithm. The proposed computational principle constitutes a generic spiking temporal correlation detector that can be applied to other sensory modalities (e.g., sound localization), and it provides a novel perspective to gating information in spiking neural networks.


1995 ◽  
Author(s):  
Derek Abbott ◽  
Alireza Moini ◽  
Andre Yakovleff ◽  
X. Thong Nguyen ◽  
Andrew Blanksby ◽  
...  

1997 ◽  
Author(s):  
Derek Abbott ◽  
Alireza Moini ◽  
Andre Yakovleff ◽  
X. Thong Nguyen ◽  
R. Beare ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document