Analysis of statistical properties of atmospheric turbulence-induced image dancing based on Hilbert transform and dense optical flow

Author(s):  
Jingyuan Liu ◽  
Bindang Xue ◽  
Linyan Cui
Author(s):  
A. V. Bratulin ◽  
◽  
M. B. Nikiforov ◽  
A. I. Efimov ◽  
◽  
...  

2021 ◽  
Author(s):  
Tian Shen ◽  
Cui Long ◽  
Liu Zhaoming ◽  
Wang Hongwei ◽  
Zhang Feng ◽  
...  

2016 ◽  
Vol 36 (10) ◽  
pp. 1026016
Author(s):  
向宁静 Xiang Ningjing ◽  
吴振森 Wu Zhensen ◽  
郭秋芬 Guo Qiufen

Sensors ◽  
2019 ◽  
Vol 19 (4) ◽  
pp. 807
Author(s):  
Cong Shi ◽  
Zhuoran Dong ◽  
Shrinivas Pundlik ◽  
Gang Luo

This work proposes a hardware-friendly, dense optical flow-based Time-to-Collision (TTC) estimation algorithm intended to be deployed on smart video sensors for collision avoidance. The algorithm optimized for hardware first extracts biological visual motion features (motion energies), and then utilizes a Random Forests regressor to predict robust and dense optical flow. Finally, TTC is reliably estimated from the divergence of the optical flow field. This algorithm involves only feed-forward data flows with simple pixel-level operations, and hence has inherent parallelism for hardware acceleration. The algorithm offers good scalability, allowing for flexible tradeoffs among estimation accuracy, processing speed and hardware resource. Experimental evaluation shows that the accuracy of the optical flow estimation is improved due to the use of Random Forests compared to existing voting-based approaches. Furthermore, results show that estimated TTC values by the algorithm closely follow the ground truth. The specifics of the hardware design to implement the algorithm on a real-time embedded system are laid out.


Sign in / Sign up

Export Citation Format

Share Document