3d hough transform
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 1)

H-INDEX

7
(FIVE YEARS 0)

2021 ◽  
Vol 13 (19) ◽  
pp. 3817
Author(s):  
Yimeng Zou ◽  
Jiahao Tian ◽  
Guanghu Jin ◽  
Yongsheng Zhang

Distributed radar array brings several new forthcoming advantages in aerospace target detection and imaging. The two-dimensional distributed array avoids the imperfect motion compensation in coherent processing along slow time and can achieve single snapshot 3D imaging. Some difficulties exist in the 3D imaging processing. The first one is that the distributed array may be only in small amount. This means that the sampling does not meet the Nyquist sample theorem. The second one refers to echoes of objects in the same beam that will be mixed together, which makes sparse optimization dictionary too long for it to bring the huge computation burden in the imaging process. In this paper, we propose an innovative method on 3D imaging of the aerospace targets in the wide airspace with sparse radar array. Firstly, the case of multiple targets is not suitable to be processed uniformly in the imaging process. A 3D Hough transform is proposed based on the range profiles plane difference, which can detect and separate the echoes of different targets. Secondly, in the subsequent imaging process, considering the non-uniform sparse sampling of the distributed array in space, the migration through range cell (MTRC)-tolerated imaging method is proposed to process the signal of the two-dimensional sparse array. The uniformized method combining compressed sensing (CS) imaging in the azimuth direction and matched filtering in the range direction can realize the 3D imaging effectively. Before imaging in the azimuth direction, interpolation in the range direction is carried out. The main contributions of the proposed method are: (1) echo separation based on 3D transform avoids the huge amount of computation of direct sparse optimization imaging of three-dimensional data, and ensures the realizability of the algorithm; and (2) uniformized sparse solving imaging is proposed, which can remove the difficulty cause by MTRC. Simulation experiments verified the effectiveness and feasibility of the proposed method.


Sensors ◽  
2020 ◽  
Vol 20 (11) ◽  
pp. 3146
Author(s):  
Xin Chen ◽  
Houjin Chen ◽  
Yahui Peng ◽  
Dan Tao

A 3D ultrasound image reconstruction technique, named probe sector matching (PSM), is proposed in this paper for a freehand linear array ultrasound probe equipped with multiple sensors, providing the position and attitude of the transducer and the pressure between the transducer and the target surface. The proposed PSM method includes three main steps. First, the imaging target and the working range of the probe are set to be the center and the radius of the imaging field of view, respectively. To reconstruct a 3D volume, the positions of all necessary probe sectors are pre-calculated inversely to form a sector database. Second, 2D cross-section probe sectors with the corresponding optical positioning, attitude and pressure information are collected when the ultrasound probe is moving around the imaging target. Last, an improved 3D Hough transform is used to match the plane of the current probe sector to the existing sector images in the sector database. After all pre-calculated probe sectors are acquired and matched into the 3D space defined by the sector database, a 3D ultrasound reconstruction is completed. The PSM is validated through two experiments: a virtual simulation using a numerical model and a lab experiment using a real physical model. The experimental results show that the PSM effectively reduces the errors caused by changes in the target position due to the uneven surface pressure or the inhomogeneity of the transmission media. We conclude that the PSM proposed in this study may help to design a lightweight, inexpensive and flexible ultrasound device with accurate 3D imaging capacity.


2020 ◽  
Vol 10 (5) ◽  
pp. 1744 ◽  
Author(s):  
Yifei Tian ◽  
Wei Song ◽  
Long Chen ◽  
Yunsick Sung ◽  
Jeonghoon Kwak ◽  
...  

Plane extraction is regarded as a necessary function that supports judgment basis in many applications, including semantic digital map reconstruction and path planning for unmanned ground vehicles. Owing to the heterogeneous density and unstructured spatial distribution of three-dimensional (3D) point clouds collected by light detection and ranging (LiDAR), plane extraction from it is recently a significant challenge. This paper proposed a parallel 3D Hough transform algorithm to realize rapid and precise plane detection from 3D LiDAR point clouds. After transforming all the 3D points from a Cartesian coordinate system to a pre-defined 3D Hough space, the generated Hough space is rasterised into a series of arranged cells to store the resided point counts into individual cells. A 3D connected component labeling algorithm is developed to cluster the cells with high values in Hough space into several clusters. The peaks from these clusters are extracted so that the targeting planar surfaces are obtained in polar coordinates. Because the laser beams emitted by LiDAR sensor holds several fixed angles, the collected 3D point clouds distribute as several horizontal and parallel circles in plane surfaces. This kind of horizontal and parallel circles mislead plane detecting results from horizontal wall surfaces to parallel planes. For detecting accurate plane parameters, this paper adopts a fraction-to-fraction method to gradually transform raw point clouds into a series of sub Hough space buffers. In our proposed planar detection algorithm, a graphic processing unit (GPU) programming technology is applied to speed up the calculation of 3D Hough space updating and peaks searching.


Author(s):  
Egor. I. Ershov ◽  
Arseniy P. Terekhin ◽  
Simon M. Karpenko ◽  
Dmitry P. Nikolaev ◽  
Vassili V. Postnikov

Author(s):  
Yegor V. Goshin ◽  
◽  
Galina E. Loshkareva ◽  
◽  
◽  
...  

2014 ◽  
Vol 25 (7) ◽  
pp. 1877-1891 ◽  
Author(s):  
Marco Camurri ◽  
Roberto Vezzani ◽  
Rita Cucchiara

Sign in / Sign up

Export Citation Format

Share Document