Recognition of free-form objects in dense range data using local features

Author(s):  
R.J. Campbell ◽  
P.J. Flynn
Author(s):  
RICHARD J. CAMPBELL ◽  
PATRICK J. FLYNN

Model-Based 3D object recognition systems have a variety of potential applications, but widespread use of such systems has not occurred, due to a number of factors including the representational limitations of models. One historical limitation is the discriminatory representation of free-form objects. The system described in this paper recognizes free-form objects in dense range data acquired by a structured light rangefinder. Images and object models are represented as a network of salient segments which are then brought into correspondence until a reliable pose estimate is available. Experiments with a database of images and object models highlight the contributions of this system.


Sensors ◽  
2018 ◽  
Vol 18 (8) ◽  
pp. 2678 ◽  
Author(s):  
Joel Vidal ◽  
Chyi-Yeu Lin ◽  
Xavier Lladó ◽  
Robert Martí

Pose estimation of free-form objects is a crucial task towards flexible and reliable highly complex autonomous systems. Recently, methods based on range and RGB-D data have shown promising results with relatively high recognition rates and fast running times. On this line, this paper presents a feature-based method for 6D pose estimation of rigid objects based on the Point Pair Features voting approach. The presented solution combines a novel preprocessing step, which takes into consideration the discriminative value of surface information, with an improved matching method for Point Pair Features. In addition, an improved clustering step and a novel view-dependent re-scoring process are proposed alongside two scene consistency verification steps. The proposed method performance is evaluated against 15 state-of-the-art solutions on a set of extensive and variate publicly available datasets with real-world scenarios under clutter and occlusion. The presented results show that the proposed method outperforms all tested state-of-the-art methods for all datasets with an overall 6.6% relative improvement compared to the second best method.


Sign in / Sign up

Export Citation Format

Share Document