Point-Cloud Fast Filter for People Detection with Indoor Service Robots

Author(s):  
Carlos Medina Sanchez ◽  
Jesus Capitan ◽  
Matteo Zella ◽  
Pedro J. Marron
Robotics ◽  
2019 ◽  
Vol 8 (3) ◽  
pp. 75 ◽  
Author(s):  
Claudia Álvarez-Aparicio ◽  
Ángel Manuel Guerrero-Higueras ◽  
Francisco Javier Rodríguez-Lera ◽  
Jonatan Ginés Clavero ◽  
Francisco Martín Rico ◽  
...  

The tracking of people is an indispensable capacity in almost any robotic application. A relevant case is the @home robotic competitions, where the service robots have to demonstrate that they possess certain skills that allow them to interact with the environment and the people who occupy it; for example, receiving the people who knock at the door and attending them as appropriate. Many of these skills are based on the ability to detect and track a person. It is a challenging problem, particularly when implemented using low-definition sensors, such as Laser Imaging Detection and Ranging (LIDAR) sensors, in environments where there are several people interacting. This work describes a solution based on a single LIDAR sensor to maintain a continuous identification of a person in time and space. The system described is based on the People Tracker package, aka PeTra, which uses a convolutional neural network to identify person legs in complex environments. A new feature has been included within the system to correlate over time the people location estimates by using a Kalman filter. To validate the solution, a set of experiments have been carried out in a test environment certified by the European Robotic League.


2019 ◽  
Vol 16 (1) ◽  
pp. 172988141983184 ◽  
Author(s):  
Brayan S Zapata-Impata ◽  
Pablo Gil ◽  
Jorge Pomares ◽  
Fernando Torres

Industrial and service robots deal with the complex task of grasping objects that have different shapes and which are seen from diverse points of view. In order to autonomously perform grasps, the robot must calculate where to place its robotic hand to ensure that the grasp is stable. We propose a method to find the best pair of grasping points given a three-dimensional point cloud with the partial view of an unknown object. We use a set of straightforward geometric rules to explore the cloud and propose grasping points on the surface of the object. We then adapt the pair of contacts to a multi-fingered hand used in experimentation. We prove that, after performing 500 grasps of different objects, our approach is fast, taking an average of 17.5 ms to propose contacts, while attaining a grasp success rate of 85.5%. Moreover, the method is sufficiently flexible and stable to work with objects in changing environments, such as those confronted by industrial or service robots.


Author(s):  
Yanzhu Hu ◽  
Yingjian Wang ◽  
Song Wang ◽  
Xu Zhao

Facing the precise service and emergency rescue needs of medical service robots in irregular scene, in order to achieve better navigation and path planning for robots in service scenarios, for the whole reconstruction of the absolute scale service scenario, this article proposes a frame of whole scene three-dimensional (3D) point cloud reconstruction based on the fusion of scene depth estimation, confidence assessment, and pose tracking with monocular camera. The algorithm first collects the scene focus stack images under an initial viewing angle through the robot mobile terminal of camera. The absolute depth information of the scene is estimated on the server side, and the confidence level of the reconstructed image of the point cloud is evaluated, and non- uniform sampling is performed to reduce the influence of the error estimation. Based on the sparse key frame position information defined by monocular SLAM, the 3D reconstruction of the whole scene in absolute scale is realized through multi-perspective point cloud pose matching. It provides information of cloud reconstruction of scenic spots for target recognition and navigation of a medical service robot.


Author(s):  
Kai O. Arras ◽  
Boris Lau ◽  
Slawomir Grzonka ◽  
Matthias Luber ◽  
Oscar Martinez Mozos ◽  
...  

2013 ◽  
Vol 133 (1) ◽  
pp. 18-27 ◽  
Author(s):  
Hisato Fukuda ◽  
Satoshi Mori ◽  
Katsutoshi Sakata ◽  
Yoshinori Kobayashi ◽  
Yoshinori Kuno

2016 ◽  
Vol 136 (8) ◽  
pp. 1078-1084
Author(s):  
Shoichi Takei ◽  
Shuichi Akizuki ◽  
Manabu Hashimoto

2004 ◽  
Vol 61 (7-12) ◽  
pp. 875-893 ◽  
Author(s):  
I. A. Vyazmitinov ◽  
Ye. I. Myroshnychenko ◽  
O. V. Sytnik
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document