scholarly journals A Task-Learning Strategy for Robotic Assembly Tasks from Human Demonstrations

Sensors ◽  
2020 ◽  
Vol 20 (19) ◽  
pp. 5505
Author(s):  
Guanwen Ding ◽  
Yubin Liu ◽  
Xizhe Zang ◽  
Xuehe Zhang ◽  
Gangfeng Liu ◽  
...  

In manufacturing, traditional task pre-programming methods limit the efficiency of human–robot skill transfer. This paper proposes a novel task-learning strategy, enabling robots to learn skills from human demonstrations flexibly and generalize skills under new task situations. Specifically, we establish a markerless vision capture system to acquire continuous human hand movements and develop a threshold-based heuristic segmentation algorithm to segment the complete movements into different movement primitives (MPs) which encode human hand movements with task-oriented models. For movement primitive learning, we adopt a Gaussian mixture model and Gaussian mixture regression (GMM-GMR) to extract the optimal trajectory encapsulating sufficient human features and utilize dynamical movement primitives (DMPs) to learn for trajectory generalization. In addition, we propose an improved visuo-spatial skill learning (VSL) algorithm to learn goal configurations concerning spatial relationships between task-relevant objects. Only one multioperation demonstration is required for learning, and robots can generalize goal configurations under new task situations following the task execution order from demonstration. A series of peg-in-hole experiments demonstrate that the proposed task-learning strategy can obtain exact pick-and-place points and generate smooth human-like trajectories, verifying the effectiveness of the proposed strategy.

2021 ◽  
Vol 15 ◽  
Author(s):  
Bingchen Liu ◽  
Li Jiang ◽  
Shaowei Fan ◽  
Jinghui Dai

The proposal of postural synergy theory has provided a new approach to solve the problem of controlling anthropomorphic hands with multiple degrees of freedom. However, generating the grasp configuration for new tasks in this context remains challenging. This study proposes a method to learn grasp configuration according to the shape of the object by using postural synergy theory. By referring to past research, an experimental paradigm is first designed that enables the grasping of 50 typical objects in grasping and operational tasks. The angles of the finger joints of 10 subjects were then recorded when performing these tasks. Following this, four hand primitives were extracted by using principal component analysis, and a low-dimensional synergy subspace was established. The problem of planning the trajectories of the joints was thus transformed into that of determining the synergy input for trajectory planning in low-dimensional space. The average synergy inputs for the trajectories of each task were obtained through the Gaussian mixture regression, and several Gaussian processes were trained to infer the inputs trajectories of a given shape descriptor for similar tasks. Finally, the feasibility of the proposed method was verified by simulations involving the generation of grasp configurations for a prosthetic hand control. The error in the reconstructed posture was compared with those obtained by using postural synergies in past work. The results show that the proposed method can realize movements similar to those of the human hand during grasping actions, and its range of use can be extended from simple grasping tasks to complex operational tasks.


Author(s):  
Zhou Ma ◽  
Pinhas Ben-Tzvi ◽  
Jerome Danoff

This paper presents the design and application of the SAFER glove in the field of hand rehabilitation. The authors present preliminary results on a new hand grasping rehabilitation learning system that is designed to gather kinematic and force information of the human hand and to playback the motion to assist a user in common hand grasping movements, such as grasping a bottle of water. The fingertip contact forces during grasping have been measured by the SAFER Glove from 12 subjects. The measured fingertip contact forces were modeled with Gaussian Mixture Model (GMM) based on machine learning approach. The learned force distributions were then used to generate fingertip force trajectories with a Gaussian Mixture Regression (GMR) method. To demonstrate the glove’s potential to manipulate the hand, experiments with the glove fitted on a wooden hand to grasp various objects were performed. Instead of defining a grasping force, contact force trajectories were used to control the SAFER Glove to actuate/assist this hand while carrying out a learned grasping task. To prove that the hand can be driven safely by the haptic mechanism, force sensor readings placed between each finger and the mechanism have been plotted. The experimental results show the potential of the proposed system in future hand rehabilitation therapy.


2016 ◽  
Vol 8 (5) ◽  
Author(s):  
Pinhas Ben-Tzvi ◽  
Jerome Danoff ◽  
Zhou Ma

This paper presents the design evolution of the sensing and force-feedback exoskeleton robotic (SAFER) glove with application to hand rehabilitation. The hand grasping rehabilitation system is designed to gather kinematic and force information from the human hand and then playback the motion to assist a user in common hand grasping movements, such as grasping a bottle of water. Grasping experiments were conducted where fingertip contact forces were measured by the SAFER glove. These forces were then modeled based on a machine learning approach to obtain the learned contact force distributions. Using these distributions, fingertip force trajectories were generated with a Gaussian mixture regression (GMR) method. To demonstrate the glove's effectiveness to manipulate the hand, experiments were performed using the glove to demonstrate grasping capabilities on several objects. Instead of defining a grasping force, contact force trajectories were used to control the SAFER glove in order to actuate a user's hand while carrying out a learned grasping task.


IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Giovanni Buroni ◽  
Bertrand Lebichot ◽  
Gianluca Bontempi

2021 ◽  
Vol 101 (3) ◽  
Author(s):  
Korbinian Nottensteiner ◽  
Arne Sachtler ◽  
Alin Albu-Schäffer

AbstractRobotic assembly tasks are typically implemented in static settings in which parts are kept at fixed locations by making use of part holders. Very few works deal with the problem of moving parts in industrial assembly applications. However, having autonomous robots that are able to execute assembly tasks in dynamic environments could lead to more flexible facilities with reduced implementation efforts for individual products. In this paper, we present a general approach towards autonomous robotic assembly that combines visual and intrinsic tactile sensing to continuously track parts within a single Bayesian framework. Based on this, it is possible to implement object-centric assembly skills that are guided by the estimated poses of the parts, including cases where occlusions block the vision system. In particular, we investigate the application of this approach for peg-in-hole assembly. A tilt-and-align strategy is implemented using a Cartesian impedance controller, and combined with an adaptive path executor. Experimental results with multiple part combinations are provided and analyzed in detail.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 137
Author(s):  
Larisa Dunai ◽  
Martin Novak ◽  
Carmen García Espert

The present paper describes the development of a prosthetic hand based on human hand anatomy. The hand phalanges are printed with 3D printing with Polylactic Acid material. One of the main contributions is the investigation on the prosthetic hand joins; the proposed design enables one to create personalized joins that provide the prosthetic hand a high level of movement by increasing the degrees of freedom of the fingers. Moreover, the driven wire tendons show a progressive grasping movement, being the friction of the tendons with the phalanges very low. Another important point is the use of force sensitive resistors (FSR) for simulating the hand touch pressure. These are used for the grasping stop simulating touch pressure of the fingers. Surface Electromyogram (EMG) sensors allow the user to control the prosthetic hand-grasping start. Their use may provide the prosthetic hand the possibility of the classification of the hand movements. The practical results included in the paper prove the importance of the soft joins for the object manipulation and to get adapted to the object surface. Finally, the force sensitive sensors allow the prosthesis to actuate more naturally by adding conditions and classifications to the Electromyogram sensor.


Sign in / Sign up

Export Citation Format

Share Document