A Fuzzy Deep Neural Network with Sparse Autoencoder for Emotional Intention Understanding in Human-Robot Interaction

Author(s):  
Luefeng Chen ◽  
Wanjuan Su ◽  
Min Wu ◽  
Witold Pedrycz ◽  
Kaoru Hirota
2021 ◽  
Vol 8 ◽  
Author(s):  
Lei Shi ◽  
Cosmin Copot ◽  
Steve Vanlanduit

Safety is an important issue in human–robot interaction (HRI) applications. Various research works have focused on different levels of safety in HRI. If a human/obstacle is detected, a repulsive action can be taken to avoid the collision. Common repulsive actions include distance methods, potential field methods, and safety field methods. Approaches based on machine learning are less explored regarding the selection of the repulsive action. Few research works focus on the uncertainty of the data-based approaches and consider the efficiency of the executing task during collision avoidance. In this study, we describe a system that can avoid collision with human hands while the robot is executing an image-based visual servoing (IBVS) task. We use Monte Carlo dropout (MC dropout) to transform a deep neural network (DNN) to a Bayesian DNN, and learn the repulsive position for hand avoidance. The Bayesian DNN allows IBVS to converge faster than the opposite repulsive pose. Furthermore, it allows the robot to avoid undesired poses that the DNN cannot avoid. The experimental results show that Bayesian DNN has adequate accuracy and can generalize well on unseen data. The predictive interval coverage probability (PICP) of the predictions along x, y, and z directions are 0.84, 0.94, and 0.95, respectively. In the space which is unseen in the training data, the Bayesian DNN is also more robust than a DNN. We further implement the system on a UR10 robot, and test the robustness of the Bayesian DNN and the IBVS convergence speed. Results show that the Bayesian DNN can avoid the poses out of the reach range of the robot and it lets the IBVS task converge faster than the opposite repulsive pose.1


Sensors ◽  
2021 ◽  
Vol 21 (19) ◽  
pp. 6674
Author(s):  
Wookyong Kwon ◽  
Yongsik Jin ◽  
Sang Jun Lee

Human-robot interaction has received a lot of attention as collaborative robots became widely utilized in many industrial fields. Among techniques for human-robot interaction, collision identification is an indispensable element in collaborative robots to prevent fatal accidents. This paper proposes a deep learning method for identifying external collisions in 6-DoF articulated robots. The proposed method expands the idea of CollisionNet, which was previously proposed for collision detection, to identify the locations of external forces. The key contribution of this paper is uncertainty-aware knowledge distillation for improving the accuracy of a deep neural network. Sample-level uncertainties are estimated from a teacher network, and larger penalties are imposed for uncertain samples during the training of a student network. Experiments demonstrate that the proposed method is effective for improving the performance of collision identification.


2019 ◽  
Vol 5 (4) ◽  
pp. 1279-1293 ◽  
Author(s):  
Jiawei Liu ◽  
Qi Li ◽  
Ying Han ◽  
Guorui Zhang ◽  
Xiang Meng ◽  
...  

Author(s):  
Akimul Prince ◽  
Biswanath Samanta

The paper presents a control approach based on vertebrate neuromodulation and its implementation on an autonomous robot platform. A simple neural network is used to model the neuromodulatory function for generating context based behavioral responses to sensory signals. The neural network incorporates three types of neurons — cholinergic and noradrenergic (ACh/NE) neurons for attention focusing and action selection, dopaminergic (DA) neurons for curiosity-seeking, and serotonergic (5-HT) neurons for risk aversion behavior. The implementation of the neuronal model on a relatively simple autonomous robot illustrates its interesting behavior adapting to changes in the environment. The integration of neuromodulation based robots in the study of human-robot interaction would be worth considering in future.


Author(s):  
Adhau P ◽  
◽  
Kadwane S. G ◽  
Shital Telrandhe ◽  
Rajguru V. S ◽  
...  

Human robot interaction have been ever the topic of research to research scholars owing to its importance to help humanity. Robust human interacting robot where commands from Electromyogram (EMG) signals is recently being investigated. This article involves study of motions a system that allows signals recorded directly from a human body and thereafter can be used for control of a small robotic arm. The various gestures are recognized by placing the electrodes or sensors on the human hand. These gestures are then identified by using neural network. The neural network will thus train the signals. The offline control of the arm is done by controlling the motors of the robotic arm.


2018 ◽  
Vol 311 ◽  
pp. 1-10 ◽  
Author(s):  
Lin Xu ◽  
Maoyong Cao ◽  
Baoye Song ◽  
Jiansheng Zhang ◽  
Yurong Liu ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document