scholarly journals Smart Assistive Architecture for the Integration of IoT Devices, Robotic Systems, and Multimodal Interfaces in Healthcare Environments

Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2212
Author(s):  
Alberto Brunete ◽  
Ernesto Gambao ◽  
Miguel Hernando ◽  
Raquel Cedazo

This paper presents a new architecture that integrates Internet of Things (IoT) devices, service robots, and users in a smart assistive environment. A new intuitive and multimodal interaction system supporting people with disabilities and bedbound patients is presented. This interaction system allows the user to control service robots and devices inside the room in five different ways: touch control, eye control, gesture control, voice control, and augmented reality control. The interaction system is comprised of an assistive robotic arm holding a tablet PC. The robotic arm can place the tablet PC in front of the user. A demonstration of the developed technology, a prototype of a smart room equipped with home automation devices, and the robotic assistive arm are presented. The results obtained from the use of the various interfaces and technologies are presented in the article. The results include user preference with regard to eye-base control (performing clicks, and using winks or gaze) and the use of mobile phones over augmented reality glasses, among others.

Author(s):  
Dharshan Y. ◽  
Vivek S. ◽  
Saranya S. ◽  
Aarthi V.R. ◽  
Madhumathi T.

<div><p><em>Robots have become a key technology in various fields. Robotic arms are mostly remote controlled by buttons or panels and sometimes in batch process they are autonomous. The usage of panel boards or control sticks includes a lot of hardwiring and subject to malfunction.  It also induces some stress on the operators. Hence major chemical industries like cosmetic manufacturing, paint manufacturing and Biosynthesis laboratory etc., which deals with hazardous environment due to the chemicals and other bio substances, involve humans for the processing. The aim is to reduce the bulk of wiring in the robotic arms and reduce the effort and number of operators in controlling the robotic arm operations. To implement gestures into the process this would be a major breakthrough. This can also be used as pick &amp; place robot, a cleaning robot in chemical industries where a human does not need to directly involved in the process of cleaning the chemicals and also for coating underground tanks.</em></p></div>


Author(s):  
Shriya A. Hande ◽  
Nitin R. Chopde

<p>In today’s world, in almost all sectors, most of the work is done by robots or robotic arm having different number of degree of freedoms (DOF’s) as per the requirement. This project deals with the Design and Implementation of a “Wireless Gesture Controlled Robotic Arm with Vision”. The system design is divided into 3 parts namely: Accelerometer Part, Robotic Arm and Platform. It is fundamentally an Accelerometer based framework which controls a Robotic Arm remotely utilizing a, little and minimal effort, 3-pivot (DOF's) accelerometer by means of RF signals. The Robotic Arm is mounted over a versatile stage which is likewise controlled remotely by another accelerometer. One accelerometer is mounted/joined on the human hand, catching its conduct (motions and stances) and hence the mechanical arm moves in like manner and the other accelerometer is mounted on any of the leg of the client/administrator, catching its motions and stances and in this way the stage moves as needs be. In a nutshell, the robotic arm and platform is synchronised with the gestures and postures of the hand and leg of the user / operator, respectively. The different motions performed by robotic arm are: PICK and PLACE / DROP, RAISING and LOWERING the objects. Also, the motions performed by the platform are: FORWARD, BACKWARD, RIGHT and LEFT.</p>


2020 ◽  
Vol 305 ◽  
pp. 00022
Author(s):  
Marius Leonard Olar ◽  
Marius Risteiu ◽  
Arun Fabian Panaite ◽  
Mihai Rebrisoreanu ◽  
Oliviu Musetoiu

Under the circumstances of a patient’s upper limb disability, aided by a robotic arm with faulty controls, assistance is needed, using augmented reality as an auxiliary. Our system, with a headset, using an internet connection and an augmented reality device, placed on the assistant’s head, can ensure communication between the two, for both remote supervision and control. The assistant can enhance the control over the robotic arm, while having a head up display on the augmented reality glasses, based on what the patient sees. The communication is established through PC or mobile devices, connected to the internet. Having the patient’s view, and enhanced control over the robotic arm, the assistant can interact with nearby smart objects.


Author(s):  
Khairul Salleh Mohamed Sahari ◽  
◽  
Yew Cheong Hou

This paper presents a mass-spring model applied to the manipulation of an elastic deformable object for home service robot application. A system is also proposed that is used to fold a piece of rectangular cloth from a specific initial condition using a robot. The cloth is modeled as a three-dimensional object in a two-dimensional quadrangular mesh based on a massspring system, and its state is estimated using an explicit integration scheme that computes the particle position as a function of the internal and external forces acting on the elastic deformable object. The current state of the elastic deformable object under robot manipulation is tracked based on the trajectory of the mass points in the mass-spring system model in a self-developed simulator, which integrates a massspring model and a five-degree-of-freedom articulated robotic arm. To test the reliability of the model, the simulator is used to predict the best possible paths for using the robotic arm to fold a rectangular cloth into two. In the test, the state of the object is derived from the model and then compared with the results of a practical experiment. Based on the test, the error is found to be generally acceptable. Thus, this model can be used as an estimator for the vision-based tracking of the state of an elastic deformable object for manipulation by home service robots.


Author(s):  
Yassine Bouteraa ◽  
Ismail Ben Abdallah

Purpose The idea is to exploit the natural stability and performance of the human arm during movement, execution and manipulation. The purpose of this paper is to remotely control a handling robot with a low cost but effective solution. Design/methodology/approach The developed approach is based on three different techniques to be able to ensure movement and pattern recognition of the operator’s arm as well as an effective control of the object manipulation task. In the first, the methodology works on the kinect-based gesture recognition of the operator’s arm. However, using only the vision-based approach for hand posture recognition cannot be the suitable solution mainly when the hand is occluded in such situations. The proposed approach supports the vision-based system by an electromyography (EMG)-based biofeedback system for posture recognition. Moreover, the novel approach appends to the vision system-based gesture control and the EMG-based posture recognition a force feedback to inform operator of the real grasping state. Findings The main finding is to have a robust method able to gesture-based control a robot manipulator during movement, manipulation and grasp. The proposed approach uses a real-time gesture control technique based on a kinect camera that can provide the exact position of each joint of the operator’s arm. The developed solution integrates also an EMG biofeedback and a force feedback in its control loop. In addition, the authors propose a high-friendly human-machine-interface (HMI) which allows user to control in real time a robotic arm. Robust trajectory tracking challenge has been solved by the implementation of the sliding mode controller. A fuzzy logic controller has been implemented to manage the grasping task based on the EMG signal. Experimental results have shown a high efficiency of the proposed approach. Research limitations/implications There are some constraints when applying the proposed method, such as the sensibility of the desired trajectory generated by the human arm even in case of random and unwanted movements. This can damage the manipulated object during the teleoperation process. In this case, such operator skills are highly required. Practical implications The developed control approach can be used in all applications, which require real-time human robot cooperation. Originality/value The main advantage of the developed approach is that it benefits at the same time of three various techniques: EMG biofeedback, vision-based system and haptic feedback. In such situation, using only vision-based approaches mainly for the hand postures recognition is not effective. Therefore, the recognition should be based on the biofeedback naturally generated by the muscles responsible of each posture. Moreover, the use of force sensor in closed-loop control scheme without operator intervention is ineffective in the special cases in which the manipulated objects vary in a wide range with different metallic characteristics. Therefore, the use of human-in-the-loop technique can imitate the natural human postures in the grasping task.


Sign in / Sign up

Export Citation Format

Share Document