Design and Analysis of New Haptic Joysticks for Enhancing Operational Skills in Excavator Control

2020 ◽  
Vol 142 (12) ◽  
Author(s):  
Meera C S ◽  
Pinisetti Swami Sairam ◽  
Vineeth Veeramalla ◽  
Adarsh Kumar ◽  
Mukul Kumar Gupta

Abstract The design perspective of interfaces has strong implications on operator intuition and safety. Haptics enabled user interfaces can enhance operator skills and enhance interactivity. In this paper, an innovative method of haptic feedback in joysticks is presented for excavator control. Haptic illusion in the device is generated with the concept of the variable stiffness actuation mechanism. The force feedback (FFB) is rendered through “haptic links,” based on the effect of digging force at each joint. The stiffness in the device varies dynamically with the load and restricts the operator motion with a resistive torque in the range of 0–0.9 Nm. The haptic joystick aims to render high-fidelity kinesthetic feedback that can help to mitigate the operator error in loading operations. The user evaluation with the joystick showed an improvement of 40% in the volume of material removed and a significant drop in error rate related to force patterns and collisions.

Author(s):  
Ronak R. Mohanty ◽  
Vinayak R. Krishnamurthy

Abstract In this article, we report on our investigation of kinesthetic feedback as a means to provide precision, accuracy, and mitigation of arm fatigue in spatial manipulation tasks. Most works on spatial manipulation discuss the use of haptics (kinesthetic/force and tactile) primarily as a means to offer physical realism in spatial user interfaces (SUIs). Our work offers a new perspective in terms of how force-feedback can promote precise manipulations in spatial interactions to aid manual labor, controllability, and precision. To demonstrate this, we develop, implement, and evaluate three new haptics-enabled interaction techniques (kinesthetic metaphors) for precise rotation of 3D objects. The quantitative and qualitative analyses of experiments reveal that the addition of force-feedback improves precision for each of the rotation techniques. Self-reported user feedback further exposes a novel aspect of kinesthetic manipulation in its ability to mitigate arm fatigue for close-range spatial manipulation tasks.


2000 ◽  
Author(s):  
Michael L. Turner ◽  
Ryan P. Findley ◽  
Weston B. Griffin ◽  
Mark R. Cutkosky ◽  
Daniel H. Gomez

Abstract This paper describes the development of a system for dexterous telemanipulation and presents the results of tests involving simple manipulation tasks. The user wears an instrumented glove augmented with an arm-grounded haptic feedback apparatus. A linkage attached to the user’s wrist measures gross motions of the arm. The movements of the user are transferred to a two fingered dexterous robot hand mounted on the end of a 4-DOF industrial robot arm. Forces measured at the robot fingers can be transmitted back to the user via the haptic feedback apparatus. The results obtained in block-stacking and object-rolling experiments indicate that the addition of force feedback to the user did not improve the speed of task execution. In fact, in some cases the presence of incomplete force information is detrimental to performance speed compared to no force information. There are indications that the presence of force feedback did aid in task learning.


2018 ◽  
Vol 35 (2) ◽  
pp. 149-160 ◽  
Author(s):  
Mustufa H. Abidi ◽  
Abdulrahman M. Al-Ahmari ◽  
Ali Ahmad ◽  
Saber Darmoul ◽  
Wadea Ameen

AbstractThe design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.


2005 ◽  
Vol 128 (2) ◽  
pp. 216-226 ◽  
Author(s):  
M. A. Vitrani ◽  
J. Nikitczuk ◽  
G. Morel ◽  
C. Mavroidis ◽  
B. Weinberg

Force-feedback mechanisms have been designed to simplify and enhance the human-vehicle interface. The increase in secondary controls within vehicle cockpits has created a desire for a simpler, more efficient human-vehicle interface. By consolidating various controls into a single, haptic feedback control device, information can be transmitted to the operator, without requiring the driver’s visual attention. In this paper, the experimental closed loop torque control of electro-rheological fluids (ERF) based resistive actuators for haptic applications is performed. ERFs are liquids that respond mechanically to electric fields by changing their properties, such as viscosity and shear stress electroactively. Using the electrically controlled rheological properties of ERFs, we developed resistive-actuators for haptic devices that can resist human operator forces in a controlled and tunable fashion. In this study, the ERF resistive-actuator analytical model is derived and experimentally verified and accurate closed loop torque control is experimentally achieved using a non-linear proportional integral controller with a feedforward loop.


Author(s):  
Ronak R. Mohanty ◽  
Umema H. Bohari ◽  
Vinayak ◽  
Eric Ragan

We present haptics-enabled mid-air interactions for sketching collections of three-dimensional planar curves — 3D curve-soups — as a means for 3D design conceptualization. Haptics-based mid-air interactions have been extensively studied for modeling of surfaces and solids. The same is not true for modeling curves; there is little work that explores spatiality, tangibility, and kinesthetics for curve modeling, as seen from the perspective of 3D sketching for conceptualization. We study pen-based mid air interactions for free-form curve input from the perspective of manual labor, controllability, and kinesthetic feedback. For this, we implemented a simple haptics-enabled workflow for users to draw and compose collections of planar curves on a force-enabled virtual canvas. We introduce a novel force-feedback metaphor for curve drawing, and investigate three novel rotation techniques within our workflow for both controlled and free-form sketching tasks.


2019 ◽  
Vol 121 (4) ◽  
pp. 1398-1409 ◽  
Author(s):  
Vonne van Polanen ◽  
Robert Tibold ◽  
Atsuo Nuruki ◽  
Marco Davare

Lifting an object requires precise scaling of fingertip forces based on a prediction of object weight. At object contact, a series of tactile and visual events arise that need to be rapidly processed online to fine-tune the planned motor commands for lifting the object. The brain mechanisms underlying multisensory integration serially at transient sensorimotor events, a general feature of actions requiring hand-object interactions, are not yet understood. In this study we tested the relative weighting between haptic and visual signals when they are integrated online into the motor command. We used a new virtual reality setup to desynchronize visual feedback from haptics, which allowed us to probe the relative contribution of haptics and vision in driving participants’ movements when they grasped virtual objects simulated by two force-feedback robots. We found that visual delay changed the profile of fingertip force generation and led participants to perceive objects as heavier than when lifts were performed without visual delay. We further modeled the effect of vision on motor output by manipulating the extent to which delayed visual events could bias the force profile, which allowed us to determine the specific weighting the brain assigns to haptics and vision. Our results show for the first time how visuo-haptic integration is processed at discrete sensorimotor events for controlling object-lifting dynamics and further highlight the organization of multisensory signals online for controlling action and perception. NEW & NOTEWORTHY Dexterous hand movements require rapid integration of information from different senses, in particular touch and vision, at different key time points as movement unfolds. The relative weighting between vision and haptics for object manipulation is unknown. We used object lifting in virtual reality to desynchronize visual and haptic feedback and find out their relative weightings. Our findings shed light on how rapid multisensory integration is processed over a series of discrete sensorimotor control points.


Sign in / Sign up

Export Citation Format

Share Document