scholarly journals Haptic Stylus and Empirical Studies on Braille, Button, and Texture Display

2008 ◽  
Vol 2008 ◽  
pp. 1-11 ◽  
Author(s):  
Ki-Uk Kyung ◽  
Jun-Young Lee ◽  
Junseok Park

This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor.

2012 ◽  
Vol 21 (4) ◽  
pp. 435-451 ◽  
Author(s):  
Laura Santos-Carreras ◽  
Kaspar Leuenberger ◽  
Evren Samur ◽  
Roger Gassert ◽  
Hannes Bleuler

Robotic surgery provides many benefits such as reduced invasiveness and increased dexterity. This comes at the cost of no direct contact between surgeon and patient. This physical separation prevents surgeons from performing direct haptic exploration of tissues and organs, imposing exclusive reliance on visual cues. Current technology is not yet able to both measure and reproduce a realistic and complete sense of touch (interaction force, temperature, roughness, etc.). In this paper, we put forward a concept based on multimodal feedback consisting of the integration of different kinds of visual and tactile cues with force feedback that can potentially improve both the surgeon's performance and the patient's safety. We present a cost-effective tactile display simulating a pulsating artery that has been integrated into a haptic workstation to combine both tactile and force-feedback information. Furthermore, we investigate the effect of different feedback types, including tactile and/or visual cues, on the performance of subjects carrying out two typical palpation tasks: (1) exploring a tissue to find a hidden artery and (2) identifying the orientation of a hidden artery. The results show that adding tactile feedback significantly reduces task completion time. Moreover, for high difficulty levels, subjects perform better with the feedback condition combining tactile and visual cues. As a matter of fact, the majority of the subjects in the study preferred this combined feedback because redundant feedback reassures subjects in their actions. Based on this work, we can infer that multimodal haptic feedback improves subjects' performance and confidence during exploratory procedures.


2021 ◽  
Vol 33 (5) ◽  
pp. 1104-1116
Author(s):  
Yoshihiro Tanaka ◽  
Shogo Shiraki ◽  
Kazuki Katayama ◽  
Kouta Minamizawa ◽  
Domenico Prattichizzo ◽  
...  

Tactile sensations are crucial for achieving precise operations. A haptic connection between a human operator and a robot has the potential to promote smooth human-robot collaboration (HRC). In this study, we assemble a bilaterally shared haptic system for grasping operations, such as both hands of humans using a bottle cap-opening task. A robot arm controls the grasping force according to the tactile information from the human that opens the cap with a finger-attached acceleration sensor. Then, the grasping force of the robot arm is fed back to the human using a wearable squeezing display. Three experiments are conducted: measurement of the just noticeable difference in the tactile display, a collaborative task with different bottles under two conditions, with and without tactile feedback, including psychological evaluations using a questionnaire, and a collaborative task under an explicit strategy. The results obtained showed that the tactile feedback provided the confidence that the cooperative robot was adjusting its action and improved the stability of the task with the explicit strategy. The results indicate the effectiveness of the tactile feedback and the requirement for an explicit strategy of operators, providing insight into the design of an HRC with bilaterally shared haptic perception.


Sensors ◽  
2020 ◽  
Vol 20 (17) ◽  
pp. 4780
Author(s):  
Oliver Ozioko ◽  
William Navaraj ◽  
Marion Hersh ◽  
Ravinder Dahiya

This paper presents a dual-function wearable device (Tacsac) with capacitive tactile sensing and integrated tactile feedback capability to enable communication among deafblind people. Tacsac has a skin contactor which enhances localized vibrotactile stimulation of the skin as a means of feedback to the user. It comprises two main modules—the touch-sensing module and the vibrotactile module; both stacked and integrated as a single device. The vibrotactile module is an electromagnetic actuator that employs a flexible coil and a permanent magnet assembled in soft poly (dimethylsiloxane) (PDMS), while the touch-sensing module is a planar capacitive metal-insulator-metal (MIM) structure. The flexible coil was fabricated on a 50 µm polyimide (PI) sheet using Lithographie Galvanoformung Abformung (LIGA) micromoulding technique. The Tacsac device has been tested for independent sensing and actuation as well as dual sensing-actuation mode. The measured vibration profiles of the actuator showed a synchronous response to external stimulus for a wide range of frequencies (10 Hz to 200 Hz) within the perceivable tactile frequency thresholds of the human hand. The resonance vibration frequency of the actuator is in the range of 60–70 Hz with an observed maximum off-plane displacement of 0.377 mm at coil current of 180 mA. The capacitive touch-sensitive layer was able to respond to touch with minimal noise both when actuator vibration is ON and OFF. A mobile application was also developed to demonstrate the application of Tacsac for communication between deafblind person wearing the device and a mobile phone user who is not deafblind. This advances existing tactile displays by providing efficient two-way communication through the use of a single device for both localized haptic feedback and touch-sensing.


Author(s):  
Francesca Sorgini ◽  
Giuseppe Airò Farulla ◽  
Nikola Lukic ◽  
Ivan Danilov ◽  
Bozica Bojovic ◽  
...  

Research on bidirectional human-machine interfaces will enable the smooth interaction with robotic platforms in contexts ranging from industry to tele-medicine and rescue. This paper introduces a bidirectional communication system to achieve multisensory telepresence during the gestural control of an industrial robotic arm. We complement the gesture-based control by means of a tactile-feedback strategy grounding on a spiking artificial neuron model. Force and motion from the robot are converted in neuromorphic haptic stimuli delivered on the user’s hand through a vibro-tactile glove. Untrained personnel participated in an experimental task benchmarking a pick-and-place operation. The robot end-effector was used to sequentially press six buttons, illuminated according to a random sequence, and comparing the tasks executed without and with tactile feedback. The results demonstrated the reliability of the hand tracking strategy developed for controlling the robotic arm, and the effectiveness of a neuronal spiking model for encoding hand displacement and exerted forces in order to promote a fluid embodiment of the haptic interface and control strategy. The main contribution of this paper is in presenting a robotic arm under gesture-based remote control with multisensory telepresence, demonstrating for the first time that a spiking haptic interface can be used to effectively deliver on the skin surface a sequence of stimuli emulating the neural code of the mechanoreceptors beneath.


2019 ◽  
Vol 16 (5) ◽  
pp. 172988141986318 ◽  
Author(s):  
Zhen Zhang ◽  
Xin Lu ◽  
Yoshihiro Hagihara ◽  
Adiljan Yimit

This article presents a virtual tactile display using a shape-displaying method with flexible tendon-driven transmission to enhance performance. Sixteen tactors move perpendicularly in a 4 × 4 module to render the local shape of the virtual object to the skin of the user’s fingertip. We detail the display structure design and the transmission system, and we combine the compact design of the drive unit and tactor module with a flexible tendon-driven transmission to address the ergonomic constraints on previous devices and make them more suitable for tactile feedback. In this work, we integrate the display with leap motion controller and a ray detection rendering method to generate tactile feedback. To evaluate the performance, we perform a virtual touch experiment that assesses how much the display can render the surface of three-dimensional objects to aid the participant to match the tactile sensation with visual stimuli in the virtual scene. Results show that the display improves the user experience and has good feasibility and effectiveness. In addition, the portable structure allows the user’s hand to move more freely without redundant restrictions, and the larger tactor amplitude provides more shape patterns than previous models.


Author(s):  
Andreas M. Kunz ◽  
Adrian Burri

Abstract Virtual Reality becomes more and more important within the product development process. It enables the engineer to realize constraints or mistakes in the product design at a very early stage by viewing the digital geometric prototype. Beside viewing the design of a product, additional functionalities like simulation of assembling, the physically correct behavior of a machine or the machine control come into focus of interest. Therefore, the interaction modality of haptic feedback gains more and more importance for simulation tasks in virtual environments. However there are only a few portable haptic interfaces with which the user can experience in a natural way the sensation of force feedback. The scope of this paper is to present a new passive haptic interface that is lightweight and easy to use. Furthermore it has no constraints in the workspace and applies high forces to the fingertips of the user by blocking the natural grasping.


2010 ◽  
Vol 19 (5) ◽  
pp. 400-414 ◽  
Author(s):  
Andreas Tobergte

This paper presents MiroSurge, a telepresence system for minimally invasive surgery developed at the German Aerospace Center (DLR), and introduces MiroSurge's new user interaction modalities: (1) haptic feedback with software-based preservation of the fulcrum point, (2) an ultrasound-based approach to the quasi-tactile detection of pulsating vessels, and (3) a contact-free interface between surgeon and telesurgery system, where stereo vision is augmented with force vectors at the tool tip. All interaction modalities aim to increase the user's perception beyond stereo imaging by either augmenting the images or by using haptic interfaces. MiroSurge currently provides surgeons with two different interfaces. The first option, bimanual haptic interaction with force and partial tactile feedback, allows for direct perception of the remote environment. Alternatively, users can choose to control the surgical instruments by optically tracked forceps held in their hands. Force feedback is then provided in augmented stereo images by constantly updated force vectors displayed at the centers of the teleoperated instruments, regardless of the instruments' position within the video image. To determine the centerpoints of the instruments, artificial markers are attached and optically tracked. A new approach to detecting pulsating vessels beneath covering tissue with an omnidirectional ultrasound Doppler sensor is presented. The measurement results are computed and can be provided acoustically (by displaying the typical Doppler sound), optically (by augmenting the endoscopic video stream), or kinesthetically (by a gentle twitching of the haptic input devices). The control structure preserves the fulcrum point in minimally invasive surgery and user commands are followed by the surgical instrument. Haptic feedback allows the user to distinguish between interaction with soft and hard environments. The paper includes technical evaluations of the features presented, as well as an overview of the system integration of MiroSurge.


2000 ◽  
Author(s):  
Michael L. Turner ◽  
Ryan P. Findley ◽  
Weston B. Griffin ◽  
Mark R. Cutkosky ◽  
Daniel H. Gomez

Abstract This paper describes the development of a system for dexterous telemanipulation and presents the results of tests involving simple manipulation tasks. The user wears an instrumented glove augmented with an arm-grounded haptic feedback apparatus. A linkage attached to the user’s wrist measures gross motions of the arm. The movements of the user are transferred to a two fingered dexterous robot hand mounted on the end of a 4-DOF industrial robot arm. Forces measured at the robot fingers can be transmitted back to the user via the haptic feedback apparatus. The results obtained in block-stacking and object-rolling experiments indicate that the addition of force feedback to the user did not improve the speed of task execution. In fact, in some cases the presence of incomplete force information is detrimental to performance speed compared to no force information. There are indications that the presence of force feedback did aid in task learning.


2018 ◽  
Vol 35 (2) ◽  
pp. 149-160 ◽  
Author(s):  
Mustufa H. Abidi ◽  
Abdulrahman M. Al-Ahmari ◽  
Ali Ahmad ◽  
Saber Darmoul ◽  
Wadea Ameen

AbstractThe design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.


Sign in / Sign up

Export Citation Format

Share Document