PHANToM haptic device implemented in a projection screen virtual environment

Author(s):  
A. Fischer ◽  
J. M. Vance
Author(s):  
Ryan A. Pavlik ◽  
Judy M. Vance ◽  
Greg R. Luecke

Ground-based haptic devices provide the capability of adding force feedback to virtual environments; however, the physical workspace of such devices is very limited due to the fixed base. By mounting a haptic device on a mobile robot, rather than a fixed stand, the reachable volume can be extended to function in full-scale virtual environments. This work presents the hardware, software, and integration developed to use such a mobile base with a Haption Virtuose™ 6D35-45. A mobile robot with a Mecanum-style omni-directional drive base and an Arduino-compatible microcontroller development board communicates with software on a host computer to provide a VRPN-based control and data acquisition interface. The position of the mobile robot in the physical space is tracked using an optical tracking system. The SPARTA virtual assembly software was extended to 1) apply transformations to the haptic device data based on the tracked base position, and 2) capture the error between the haptic device’s end effector and the center of its workspace and command the robot over VRPN to minimize this error. The completed system allows use of the haptic device in a wide area projection screen or head-mounted display virtual environment, providing smooth free-space motion and stiff display of forces to the user throughout the entire space. The availability of haptics in large immersive environments can contribute to future advances in virtual assembly planning, factory simulation, and other operations where haptics is an essential part of the simulation experience.


2013 ◽  
Vol 2013 ◽  
pp. 1-15 ◽  
Author(s):  
Faraz Shah ◽  
Ilia G. Polushin

The paper deals with the design of control algorithms for virtual reality based telerobotic system with haptic feedback that allows for the remote control of the vertical drilling operation. The human operator controls the vertical penetration velocity using a haptic device while simultaneously receiving the haptic feedback from the locally implemented virtual environment. The virtual environment is rendered as a virtual spring with stiffness updated based on the estimate of the stiffness of the rock currently being cut. Based on the existing mathematical models of drill string/drive systems and rock cutting/penetration process, a robust servo controller is designed which guarantees the tracking of the reference vertical penetration velocity of the drill bit. A scheme for on-line estimation of the rock intrinsic specific energy is implemented. Simulations of the proposed control and parameter estimation algorithms have been conducted; consequently, the overall telerobotic drilling system with a human operator controlling the process using PHANTOM Omni haptic device is tested experimentally, where the drilling process is simulated in real time in virtual environment.


Author(s):  
Xiaowei Dai ◽  
Yuru Zhang ◽  
Dangxiao Wang

Maximum virtual stiffness is a critical performance measure for haptic devices. Stable haptic interaction is necessary for realistic feeling of virtual environment. The virtual environment is determined by the application and device. To ensure the stable haptic interaction, the virtual environment must be suitable for the device. Therefore, the virtual stiffness should not be greater than the minimum value of maximum virtual stiffness that a haptic device can stably render in the workspace. This paper proposes a method, utilizing the eigenvalue and eigenvector of stiffness matrix in joint space, to analyze and measure the maximum virtual stiffness distribution in the work space of a haptic device. Therefore, for a given haptic device, the maximum virtual stiffness at each position and orientation can be forecasted by this method. A new sufficient condition for haptic stability is also presented in the view of driven motor in this paper. A series experiments validate the effectiveness of this method.


Author(s):  
Norali Pernalete ◽  
Amar Raheja ◽  
Stephanie Carey

In this paper, we discuss the possibility to determine assessment metrics for eye-hand coordination and upper-limb disability therapy, using a mapping between a robotic haptic device to a virtual environment and a training algorithm based on Complex Valued Neural Networks that will calculate how close a set movement pattern is in relationship with that traced by a healthy individual. Most of the current robotic systems’ therapy relies on the patient’s performance on standardized clinical tests such as the functional independence measure (FIM), and the upper limb subsection of the Fugl-Meyer (FM) scales. These systems don’t have other standardized metrics for assessment purposes. There is a need to establish a more intelligent and tailored therapy that could be implemented for patients to use at home in between therapy sessions, or in the long term. This therapy should be based on performance data gathered by the robotic/computer system that will provide an assessment procedure with improved objectivity and precision. A set of complex and movement demanding virtual environments, representing various levels of difficulty labyrinths was developed in a virtual environment. The participants were instructed to use a haptic device (Omni) to follow the trajectories. This was completed while video data were collected using a Vicon motion capture system. Readings of traced trajectories, time, and upper limb motions are recorded for further analysis.


2021 ◽  
Author(s):  
Linda Feenstra ◽  
Umberto Scarcia ◽  
Riccardo Zanella ◽  
Roberto Meattini ◽  
Davide Chiaravalli ◽  
...  

2007 ◽  
Vol 4 (4) ◽  
pp. 157-168
Author(s):  
Juan Manuel Ibarra-Zannatha ◽  
Claudia Marmolejo-Rivas ◽  
Manuel Ferre-Pérez ◽  
Rafael Aracil-Santonja ◽  
Salvador Cobos-Guzmán

The aim of this work is the integration of a virtual environment containing a deformable object, manipulated by an open kinematical chain virtual slave robot, to a bilateral teleoperation scheme based on a real haptic device. The virtual environment of this hybrid bilateral teleoperation system combines collision detection algorithms, dynamical, kinematical and geometrical models with a position–position and/or force–position bilateral control algorithm, to produce on the operator side the reflected forces corresponding to the virtual mechanical interactions, through a haptic device. Contact teleoperation task over the virtual environment with a flexible object is implemented and analysed.


Author(s):  
Hugo I. Medellín-Castillo ◽  
Germánico González-Badillo ◽  
Eder Govea ◽  
Raquel Espinosa-Castañeda ◽  
Enrique Gallegos

The technological growth in the last years have conducted to the development of virtual reality (VR) systems able to immerse the user into a three-dimensional (3D) virtual environment where the user can interact in real time with virtual objects. This interaction is mainly based on visualizing the virtual environment and objects. However, with the recent beginning of haptic systems, the interaction with the virtual world has been extended to also feel, touch and manipulate virtual objects. Virtual reality has been successfully used in the development of applications in different scientific areas ranging from basic sciences, social science, education and entertainment. On the other hand, the use of haptics has increased in the last decade in domains from sciences and engineering to art and entertainment. Despite many developments, there is still relatively little knowledge about the confluence of software, enabling hardware, visual and haptic representations, to enable the conditions that best provide for an immersive sensory environment to convey information about a particular subject domain. In this paper, the state of the art of the research work regarding virtual reality and haptic technologies carried out by the authors in the last years is presented. The aim is to evidence the potential use of these technologies to develop usable systems for analysis and simulation in different areas of knowledge. The development of three different systems in the areas of engineering, medicine and art is presented. In the area of engineering, a system for the planning, evaluation and training of assembly and manufacturing tasks has been developed. The system, named as HAMS (Haptic Assembly and Manufacturing System), is able to simulate assembly tasks of complex components with force feedback provided by the haptic device. On the other hand, in the area of medicine, a surgical simulator for planning and training orthognathic surgeries has been developed. The system, named as VOSS (Virtual Osteotomy Simulator System), allows the realization of virtual osteotomies with force feedback. Finally, in the area of art, an interactive cinema system for blind people has been developed. The system is able to play a 3D virtual movie for the blind user to listen to and touch by means of the haptic device. The development of these applications and the results obtained from these developments are presented and discussed in this paper.


2007 ◽  
Author(s):  
Young-Min Han ◽  
Pil-Soon Kang ◽  
Min-Sang Seong ◽  
Seung-Bok Choi

Sign in / Sign up

Export Citation Format

Share Document