Creating Freeform Model by Carving Virtual Workpiece With Haptic Interface

Author(s):  
Ming C. Leu ◽  
Aditya Velivelli ◽  
Xiaobo Peng

This paper presents the development of a virtual sculpting system, with the goal of enabling the user to create a freeform model by carving a virtual workpiece with a virtual tool while providing haptic interface during the sculpting process. A virtual reality approach is taken to provide stereoscopic viewing and force feedback, thus making the process of model creation in the virtual environment easier and more intuitive. The development of this system involves integrating techniques and algorithms in geometric modeling, computer graphics, and haptic rendering. Multithreading is used in an attempt to address the different update rates required in the graphic and haptic displays.

2001 ◽  
Vol 10 (5) ◽  
pp. 465-476 ◽  
Author(s):  
Simon P. DiMaio ◽  
Septimiu E. Salcudean ◽  
Claude Reboulet

An excavator simulator has been developed to facilitate the training of human operators and to evaluate control strategies for heavy-duty hydraulic machines. The operator controls a virtual excavator by means of a joystick while experiencing visual and force feedback generated by environment and machine models. The simulator comprises an impedance model of the excavator arm, a model for the bucket-ground interaction forces, a graphically rendered visual environment, and a haptic interface. This paper describes the simulator components and their integration.


Author(s):  
Hugo I. Medellín-Castillo ◽  
Germánico González-Badillo ◽  
Eder Govea ◽  
Raquel Espinosa-Castañeda ◽  
Enrique Gallegos

The technological growth in the last years have conducted to the development of virtual reality (VR) systems able to immerse the user into a three-dimensional (3D) virtual environment where the user can interact in real time with virtual objects. This interaction is mainly based on visualizing the virtual environment and objects. However, with the recent beginning of haptic systems, the interaction with the virtual world has been extended to also feel, touch and manipulate virtual objects. Virtual reality has been successfully used in the development of applications in different scientific areas ranging from basic sciences, social science, education and entertainment. On the other hand, the use of haptics has increased in the last decade in domains from sciences and engineering to art and entertainment. Despite many developments, there is still relatively little knowledge about the confluence of software, enabling hardware, visual and haptic representations, to enable the conditions that best provide for an immersive sensory environment to convey information about a particular subject domain. In this paper, the state of the art of the research work regarding virtual reality and haptic technologies carried out by the authors in the last years is presented. The aim is to evidence the potential use of these technologies to develop usable systems for analysis and simulation in different areas of knowledge. The development of three different systems in the areas of engineering, medicine and art is presented. In the area of engineering, a system for the planning, evaluation and training of assembly and manufacturing tasks has been developed. The system, named as HAMS (Haptic Assembly and Manufacturing System), is able to simulate assembly tasks of complex components with force feedback provided by the haptic device. On the other hand, in the area of medicine, a surgical simulator for planning and training orthognathic surgeries has been developed. The system, named as VOSS (Virtual Osteotomy Simulator System), allows the realization of virtual osteotomies with force feedback. Finally, in the area of art, an interactive cinema system for blind people has been developed. The system is able to play a 3D virtual movie for the blind user to listen to and touch by means of the haptic device. The development of these applications and the results obtained from these developments are presented and discussed in this paper.


2007 ◽  
Vol 07 (01) ◽  
pp. 37-53 ◽  
Author(s):  
OLGA SOURINA ◽  
ALEXEI SOURIN ◽  
HOWE TET SEN

Surgical training is one of the most promising areas in medicine where 3-D computer graphics and virtual reality techniques are emerging. Orthopedic surgery is a discipline requiring appreciation and understanding of complex 3-dimensional bony structures and their relationships to nerves, blood vessels and other vital structures. Learning these spatial skills requires a lengthy period and much practice. In this paper, we present a software simulator which was developed to aid in the understanding of the complex 3-dimensional relationships between bones and implants. The developed software cuts down the learning curve and allows for better and more precise surgery by letting the surgeon practice the surgery in a virtual environment before undertaking the actual procedure.


2001 ◽  
Vol 1 (2) ◽  
pp. 123-128 ◽  
Author(s):  
Sergei Volkov ◽  
Judy M. Vance

Virtual reality techniques provide a unique new way to interact with three-dimensional digital objects. Virtual prototyping refers to the use of virtual reality to obtain evaluations of designs while they are still in digital form before physical prototypes are built. While the state-of-the-art in virtual reality relies mainly on the use of stereo viewing and auditory feedback, commercial haptic devices have recently become available that can be integrated into the virtual environment to provide force feedback to the user. This paper outlines a study that was performed to determine whether the addition of force feedback to the virtual prototyping task improved the ability of the participants to make design decisions. Seventy-six people participated in the study. The specific task involved comparing the location and movement of two virtual parking brakes located in the virtual cockpit of an automobile. The results indicate that the addition of force feedback to the virtual environment did not increase the accuracy of the participants’ answers, but it did allow them to complete the task in a shorter time. This paper describes the purpose, methods, and results of the study.


Author(s):  
Weihang Zhu

This paper presents an infrastructure that integrates a haptic interface into a mainstream computer-aided design (CAD) system. A haptic interface, by providing force feedback in human-computer interaction, can improve the working efficiency of CAD/computer-aided manufacturing (CAM) systems in a unique way. The full potential of the haptic technology is best realized when it is integrated effectively into the product development environment and process. For large manufacturing companies this means integration into a commercial CAD system (Stewart, et al., 1997, “Direct Integration of Haptic User Interface in CAD Systems,” ASME Dyn. Syst. Control Div., 61, pp. 93–99). Mainstream CAD systems typically use constructive solid geometry (CSG) and boundary representation (B-Rep) format as their native format, while internally they automatically maintain triangulated meshes for graphics display and for numerical evaluation tasks such as surface-surface intersection. In this paper, we propose to render a point-based haptic force feedback by leveraging built-in functions of the CAD systems. The burden of collision detection and haptic rendering computation is alleviated by using bounding spheres and an OpenGL feedback buffer. The major contribution of this paper is that we developed a sound structure and methodology for haptic interaction with native CAD models inside mainstream CAD systems. We did so by analyzing CAD application models and by examining haptic rendering algorithms. The technique enables the user to directly touch and manipulate native 3D CAD models in mainstream CAD systems with force/touch feedback. It lays the foundation for future tasks such as direct CAD model modification, dynamic simulation, and virtual assembly with the aid of a haptic interface. Hence, by integrating a haptic interface directly with mainstream CAD systems, the powerful built-in functions of CAD systems can be leveraged and enhanced to realize more agile 3D CAD design and evaluation.


2011 ◽  
Vol 20 (4) ◽  
pp. 371-392 ◽  
Author(s):  
Zheng Wang ◽  
Elias Giannopoulos ◽  
Mel Slater ◽  
Angelika Peer

This paper focuses on the development and evaluation of a haptic enhanced virtual reality system which allows a human user to make physical handshakes with a virtual partner through a haptic interface. Multimodal feedback signals are designed to generate the illusion that a handshake with a robotic arm is a handshake with another human. Advanced controllers of the haptic interface are developed to respond to user behaviors online. Techniques to achieve online behavior generation are presented, such as a hidden-Markov-model approach to human interaction strategy estimation. Human-robot handshake experiments were carried out to evaluate the performance of the system. Two different approaches to haptic rendering were compared in experiments: a controller in basic mode with an embedded curve in the robot that disregards the human partner, and an interactive robot controller for online behavior generation. The two approaches were compared with the ground truth of another human driving the robot via teleoperation instead of the controller implementing a virtual partner. In the evaluation results, the human approach is rated to be most human-like, with the interactive controller following closely behind, followed by the controller in basic mode. This paper mainly concentrates on discussing the development of the haptic rendering algorithm for the handshaking system, its integration with visual and haptic cues, and reports about the results of subjective evaluation experiments that were carried out.


2019 ◽  
Vol 9 (18) ◽  
pp. 3692 ◽  
Author(s):  
Seonghoon Ban ◽  
Kyung Hoon Hyun

In recent years, consumer-level virtual-reality (VR) devices and content have become widely available. Notably, establishing a sense of presence is a key objective of VR and an immersive interface with haptic feedback for VR applications has long been in development. Despite the state-of-the-art force feedback research being conducted, a study on directional feedback, based on force concentration, has not yet been reported. Therefore, we developed directional force feedback (DFF), a device that generates directional sensations for virtual-reality (VR) applications via mechanical force concentrations. DFF uses the rotation of motors to concentrate force and deliver directional sensations to the user. To achieve this, we developed a novel method of force concentration for directional sensation; by considering both rotational rebound and gravity, the optimum rotational motor speeds and rotation angles were identified. Additionally, we validated the impact of DFF in a virtual environment, showing that the users’ presence and immersion within VR were higher with DFF than without. The result of the user studies demonstrated that the device significantly improves immersivity of virtual applications.


Author(s):  
Semin Ryu ◽  
Jeong-Hoi Koo ◽  
Tae-Heon Yang ◽  
Dongbum Pyo ◽  
Ki-Uk Kyung ◽  
...  

This paper presents design and testing of a haptic keypad system using an array of haptic actuators. The research goals are to construct a prototype haptic keypad system using haptic actuators and to evaluate the performance of the prototype keypad for haptic rendering. To this end, an MR haptic actuator was designed and fabricated such that it can convey realistic force feedback to users. To demonstrate haptic applications of the MR actuator, a haptic keypad system was constructed, which consists of following components: (1) 3 × 3 array of haptic actuators, (2) 3 × 3 array of force sensing resistors (FSR), (3) a controller including a micro-processor, a current amplifier and a wireless communication module, (4) a graphic display unit with PC. After constructing a prototype keypad system, a haptic rendering technology was employed to interface the hardware keypad system with test software (virtual environment). The prototype system enabled human operators to interact with the target contents in a virtual environment more intuitively. The evaluation results show a feasibility of applications of MR fluids-based haptic actuators in real-world mobile applications.


2012 ◽  
Vol 588-589 ◽  
pp. 1242-1245 ◽  
Author(s):  
Chun Lin Zhao ◽  
Hao Yuan ◽  
Jian Gong Wang ◽  
Liang Wang

By combining three-dimensional panorama technology with geometric modeling technology, the models of the primary and secondary equipment of substations are built to get a three-dimensional (3D) virtual environment. This method makes it easy to view the equipments and improves the integration of substation monitoring information. This paper analyzes and discusses two kinds of solutions which are based on Virtual Reality Modeling Language (VRML) and Open Scene Graph (OSG) together with Qt as the main technologies.


Author(s):  
Xiaobo Peng ◽  
Blesson Isaac

This paper presents the research work on developing a virtual sculpting system with haptic interface integrated with PowerWall system for complex product design. The PowerWall is a large scale (10 ft by 7.5 ft) immersive Virtual Environment (VE). The approach is to apply virtual sculpting method by interactively carving a workpiece using a virtual tool. With the implementation of stereoscopic visual feedback and haptic force feedback in the PowerWall, the designer would appreciate a much better understanding of the 3D shape geometry and can explore through the 3D scene like he/she can do in the real world. The “hybrid interaction technique” is presented as solution to solve the mismatch between the small workspace of the haptic device and the large size of PowerWall system.


Sign in / Sign up

Export Citation Format

Share Document