By Lynn Adams,2014-07-04 17:47
23 views 0

    Proceedings of International Workshop on Virtual prototyping, Laval, France, pp. 87-96, May 1999.

    Keynote Address: Haptic Feedback for Virtual Reality

    Grigore C. Burdea

    Rutgers? The State University of New Jersey,

     CAIP Center, 96 Frelinghuysen Rd.,

     Piscataway, NJ 08854, USA.

     phone: 1-732-445-5309

     fax: 1-732-445-4775


    Haptic feedback is a crucial sensorial modality in virtual reality interactions. Haptics means both force feedback (simulating object hardness, weight, and inertia) and tactile feedback (simulating surface contact geometry, smoothness, slippage, and temperature). Providing such sensorial data requires desk-top or portable special-purpose hardware called haptic interfaces. Modeling physical interactions involves precise collision detection, real-time force computation, and high control-loop bandwidth. This results in a large computation load which requires multi-processor parallel processing on networked computers. Applications for haptics-intensive VR simulations include CAD model design and assembly. Improved technology (wearable computers, novel actuators, haptic toolkits) will increase the use of force/tactile feedback in future VR simulations. 1. Introduction

    Virtual Reality has been defined as I3 for “Immersion-Interaction-Imagination” [Burdea and Coiffet, 1994]. The interaction component of this high-end user interface involves multiple sensorial channels, such as the visual, auditory, haptic, smell, and taste ones. The majority of today's VR simulations use the visual (3-D stereo displays) and auditory (interactive or 3-D sound) modalities. Haptic feedback is now starting to get recognition and use in manipulation-intensive applications, while smell and taste feedback are at the stage of early research. Haptic feedback groups the modalities of force feedback, tactile feedback, and the proprioceptive feedback [Burdea,1996]. Force feedback integrated in a VR simulation provides data on a virtual object hardness, weight, and inertia. Tactile feedback is used to give the user a feel of the virtual object surface contact geometry, smoothness, slippage, and temperature. Finally, proprioceptive feedback in the sensing of the user's body position, or posture.

    Of the feedback modalities mentioned above, force feedback was the first to be used. It was integrated in a robotic tele-operation system for nuclear environments developed by Goertz at Argonne National Laboratories [Goertz and Thomson, 1954]. Subsequently the group led by

    Brooks at the University of North Carolina at Chapel Hill adapted the same electromechanical arm to provide force feedback during virtual molecular docking [Brooks et al., 1990]. Later Burdea and colleagues at Rutgers University developed a light and portable force feedback glove called the “Rutgers Master” [Burdea et al., 1992]. Commercial force feedback interfaces have subsequently appeared, such as the PHANToM arm in 1994 [Massie and Salisbury, 1994], the Impulse Engine in 1995 [Jackson and Rosenberg, 1995] and the CyberGrasp? glove in 1998 [Virtual Technologies, 1998]. Furthermore, inexpensive haptic joysticks costing about $100 became available for computer game use in the late 90s.

    Tactile feedback, as a component of VR simulations, was pioneered at MIT. Patrick used voice coils to provide vibrations at the fingertips of a user wearing a Dextrous Hand Master Exoskeleton [Patrick, 1990]. Minsky and her colleagues developed the “Sandpaper” tactile joystick that mapped image texels to vibrations [Minsky et al., 1990]. Commercial tactile feedback interfaces followed, namely the “Touch Master” in 1993 [Exos, 1993], the CyberTouch? glove in 1995 [Virtex, 1998], and more recently, the “FEELit Mouse” in 1997 [Immersion, 1997]. Figure 1 summarizes this abbreviated history of VR force/tactile feedback development in the USA [Burdea, 1996].

     Figure 1: Abbreviated history of virtual tactile/force feedback in the USA.

    (adapted from [Burdea, 1996]). ;John Wiley & Sons. Reprinted by permission. As can be clearly seen, there has been a resurgence of research interest and haptics interface products in the late 90s. Outside of the United States, research on haptic feedback has been pursued in several countries, notably in Japan [Iwata, 1990], UK [Stone, 1991], France [Bouzit et al, 1993], and Italy [Bergamasco et al., 1994]. Section 2 of this article describes several general purpose

    haptic interfaces, with emphasis on commercial products. Section 3 discusses the physical modeling required, and its associated computation/control load. Section 4 presents some application examples of haptics for CAD/CAM VR simulations. Conclusions and future directions for haptic feedback are given in Section 5.

    2. Haptic Feedback Interfaces

    Haptic feedback interfaces comprise force feedback and tactile feedback devices. Force feedback interfaces can be viewed as computer “extensions” that apply physical forces, and torques on the user. The interfaces that are most used today are those that are desk-top, are easy to install, clean and safe to the user. When comparing force feedback hardware for a given simulation application, such as CAD/CAM, the designer has to consider several key hardware characteristics. These are the number of degrees of freedom, the interface work envelope, its maximum vs. sustained force levels, its structural friction and stiffness, its dynamic range, control bandwidth, etc. Sustained force levels are usually much smaller than maximum output force produced by haptic interfaces. This is especially true for forces produced with electrical actuators, which overheat. Friction needs to be small, such that the forces commanded by the computer are not “filtered out” by the interface, before they are felt by the user. Dynamic range is the maximum force divided by the interface friction. This a-dimensional number is a good measure of the quality of the force feedback produced by a given interface. Finally, high bandwidths are important for short time delays and overall system stability.

    The most popular haptic feedback interface at present is the PHANToM family of arms, manufactured by SensAble Technologies (Boston, MA). As illustrated in Figure 2 [Massie and Salisbury, 1994], the PHANToM has six degrees of freedom, and three electrical actuators, each model has different dimensions. Depending on model, its work envelope progresses from wrist motion up to shoulder motion. The maximum force level is up to 22 N, and sustained forces of only 3 N. The increase of work envelope between the “Standard” and “Super Extended” versions of the PHANToM results in increased friction and diminished structural stiffness. Another drawback of the PHANToM standard configuration is its ability to apply forces to one finger only, and with no torques. A recent 3-D version of the interface, called the PHANToM “Premium 3.0” has forces and torques [Chen, 1999]. Maximum torques are 670 mNm, being produced by actuators placed in the handle. Continuous torques are only 104 mNm. However, it still allows either one finger, or closed fist type of interactions only.

    The characteristics of the PHANToM make it well suited for point interaction mediated by a single virtual finger, or a stylus, or pencil. More dextrous manipulation of virtual objects will require at least two PHANToM arms (one for the thumb and index, respectively), or the use of a haptic glove. The use of a haptic glove will allow the designer to pick up and manipulate CAD models, while feeling their hardness. Haptic gloves are useful for manipulation over large volumes, including simulating hard objects with no weight. Simulating object weight would require adding a wrist force/torque interface, with reduced work envelope, increased system complexity and cost.

    Figure 2: The PHANToM Arm (adapted from [Massie and Salisbury, 1994]).

     ;ASME. Reprinted by permission.

    The only haptic glove commercially available today is the CyberGrasp, which is a retrofit of the position-only CyberGlove manufactured by Virtual Technologies (Palo Alto, CA). As illustrated in Figure 3-a, the CyberGrasp consists of a cable-driven exoskeleton structure on the back of the hand. The interface is powered by electrical actuators capable of applying 12 N resistive forces to each finger. The exoskeleton attachment to the back of the palm allows full fist closure, but requires the remote placement of actuators in a control box. This results in high backlash and friction, and reduces the dynamic range of the device. Even with the remote placement of its actuators, the weight of the glove is quite high (450 grams), which may lead to user fatigue during prolonged use.

    The Rutgers Master II illustrated in Figure 3-b, is a research prototype developed at Rutgers University. This second haptic glove has a smaller weight than the CyberGlove (130 grams vs. 450 grams), due to the use of pneumatic actuators with high power/weight ratio. The low friction of the actuators, and their placement in the hand provide for high interface dynamic range (300). Since actuators do not overheat, they can produce a maximum forces equal to the sustained one of 16 N/fingertip. This force is higher than the peak force of the CyberGrasp. Unfortunately, the Rutgers Master does not allow complete fist closure, due to the placement of the actuators in the palm.

    a) b)

    Figure 3: a) The CyberGrasp [Virtex, 1998]. Reprinted by permission;

     b) The Rutgers Master II.

    Some of the variables used to characterize force feedback hardware, such as work envelope, degrees of freedom, weight, control bandwidth are also used in the selection process of tactile feedback interfaces. In fact some force feedback interfaces, such as the PHANToM can also replicate surface mechanical texture, or slippage, which means they can also provide tactile feedback. Conversely, some tactile feedback interfaces have some (limited) force feedback capability. An example is the FEELit Mouse produced by Immersion Corporation (San Jose, CA), and illustrated in Figure 4-a. This desk-top two-DOF interface enables the user to feel simulated objects, such as hard surfaces, rough textures, smooth contours, even rubbery materials. Its workspace is 2.5×1.9 cm, and its maximum output force equivalent is 1 N in x and y directions. The drawback is the limited work envelope, and the point/arrow interaction modality.

    a) b)

    Figure 4: Tactile feedback interfaces: a) the FEELit Mouse [Immersion, 1997];

     b) the CyberTouch Glove? [Virtex, 1998]. Reprinted by permission.

    Tactile gloves are more appropriate when the VR simulation requires dexterity (multiple contact points), freedom of motion, and information on object grasping state and mechanical texture (but not weight). These gloves are lighter than force feedback ones, and typically use electromechanical vibrators to convey texture data. These actuators are small and can be placed directly on the glove. The co-location of actuators on the glove in places where tactile feedback is needed results in a simpler design, reduced weight and system cost. An example of a commercial tactile glove is the CyberTouch produced by Virtual Technologies Co., which is illustrated in Figure 4-b. Its weight is only 144 grams, compared with the 450 grams of the CyberGrasp. The CyberTouch uses six electromechanical vibrators placed on the back of the fingers and in the palm. These actuators produce vibrations of 0? 125 Hz, with a force amplitude of 1.2 N at 125 Hz.

    3. Physical Modeling

    Selecting the best haptic interface for a given application is only part of the developer's task. Just as important is the physical simulation component of the VR software. Realistic physical modeling software can enhance significantly the user's sense of immersion and interactivity, especially in manipulation-intensive applications such as CAD/CAM. Conversely, poor modeling of haptic interactions diminish the simulation usefulness, and may even be detrimental (through system instabilities).

    Among the key aspects of physical modeling are collision detection, force and tactile feedback computation, surface deformation, hard contact modeling and others. Collision detection determines when two or more virtual objects interact, through touching or interpenetrating. Such ``objects" could be several CAD parts, or they could be virtual fingers grasping a virtual ball. The need to perform collision detection in real time, while simulating complex models, has led to a two- step approach. The first step is an approximate bounding box collision detection [Burdea, 1994], which eliminates a lot of unnecessary computations. Once the bounding boxes of two virtual objects interpenetrate, then exact collision detection is performed. Such algorithms use Voronoi Volumes [Lin, 1993], [Cohen, 1995], or implicit functions [Schlaroff, 1991].

    Of special interest to the CAD community is the “Voxmap PointShell” (VPS) collision-detection algorithm developed by Boeing Co. [McNeely et al., 1999]. This algorithm is specially suited to complex virtual models, with rigid surfaces, and a small number of moving objects. Figure 5 shows a model of a teapot with the associated voxmap. The teapot is mapped to a PHANToM Premium, such that the user feels contact forces and torques when navigating in a myriad of virtual pipes. In such a complex environments VPS can detect collisions and calculate response up to 1,000 times/second!

    Interaction forces need to be calculated at high rates in order to satisfy the control requirements of haptic interface hardware. For a single contact point this usually means simple linear equations such as the Hooke's law, with different slopes for “hard” and “soft” virtual objects [Burdea, 1993]. When multiple contact points exist between objects, then Hooke's law forces are added according to Newtonian physics. More complex force patterns are associated with objects that are non- isotropic, with different hardness in different regions of their geometry. An example is a virtual push button with haptic click [SensAble, 1994]. At the start of compression the user feels the virtual spring inside the button, and the force increases linearly. At a certain compression threshold

    the spring is disengaged and the force drops to a very small value, due to static friction. If the user continues to press, the force grows very quickly to the maximum output force of the haptic interface, resulting in a haptic click.


    Figure 5: Voxmap and point shell collision detection: a) the point shell model; b) the complex

     virtual environment [11]. ;ACM. Reprinted by permission. In CAD applications physical modeling has to account for objects (such as assembly parts) which are neither elastic nor plastic. Real steel parts do not deform under normal manipulation forces, and haptic interfaces need to produce an instantaneous and very large contact force. If a simple Hooke's law is used, the virtual steel part will not feel real because of the limited stiffness of today interfaces. Furthermore, instabilities may result, due to the digital control of the interface, which produces sample-and-hold artifacts [Colgate, 1993]. The solution is then to add a dissipative term, such as a directional damper, to the Hooke's law.

    Surface mechanical texture, or smoothness, is another important component of physical modeling in VR. The tactile interface can then let the user feel if the surface of the manipulated object is smooth, rough, bumpy, etc. One approach is to use local surface gradients in the direction of the surface normal [Minsky, 1990]. Small feedback forces are then proportional to the height of the surface “hills.” Another approach, taken by the GHOST haptic library used by the PHANToM, is to use sinusoidal functions [SensAble, 1994]. Thus vibrations proportional with the surface roughness are over-imposed on the force-feedback signal. This approach can be used by any high- bandwidth haptic interface.

    4. Application Examples

    A comprehensive survey of haptic feedback applications in VR can be found in [Burdea, 1996]. Space constraints limit our discussion in this paper to CAD/CAM applications. The CAD concept stage design process focuses on overall functionality, and is typically done today with pencil and paper [Kraftcheck, 1997]. Chu and his colleagues at University of Wisconsin-Madison developed a multi-modal VR interface for the generation, modification and review of part and assembly design [Chu, 1997]. Input to the system was through hand gestures (measured by sensing gloves),

    voice commands and eye tracking. Output from the simulation was through visual feedback (graphics), auditory feedback (voice synthesis) and tactile feedback (allowing the user to feel the parts he was designing). Subjective human factor studies were conducted to evaluate the usefulness of these interaction modalities in various combinations. Results showed voice and gesture input to be superior to eye tracking, while visual output was the most important output modality for shape design. The researchers noted a lack of reliable force feedback technology that may be used in CAD design.

    Gupta at Schlumberger in collaboration with Sheridan and Whitney at MIT developed a CAD assembly simulation [Gupta, 1997]. A pair of PHANToM force feedback interfaces were used by designers to grasp the part being designed with the thumb and index and feel resistance due to contact forces. The multi-modal simulation incorporated speech and finger position inputs with visual, auditory and force feedback. Experiments comparing handling and insertion assembly times in real and virtual worlds showed that force feedback was beneficial in terms of task efficiency.

    Jayaram and his colleagues at Washington State University developed the Virtual Assembly Design Environment (VADE). The system allowed engineers to design using a parametric CAD system, and automatically export data to an immersive virtual environment. In the virtual environment, the designer was presented with an assembly scene, as illustrated in Figure 6 [Jayaram et al., 1997,1999]. The user performed the assembly in VR and generated design information, which was then automatically fed back to the parametric CAD system. A CyberGrasp haptic interface, modified for portability, was later integrated in order to provide grasping forces to the user (trainee).

    More recently Frölich and colleagues at the German National Research Center in collaboration with Stanford University and Carnegie Mellon University have developed a physically-based assembly training simulation [Frölich et al., 2000]. The system uses a Responsive Workbench large-volume display for graphics and allows several users to interact with the complex assembly model. 5. Conclusion and Future Directions

    It is hoped that this brief discussion gave the reader a feel for the complexities and benefits of haptic interface in VR simulation, especially as they relate to CAD/CAM applications. While the need for haptics has become clear in recent years [Brooks, 1999], the technology is not fully developed. We need interfaces that are powerful, yet light and non-obstructive. This in turn requires novel actuators of the type that do not exist today. Distributed computation on multiple processors and multiple computers will become widespread, especially as higher bandwidth networks become common. This in turn will allow haptics to be added to today's web modalities of sight and sound. Once the hardware problems are solved, more and more work will be dedicated to making simulations more realistic. This will require significant human-factors studies for iterative design and validation. Haptics will also require more of the third I in VR, imagination. ACKNOWLEDGMENTS

    The author's research which was reported here has been supported in part by grants from the National Science Foundation (BES 9708020), from the New Jersey Commission on Science and Technology (R & Excellence Grant) and from Rutgers? The State University of New Jersey (CAIP and SROA Grants).

    Figure 6: Virtual assembly simulation with constrained motion and force feedback

     [Jayaram et al., 1999]. ;IEEE 1999. Reprinted by permission. References

    Bergamasco M., B. Allotta, L. Bosio, L. Ferretti, G. Parrini, G. Prisco, F. Salsedo and Sartini, “An Arm Exoskeleton System for Teleoperation and Virtual Environments Applications,” Proceedings of the IEEE International Conference on Robotics and Automation, San Diego, CA, May, pp. 1449? 1454, 1994.

    Boeing Co., Haptics, Company brochure, Seattle WA, 2 pp., 1999.

    Bouzit, M., P. Richard and P. Coiffet, “LRP Dextrous Hand Master Control System,” Technical Report, Laboratoire de Robotique de Paris, 21 pp., January 1993.

    Brooks, F., M. Ouh-Young, J. Batter and A. Jerome, “Project GROPE - Haptic Displays for Scientific Visualization,” Computer Graphics, Vol. 24, No. 4, pp. 177? 185, 1990. Brooks, F., “What's Real about Virtual Reality?” Keynote address, Proceedings of IEEE Virtual Reality'99, Houston, TX, pp. 2? 3, March 1999.

    Burdea, G., J. Zhuang, E. Roskos, D. Silver and N. Langrana, “A Portable Dextrous Master with Force Feedback,” Presence, Vol. 1. No.1, pp. 18? 27, March 1992.

    Burdea, G., “Virtual Reality Systems and Applications,” Electro'93 International Conference, Short Course, Edison, NJ, April 28, 1993.

    Burdea, G., and P. Coiffet, Virtual Reality Technology, John Wiley & Sons, New York, USA, 1994.

    Burdea, G., Force and Touch Feedback for Virtual Reality, John Wiley & Sons, New York, USA, 1996.

    Chen E., “Six Degree-of-Freedom Haptic System for Desktop Virtual Prototyping Applications,” Proceedings of International Workshop on Virtual Reality and Prototyping, Laval, France, pp. 97? 106, June 1999.

    Chu, C-C., T. Dani and R. Gadh, “Multimodal Interface for a Virtual Reality Based Computer Aided Design System,” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque NM, pp. 1329? 1334, April 1997.

    Cohen, J., M. Lin, D. Manocha and M. Ponamgi, “I-COLLIDE: An Interactive and Exact Collision Detection System for Large-scale Environments,” Proceedings of ACM Interactive 3D Graphics Conference, Monterey, CA, pp. 189? 196, 1995.

    Colgate, E., P. Grafing, M. Stanley and G. Schenkel, “Implementation of Stiff Virtual Walls in Force-Reflecting Interfaces,” Proceedings of VRAIS, Seattle, WA, pp. 202? 208, September, 1993. EXOS Co., “The Touch Master Specifications,” Company brochure, 1 pp., Woburn MA, 1993. Frölich, B., H. Tramberend, A. beers, M. Agrawala and D. Baraff, “Physically-Based Manipulation on the Responsive Workbench,” Proceedings of IEEE Virtual Reality 2000, New Brunswick, USA, pp. 5? 11, March 2000.

    Goertz, R and R. Thompson, “Electronically controlled manipulator,” Nucleonics, pp. 46? 47, 1954. Gupta, R., T. Sheridan and D. Whitney, “Experiments Using Multimodal Virtual Environments in Design for Assembly Analysis,” Presence, Vol. 6, No. 3, pp. 318? 338, 1997.

    Immersion Corporation, “FEELit Mouse,” Technical Document, San Jose, CA, 12 pp., October 1 1997. Electronic version:

    Iwata, H., “Artificial Reality with Force-feedback: Development of Desktop Virtual Space with Compact Master Manipulator,” Computer Graphics, Vol. 24, No. 4, pp. 165? 170, 1990. Jackson, B., and L. Rosenberg, “Force Feedback and Medical Simulation,” Interactive Technology and the New Paradigm for Healthcare, K. Morgan, R. Satava, H. Sieburg, R. Mattheus and J. Christensen (Eds.), pp. 147? 151, January, 1995.

Report this document

For any questions or suggestions please email