WO1995020788A1 - Intelligent remote multimode sense and display system utilizing haptic information compression - Google Patents

Intelligent remote multimode sense and display system utilizing haptic information compression Download PDF

Info

Publication number
WO1995020788A1
WO1995020788A1 PCT/US1995/001227 US9501227W WO9520788A1 WO 1995020788 A1 WO1995020788 A1 WO 1995020788A1 US 9501227 W US9501227 W US 9501227W WO 9520788 A1 WO9520788 A1 WO 9520788A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensory
information
sensory information
display
haptic
Prior art date
Application number
PCT/US1995/001227
Other languages
French (fr)
Inventor
Timothy Osborne
Beth A. Marcus
Original Assignee
Exos, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Exos, Inc. filed Critical Exos, Inc.
Publication of WO1995020788A1 publication Critical patent/WO1995020788A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/33Director till display
    • G05B2219/33249Compress, pack data before transmission
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37511Select and process only those detected signals needed for a certain purpose
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40142Temperature sensation, thermal feedback to operator fingers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40143Slip, texture sensation feedback, by vibration stimulation of fingers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40159Between operator and sensor a world modeler, local intelligence
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40162Sound display of machining operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40567Purpose, workpiece slip sensing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40606Force, torque sensor in finger
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40619Haptic, combination of tactile and proprioceptive sensing

Definitions

  • the present invention relates generally to a system and method of providing haptic (position, orientation, force, & tactile) feedback to the human operator or a remote or virtual environment.
  • Force-reflective devices exist in a variety of forms, from joysticks to arm and hand controllers. These devices serve as input devices and haptic interfaces for virtual environment manipulation, or teleoperation.
  • a significant amount of research has been conducted on robotic hands and force-reflecting controllers for robotic arm manipulation. While a few controllers were anthropomorphic, most were typically tailored to imitate the slave robot's kinematics. The advantage of anthropomorphic controllers is that they imitate the kinematics of the human hand and arm, thus possessing the largest range of human operator (user) motion. Yet these past anthropomorphic designs were unable to achieve high quality force feedback while retaining minimal actuator size, a requirement for system portability. So while many forms of force-feedback devices have been developed, few completely anthropomorphic systems exist. The systems tended to be massive and bulky, and required large centralized computational resources
  • Tactile displays have been the subject of much research activity in the past thirty years. While neuroscientists studied the physiological roots of human tactile perception, engineers were interested in tactile display devices that substituted or augmented other senses such as sight and hearing. Tactile displays also demonstrated their usefulness in virtual reality simulations and telerobotic operations.
  • a system utilizing tactile display devices provides the teleoperator with additional information, such as contact locations, that could not be adequately displayed by force alone.
  • Tactile displays also simplify the requirements placed upon force reflection.
  • a combination of tactile display and force reflection is haptically more effective, technically less challenging and economically more feasible than force reflection alone. Therefore, a high performance force reflective exoskeleton master should incorporate tactile displays.
  • Tactile display technologies have slowly evolved in the past thirty years. Many different tactile display technologies are available today. For instance, air or water jet displays are used to trace outlines on the skin and are attractive because they eliminate the need for moving mechanical parts and because they exhibit a consistent frequency response. Air or water bladder displays are used to apply a continuous pressure on an area of skin but suffer from extremely low bandwidth.
  • Mechanical tactor displays include most of the vibro-tactile devices described in the literature in addition to a few displays operated at a very low frequency that would not normally be considered vibrotactile.
  • Two methods of actuation dominate these displays; solenoid activated pins and cantilevered piezoelectric bimorph reeds with a pin mounted to the tip. Another method of actuation that has great potential is the use of shape memory alloys.
  • Electrotactile displays elicit touch sensations by indiscriminately stimulating all the touch and pain sensors in the skin with a current passing through the skin. These devices usually involve a remote stimulator (which supplies the power and generates the stimulation waveform) and a number of electrodes mounted on the skin using a sticky dielectric. Each electrode pair constitutes a stimulation site. While some systems involve single or a small number of isolated electrode pairs, arrays of electrodes were also popular, with number of elements running up to 1024 in one instance.
  • the sensing or feedback element contains no intelligence or ability to interact with human stimuli itself. They are hooked to a computation engine which contains a world model which interprets the raw sensory information and then sends low level commands for behaviors to be achieved.
  • a computation engine which contains a world model which interprets the raw sensory information and then sends low level commands for behaviors to be achieved.
  • the current invention relates to a method and system for overcoming these limitations through development of a architecture which provides local intelligence and computation coupled with a method of compressing haptic information through various protocols which reduce the amount of information that is shared between the world modeler and the haptic system.
  • System speed is improved and performance is enhanced making it feasible to employ multiple modes of haptic feedback simultaneously without overloading the communications link or basic computation engines. Since multiple modes of sensory feedback are utilized to reflect the real interactions between a human body part and the outside world, a realistic user interaction with a virtual or remote environment is achieved. Therefore, an object of the invention is to provide local processing to each sense and display device so as to provide real-time feedback to the human operator.
  • Another object of the invention is to provide a modular framework in which to extend the sensory feedback capabilities as required by the human operator.
  • Another object of the invention is to minimize the communications bandwidth required between the haptic system and the world modeler.
  • a system for sensing and displaying sensory information to a human operator comprising a world modeler adapted to receive sensory input from the operator over a communications medium.
  • the world modeler modifies either a virtual world which is a mathematical model of an environment, or modifies a real world which is a remote real environment.
  • the world modeler then sends sensory information from the real or virtual environments over the communication medium to the haptic system.
  • the haptic system displays this sensory information to the human operator. The result is that the operator is able to manipulate a real or virtual environment remotely through the information sensed and displayed by the haptic system.
  • the haptic system is composed of sensory modules.
  • the sensory modules are further composed of a controller (i.e. CPU, DSP or microcontroller unit), communications unit, and sensory and/or display devices.
  • the display devices can include heating elements to convey temperature, voice coils to convey vibration, goggles to display visible, motors to display force, brakes to display friction, speakers to convey sound and tactile sensors to convey various senses of
  • the sense devices can include encoders to convey position, strain gages for force, display device feedback to measure torque, thermocouples for temperature and cameras for vision.
  • Sensory information is compressed through various methods.
  • One method is to transmit only those sensory cues required to convey the intended sense.
  • Another method is to only select the sensory information required for a task.
  • the data selected may be compressed further through conventional data compression techniques (i.e. JPEG for video).
  • the complete system may be scaled as needed. Minimal systems may only convey tactile information through a haptic hand. Larger systems may utilize exoskeletons to convey joint information and force to the system.
  • FIGURE 1 is a block diagram of the present invention.
  • FIGURE 2 is a block diagram of the world modeler of the invention.
  • FIGURE 3 is a block diagram of the haptic system of the invention.
  • FIGURE 4 is a block diagram of variation of sensory modules of the invention.
  • FIGURE 5 is a block diagram of the invention as implementing teleoperation.
  • FIGURE 6 is a front view of the hand embodiment of the invention.
  • FIGURE 7 is a side view of an example tactile sensory used to convey slip.
  • FIGURE 8 is a block diagram of the haptic system for the hand embodiment of the invention.
  • FIGURE 9 is a flow chart of the control software for the hand embodiment of the invention.
  • FIGURE 10 is a front view of the hand, arm and vision embodiment of the invention. Description of the Preferred Embodiment
  • the current invention relates to a method of overcoming past design limitations through development of an architecture which provides local intelligence and computation in conjunction with compression of the haptic information through the use of protocols which reduce the amount of information needed to be shared between computation engines at a given instant.
  • By utilizing local processing within the sensory units the system speed and performance is enhanced making it feasible to employ multiple modes of haptic feedback simultaneously without overloading the communications channel or basic computation engines. Multiple modes of feedback are necessary in order to have a realistic interaction between a human body part and a virtual or remote environment.
  • the present invention generally relates to a system architecture for providing sensory feedback in general to the human operator.
  • touch feedback is necessary for interactions with the human operator in controlling either a computer generated, virtual environment or a real remote robotic environment.
  • the sense of touch is complex. Unlike vision, touch involves a direct physical interaction, so that human actions cause and change what is perceived. It is the fundamental role of touch to sense the results of these contact interactions in order to guide manipulation and body motion.
  • the current art only contains devices for presenting a crude tactile image to a location on the body. This type of information represents only a fraction of the total information sensed by the human sense of touch.
  • Many different types of basic information about the contacts are extracted by people through a combination of four different tactile sensory systems, measurement of joint motion, and measurement of joint torques. This basic information can be organized broadly into passively acquired information and actively acquired information.
  • Passively acquired information can be gathered without relative motion between the body and an object and without varying the forces between the body and the object. Actively acquired information requires either relative motion or varying contact forces.
  • Determining the local contact curvature and labeling it as a point, edge, or planar contact is important for grasp acquisition and object identification.
  • Different contact types have different grasp properties which affect stability; Salisbury, J.K. and Mason, M.T. , Robot Hands and the Mechanics of Manipulation, MIT Press, Cambridge, MA (1985).
  • the local contact normal and curvature provide a strong pruning heuristic rule for identifying objects and object pose; Grmson, W.E.L. , Object Recognition by Computer: The Role of Geometric Constraint, MIT Press, Cambridge, MA (1990).
  • Local object curvature can be measured passively be examining the normal contact force distribution. 2. Determining the surface texture at each contact. Surface texture affects grasp stability.
  • Rough textures are generally easier to grasp than smooth textures.
  • surface texture can also be used as a pruning heuristic rule in identifying objects and object pose. Texture cues are produced both from the spatial distribution of the contact force and the dynamic vibration effects produced at the skin during motion.
  • an initial grasp might be a power grasp in order to get a lot of contact information about an object.
  • a dexterous grasp might be used for manipulation.
  • properties relating the reaction forces to the applied force can also be determined: 1. Determining the local stiffness at each contact.
  • the local surface stiffness can be estimated by applying varying normal forces and measuring the change in contact area. Softer surfaces will produce a larger contact area for a given applied force. This is not a significant cue for manipulation of control surfaces and tools since almost all of the objects involved are rigid.
  • Friction can be measured by applying varying tangential forces and then detecting the onset of slip (slip detection is a separate basic manipulation cue; see next Section). This task also requires the ability to measure tangential forces.
  • Slip detection is used to determine the necessary grasp forces at each contact during all states of manipulation.
  • the weight, center of mass and moments of inertia of the object can be computed.
  • This information can be used for object identification and pose determination, Siegal, D.M. , Pose Determination of a Grasped Object Using Limited Sensing, Ph.D. Thesis, Department of Electrical Engineering, MIT, Cambridge, MA (1991), as well as for computing the necessary torque's for throwing or manipulating the object.
  • Assembly is the process or bringing a moving part into a constrained relationship with a fixed part.
  • the directions in which movement is constrained need to be estimated.
  • Contract constraints are estimated using measurements of the reaction forces as a function of position and by measuring the direction of impact forces.
  • the detents in switches, the termination in screws, the impacts from mating two parts are all examples.
  • the onset of the change can be detected by looking for unexplained impact forces and the direction of the impact force.
  • Current tactile feedback devices present a flat contact image using either arrays of tactors or raster scan techniques.
  • Appendix A includes a list of prior art devices whose teachings are herein incorporated by reference. This type of display can only be used for very local shape display or textures, and is limited to a flat type of display.
  • the object of the current invention is to provide a modular structure under which individual haptic cues or multiple cues can be effectively and efficiently presented to the human operator.
  • Haptic compression is based upon human perception.
  • the compression algorithms ask the question; what information has to be presented to perform this manipulation or recognition task.
  • Haptic compression is based in sending a set of physical parameters and a number indicating the model type to the haptic display computer. Locally, the model number is used to pull up a stored model of a component of the task physics from the memory of the local haptic processor. The parameters then control the specific instantation of that model. For example, perceiving a fine surface texture is done primarily through perception of spatial frequencies through vibration. By measuring relative velocity and having the spatial frequency stored as a two directions fourier spectrum the texture display would be able to simulate a complex texture. Compression occurs because (1) spatial representation does not resolve all surface features; (2) vibration is substituted for an array display.
  • Curvature and surface normal is based on an analysis of the task which shows that only this information is necessary for performing the required manipulation task.
  • the computer sends 2D curvature properties and 1 contact point to represent the surface to the haptic display computer.
  • the display then takes care of bending up the surface. This is a first order representation of the more complex shape. Since not all the information is shown, there is information compression.
  • the haptic processor gets a number K which represents the contact stiffness. Based on this number it simulates a surface of stiffness K by applying the correct forces given a displacement relative to the display.
  • a slip haptic processor Like local stiffness, a slip haptic processor only needs the coefficient of friction mu. The slip processor can then measure the normal force and the tangential force to determine the current state of the slip feedback. In order to implement general constant constraints a set of constraint equations C_i(g,x) are stored on the local processor. The constraint parameters g and the specific model type i are sent to the local processor in order to instantiate a specific constraint model. The local process applies corrective forces to try and force the coordinates of the display, x, to obey this equation in the display.
  • Information may also be compressed through the removal of extraneous information.
  • One method is to include only those senses appropriate to the task.
  • a second method is to use psychophysical compression, i.e. only including the sense cues necessary to give the operator the ability to "feel" the characteristic being displayed.
  • a third method is to use data compression prior to transmission across the communication channel.
  • a fourth method is data thinning, e.g. communicating only the changes since the last transmission or transmitting only when changes are large enough to warrant.
  • a fifth method is to partition the system such that the computation and display subsystems share a lower bandwidth communication channel.
  • Sense Selection There may be no need to include temperature display, if the task is one of simply grasping and moving an object of known temperature. There may also be no need to transmit with absolute fidelity all hand joint torques if the application requires only the input of the thumb and forefinger for acceptable results. Selection of the minimal number of senses required for convincing haptic feedback is the primary method of reducing communications traffic.
  • Sense-mapping or replacing one sense for another, can be used for feedback of senses or other data sources otherwise difficult to display.
  • An example is using a vibrotactile display to display surface temperature through variations in vibration amplitude.
  • More efficient methods such as fractal compression, reproduce the data with reasonable fidelity and require relatively small transmission bandwidth but require some knowledge of the nature of the data being compressed.
  • the system may be partitioned so that there is a reasonable amount of computation performed by the display driving electronics.
  • Each sense can have it's own processing, sharing and communicating a common knowledge of the "world” but reacting in ways unique to the sense being displayed.
  • This approach is readily scalable due to its modularity.
  • the visual display hardware may be suited to fast graphics rendering but does not need the ability to compute other sense data. The only need is to be able to modify and display its world view given input data from other transducers and senses.
  • the prototypical haptic module would consist of a processor, memory, power and drive electronics, transducer interfaces, and a communications interface and controller.
  • the software on board the haptic processor would include the kinematics mapping the sensor data to the display. If, for example, finger force feedback was desired, the module would query the finger's position sensors, map this data to joint angles and fmger geometry in it's world view, sense collisions with objects in it's world view, and calculate resultant forces. It would communicate the updated world view to the other modules and apply the appropriate torques to the fmger, knowing the configuration of the force feedback device.
  • the system is tolerant of system communication latencies as the feedback loop is local to the haptic processor.
  • Scaling and Mapping Sensory Information With the sensors and transducers available, the option of mapping one sense to another becomes feasible and perhaps desirable in some cases.
  • the ability to map physical parameters not normally sensed (for example electromagnetic field strengths) to human senses is compelling and may even lead to more intuitive understanding of complicated phenomenon.
  • Sense data may be freely scaled, such that a remote robot manipulator could experience forces on the order of tons that could scale to ounces on the haptic display.
  • microforces can be scaled to pounds at the haptic display, allowing micromanipulation with extremely fine resolution of control.
  • a system for remote multimode sense and display as shown in Fig. l .
  • the haptic system 14 interacts with a user 16. Sensory input 19 from the user 16 is collected by the haptic system 14. This information is compressed by the haptic system 14 and transmitted onto the communications link 12 to the world modeler 10.
  • the world modeler 10 uses the information to manipulate either a real or virtual world.
  • the world modeler monitors and measures the state of the world, transmitting display information onto communications link 12 to haptic system 14, where the display information is converted to sensory output 18 for the user 16.
  • the world modeler 10 of Fig. 1 may take two forms 10a or 10b.
  • the first form 10a utilizes a computer system 20 to create a mathematical model 22 of a virtual world.
  • the second form of world modeler 10b utilizes a computer system 24 to sense information 25 and send control information 26 to remote input-output system 28.
  • This remote input-output system 28 exists in a real environment.
  • the haptic system 14 of Fig. 1 is further composed of at least one sensory module 30 as shown in Fig. 3.
  • Each sensory module 30 may take a variety of formats.
  • Fig. 4 shows examples of sensory modules 30a, 30b and 30c.
  • a sensory module is composed at a controller 40a communications unit 44 and at least one sense unit 42 or display unit 43.
  • Sensory module 30a contains only one sense unit 42
  • sensory module 30b contains only one display unit 43
  • sensory module 30c contains two sense modules 42 and one display unit 43.
  • Fig. 5 shows the system implementing teleoperation.
  • a user 56 interacts with the remote world 50.
  • the haptic system 14 gathers sensory input 52. This information is compressed by the haptic system 14 and transmitted on communications unit 12 to the world modeler 10.
  • the world modeler generates control information 26 from the compressed sensory input received from the communications link 12.
  • the remote input-output unit 28 which may be a robot or mechanical arm, responds to the control information 26, thereby acting in the remote environment 50.
  • the world modeler also takes sensory input 25 sensed by the remote input-output unit 28 from the remote environment 50.
  • This sensory input 25 is compressed by the world modeler 10 and transmitted onto communications link 12 to the haptic system 14. There the compressed sensory information is displayed 54 to the user 56.
  • the result is that user 56 controls a remote input-output device 28 in a remote world 50.
  • the hand haptic system is comprised of touch sensory modules which are composed of a local processing element connected to display and sense hardware via wire or fiber optic connection.
  • the sensory module is in turn connected via a wire or fiber optic link or a communications link such as an ethernet port, serial port, bus connector such as a Bit 3, parallel port, game port, a local area network to the world model transmitting either direct information for simple or low speed applications or information which has undergone haptic information compression.
  • the human operator controls the world model through interactions with the visual model information (displayed on a computer monitor) and the touch display hardware.
  • the touch sensory module because it has a local processor system, reacts directly to the human stimulus and thus can interact at the high speeds required for the feedback to feel realistic even in the face of rapidly changing stimuli from the human operator. This reaction can be completely independent from the world model and visual display and thus not be slowed by a slow communications link and can therefore run at speeds upwards of lKHz if necessary. This is made possible because of the architecture of the local processor unit.
  • This processor can be a single chip such as a DSP or Microcontroller, a board with multiple chips or a computer, depending upon the number of sensor and display elements contained in the touch display hardware and the complexity of the model.
  • the haptic compression is achieved by utilizing the basic research into human perception of touch described above to determine which touch cues are being utilized for a given type of manipulation. For example, when an object is grasped the joint torque cues as well as slip information are used to ensure that the object is stably grasped.
  • the algorithms in the local processor determine the type of cues required and their magnitude thus reducing the amount of information which needs to be transmitted to the other system components.
  • the world model only needs to know the resulting object status (i.e. position and orientation, or forces acting on the object, depending upon how the world model is structured) thus the information content of the transmitted signals is greatly reduced thus improving the overall performance of the system.
  • a simple slip display shown in Fig. 7, was developed as disclosed in U.S.S.N. 08/187,646, entitled “Multimode Feedback Display Technology” filed January 27, 1994 (herein incorporated by reference).
  • the slip display consists of a delrin cylinder 71 with a textured surface, directly mounted on the shaft of a miniature gear motor 72.
  • the cylinder is supported on the other side with a miniature bearing 73.
  • the motor is held by a set screw in the mounting bracket 70.
  • the slip display 61 was attached to the tip of each of the index fmger force feedback apparatus 62 and the thumb force feedback apparatus 63.
  • Force reflection to the index fmger is provided by a modified two-degree-of-freedom SAFIRE (Sensing and Force reflecting Exoskeleton, Patent application serial #07/961 ,259 submitted 10/15/92) prototype, which consists of two cable-driven linkages 65, with the two motors 66 mounted on the forearm. The axes of rotation of the linkages are co-located with the PIP and MCP joints of the" index finger.
  • Force reflection to the thumb is provided by a one-degree-of- freedom device 67, also driven by cables 69 from a remote motor mounted on the forearm 68.
  • the vertical resistance and position and orientation in space is provided by a boom 64.
  • the vertical force reflection axis provided up to 2 lbs of virtual weight at the hand. It also supported the weight of the SAFIRE through counterbalancing.
  • the stiffness of the vertical boom up to the SAFIRE attachment bracket exceeded 140 lb/in, the minimum stiffness required for a cantilevered beam to feel rigid to a human operator.
  • the resolution of the vertical and horizontal position sensors for the hand and index finger was 0.005 inch. Two linear ways were mounted on opposite sides of the vertical support structure, which was rigidly attached to a heavy base.
  • the drive mechanism 201 for the vertical degree-of-freedom was located on the back of the support structure 200.
  • the motor providing vertical force reflection was horizontally mounted on the slider riding in the rear linear way.
  • a threaded capstan was directly mounted on the shaft of the motor.
  • a 7x7, 0.032 inch diameter steel cable was wound 3-4 turns around the capstan.
  • One end of the cable was terminated on a cable tensioner at the top of the structure.
  • the rear slider was connected through a second cable and a set of pulleys to a plate mounted on two sliders riding in a second linear way, mounted on the front side of the vertical support structure.
  • Using two sliders instead of one greatly improved the twisting and bending load capacity of the front linear way.
  • the two sliders were mounted on the plate through two pivots to minimize bending moments applied to the front linear way.
  • the linkage consisted of a distal block attached to the front slider plate, proximal'block, and a 12 inch long rod.
  • the distal block housed the bearings and the encoder for the first passive pivot, also called the distal pivot.
  • One end of the rod was pivoted about this block so that the linkage rotated freely in the horizontal plane.
  • the other end of the rod was pivoted about a second bearing/encoder housing, also called the proximal pivot.
  • the proximal mounting block was attached to the SAFIRE base with a rigid bracket. The length of this bracket was designed to position the hand so that the proximal pivot was between the tip of the index finger and the thumb.
  • the motor for vertical force reflection was selected based on the power requirements. Ideally the motor should produce a force of 2 lbs at the hand. It is estimated the hand could move at a maximum vertical speed of 8 in per sec. We selected a 90W motor without a gearhead to provide a smooth operation with minimal friction. This is designed to produce a force that exceeds the corresponding frictional force, if the frictional coefficient between the finger and the display is estimated at 0.5, the tactile display must be able to produce 1 lb. of force. If the display moved past the fmger at 1 in/s, the required power would be around 0.112W. A motor producing 0.25-0.5 W would therefore be a suitable actuator for this display. The MicroMo Series 1016 motor fits these requirements; it is 0.384 inch in diameter, 0.63 inch long and weighs 0.23 oz. This is mounted in the space beyond the fingertip.
  • the hand haptic system is interfaced to the world modeler, in this case implemented on a graphics rendering PC, through a local processor 80, in this case a VME card cage which houses a TI C30 DSP board 81 , I/O boards 82, 83, and 84, and a BiT3 VME-ISA bus adapter 85.
  • the physical simulation and device drivers reside on a TI C30 floating point DSP board 81.
  • the DSP board is installed in a multislot VME cage which also houses the I/O electronics.
  • the simulation is interrupt driven with a time step of 300 Hz, which is the minimum performance level necessary for a stable and realistic simulation. Higher performance could be gained by parsing out the simulation tasks among several DSP processors and executing them in parallel. For example, communication tasks with the graphics PC and boom position sensing could be handled by a different processor than the main simulation loop.
  • the DSP simulation communicates with the graphics PC via a BiT3 VME-ISA bus adapter.
  • This high speed communications channel handles graphics information generated by the simulation and simulation parameters generated by the user interface.
  • the user interface starts the standard simulation demonstration and provides program status information. Since the user interface and graphics rendering can be somewhat asynchronous to the simulation, graphics and PC system latencies will not disturb the simulation which depends on a uniform time step. Also the bandwidth required for this information is low enough to not require a dedicated processor.
  • This software approach does not require the simulation and graphics to communicate a significant amount of objects characteristics upon startup of the simulation so that a minimum of data is exchanged during run time. Data from the simulation need not be transmitted to the graphics PC any faster than the graphics refresh rate.
  • the joint positions are sensed with digital position encoders.
  • the joint torque's and slip applied to the touch display are voltage controlled by the simulation via 6 12-bit digital to analog converters and external current mode motor amplifiers.
  • the encoder counters and digital to analog converters reside on the high-speed VME bus with the DSP board and are directly accessed by the DSP.
  • the generality of the control electronics and overall architecture makes the system adaptable to other force reflecting masters.
  • the minimum configuration can handle six output and eight input degrees of freedom, and can be easily expanded to any reasonable number by adding I/O modules and, if necessary, DSP processors.
  • the simulation can be developed on the graphics PC and debugged with tools available for the PC/BiT3 development environment. Porting the simulation to the DSP is then relatively straightforward. In fact, the existing simulation can be debugged on the PC and then recompiled for the DSP with only minor changes. Hardware based debugging tools are available for the DSP, although the DSP/PC interface provides a relatively powerful stethoscope for all but the most low level debugging or obscure compiler/hardware bugs.
  • the DSP executable and source code resides on the PC and is downloaded to the DSP at run time.
  • This system and method can be applied to a wide variety of physical embodiments including simpler systems and more complex and capable systems with more feedback modes utilized.
  • An alternate embodiment utilizes the hand haptic system in conjunction with an exoskeletal arm mechanism combined with a vision display.
  • the exoskeleton arm consists of two interconnected mechanisms, one for the shoulder and the other for the elbow and forearm as shown in Figure 10.
  • the shoulder mechanism 90 consists of a three DOF motor driven gimbal, a passive prismatic joint (spline), a 2 DOF passive gimbal and a upper arm cuff.
  • the elbow mechanism 92 is connected to the shoulder mechanism 90 at the distal end of the spline 91. This allows the system to accommodate changes in the distance between the shoulder 90 and elbow mechanisms 92.
  • the spline also eliminates the need for adjustment for different upper arm lengths.
  • the center of the distal gimbal 93 is located on the axis of the upper arm E/I rotation.
  • the elbow F/E motor 94 is pivotally jointed to both the upper 95 and lower 96 cuffs.
  • the final step was to use a brake 97 at the distal gimbal of the shoulder mechanism to ground the elbow/forearm mechanism.
  • the elbow mechanism has two DOF. The first DOF is aligned with the axis of elbow F/E. The second DOF is placed along the forearm with a remote center drive mechanism.
  • the three motors 100, 101 , and 102, at the shoulder gimbal can be coordinated to apply a force at the upper arm which is approximately perpendicular to and intersects with the longitudinal axis of the upper arm. This force creates a torque about axes of the shoulder F/E and Ab/Ad.
  • the motor at the elbow controls the Flexion/Extension torque about the elbow.
  • the rotational torque about the longitudinal axes of the upper and lower arms are controlled in two different ways depending on the position of the elbow joint. When the elbow is extended the motor at the wrist applies rotational torque to the longitudinal aces of both the upper and lower arms, since their motions coincide.
  • the motors at the shoulder can be coordinated to apply a force at the forearm cuff to produce a torque about the upper arm E/I, and the RCD applies a torque to the forearm.
  • this mechanism offered all the force display requirements for a force reflective arm master and conformed to the anatomic motion of the arm.
  • a higher performance version can be achieved by using magnetic brakes in conjunction to tow miniature motors or actuators.
  • the two magnetic particle brakes may be used to replace the first two motors at the shoulder gimbal.
  • the mechanism In order to use brakes, the mechanism must be fully balanced about these two axes because brakes are passive and cannot lift.
  • the miniature actuators or motors must be put in series with, and parallel to, the two brakes for active force control. One way to achieve this is to activate the 2 DOF passive gimbal at the upper arm cuff with such miniature motors.
  • the two miniature actuators or motors can display small forces to the operator at very fine resolutions.
  • PWM Pulse Width Modulations
  • the video sense is accommodated via a head mounted display or remote video screen, with accompanying processing. They can communicate with each haptic processor, which take in data from the physical parameters of the sense being displayed, e.g. position of the fingers related to force feedback. Each haptic processor applies the appropriate feedback, and communicates its part of the world view to the other processors.
  • the communication channels can be fibre optic, infrared, wired, or whatever medium is appropriate to the twin demands of bandwidth and portability. Adding another sense is as straightforward as adding another sense module up to the limitations of inter-module bandwidth. System autoconfiguration is a straightforward function of the software. Controllers for the modules could be general purpose digital signal processors for maximum flexibility or specialized processors, e.g. application specific integrated circuits, for minimum cost.
  • Vision sense information is provided to the operator through a set of goggles containing vision displays.
  • a device is the Vision Immersion Module manufactured by Kaiser Electro-Optics in Carlsbad, CA 92008. Sound may also be provided to the operator through a set of headsets.
  • the complete system is worn by the human operator through a backpack mechanism which also contains the electronic and power systems.
  • the resulting system is a self-supporting exoskeleton, requiring no precise alignment to conform to the anatomical movement of the arm, and is thus comfortable to wear.
  • the device is comfortable to use because it applies normal forces to the arm; no twist, no shear to skin or bones. It is also easy to don and doff because entire mechamsm is supported through the backpack.
  • the complete system communicates to the world modeler through a cable which also provides power to the system.
  • the system can also be self contained by providing power through batteries and communications through an RF or IR link.

Abstract

A system for sensing and displaying sensory information to a human operator comprising a world modeler adapted to receive sensory input from the operator over a communications medium. The world modeler modifies either a virtual world which is a mathematical model of an environment, or modifies a real world which is a remote real environment. The world modeler then sends sensory information from the real or virtual environments over the communication medium to the haptic system. The haptic system displays this sensory information to the human operator. The result is that the operator is able to manipulate a real or virtual environment remotely through the information sensed and displayed by the haptic system. The method provides for compression of the haptic information, reducing the bandwidth required to transmit sensory information between the haptic system and the world modeler.

Description

INTELLIGENT REMOTE MULTIMODE SENSE AND DISPLAY SYSTEM UTILIZING HAPTIC INFORMATION COMPRESSION
Background of the Invention
The present invention relates generally to a system and method of providing haptic (position, orientation, force, & tactile) feedback to the human operator or a remote or virtual environment.
Devices and systems are well known for measuring or monitoring one or more characteristics of an anatomical part (see for example U.S. Patent Nos. 4,986,280, 3,258,007 and 3,364,929). A human controlled position sensing device for use in the field of robotics is also well known (see for example, U.S. Patent Nos. 4,328,621 , 4,534,694, 4,608,525, and 4,674,048).
Force Feedback Systems
Systems are also well known for providing force feedback to human operators. Force-reflective devices exist in a variety of forms, from joysticks to arm and hand controllers. These devices serve as input devices and haptic interfaces for virtual environment manipulation, or teleoperation. A significant amount of research has been conducted on robotic hands and force-reflecting controllers for robotic arm manipulation. While a few controllers were anthropomorphic, most were typically tailored to imitate the slave robot's kinematics. The advantage of anthropomorphic controllers is that they imitate the kinematics of the human hand and arm, thus possessing the largest range of human operator (user) motion. Yet these past anthropomorphic designs were unable to achieve high quality force feedback while retaining minimal actuator size, a requirement for system portability. So while many forms of force-feedback devices have been developed, few completely anthropomorphic systems exist. The systems tended to be massive and bulky, and required large centralized computational resources
/ tightly coupled to the sensory system, further preventing an efficient integration of the system to the human operator. Finally, the systems were non-modular, requiring extensive work to add additional haptic sense and display devices.
Tactile Systems
Tactile displays have been the subject of much research activity in the past thirty years. While neuroscientists studied the physiological roots of human tactile perception, engineers were interested in tactile display devices that substituted or augmented other senses such as sight and hearing. Tactile displays also demonstrated their usefulness in virtual reality simulations and telerobotic operations.
A system utilizing tactile display devices provides the teleoperator with additional information, such as contact locations, that could not be adequately displayed by force alone. Tactile displays also simplify the requirements placed upon force reflection. A combination of tactile display and force reflection is haptically more effective, technically less challenging and economically more feasible than force reflection alone. Therefore, a high performance force reflective exoskeleton master should incorporate tactile displays.
Tactile display technologies have slowly evolved in the past thirty years. Many different tactile display technologies are available today. For instance, air or water jet displays are used to trace outlines on the skin and are attractive because they eliminate the need for moving mechanical parts and because they exhibit a consistent frequency response. Air or water bladder displays are used to apply a continuous pressure on an area of skin but suffer from extremely low bandwidth.
Mechanical tactor displays include most of the vibro-tactile devices described in the literature in addition to a few displays operated at a very low frequency that would not normally be considered vibrotactile. A few single-element display devices exist but the vast majority of mechanical tactile displays involve tactor arrays. The number of elements in these arrays run from two to 400. Almost all of the devices in this group involve indenting or impacting the skin perpendicular to the skin surface. Two methods of actuation dominate these displays; solenoid activated pins and cantilevered piezoelectric bimorph reeds with a pin mounted to the tip. Another method of actuation that has great potential is the use of shape memory alloys. As a current is passed through the alloy, it changes shape and deflects a spring that holds a tactor pin in place. Voice coil displays generate vibratory sensations by mounting a voice coil against the skin. Electrotactile displays elicit touch sensations by indiscriminately stimulating all the touch and pain sensors in the skin with a current passing through the skin. These devices usually involve a remote stimulator (which supplies the power and generates the stimulation waveform) and a number of electrodes mounted on the skin using a sticky dielectric. Each electrode pair constitutes a stimulation site. While some systems involve single or a small number of isolated electrode pairs, arrays of electrodes were also popular, with number of elements running up to 1024 in one instance.
In all of these devices, the sensing or feedback element contains no intelligence or ability to interact with human stimuli itself. They are hooked to a computation engine which contains a world model which interprets the raw sensory information and then sends low level commands for behaviors to be achieved. There are many limitations in this approach which have been apparent and much discussed in the research literature (papers on force feedback in the presence of time delays by Sheridan and others) and elaborate methods of modeling system performance to overcome these problems. The more complex the feedback, the more extreme the problem becomes. In fact this has led the virtual reality research community in the opposite direction from this invention, to use larger and more powerful computational engines and very simple feedback modes with limited performance success. Most of the systems used are on non real-time computation engines which makes the problem of providing stable, realistic and predictable feedback even more difficult.
The current invention relates to a method and system for overcoming these limitations through development of a architecture which provides local intelligence and computation coupled with a method of compressing haptic information through various protocols which reduce the amount of information that is shared between the world modeler and the haptic system. System speed is improved and performance is enhanced making it feasible to employ multiple modes of haptic feedback simultaneously without overloading the communications link or basic computation engines. Since multiple modes of sensory feedback are utilized to reflect the real interactions between a human body part and the outside world, a realistic user interaction with a virtual or remote environment is achieved. Therefore, an object of the invention is to provide local processing to each sense and display device so as to provide real-time feedback to the human operator.
Another object of the invention is to provide a modular framework in which to extend the sensory feedback capabilities as required by the human operator.
Another object of the invention is to minimize the communications bandwidth required between the haptic system and the world modeler.
Another object of the invention is to provide an anthropomorphic system while retaining portability. Another object of the invention is to provide a system which can sense and/or display force, slip, vibration, shape, stiffness, friction, texture, pressure, mass, temperature, sound and electromagnetic radiation to a human operator. Additional objects and features of the invention will appear from the following description in which the preferred embodiments is set forth in detail in conjunction with the accompanying drawings.
Summary of the Invention
A system for sensing and displaying sensory information to a human operator comprising a world modeler adapted to receive sensory input from the operator over a communications medium. The world modeler modifies either a virtual world which is a mathematical model of an environment, or modifies a real world which is a remote real environment. The world modeler then sends sensory information from the real or virtual environments over the communication medium to the haptic system. The haptic system displays this sensory information to the human operator. The result is that the operator is able to manipulate a real or virtual environment remotely through the information sensed and displayed by the haptic system. The haptic system is composed of sensory modules. The sensory modules are further composed of a controller (i.e. CPU, DSP or microcontroller unit), communications unit, and sensory and/or display devices. The display devices can include heating elements to convey temperature, voice coils to convey vibration, goggles to display visible, motors to display force, brakes to display friction, speakers to convey sound and tactile sensors to convey various senses of touch.
The sense devices can include encoders to convey position, strain gages for force, display device feedback to measure torque, thermocouples for temperature and cameras for vision.
Sensory information is compressed through various methods. One method is to transmit only those sensory cues required to convey the intended sense. Another method is to only select the sensory information required for a task. Further, the data selected may be compressed further through conventional data compression techniques (i.e. JPEG for video).
The complete system may be scaled as needed. Minimal systems may only convey tactile information through a haptic hand. Larger systems may utilize exoskeletons to convey joint information and force to the system.
Brief Description of the Drawings
The invention is described and explained in more detail below using the embodiments shown in the drawings. The described and drawn features, in other embodiments of the invention, can be used individually or in preferred combinations. The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of a preferred embodiment of the invention, as illustrated in the accompanying drawings. FIGURE 1 is a block diagram of the present invention.
FIGURE 2 is a block diagram of the world modeler of the invention. FIGURE 3 is a block diagram of the haptic system of the invention. FIGURE 4 is a block diagram of variation of sensory modules of the invention. FIGURE 5 is a block diagram of the invention as implementing teleoperation.
FIGURE 6 is a front view of the hand embodiment of the invention. FIGURE 7 is a side view of an example tactile sensory used to convey slip. FIGURE 8 is a block diagram of the haptic system for the hand embodiment of the invention.
FIGURE 9 is a flow chart of the control software for the hand embodiment of the invention.
FIGURE 10 is a front view of the hand, arm and vision embodiment of the invention. Description of the Preferred Embodiment
The current invention relates to a method of overcoming past design limitations through development of an architecture which provides local intelligence and computation in conjunction with compression of the haptic information through the use of protocols which reduce the amount of information needed to be shared between computation engines at a given instant. By utilizing local processing within the sensory units the system speed and performance is enhanced making it feasible to employ multiple modes of haptic feedback simultaneously without overloading the communications channel or basic computation engines. Multiple modes of feedback are necessary in order to have a realistic interaction between a human body part and a virtual or remote environment.
The present invention generally relates to a system architecture for providing sensory feedback in general to the human operator. In particular, touch feedback is necessary for interactions with the human operator in controlling either a computer generated, virtual environment or a real remote robotic environment.
Sensory Display and Sense The sense of touch, or haptic sense, is complex. Unlike vision, touch involves a direct physical interaction, so that human actions cause and change what is perceived. It is the fundamental role of touch to sense the results of these contact interactions in order to guide manipulation and body motion. The current art only contains devices for presenting a crude tactile image to a location on the body. This type of information represents only a fraction of the total information sensed by the human sense of touch. Many different types of basic information about the contacts are extracted by people through a combination of four different tactile sensory systems, measurement of joint motion, and measurement of joint torques. This basic information can be organized broadly into passively acquired information and actively acquired information. Passively acquired information can be gathered without relative motion between the body and an object and without varying the forces between the body and the object. Actively acquired information requires either relative motion or varying contact forces. A summary of some of the basic types of cues or sensed information is given below.
Passively Acquired Identification Cues
Passively acquired identification cues can be obtained by the act of holding the object in the hand. Three major cues have been identified in this category:
1. Determining the local contact normal and curvature of the surface at each contact.
Determining the local contact curvature and labeling it as a point, edge, or planar contact is important for grasp acquisition and object identification. Different contact types have different grasp properties which affect stability; Salisbury, J.K. and Mason, M.T. , Robot Hands and the Mechanics of Manipulation, MIT Press, Cambridge, MA (1985). In addition, the local contact normal and curvature provide a strong pruning heuristic rule for identifying objects and object pose; Grmson, W.E.L. , Object Recognition by Computer: The Role of Geometric Constraint, MIT Press, Cambridge, MA (1990). Local object curvature can be measured passively be examining the normal contact force distribution. 2. Determining the surface texture at each contact. Surface texture affects grasp stability. Rough textures are generally easier to grasp than smooth textures. In addition, surface texture can also be used as a pruning heuristic rule in identifying objects and object pose. Texture cues are produced both from the spatial distribution of the contact force and the dynamic vibration effects produced at the skin during motion.
3. Gross object shape.
By using the finger geometry and the location of the contact points in the hand, the shape of the object can be estimated. For example, an initial grasp might be a power grasp in order to get a lot of contact information about an object. Once the object and its shape is identified, a dexterous grasp might be used for manipulation.
Actively Acquired Identification Cues
With some local active exploration, properties relating the reaction forces to the applied force can also be determined: 1. Determining the local stiffness at each contact.
The local surface stiffness can be estimated by applying varying normal forces and measuring the change in contact area. Softer surfaces will produce a larger contact area for a given applied force. This is not a significant cue for manipulation of control surfaces and tools since almost all of the objects involved are rigid.
2. Determining local frictional properties.
The local friction along with the local contact type control the local grasp stability. Friction can be measured by applying varying tangential forces and then detecting the onset of slip (slip detection is a separate basic manipulation cue; see next Section). This task also requires the ability to measure tangential forces.
3. Overall object stiffness.
By integrating the contact force over an area with the joint torque's, and correlating this with joint deflection, the overall stiffness can be computed. 4. Detecting slip.
Detecting the onset of slip between an object and the hand is essential for grasp maintenance. Slip detection is used to determine the necessary grasp forces at each contact during all states of manipulation.
5. Determining object mass (weight), center of mass, and moments of inertia.
By manipulating a grasped object and measuring the resulting joint torque's and net contact forces at the hands for different configurations, the weight, center of mass and moments of inertia of the object can be computed. This information can be used for object identification and pose determination, Siegal, D.M. , Pose Determination of a Grasped Object Using Limited Sensing, Ph.D. Thesis, Department of Electrical Engineering, MIT, Cambridge, MA (1991), as well as for computing the necessary torque's for throwing or manipulating the object.
6. Estimating directions of contact constraint.
Assembly is the process or bringing a moving part into a constrained relationship with a fixed part. In order to control the assembly process, the directions in which movement is constrained need to be estimated. Contract constraints are estimated using measurements of the reaction forces as a function of position and by measuring the direction of impact forces.
7. Detecting changes in contact constraints. This is one of the most common tasks during manipulation.
The detents in switches, the termination in screws, the impacts from mating two parts are all examples. The onset of the change can be detected by looking for unexplained impact forces and the direction of the impact force. Current tactile feedback devices present a flat contact image using either arrays of tactors or raster scan techniques. Appendix A includes a list of prior art devices whose teachings are herein incorporated by reference. This type of display can only be used for very local shape display or textures, and is limited to a flat type of display. A need exists for devices for displaying a variety of the remaining cues, such as those described above, (slip, contact curvature, gross texture) and those cues such as vibration, for which devices already exist in a realistic and natural fashion. Thus the object of the current invention is to provide a modular structure under which individual haptic cues or multiple cues can be effectively and efficiently presented to the human operator.
Haptic Compression
The bandwidth required to completely characterize and display haptic feedback of all senses would normally be prohibitive. Various forms of haptic compression are used to limit this bandwidth.
Haptic compression is based upon human perception. The compression algorithms ask the question; what information has to be presented to perform this manipulation or recognition task. Haptic compression is based in sending a set of physical parameters and a number indicating the model type to the haptic display computer. Locally, the model number is used to pull up a stored model of a component of the task physics from the memory of the local haptic processor. The parameters then control the specific instantation of that model. For example, perceiving a fine surface texture is done primarily through perception of spatial frequencies through vibration. By measuring relative velocity and having the spatial frequency stored as a two directions fourier spectrum the texture display would be able to simulate a complex texture. Compression occurs because (1) spatial representation does not resolve all surface features; (2) vibration is substituted for an array display. Curvature and surface normal is based on an analysis of the task which shows that only this information is necessary for performing the required manipulation task. The computer sends 2D curvature properties and 1 contact point to represent the surface to the haptic display computer. The display then takes care of bending up the surface. This is a first order representation of the more complex shape. Since not all the information is shown, there is information compression.
For local stiffness, the haptic processor gets a number K which represents the contact stiffness. Based on this number it simulates a surface of stiffness K by applying the correct forces given a displacement relative to the display.
Like local stiffness, a slip haptic processor only needs the coefficient of friction mu. The slip processor can then measure the normal force and the tangential force to determine the current state of the slip feedback. In order to implement general constant constraints a set of constraint equations C_i(g,x) are stored on the local processor. The constraint parameters g and the specific model type i are sent to the local processor in order to instantiate a specific constraint model. The local process applies corrective forces to try and force the coordinates of the display, x, to obey this equation in the display.
Information may also be compressed through the removal of extraneous information. One method is to include only those senses appropriate to the task. A second method is to use psychophysical compression, i.e. only including the sense cues necessary to give the operator the ability to "feel" the characteristic being displayed. A third method is to use data compression prior to transmission across the communication channel. A fourth method is data thinning, e.g. communicating only the changes since the last transmission or transmitting only when changes are large enough to warrant. A fifth method is to partition the system such that the computation and display subsystems share a lower bandwidth communication channel.
Sense Selection There may be no need to include temperature display, if the task is one of simply grasping and moving an object of known temperature. There may also be no need to transmit with absolute fidelity all hand joint torques if the application requires only the input of the thumb and forefinger for acceptable results. Selection of the minimal number of senses required for convincing haptic feedback is the primary method of reducing communications traffic.
Psychophysical Considerations
Studies have shown that the human may only need a subset of sense cues to synthesize the sensory feedback. Sense-mapping, or replacing one sense for another, can be used for feedback of senses or other data sources otherwise difficult to display. An example is using a vibrotactile display to display surface temperature through variations in vibration amplitude.
Data Compression
Various methods exist for brute-force data compression. The least efficient algorithms, on a percentage compression basis, offer the ability to completely and accurately reproduce the input data after decompression.
More efficient methods, such as fractal compression, reproduce the data with reasonable fidelity and require relatively small transmission bandwidth but require some knowledge of the nature of the data being compressed.
The best protocol will depend on the psychophysics of the senses being displayed.
Data Thinning Sending a continuous stream of absolute sensory data across the communications channel may be inefficient. Once the minimum bandwidth of the sense displayed is established, only state changes may need to be communicated. The channel could become event driven by transmitting only perceptible changes.
System Configuration
The system may be partitioned so that there is a reasonable amount of computation performed by the display driving electronics. Each sense can have it's own processing, sharing and communicating a common knowledge of the "world" but reacting in ways unique to the sense being displayed. This approach is readily scalable due to its modularity. For example, the visual display hardware may be suited to fast graphics rendering but does not need the ability to compute other sense data. The only need is to be able to modify and display its world view given input data from other transducers and senses.
The prototypical haptic module would consist of a processor, memory, power and drive electronics, transducer interfaces, and a communications interface and controller. The software on board the haptic processor would include the kinematics mapping the sensor data to the display. If, for example, finger force feedback was desired, the module would query the finger's position sensors, map this data to joint angles and fmger geometry in it's world view, sense collisions with objects in it's world view, and calculate resultant forces. It would communicate the updated world view to the other modules and apply the appropriate torques to the fmger, knowing the configuration of the force feedback device. The system is tolerant of system communication latencies as the feedback loop is local to the haptic processor.
Scaling and Mapping Sensory Information With the sensors and transducers available, the option of mapping one sense to another becomes feasible and perhaps desirable in some cases. The ability to map physical parameters not normally sensed (for example electromagnetic field strengths) to human senses is compelling and may even lead to more intuitive understanding of complicated phenomenon. Sense data may be freely scaled, such that a remote robot manipulator could experience forces on the order of tons that could scale to ounces on the haptic display. At the other end of the spectrum, microforces can be scaled to pounds at the haptic display, allowing micromanipulation with extremely fine resolution of control.
General System Architecture
Referring now to the drawings, wherein like numerals are used throughout the several views to identify like elements of the invention, there is disclosed a system for remote multimode sense and display as shown in Fig. l . The haptic system 14 interacts with a user 16. Sensory input 19 from the user 16 is collected by the haptic system 14. This information is compressed by the haptic system 14 and transmitted onto the communications link 12 to the world modeler 10. The world modeler 10 uses the information to manipulate either a real or virtual world. The world modeler monitors and measures the state of the world, transmitting display information onto communications link 12 to haptic system 14, where the display information is converted to sensory output 18 for the user 16.
With reference to Fig. 2, the world modeler 10 of Fig. 1 may take two forms 10a or 10b. The first form 10a utilizes a computer system 20 to create a mathematical model 22 of a virtual world. The second form of world modeler 10b utilizes a computer system 24 to sense information 25 and send control information 26 to remote input-output system 28. This remote input-output system 28 exists in a real environment. The haptic system 14 of Fig. 1 is further composed of at least one sensory module 30 as shown in Fig. 3. Each sensory module 30 may take a variety of formats. Fig. 4 shows examples of sensory modules 30a, 30b and 30c. A sensory module is composed at a controller 40a communications unit 44 and at least one sense unit 42 or display unit 43. Sensory module 30a contains only one sense unit 42, sensory module 30b contains only one display unit 43, while sensory module 30c contains two sense modules 42 and one display unit 43.
Sense unit 42 contains sensors to monitor sensory input from the user. For example, this sensory input may be force, position, pressure, temperature information as experienced by the user. Display unit 43 contains displays to convey information, such as force, temperature, vibration, texture, friction, shape and stiffness, to the user. The result is the user interacts with either a remote or virtual world through sensory input controlled by the haptic system.
Fig. 5 shows the system implementing teleoperation. A user 56 interacts with the remote world 50. The haptic system 14 gathers sensory input 52. This information is compressed by the haptic system 14 and transmitted on communications unit 12 to the world modeler 10. The world modeler generates control information 26 from the compressed sensory input received from the communications link 12. The remote input-output unit 28, which may be a robot or mechanical arm, responds to the control information 26, thereby acting in the remote environment 50. The world modeler also takes sensory input 25 sensed by the remote input-output unit 28 from the remote environment 50. This sensory input 25 is compressed by the world modeler 10 and transmitted onto communications link 12 to the haptic system 14. There the compressed sensory information is displayed 54 to the user 56. The result is that user 56 controls a remote input-output device 28 in a remote world 50. Hand Haptic System
The hand haptic system is comprised of touch sensory modules which are composed of a local processing element connected to display and sense hardware via wire or fiber optic connection. The sensory module is in turn connected via a wire or fiber optic link or a communications link such as an ethernet port, serial port, bus connector such as a Bit 3, parallel port, game port, a local area network to the world model transmitting either direct information for simple or low speed applications or information which has undergone haptic information compression. The human operator controls the world model through interactions with the visual model information (displayed on a computer monitor) and the touch display hardware.
This results in the touch display hardware reacting and displaying touch information to the operator. The touch sensory module, because it has a local processor system, reacts directly to the human stimulus and thus can interact at the high speeds required for the feedback to feel realistic even in the face of rapidly changing stimuli from the human operator. This reaction can be completely independent from the world model and visual display and thus not be slowed by a slow communications link and can therefore run at speeds upwards of lKHz if necessary. This is made possible because of the architecture of the local processor unit. This processor can be a single chip such as a DSP or Microcontroller, a board with multiple chips or a computer, depending upon the number of sensor and display elements contained in the touch display hardware and the complexity of the model.
The haptic compression is achieved by utilizing the basic research into human perception of touch described above to determine which touch cues are being utilized for a given type of manipulation. For example, when an object is grasped the joint torque cues as well as slip information are used to ensure that the object is stably grasped. The algorithms in the local processor determine the type of cues required and their magnitude thus reducing the amount of information which needs to be transmitted to the other system components. The world model only needs to know the resulting object status (i.e. position and orientation, or forces acting on the object, depending upon how the world model is structured) thus the information content of the transmitted signals is greatly reduced thus improving the overall performance of the system.
In the embodiment shown in Fig. 6, three sense and display elements were used; slip on the finger tips, force feedback to the index fmger and thumb, and vertical resistance. In addition the device provides position and orientation of each of the system elements. In this embodiment a simple slip display, shown in Fig. 7, was developed as disclosed in U.S.S.N. 08/187,646, entitled "Multimode Feedback Display Technology" filed January 27, 1994 (herein incorporated by reference). The slip display consists of a delrin cylinder 71 with a textured surface, directly mounted on the shaft of a miniature gear motor 72. The cylinder is supported on the other side with a miniature bearing 73. The motor is held by a set screw in the mounting bracket 70.
In Fig. 6, the slip display 61 was attached to the tip of each of the index fmger force feedback apparatus 62 and the thumb force feedback apparatus 63. Force reflection to the index fmger is provided by a modified two-degree-of-freedom SAFIRE (Sensing and Force reflecting Exoskeleton, Patent application serial #07/961 ,259 submitted 10/15/92) prototype, which consists of two cable-driven linkages 65, with the two motors 66 mounted on the forearm. The axes of rotation of the linkages are co-located with the PIP and MCP joints of the" index finger. Force reflection to the thumb is provided by a one-degree-of- freedom device 67, also driven by cables 69 from a remote motor mounted on the forearm 68.
The vertical resistance and position and orientation in space is provided by a boom 64. The vertical force reflection axis provided up to 2 lbs of virtual weight at the hand. It also supported the weight of the SAFIRE through counterbalancing. The stiffness of the vertical boom up to the SAFIRE attachment bracket exceeded 140 lb/in, the minimum stiffness required for a cantilevered beam to feel rigid to a human operator. The resolution of the vertical and horizontal position sensors for the hand and index finger was 0.005 inch. Two linear ways were mounted on opposite sides of the vertical support structure, which was rigidly attached to a heavy base.
The drive mechanism 201 for the vertical degree-of-freedom was located on the back of the support structure 200. The motor providing vertical force reflection was horizontally mounted on the slider riding in the rear linear way. A threaded capstan was directly mounted on the shaft of the motor. A 7x7, 0.032 inch diameter steel cable was wound 3-4 turns around the capstan. One end of the cable was terminated on a cable tensioner at the top of the structure. As the motor rotated, the cable hoisted the slider up and down the rear linear way. The rear slider was connected through a second cable and a set of pulleys to a plate mounted on two sliders riding in a second linear way, mounted on the front side of the vertical support structure. Using two sliders instead of one greatly improved the twisting and bending load capacity of the front linear way. The two sliders were mounted on the plate through two pivots to minimize bending moments applied to the front linear way.
Vertical forces were transmitted to the SAFIRE by a horizontally mounted linkage. The linkage consisted of a distal block attached to the front slider plate, proximal'block, and a 12 inch long rod. The distal block housed the bearings and the encoder for the first passive pivot, also called the distal pivot. One end of the rod was pivoted about this block so that the linkage rotated freely in the horizontal plane. The other end of the rod was pivoted about a second bearing/encoder housing, also called the proximal pivot. The proximal mounting block was attached to the SAFIRE base with a rigid bracket. The length of this bracket was designed to position the hand so that the proximal pivot was between the tip of the index finger and the thumb.
The motor for vertical force reflection was selected based on the power requirements. Ideally the motor should produce a force of 2 lbs at the hand. It is estimated the hand could move at a maximum vertical speed of 8 in per sec. We selected a 90W motor without a gearhead to provide a smooth operation with minimal friction. This is designed to produce a force that exceeds the corresponding frictional force, if the frictional coefficient between the finger and the display is estimated at 0.5, the tactile display must be able to produce 1 lb. of force. If the display moved past the fmger at 1 in/s, the required power would be around 0.112W. A motor producing 0.25-0.5 W would therefore be a suitable actuator for this display. The MicroMo Series 1016 motor fits these requirements; it is 0.384 inch in diameter, 0.63 inch long and weighs 0.23 oz. This is mounted in the space beyond the fingertip.
The hand haptic system is interfaced to the world modeler, in this case implemented on a graphics rendering PC, through a local processor 80, in this case a VME card cage which houses a TI C30 DSP board 81 , I/O boards 82, 83, and 84, and a BiT3 VME-ISA bus adapter 85. The physical simulation and device drivers reside on a TI C30 floating point DSP board 81. The DSP board is installed in a multislot VME cage which also houses the I/O electronics. The simulation is interrupt driven with a time step of 300 Hz, which is the minimum performance level necessary for a stable and realistic simulation. Higher performance could be gained by parsing out the simulation tasks among several DSP processors and executing them in parallel. For example, communication tasks with the graphics PC and boom position sensing could be handled by a different processor than the main simulation loop.
The DSP simulation, as shown in Fig. 9, communicates with the graphics PC via a BiT3 VME-ISA bus adapter. This high speed communications channel handles graphics information generated by the simulation and simulation parameters generated by the user interface. The user interface starts the standard simulation demonstration and provides program status information. Since the user interface and graphics rendering can be somewhat asynchronous to the simulation, graphics and PC system latencies will not disturb the simulation which depends on a uniform time step. Also the bandwidth required for this information is low enough to not require a dedicated processor. This software approach does not require the simulation and graphics to communicate a significant amount of objects characteristics upon startup of the simulation so that a minimum of data is exchanged during run time. Data from the simulation need not be transmitted to the graphics PC any faster than the graphics refresh rate. For example, at a 60 Hz refresh rate, new data would be needed at a maximum of every third integration time step. Sensing the joint. positions and orientations and applying appropriate forces and slip requires additional I/O electronics. The joint positions are sensed with digital position encoders. The joint torque's and slip applied to the touch display are voltage controlled by the simulation via 6 12-bit digital to analog converters and external current mode motor amplifiers. The encoder counters and digital to analog converters reside on the high-speed VME bus with the DSP board and are directly accessed by the DSP. The generality of the control electronics and overall architecture makes the system adaptable to other force reflecting masters. The minimum configuration can handle six output and eight input degrees of freedom, and can be easily expanded to any reasonable number by adding I/O modules and, if necessary, DSP processors.
Almost all of the application graphics and DSP code is written in C. The simulation can be developed on the graphics PC and debugged with tools available for the PC/BiT3 development environment. Porting the simulation to the DSP is then relatively straightforward. In fact, the existing simulation can be debugged on the PC and then recompiled for the DSP with only minor changes. Hardware based debugging tools are available for the DSP, although the DSP/PC interface provides a relatively powerful stethoscope for all but the most low level debugging or obscure compiler/hardware bugs. The DSP executable and source code resides on the PC and is downloaded to the DSP at run time.
This system and method can be applied to a wide variety of physical embodiments including simpler systems and more complex and capable systems with more feedback modes utilized.
Hand/Arm/Vision Haptic System
An alternate embodiment utilizes the hand haptic system in conjunction with an exoskeletal arm mechanism combined with a vision display. The exoskeleton arm consists of two interconnected mechanisms, one for the shoulder and the other for the elbow and forearm as shown in Figure 10.
The shoulder mechanism 90 consists of a three DOF motor driven gimbal, a passive prismatic joint (spline), a 2 DOF passive gimbal and a upper arm cuff. The elbow mechanism 92 is connected to the shoulder mechanism 90 at the distal end of the spline 91. This allows the system to accommodate changes in the distance between the shoulder 90 and elbow mechanisms 92. The spline also eliminates the need for adjustment for different upper arm lengths.
The center of the distal gimbal 93 is located on the axis of the upper arm E/I rotation. The elbow F/E motor 94 is pivotally jointed to both the upper 95 and lower 96 cuffs. The final step was to use a brake 97 at the distal gimbal of the shoulder mechanism to ground the elbow/forearm mechanism. The elbow mechanism has two DOF. The first DOF is aligned with the axis of elbow F/E. The second DOF is placed along the forearm with a remote center drive mechanism.
The three motors 100, 101 , and 102, at the shoulder gimbal can be coordinated to apply a force at the upper arm which is approximately perpendicular to and intersects with the longitudinal axis of the upper arm. This force creates a torque about axes of the shoulder F/E and Ab/Ad. The motor at the elbow controls the Flexion/Extension torque about the elbow. The rotational torque about the longitudinal axes of the upper and lower arms are controlled in two different ways depending on the position of the elbow joint. When the elbow is extended the motor at the wrist applies rotational torque to the longitudinal aces of both the upper and lower arms, since their motions coincide. When the elbow is flexed, the motors at the shoulder can be coordinated to apply a force at the forearm cuff to produce a torque about the upper arm E/I, and the RCD applies a torque to the forearm.
The biggest advantage of this design was that it conformed to the anatomic motions of the entire arm. It imposed no kinematic constraints on the arm because the shoulder mechanism had full 6 DOF and the elbow F/E mechanism formed a 4-bar linkage with the elbow F/E joint. Therefore, it was very comfortable to wear. It also offered large Range Of Motions (ROM) . It not only applies forces to the entire arm grounded to the back, but also torque to individual joints. The former allowed us to simulate picking up an object by bracing between the forearm and the upper arm or between the entire arm and the chest. The latter could simulate contacts of objects in a virtual environment by any portion of the arm such as hand, forearm and upper arm. In summary, this mechanism offered all the force display requirements for a force reflective arm master and conformed to the anatomic motion of the arm. A higher performance version can be achieved by using magnetic brakes in conjunction to tow miniature motors or actuators. The two magnetic particle brakes may be used to replace the first two motors at the shoulder gimbal. In order to use brakes, the mechanism must be fully balanced about these two axes because brakes are passive and cannot lift. In addition, the miniature actuators or motors must be put in series with, and parallel to, the two brakes for active force control. One way to achieve this is to activate the 2 DOF passive gimbal at the upper arm cuff with such miniature motors. With the brakes locked, the two miniature actuators or motors can display small forces to the operator at very fine resolutions. To simulate a hard contact, simply lock the brakes and drive the two miniature motors to their respective hard stops so that the braking forces can be felt by the upper arm. For a force in between these two extremes, the combination of Pulse Width Modulations (PWM) on the brakes and force controls on the miniature motors can be used simultaneously.
The video sense is accommodated via a head mounted display or remote video screen, with accompanying processing. They can communicate with each haptic processor, which take in data from the physical parameters of the sense being displayed, e.g. position of the fingers related to force feedback. Each haptic processor applies the appropriate feedback, and communicates its part of the world view to the other processors. The communication channels can be fibre optic, infrared, wired, or whatever medium is appropriate to the twin demands of bandwidth and portability. Adding another sense is as straightforward as adding another sense module up to the limitations of inter-module bandwidth. System autoconfiguration is a straightforward function of the software. Controllers for the modules could be general purpose digital signal processors for maximum flexibility or specialized processors, e.g. application specific integrated circuits, for minimum cost. Vision sense information is provided to the operator through a set of goggles containing vision displays. Such a device is the Vision Immersion Module manufactured by Kaiser Electro-Optics in Carlsbad, CA 92008. Sound may also be provided to the operator through a set of headsets.
Portable System
The complete system is worn by the human operator through a backpack mechanism which also contains the electronic and power systems. The resulting system is a self-supporting exoskeleton, requiring no precise alignment to conform to the anatomical movement of the arm, and is thus comfortable to wear. The device is comfortable to use because it applies normal forces to the arm; no twist, no shear to skin or bones. It is also easy to don and doff because entire mechamsm is supported through the backpack. The complete system communicates to the world modeler through a cable which also provides power to the system. The system can also be self contained by providing power through batteries and communications through an RF or IR link.
The techniques of haptic compression, combined with modern power electronics and digital signal processing, make it possible to construct a totally portable haptic display system. Such a system can be used to explore virtual worlds in architectural applications, control remote mobile robots in space or nuclear applications, or transverse any number or databases to explore them through haptic feedback. While the invention has been particularly shown and described with reference to the preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention. What is claimed is:

Claims

1. A system for sensing and displaying sensory information to a user comprising: an world modeler adapted to receive first sensory information and transmit second sensory information, a communications medium adapted to carry the first sensory information and the second sensory information, a haptic system adapted to sense the first sensory information and transmit the first sensory information on the communications medium to the environment creator, adapted to receive the second sensory information transmitted on the communications medium by the environment creator and further adapted to display the second sensory information, whereby the system senses sensory information from the user and displays sensory information to the user.
2. The system of claim 1 wherein the world modeler is a computer system, the computer system adapted to create a mathematical model of an environment, the computer system further adapted to manipulate the mathematical model of an environment based upon the received first sensory information, the computer system further adapted to create the second sensory information based upon the mathematical model of an environment and the first sensory information, whereby the computer system creates an artificial environment for the user to interact.
3. The system of claim 1 wherein the world modeler is a computer system and a remote input-output device, the remote input device adapted to sense third sensory information from the remote input-output device, the computer system adapted to manipulate the third sensory information to create the second sensory information, the computer system further adapted manipulate the first sensory information to create fourth sensory information, the remote input-output adapted to respond to the fourth sensory information, whereby the remote input-output device is interactively controlled through the computer system by the user.
4. The system of claim 1 , 2 or 3 wherein the haptic system comprises: at least one sensory module adapted to: sense direct sensory information, process the direct sensory information into the first sensory information, and transmit the first sensory information, whereby the sensory module communicates the direct sensory information processed by the sensory module to the world modeler.
5. The system of claim 4 wherein the sensory module is further adapted to: receive the second sensory information, process the second sensory information into direct display information, and display the direct display information, whereby the sensory module displays the direct display information processed by the sensory module to the user.
PCT/US1995/001227 1994-01-27 1995-01-27 Intelligent remote multimode sense and display system utilizing haptic information compression WO1995020788A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US18788094A 1994-01-27 1994-01-27
US08/187,880 1994-01-27

Publications (1)

Publication Number Publication Date
WO1995020788A1 true WO1995020788A1 (en) 1995-08-03

Family

ID=22690879

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1995/001227 WO1995020788A1 (en) 1994-01-27 1995-01-27 Intelligent remote multimode sense and display system utilizing haptic information compression

Country Status (1)

Country Link
WO (1) WO1995020788A1 (en)

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0790584A2 (en) * 1996-02-13 1997-08-20 Sanyo Electric Co. Ltd A system and method for equipment design and production
US5729249A (en) * 1991-11-26 1998-03-17 Itu Research, Inc. Touch sensitive input control device
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US5828197A (en) * 1996-10-25 1998-10-27 Immersion Human Interface Corporation Mechanical interface having multiple grounded actuators
US5956484A (en) * 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
US5959613A (en) * 1995-12-01 1999-09-28 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
US5999168A (en) * 1995-09-27 1999-12-07 Immersion Corporation Haptic accelerator for force feedback computer peripherals
US6020876A (en) * 1997-04-14 2000-02-01 Immersion Corporation Force feedback interface with selective disturbance filter
US6050718A (en) * 1996-03-28 2000-04-18 Immersion Corporation Method and apparatus for providing high bandwidth force feedback with improved actuator feel
US6057828A (en) * 1993-07-16 2000-05-02 Immersion Corporation Method and apparatus for providing force sensations in virtual environments in accordance with host software
US6061004A (en) * 1995-11-26 2000-05-09 Immersion Corporation Providing force feedback using an interface device including an indexing function
US6078308A (en) * 1995-12-13 2000-06-20 Immersion Corporation Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US6100874A (en) * 1995-11-17 2000-08-08 Immersion Corporation Force feedback mouse interface
US6131097A (en) * 1992-12-02 2000-10-10 Immersion Corporation Haptic authoring
US6147674A (en) * 1995-12-01 2000-11-14 Immersion Corporation Method and apparatus for designing force sensations in force feedback computer applications
US6169540B1 (en) 1995-12-01 2001-01-02 Immersion Corporation Method and apparatus for designing force sensations in force feedback applications
US6184868B1 (en) 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
US6211861B1 (en) 1998-06-23 2001-04-03 Immersion Corporation Tactile mouse device
US6219033B1 (en) 1993-07-16 2001-04-17 Immersion Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US6246390B1 (en) 1995-01-18 2001-06-12 Immersion Corporation Multiple degree-of-freedom mechanical interface to a computer system
US6252583B1 (en) 1997-11-14 2001-06-26 Immersion Corporation Memory and force output management for a force feedback system
US6252579B1 (en) 1997-08-23 2001-06-26 Immersion Corporation Interface device and method for providing enhanced cursor control with force feedback
US6285351B1 (en) 1997-04-25 2001-09-04 Immersion Corporation Designing force sensations for computer applications including sounds
US6292170B1 (en) 1997-04-25 2001-09-18 Immersion Corporation Designing compound force sensations for computer applications
US6300936B1 (en) 1997-11-14 2001-10-09 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
US6366273B1 (en) 1993-07-16 2002-04-02 Immersion Corp. Force feedback cursor control interface
US6374255B1 (en) 1996-05-21 2002-04-16 Immersion Corporation Haptic authoring
US6396232B2 (en) 1997-11-03 2002-05-28 Cybernet Haptic Systems Corporation Haptic pointing devices
US6400352B1 (en) 1995-01-18 2002-06-04 Immersion Corporation Mechanical and force transmission for force feedback devices
US6424356B2 (en) 1999-05-05 2002-07-23 Immersion Corporation Command of force sensations in a forceback system using force effect suites
US6433771B1 (en) 1992-12-02 2002-08-13 Cybernet Haptic Systems Corporation Haptic device attribute control
US6448977B1 (en) 1997-11-14 2002-09-10 Immersion Corporation Textures and other spatial sensations for a relative haptic interface device
US6525711B1 (en) * 1999-06-24 2003-02-25 Interval Research Corp. Haptic interface including clutch control
US6597347B1 (en) 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
AU770385B2 (en) * 1998-02-03 2004-02-19 Immersion Corporation Implementing force feedback over the world wide web and other computer networks
KR20040051264A (en) * 2002-12-12 2004-06-18 한국전자통신연구원 Virtual reality rehabilitation system based on haptic interface device and the method
US6920373B2 (en) 2001-04-13 2005-07-19 Board Of Trusstees Operating Michigan State University Synchronization and task control of real-time internet based super-media
EP1629949A2 (en) * 2004-08-31 2006-03-01 Scuola Superiore Di Studi Universitari E Di Perfezionamento S. Anna Haptic interface device
US7082570B1 (en) 2002-08-29 2006-07-25 Massachusetts Institute Of Technology Distributed haptic interface system and method
US7439951B2 (en) 1995-09-27 2008-10-21 Immersion Corporation Power management for interface devices applying forces
USRE40891E1 (en) 1991-11-26 2009-09-01 Sandio Technology Corp. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
EP2099588A1 (en) * 2006-12-19 2009-09-16 Deakin University Method and apparatus for haptic control
US7592999B2 (en) 1998-06-23 2009-09-22 Immersion Corporation Haptic feedback for touchpads and other touch controls
US7755602B2 (en) 1995-11-30 2010-07-13 Immersion Corporation Tactile feedback man-machine interface device
US7779166B2 (en) 2002-12-08 2010-08-17 Immersion Corporation Using haptic effects to enhance information content in communications
USRE42183E1 (en) 1994-11-22 2011-03-01 Immersion Corporation Interface control
US8007282B2 (en) 2001-07-16 2011-08-30 Immersion Corporation Medical simulation interface apparatus and method
US8059088B2 (en) 2002-12-08 2011-11-15 Immersion Corporation Methods and systems for providing haptic messaging to handheld communication devices
US8169402B2 (en) 1999-07-01 2012-05-01 Immersion Corporation Vibrotactile haptic feedback devices
US8232969B2 (en) 2004-10-08 2012-07-31 Immersion Corporation Haptic feedback for button and scrolling action simulation in touch input devices
US8316166B2 (en) 2002-12-08 2012-11-20 Immersion Corporation Haptic messaging in handheld communication devices
US20130265148A1 (en) * 2003-11-20 2013-10-10 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US8830161B2 (en) 2002-12-08 2014-09-09 Immersion Corporation Methods and systems for providing a virtual touch haptic effect to handheld communication devices
US9050527B2 (en) 2012-08-23 2015-06-09 Wms Gaming Inc. Interactive tether using tension and feedback
CN105182792A (en) * 2015-08-10 2015-12-23 西南科技大学 Robot working simulation system under nuclear radiation environment and method thereof
CN105280080A (en) * 2015-11-26 2016-01-27 中国科学院自动化研究所 Three freedom degrees tactile sensation interaction system and tactile sensation interaction apparatus
EP2557481A4 (en) * 2010-04-05 2016-03-09 Samsung Electronics Co Ltd Apparatus and method for processing virtual world
WO2016040862A3 (en) * 2014-09-12 2016-09-29 Chizeck Howard Jay Integration of auxiliary sensors with point cloud-based haptic rendering and virtual fixtures
US9471142B2 (en) 2011-06-15 2016-10-18 The University Of Washington Methods and systems for haptic rendering and creating virtual fixtures from point clouds
US9477307B2 (en) 2013-01-24 2016-10-25 The University Of Washington Methods and systems for six degree-of-freedom haptic interaction with streaming point data
US9492847B2 (en) 1999-09-28 2016-11-15 Immersion Corporation Controlling haptic sensations for vibrotactile feedback interface devices
US9582178B2 (en) 2011-11-07 2017-02-28 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
EP2559462A4 (en) * 2010-04-14 2017-05-31 Samsung Electronics Co., Ltd. Device and method for processing virtual worlds
US20170228028A1 (en) * 2003-11-20 2017-08-10 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
WO2018050249A1 (en) * 2016-09-19 2018-03-22 Telefonaktiebolaget Lm Ericsson (Publ) Encoding and decoding multichannel haptic data by determining an order of a plurality of channels
US10226869B2 (en) 2014-03-03 2019-03-12 University Of Washington Haptic virtual fixture tools
US10416767B2 (en) 2003-11-20 2019-09-17 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
GB2575820A (en) * 2018-07-24 2020-01-29 Sony Interactive Entertainment Inc Robot interaction system and method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992005519A1 (en) * 1990-09-25 1992-04-02 Advanced Robotics Research Limited Manipulator interaction simulation system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992005519A1 (en) * 1990-09-25 1992-04-02 Advanced Robotics Research Limited Manipulator interaction simulation system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SATO M. ET AL: "Space Interface Device for Artificial Reality - SPIDAR", SYSTEMS & COMPUTERS IN JAPAN, vol. 23, no. 112, NEW YORK US, pages 44 - 54, XP000380344 *
SHIMOGA K. B.: "A Survey of Perceptual Feedback Issues in Dexterous Telemanipulation : Part I. Finger Force Feedback", IEEE VIRTUAL REALITY ANNUAL SYMPOSIUM, SEATTLE, WA, pages 263 - 270, XP000457696 *
SHIMOGA K. B.: "A Survey of Perceptual Feedback Issues in Dexterous Telemanipulation : Part II. Finger Touch Feedback", IEEE VIRTUAL REALITY ANNUAL SYMPOSIUM, pages 271 - 279, XP000457697 *

Cited By (115)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE40891E1 (en) 1991-11-26 2009-09-01 Sandio Technology Corp. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US5729249A (en) * 1991-11-26 1998-03-17 Itu Research, Inc. Touch sensitive input control device
US5805137A (en) * 1991-11-26 1998-09-08 Itu Research, Inc. Touch sensitive input control device
US6597347B1 (en) 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US6131097A (en) * 1992-12-02 2000-10-10 Immersion Corporation Haptic authoring
US6433771B1 (en) 1992-12-02 2002-08-13 Cybernet Haptic Systems Corporation Haptic device attribute control
US6366273B1 (en) 1993-07-16 2002-04-02 Immersion Corp. Force feedback cursor control interface
US6057828A (en) * 1993-07-16 2000-05-02 Immersion Corporation Method and apparatus for providing force sensations in virtual environments in accordance with host software
US6219033B1 (en) 1993-07-16 2001-04-17 Immersion Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
USRE42183E1 (en) 1994-11-22 2011-03-01 Immersion Corporation Interface control
US6400352B1 (en) 1995-01-18 2002-06-04 Immersion Corporation Mechanical and force transmission for force feedback devices
US6246390B1 (en) 1995-01-18 2001-06-12 Immersion Corporation Multiple degree-of-freedom mechanical interface to a computer system
US6342880B2 (en) 1995-09-27 2002-01-29 Immersion Corporation Force feedback system including multiple force processors
US5999168A (en) * 1995-09-27 1999-12-07 Immersion Corporation Haptic accelerator for force feedback computer peripherals
US7439951B2 (en) 1995-09-27 2008-10-21 Immersion Corporation Power management for interface devices applying forces
US6100874A (en) * 1995-11-17 2000-08-08 Immersion Corporation Force feedback mouse interface
US6061004A (en) * 1995-11-26 2000-05-09 Immersion Corporation Providing force feedback using an interface device including an indexing function
US7755602B2 (en) 1995-11-30 2010-07-13 Immersion Corporation Tactile feedback man-machine interface device
US9690379B2 (en) 1995-11-30 2017-06-27 Immersion Corporation Tactile feedback interface device
US6278439B1 (en) 1995-12-01 2001-08-21 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
US6147674A (en) * 1995-12-01 2000-11-14 Immersion Corporation Method and apparatus for designing force sensations in force feedback computer applications
US5959613A (en) * 1995-12-01 1999-09-28 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
US6169540B1 (en) 1995-12-01 2001-01-02 Immersion Corporation Method and apparatus for designing force sensations in force feedback applications
US6078308A (en) * 1995-12-13 2000-06-20 Immersion Corporation Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US6101530A (en) * 1995-12-13 2000-08-08 Immersion Corporation Force feedback provided over a computer network
US5956484A (en) * 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
EP0790584A2 (en) * 1996-02-13 1997-08-20 Sanyo Electric Co. Ltd A system and method for equipment design and production
EP0790584A3 (en) * 1996-02-13 1999-07-28 Sanyo Electric Co. Ltd A system and method for equipment design and production
US6050718A (en) * 1996-03-28 2000-04-18 Immersion Corporation Method and apparatus for providing high bandwidth force feedback with improved actuator feel
US6374255B1 (en) 1996-05-21 2002-04-16 Immersion Corporation Haptic authoring
US5828197A (en) * 1996-10-25 1998-10-27 Immersion Human Interface Corporation Mechanical interface having multiple grounded actuators
US6259382B1 (en) 1996-11-26 2001-07-10 Immersion Corporation Isotonic-isometric force feedback interface
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US6310605B1 (en) 1997-04-14 2001-10-30 Immersion Corporation Force feedback interface with selective disturbance filter
US6020876A (en) * 1997-04-14 2000-02-01 Immersion Corporation Force feedback interface with selective disturbance filter
US6292170B1 (en) 1997-04-25 2001-09-18 Immersion Corporation Designing compound force sensations for computer applications
US6285351B1 (en) 1997-04-25 2001-09-04 Immersion Corporation Designing force sensations for computer applications including sounds
US6252579B1 (en) 1997-08-23 2001-06-26 Immersion Corporation Interface device and method for providing enhanced cursor control with force feedback
US6288705B1 (en) 1997-08-23 2001-09-11 Immersion Corporation Interface device and method for providing indexed cursor control with force feedback
US6396232B2 (en) 1997-11-03 2002-05-28 Cybernet Haptic Systems Corporation Haptic pointing devices
US9740287B2 (en) 1997-11-14 2017-08-22 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
US6252583B1 (en) 1997-11-14 2001-06-26 Immersion Corporation Memory and force output management for a force feedback system
US6300936B1 (en) 1997-11-14 2001-10-09 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
US6448977B1 (en) 1997-11-14 2002-09-10 Immersion Corporation Textures and other spatial sensations for a relative haptic interface device
US9778745B2 (en) 1997-11-14 2017-10-03 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
AU770385B2 (en) * 1998-02-03 2004-02-19 Immersion Corporation Implementing force feedback over the world wide web and other computer networks
US7768504B2 (en) 1998-06-23 2010-08-03 Immersion Corporation Haptic feedback for touchpads and other touch controls
US8031181B2 (en) 1998-06-23 2011-10-04 Immersion Corporation Haptic feedback for touchpads and other touch controls
US7944435B2 (en) 1998-06-23 2011-05-17 Immersion Corporation Haptic feedback for touchpads and other touch controls
US7777716B2 (en) 1998-06-23 2010-08-17 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6211861B1 (en) 1998-06-23 2001-04-03 Immersion Corporation Tactile mouse device
US7592999B2 (en) 1998-06-23 2009-09-22 Immersion Corporation Haptic feedback for touchpads and other touch controls
US7602384B2 (en) 1998-06-23 2009-10-13 Immersion Corporation Haptic feedback touchpad
US6184868B1 (en) 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
US6424356B2 (en) 1999-05-05 2002-07-23 Immersion Corporation Command of force sensations in a forceback system using force effect suites
US6525711B1 (en) * 1999-06-24 2003-02-25 Interval Research Corp. Haptic interface including clutch control
US7119789B1 (en) 1999-06-24 2006-10-10 Vulcan Patents Llc Haptic interface including clutch control
US8169402B2 (en) 1999-07-01 2012-05-01 Immersion Corporation Vibrotactile haptic feedback devices
US9492847B2 (en) 1999-09-28 2016-11-15 Immersion Corporation Controlling haptic sensations for vibrotactile feedback interface devices
US9280205B2 (en) 1999-12-17 2016-03-08 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6920373B2 (en) 2001-04-13 2005-07-19 Board Of Trusstees Operating Michigan State University Synchronization and task control of real-time internet based super-media
US8007282B2 (en) 2001-07-16 2011-08-30 Immersion Corporation Medical simulation interface apparatus and method
US7082570B1 (en) 2002-08-29 2006-07-25 Massachusetts Institute Of Technology Distributed haptic interface system and method
US7779166B2 (en) 2002-12-08 2010-08-17 Immersion Corporation Using haptic effects to enhance information content in communications
US8803795B2 (en) 2002-12-08 2014-08-12 Immersion Corporation Haptic communication devices
US8059088B2 (en) 2002-12-08 2011-11-15 Immersion Corporation Methods and systems for providing haptic messaging to handheld communication devices
US8830161B2 (en) 2002-12-08 2014-09-09 Immersion Corporation Methods and systems for providing a virtual touch haptic effect to handheld communication devices
US8316166B2 (en) 2002-12-08 2012-11-20 Immersion Corporation Haptic messaging in handheld communication devices
KR20040051264A (en) * 2002-12-12 2004-06-18 한국전자통신연구원 Virtual reality rehabilitation system based on haptic interface device and the method
US9495803B2 (en) 2003-11-20 2016-11-15 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US10216278B2 (en) 2003-11-20 2019-02-26 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US11287888B2 (en) 2003-11-20 2022-03-29 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US20130265148A1 (en) * 2003-11-20 2013-10-10 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US10936072B2 (en) 2003-11-20 2021-03-02 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US9041520B2 (en) 2003-11-20 2015-05-26 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US10936074B2 (en) 2003-11-20 2021-03-02 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US9135791B2 (en) 2003-11-20 2015-09-15 National Istitute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US9142104B2 (en) 2003-11-20 2015-09-22 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US9171437B2 (en) 2003-11-20 2015-10-27 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US10416767B2 (en) 2003-11-20 2019-09-17 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US20130265149A1 (en) * 2003-11-20 2013-10-10 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US20170228028A1 (en) * 2003-11-20 2017-08-10 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US11385723B2 (en) 2003-11-20 2022-07-12 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US20170031443A1 (en) * 2003-11-20 2017-02-02 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US9495804B2 (en) 2003-11-20 2016-11-15 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
EP1629949A3 (en) * 2004-08-31 2006-04-12 Scuola Superiore Di Studi Universitari E Di Perfezionamento S. Anna Haptic interface device
EP1629949A2 (en) * 2004-08-31 2006-03-01 Scuola Superiore Di Studi Universitari E Di Perfezionamento S. Anna Haptic interface device
US8232969B2 (en) 2004-10-08 2012-07-31 Immersion Corporation Haptic feedback for button and scrolling action simulation in touch input devices
US8264465B2 (en) 2004-10-08 2012-09-11 Immersion Corporation Haptic feedback for button and scrolling action simulation in touch input devices
EP2099588A4 (en) * 2006-12-19 2010-07-21 Univ Deakin Method and apparatus for haptic control
EP2099588A1 (en) * 2006-12-19 2009-09-16 Deakin University Method and apparatus for haptic control
EP2422939A1 (en) * 2006-12-19 2012-02-29 Deakin University Method and apparatus for haptic control
AU2007335256B2 (en) * 2006-12-19 2013-11-07 Deakin University Method and apparatus for haptic control
US9374087B2 (en) 2010-04-05 2016-06-21 Samsung Electronics Co., Ltd. Apparatus and method for processing virtual world
EP2557481A4 (en) * 2010-04-05 2016-03-09 Samsung Electronics Co Ltd Apparatus and method for processing virtual world
EP2559462A4 (en) * 2010-04-14 2017-05-31 Samsung Electronics Co., Ltd. Device and method for processing virtual worlds
US9471142B2 (en) 2011-06-15 2016-10-18 The University Of Washington Methods and systems for haptic rendering and creating virtual fixtures from point clouds
US9582178B2 (en) 2011-11-07 2017-02-28 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US10775895B2 (en) 2011-11-07 2020-09-15 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US10152131B2 (en) 2011-11-07 2018-12-11 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US9050527B2 (en) 2012-08-23 2015-06-09 Wms Gaming Inc. Interactive tether using tension and feedback
US9477307B2 (en) 2013-01-24 2016-10-25 The University Of Washington Methods and systems for six degree-of-freedom haptic interaction with streaming point data
US9753542B2 (en) 2013-01-24 2017-09-05 University Of Washington Through Its Center For Commercialization Methods and systems for six-degree-of-freedom haptic interaction with streaming point data
US10226869B2 (en) 2014-03-03 2019-03-12 University Of Washington Haptic virtual fixture tools
WO2016040862A3 (en) * 2014-09-12 2016-09-29 Chizeck Howard Jay Integration of auxiliary sensors with point cloud-based haptic rendering and virtual fixtures
US10394327B2 (en) 2014-09-12 2019-08-27 University Of Washington Integration of auxiliary sensors with point cloud-based haptic rendering and virtual fixtures
EP3191264A4 (en) * 2014-09-12 2018-04-25 University of Washington Integration of auxiliary sensors with point cloud-based haptic rendering and virtual fixtures
CN105182792A (en) * 2015-08-10 2015-12-23 西南科技大学 Robot working simulation system under nuclear radiation environment and method thereof
CN105182792B (en) * 2015-08-10 2017-10-03 西南科技大学 Robot manipulating task analogue system and method under a kind of nuclear radiation environment
CN105280080A (en) * 2015-11-26 2016-01-27 中国科学院自动化研究所 Three freedom degrees tactile sensation interaction system and tactile sensation interaction apparatus
CN105280080B (en) * 2015-11-26 2018-05-04 中国科学院自动化研究所 A kind of Three Degree Of Freedom touches haptic interaction system and tactile dynamic sensing interexchanging apparatus
US10726686B2 (en) 2016-09-19 2020-07-28 Telefonaktiebolaget Lm Ericsson (Publ) Encoding and decoding multichannel haptic data by determining an order of a plurality of channels
WO2018050249A1 (en) * 2016-09-19 2018-03-22 Telefonaktiebolaget Lm Ericsson (Publ) Encoding and decoding multichannel haptic data by determining an order of a plurality of channels
GB2575820A (en) * 2018-07-24 2020-01-29 Sony Interactive Entertainment Inc Robot interaction system and method
GB2575820B (en) * 2018-07-24 2022-11-30 Sony Interactive Entertainment Inc Robot interaction system and method

Similar Documents

Publication Publication Date Title
WO1995020788A1 (en) Intelligent remote multimode sense and display system utilizing haptic information compression
US5709219A (en) Method and apparatus to create a complex tactile sensation
US7480600B2 (en) Force reflecting haptic interface
Massie et al. The phantom haptic interface: A device for probing virtual objects
US6042555A (en) Force-feedback interface device for the hand
Prattichizzo et al. Towards wearability in fingertip haptics: a 3-dof wearable device for cutaneous force feedback
EP0981423B1 (en) Force-feedback interface device for the hand
US6435794B1 (en) Force display master interface device for teleoperation
KR101548156B1 (en) A wireless exoskeleton haptic interface device for simultaneously delivering tactile and joint resistance and the method for comprising the same
US20060115348A1 (en) Force feedback and texture simulating interface device
JP3624374B2 (en) Force display device
O’malley et al. Haptic interfaces
Hurmuzlu et al. Effect of a pneumatically driven haptic interface on the perceptional capabilities of human operators
Nitzsche et al. Design issues of mobile haptic interfaces
Iwata History of haptic interface
Pabon et al. A data-glove with vibro-tactile stimulators for virtual social interaction and rehabilitation
KR20010028461A (en) Semi-direct drive hand exoskeleton
Ziegler Haptic displays: how can we feel virtual environments?
Fontana et al. Integrating force and tactile rendering into a single vr system
Sawada et al. A hapto-tactile display for presenting virtual objects in human-scale tactual search
Burdea et al. A distributed virtual environment with dextrous force feedback
Dazkir Active control of a distributed force feedback glove for virtual reality environments
Ohka et al. Presentation capability of compound displays for pressure and force
Shimojo et al. Development of a compact and fast‐response haptics display system
Month et al. flexible polymer actuator display

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA JP MX

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)