WO1995002233A1 - Method and system for simulating medical procedures including virtual reality and control method and system for use therein - Google Patents

Method and system for simulating medical procedures including virtual reality and control method and system for use therein Download PDF

Info

Publication number
WO1995002233A1
WO1995002233A1 PCT/US1994/007218 US9407218W WO9502233A1 WO 1995002233 A1 WO1995002233 A1 WO 1995002233A1 US 9407218 W US9407218 W US 9407218W WO 9502233 A1 WO9502233 A1 WO 9502233A1
Authority
WO
WIPO (PCT)
Prior art keywords
medical
medical instrument
generating
measuring
actions
Prior art date
Application number
PCT/US1994/007218
Other languages
French (fr)
Inventor
Charles J. Jacobus
Jennifer Lynn Griffin
Original Assignee
Cybernet Systems Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cybernet Systems Corporation filed Critical Cybernet Systems Corporation
Priority to EP94922027A priority Critical patent/EP0658265A1/en
Priority to AU72516/94A priority patent/AU7251694A/en
Publication of WO1995002233A1 publication Critical patent/WO1995002233A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine

Definitions

  • This invention relates to methods and systems for simulating medical procedures for training and for remote control .
  • Endoscopic and laparoscopic intervention is being fervently applied to procedures (such as cholecys- tectomies, hernia repair, nephrectomies, appendectomies, and numerous gynecological procedures) being praised for fast patient recovery time and decreased hospital and surgery costs.
  • procedures such as cholecys- tectomies, hernia repair, nephrectomies, appendectomies, and numerous gynecological procedures
  • U.S. Patent No. 5,149,270 issued to McKeown, discloses an apparatus for practicing endoscopic surgi ⁇ cal procedures.
  • the apparatus has a cavity in which an object simulating a human organ is mounted for perform- ing a practice procedure.
  • the cavity is closeable to outside view, thereby forcing the individual practicing the procedure to use and manipulate the instruments relying only upon a viewing scope in an attempt to mimic the actual diagnostic and operating conditions.
  • This configuration by relying upon physical linkages and organ materials, lacks the versatility, robustness, and repeatability of a programmable simulation device.
  • a further object of the above invention is to provide a method for providing a virtual reality in response to a position and orientation of a member representative of a medical instrument.
  • an object of the above invention is to provide a method and system for controlling a medical instrument.
  • An additional object of the above invention is to provide a four-axis controller for a medical instru ⁇ ment.
  • the present invention provides a method for simulating actual medical procedures which utilize at least one medical instrument.
  • the actions of the medical instrument are measured during an actual medical procedure. These measured actions of the medical instrument are generated on a member represen ⁇ tative of the medical instrument so as to simulate the actual medical procedure.
  • the present invention further provides a system for simulating actual medical procedures which utilize at least one medical instrument.
  • the system comprises means for measuring the actions of the medical instrument during an actual medical procedure.
  • Means are also provided for generating the measured actions on a member repre- sentative of the medical instrument whereby the actions of the member simulate the actual medical procedure.
  • the present invention also provides a method for providing a virtual reality in response to a position and orientation of a member representative of a medical instrument.
  • An electrical signal is generated for each of a plurality of degrees of freedom of the member as a function of the position and orientation of the member in three-dimen ⁇ sional space.
  • At least one virtual reality force field is generated in response to the generated signals.
  • a force signal is generated for each degree of freedom as a function of the force field.
  • a force on the member is generated for each force signal, thus providing the virtual reality.
  • the present invention additionally provides a method for controlling a medical instrument.
  • the position and orientation of a member representative of the medical instrument is measured.
  • Actions are generated on the medical instru ⁇ ment based upon the measured position and orientation of the member.
  • the forces acting upon the medical instru ⁇ ment are measured. Forces are generated on the member based upon the measured forces acting upon the medical instrument.
  • the present invention also provides a system for controlling a medical instrument.
  • the system comprises means for measuring the position and orientation of a member representative of the medical instrument. Means are provided to generate actions on the medical instrument based upon the measured position and orientation of the member.
  • the system further comprises means for measur ⁇ ing the forces acting upon the medical instrument. Means are provided to generate forces on the member based upon the measured forces acting upon the medical instrument.
  • the present invention also provides a four-axis controller for a medical instrument.
  • the system comprises a member representative of the medical instrument.
  • a four-axis interface device which is responsive to the position and orientation of the member, is secured to the member. Electrical signals representative of the position and orientation of the member are generated in the four-axis interface device. Means for actuating the medical instrument in response to the electrical signals are included. Means for generating a force field based upon a force induced upon the medical instrument are provid ⁇ ed.
  • a force signal for each axis of the interface device is generated by further means, with the actuators responsive to these force signals to provide a force to each axis of the interface device for reflecting a force on the medical instrument to the member.
  • FIGURE 1 shows the insertion of a scope and an instrument into a body during a laparoscopic medical procedure
  • FIGURE 2 is a block diagram representation of the recording apparatus
  • FIGURE 3 depicts the systematic scanning of an anatomical feature to generate a complete imagery record
  • FIGURE 4 is a block diagram representation of the virtual reality training/replay apparatus
  • FIGURE 5 is a block diagram representation of the incorporation of the recording apparatus with the virtual reality training/replay apparatus
  • FIGURE 6 is a block diagram representation of using the virtual reality replay system to provide a surgeon with preoperative imagery
  • FIGURE 7 is a block diagram representation of using the virtual reality system in conjunction with robotic endoscopic medical equipment
  • FIGURE 8 is a block diagram representation of the force/tactile subsystem control concept
  • FIGURE 9 is a schematic representation of the laparoscopic simulator.
  • FIGURE 10 shows the four-axis force reflecting module within an opening in the laparoscopic simulator.
  • viteoretinal, otologic, laryngeal and stereotactic procedures viteoretinal, otologic, laryngeal and stereotactic procedures; intubations, including neonatal procedures; and arterial and venous replacement and bypass methods used, for example, in treatment of innominate and subclavian occlusive disease.
  • the apparatus can be used to provide realistic simulations of patients on which to practice medical procedures for training and qualification purposes, to provide a means for trainees to see and feel actual (or simulated) surgeries, to display information from previously obtained medical diagnostics or image modali- ties (CT data, PET data, MRI data, etc.) to the surgeon overlaid on current image information during the sur ⁇ gery, and to provide the man-machine interface for robotic surgery.
  • CT data computed tomography
  • PET data PET data
  • MRI data magnetic resonance imaging
  • the apparatus consists of a two or three dimensional display device, a two or three dimensional sound device, a graphics/image processing engine and storage module capable of real-time medical image generation, and programmable tactile/force reflecting mechanisms which can generate the "feel" of medical instruments and the interaction of these instruments with the anatomical simulation.
  • the preferred embodi ⁇ ment of the invention is designed for simulating endo ⁇ scopic medical procedures due to the minimally invasive nature of these techniques.
  • the preferred embodiment of this invention addresses the problems presented by the prior art by providing means for controlled, preprogrammable simula ⁇ tion of a medical procedure (using both active tactile feel and imagery) so that standardized practice and qualification sessions can be performed by surgeons.
  • the same embodiment can also be used to augment the data normally delivered to the surgeon in actual endoscopic surgery, and can provide surrogate feels and imagery to surgeon trainees from actual surgeries being performed by expert surgeons.
  • the basic medical procedure simulation system consists of two basic functions:
  • FIG. 2 the recording appara- tus 20 that allows the critical events in a medical procedure to be measured and recorded during actual surgeries performed by expert surgeons is shown.
  • Medical instruments 16 having pressure and position sensors 22 are connected to a data digitizer 24.
  • the pressure and position data from the digitizer 24 is recorded by a recording system 26 consisting of a computer and a storage subsystem.
  • a viewing scope 14 having pressure and position sensors 28 is connected to an image digitizer/recorder 30.
  • the image digitiz- er/recorder 30 is also connected to the recording system 26.
  • the recording system 26 can either compress and store digital image data or store analog video on a computer-controlled video cassette recorder (VCR) .
  • VCR computer-controlled video cassette recorder
  • the non-image data is stored on digital disk or tape storage.
  • An accurate time base 32 is also connected to the recording subsystem 26 to provide time stamps for the instrument locations, forces or pressures, and actions such as trigger actuations during the specified scope locations, forces (or pressures) , and image data captured during the operation.
  • Each data item (force, position, trigger action item or image captured) is time stamped so that it is placed accurately into the time sequence of events during surgery.
  • the recording apparatus 20 in Figure 2 can be directly connected to a playback apparatus (in Figure 5) to provide immediate feedback to a trainee.
  • Figure 3 depicts the systematic scanning of anatomical features using the scope 14 to generate a complete imagery record of organ and tissue orientation surrounding the site ' of medical simulation. This supports generation of in-between data and interpolated views based on recorded data.
  • the virtual reality training/replay apparatus 40 needed to playback the recorded data is incorporated in Figure 4.
  • This apparatus 40 generates the virtual space which displays the time stamped and interpolated data stored in the image and data storage system 42.
  • a virtual reality control computer 44 receives image and non-image data from the image and data storage system 42. Position, orientation, and force data are processed by the computer 44 and sent to the force/tactile pro- grammed controller and hand/instrument tracking subsys ⁇ tem 46.
  • the force/tactile programmed controller and hand/instrument tracking subsystem 46 is connected to a tactile/force reflecting subject simulation mechanism 48 in order to provide force feedback.
  • the virtual reality control computer 44 obtains image data from the image and data storage system 42 based on the position and orientation of the instruments. The control computer 44 processes this data and relays it to an image warp processor 50.
  • the image warping function 50 uses morphing, defined as the correspondence of two images through corresponding polygon patches, as the means for providing interpolated images from data collected during recording sessions.
  • a 3D graphics generator 52 is also connected to the virtual reality control computer 44 in order to generate accurate three dimensional images of endoscopic instru- ments.
  • a graphics overlay unit 54 is used to overlay the three dimensional images from the 3D graphics generator 52 on the morphed video derived imagery from the image warp processor 50. The output of the graphics overlay unit 54 is fed into a three dimensional display system 56.
  • An optional (denoted by the dashed lines) trainee head tracking system 58 can be connected to the virtual reality control computer 44 to provide informa ⁇ tion for the three dimensional display generation function to control generation of view dependent imagery assuming that the display system 56 is a head mounted system.
  • future systems are expected to replace CRT viewing of endoscope-derived image data with head- mounted three-dimensional viewing. This viewing ar- rangement will require the head position sensing subsys ⁇ tem 58.
  • a head position sensing subsystem 58 also allows the incorporation of a three- dimensional sound generation system 60.
  • the sound generation system receives audio information from the virtual reality control computer 44 and sends this information to a sound system 62.
  • the use of sound generation is also optional since it is not necessary in simulating endoscopic procedures.
  • Figure 5 shows how the recording apparatus 20 in Figure 2 can be used to acquire image and manipula ⁇ tion data 70 which is then used to compose simulation data sets 72 for replay on the virtual reality replay apparatus 40 from Figure 4.
  • An experienced surgeon 74 performs an actual endoscopic medical procedure using endoscopic instruments 76.
  • the recording system 20 is used to measure and record the sights, sounds, feels, and actions occurring during the actual medical proce- dure.
  • the sights, sounds, feels, and actions data are stored in the measured data and image storage system 70.
  • Simulation data sets 72 are formed off-line from the measured data 70 by a data composition system 78.
  • the virtual reality training system 40 uses images and data in the simulation database 72 to allow a trainee 80 to explore this data.
  • the trainee 80 can view, hear, and feel operations through manipulation and positioning of the simulated instruments, or based on his head loca ⁇ tion.
  • Images or data not in the simulation database 72 are generated through interpolation between images/data items which are in the . database 72 (interpolation between images is accomplished through morphing -- mapping predetermined image polygons between images) .
  • Figure 5 also shows how the recording appara ⁇ tus 20 can be used to pick-up data and imagery from an in-progress medical procedure for immediate replay on the virtual reality replay/training apparatus 40.
  • the virtual reality replay apparatus 40 immediately receives the output of the recording appara ⁇ tus 20 via a communications channel 82. Therefore, the trainee 80 does not explore the input data and imagery through manipulation, but rather experiences the data stream.
  • This data includes the virtual forces generated by the force/tactile subsystem and applied to the trainee's hands through the simulated instruments.
  • the trainee 80 can gain intuitive understanding of the medical procedure, including the feelings, sights, and sounds of the operation, without the risks and possible complications associated with medical inter ⁇ vention.
  • Figure 6 shows how elements of the virtual reality replay system 40 can be used to provide the surgeon 90 with views of preoperative information and imagery 92, such as CT scans and MRI scans, to augment what is available now for endoscopic procedures (for instance, video imagery from the laparoscope) .
  • a surgeon 90 performs an actual medical procedure using endoscopic instruments 94.
  • the recording system appara ⁇ tus 20 is used to measure and record the actions and images in the instruments 94 during the procedure.
  • the virtual reality training system 40 incorporates pre ⁇ operative test data and images 92 with the output of the recording apparatus 20.
  • Figure 7 shows how the virtual reality systems described already can also be used to provide feedback from and control information to robotic endoscopic medical equipment 100.
  • a remote surgeon's 102 instru ⁇ ments, viewing scope, and head position are measured and recorded by the virtual reality training system 104. This data is transmitted to the robotic endoscopic surgery equipment 100 via a communications channel 106.
  • a local surgeon's 108 instrument, scope, and head position is also measured and recorded by a second virtual reality training system 110.
  • the robotic instruments are controlled to perform the actions of either the remote surgeon 102 or the local surgeon 108 on the patient 112. • The resulting sights, sounds, actions, and feels are experienced by both the remote surgeon 102 and the local surgeon 108.
  • the virtual reality concept described in this document provides the means for sharing control of such robotic equipment 100 between several surgeons or trainees.
  • the most important subsystems in the virtual reality replay apparatus are the force/tactile control ⁇ ler 46 and the tactile/force reflecting mechanism 48. These systems use force reflection information generated from the simulation database (or model-base) to create the feel of the medical instruments.
  • Figure 8 shows a control concept for creating the feel of medical instru ⁇ ments during a simulation.
  • the hands 120 of a surgeon or trainee grasp and move a member representative of an instrument.
  • the position of the instrument 122 is determined and fed into a geometric model of an organ 124.
  • the geometric model 124 includes data representing the organ size, position, surface characteristics (in terms of elasticity, resistance to searing, slipperi- ness, etc.).
  • An apparatus for simulating laparoscopic medical procedures shown in Figure 9 uses the servo systems concept described in Figure 8 to the provide force-feedback control.
  • the apparatus consists of a box-like enclosure 130 which provides three openings 132 for insertion of instruments 134.
  • instruments 134 When the instruments 134 are inserted they are clamped to four-axis devices 136 such as the force reflection mechanism 140 shown in Figure 10.
  • This mechanism 140 includes, in effect, a small backdrivable robot which moves in 4 degrees of freedom: 1 linear position (extension) and 3 attitudes (roll, pitch, and yaw) . This is accomplished by the extension actuator 142, roll actuator 144, pitch actua ⁇ tor 146, and yaw actuator 148.
  • This mechanism 140 can translate and provide forces/torques under programmed control (generated by the modeling subsystem in Figure 8) which feel to the trainee like the interactions characteristic of the actual medical procedure.
  • programmed control generated by the modeling subsystem in Figure 8
  • These four-axis actuation systems are reduced implementations of universal force reflection mechanisms described in co-pending application Serial No. 984,324, filed Decem ⁇ ber 2, 1992.
  • each of the four-axis devices In operation, each of the four-axis devices
  • the controller 138 may be a virtual reality simulation of the environment of a medical procedure, formed either artificially or based upon data collected during an actual medical procedure.
  • the controller 138 could also represent one or more slave devices which respond to follow the actions of the instruments 134, measure the forces acting on the slave device, and transmit force commands 137 back to the four-axis devices 136.
  • the four-axis device could be implemented. This device could read the position of a user's hand by means of a laser tracking system and feedback forces by urging the user's hand with jets of air. In the preferred embodiment, however, the user's hand will grip a member 134 representative of a medical instrument or another medical tool.
  • the specific four-axis device shown in Figure 10 is now examined in more detail. The position, orientation, velocity, and/or acceleration provided on the force reflective mechanism 140 by a member 134 is sensed and transmitted as a command to the simulation which implements a virtual reality force field.
  • a force command is generated based on the force field value for the position, orientation, velocity, and/or acceleration of the member 134 given by the simulation.
  • This force command is transmitted to force reflective mechanism 140.
  • Force is then generated on the member 134 by the force reflective mechanism 140 using four small, brushless, DC servo motors.
  • the four axis force-reflection output and four-axis orientation and position control provides the user with direct kinesthetic feedback from the virtual environment traversed.
  • a queuing button (not shown) can be incorpo ⁇ rated in order to actuate the instrument within the simulation or control.
  • the invention in its preferred embodiment is applicable to controlling a virtual or simulated medical environment.
  • the invention is also well suited to the control of a remote or physical medical instru ⁇ ment.
  • the present invention is suited for application to any number of axes. For instance, a full six-axis interface could be used for simulating or controlling medical procedures.

Abstract

During an actual medical procedure, the actions of a medical instrument (16) are measured and recorded. These actions are generated on a member representative of the medical instrument to simulate the actual medical procedure. Forces acting upon the medical instrument (16) are sensed and simulated in the member using a force/tactile reflecting mechanism. The preferred embodiment uses virtual reality technology including image processing, three-dimensional graphics and display methods, simulated force/tactile reflection, head/hand movement, position sensing, and sound generation to provide an accurate simulation of endoscopic medical procedures.

Description

METHOD AND SYSTEM FOR
SIMULATING MEDICAL PROCEDURES
INCLUDING VIRTUJAL REALITY AND CONTROL
METHOD AND SYSTEM FOR USE THEREIN
Cross-Reference To Related Application
This application is a continuation-in-part of copending U.S. application Serial No. 984,324, filed December 2, 1992.
Technical Field
This invention relates to methods and systems for simulating medical procedures for training and for remote control .
Background 4 rt
Laparoscopic surgery, a specific application of endoscopic surgery, is performed by:
(1) making several small openings into the peritoneal cavity 10 of the body 12 as shown in Figure
1;
(2) insufflating with an inert gas, C02 (providing medium for adequate anatomical inspection) ;
(3) inserting a viewing scope 14 (laparo- scope) into one opening 10;
(4) inserting othe surgical instruments 16 into the other openings (i.e. ..:'_.aplers, cutting/cauter- izing instruments, etc.) ; and
(5) maneuvering and manipulating these instruments 16 on the basis of tactile feel and the visual information of the surgical site from the insert¬ ed viewing scope 14 (usually displayed on a TV monitor) .
Because the feel of the instruments 16 and the imagery of the surgical site is restricted to a small local area which is not directly visible, surgeons must train extensively (usually on animals) prior to perform¬ ing human procedures. This type of surgery is still very much an art rather than science.
Endoscopic and laparoscopic intervention is being fervently applied to procedures (such as cholecys- tectomies, hernia repair, nephrectomies, appendectomies, and numerous gynecological procedures) being praised for fast patient recovery time and decreased hospital and surgery costs. As new applications emerge, training will play a crucial role in technological adoption. A need exists for a training device which will facilitate the transition from more traditional open procedures to endoscopic procedures by fostering the surgical skill necessary for the execution of efficient and complica- tion-free procedures.
U.S. Patent No. 5,149,270, issued to McKeown, discloses an apparatus for practicing endoscopic surgi¬ cal procedures. The apparatus has a cavity in which an object simulating a human organ is mounted for perform- ing a practice procedure. The cavity is closeable to outside view, thereby forcing the individual practicing the procedure to use and manipulate the instruments relying only upon a viewing scope in an attempt to mimic the actual diagnostic and operating conditions. This configuration, by relying upon physical linkages and organ materials, lacks the versatility, robustness, and repeatability of a programmable simulation device.
International PCT application No. PCT/US92/02186, published 1 October 1992, discloses an automated surgical system and apparatus. An endoscopic instrument and a viewing scope are controlled remotely via a control mechanism in communication with the surgical instruments over a telecommunications link. This device, while directed toward the remote control of surgical instruments, does not address the necessity of force feedback for giving the remote user the feel of actual surgical procedures. Further, the device is directed toward remotely performing actual surgical procedures rather than simulated surgical procedures for the purposes of training.
Summary of the Invention
It is an object of the above invention to provide a method and system for simulating actual medical procedures.
A further object of the above invention is to provide a method for providing a virtual reality in response to a position and orientation of a member representative of a medical instrument.
Moreover, an object of the above invention is to provide a method and system for controlling a medical instrument. An additional object of the above invention is to provide a four-axis controller for a medical instru¬ ment.
In carrying out the above objects, the present invention provides a method for simulating actual medical procedures which utilize at least one medical instrument. According to the present invention, the actions of the medical instrument are measured during an actual medical procedure. These measured actions of the medical instrument are generated on a member represen¬ tative of the medical instrument so as to simulate the actual medical procedure.
In carrying out the above objects, the present invention further provides a system for simulating actual medical procedures which utilize at least one medical instrument. The system comprises means for measuring the actions of the medical instrument during an actual medical procedure. Means are also provided for generating the measured actions on a member repre- sentative of the medical instrument whereby the actions of the member simulate the actual medical procedure.
In carrying out the above objects, the present invention also provides a method for providing a virtual reality in response to a position and orientation of a member representative of a medical instrument. An electrical signal is generated for each of a plurality of degrees of freedom of the member as a function of the position and orientation of the member in three-dimen¬ sional space. At least one virtual reality force field is generated in response to the generated signals. A force signal is generated for each degree of freedom as a function of the force field. A force on the member is generated for each force signal, thus providing the virtual reality.
In carrying out the above objects, the present invention additionally provides a method for controlling a medical instrument. The position and orientation of a member representative of the medical instrument is measured. Actions are generated on the medical instru¬ ment based upon the measured position and orientation of the member. The forces acting upon the medical instru¬ ment are measured. Forces are generated on the member based upon the measured forces acting upon the medical instrument.
In carrying out the above objects, the present invention also provides a system for controlling a medical instrument. The system comprises means for measuring the position and orientation of a member representative of the medical instrument. Means are provided to generate actions on the medical instrument based upon the measured position and orientation of the member. The system further comprises means for measur¬ ing the forces acting upon the medical instrument. Means are provided to generate forces on the member based upon the measured forces acting upon the medical instrument.
In carrying out the above objects, the present invention also provides a four-axis controller for a medical instrument. The system comprises a member representative of the medical instrument. A four-axis interface device, which is responsive to the position and orientation of the member, is secured to the member. Electrical signals representative of the position and orientation of the member are generated in the four-axis interface device. Means for actuating the medical instrument in response to the electrical signals are included. Means for generating a force field based upon a force induced upon the medical instrument are provid¬ ed. A force signal for each axis of the interface device is generated by further means, with the actuators responsive to these force signals to provide a force to each axis of the interface device for reflecting a force on the medical instrument to the member.
Brief Description Of The Drawings
FIGURE 1 shows the insertion of a scope and an instrument into a body during a laparoscopic medical procedure;
FIGURE 2 is a block diagram representation of the recording apparatus;
FIGURE 3 depicts the systematic scanning of an anatomical feature to generate a complete imagery record;
FIGURE 4 is a block diagram representation of the virtual reality training/replay apparatus;
FIGURE 5 is a block diagram representation of the incorporation of the recording apparatus with the virtual reality training/replay apparatus; FIGURE 6 is a block diagram representation of using the virtual reality replay system to provide a surgeon with preoperative imagery;
FIGURE 7 is a block diagram representation of using the virtual reality system in conjunction with robotic endoscopic medical equipment;
FIGURE 8 is a block diagram representation of the force/tactile subsystem control concept;
FIGURE 9 is a schematic representation of the laparoscopic simulator; and
FIGURE 10 shows the four-axis force reflecting module within an opening in the laparoscopic simulator.
Best Mode For Carrying Out The Invention
Described is a method and an apparatus which uses Virtual Reality Technology including image process¬ ing, three dimensional graphics and display methods, simulated force/tactile reflection, programmed stereo¬ phonic sound generation, head/hand movement and position sensing to provide an accurate control or simulation of medical procedures, specifically: endoscopic medical procedures including, but not limited to, cholycystec- tomies, vasectomies, bowel resections, inguinal and bilateral hernia repairs, appendectomies, lymphadenecto- mies, vagotomies, bronchial surgeries, hysterectomies, nephrectomies, lysis of adhesions, tumor extraction, knee arthroscopies, and abdominal explorations; micro¬ surgery, i.e. viteoretinal, otologic, laryngeal and stereotactic procedures; intubations, including neonatal procedures; and arterial and venous replacement and bypass methods used, for example, in treatment of innominate and subclavian occlusive disease.
The apparatus can be used to provide realistic simulations of patients on which to practice medical procedures for training and qualification purposes, to provide a means for trainees to see and feel actual (or simulated) surgeries, to display information from previously obtained medical diagnostics or image modali- ties (CT data, PET data, MRI data, etc.) to the surgeon overlaid on current image information during the sur¬ gery, and to provide the man-machine interface for robotic surgery.
The apparatus consists of a two or three dimensional display device, a two or three dimensional sound device, a graphics/image processing engine and storage module capable of real-time medical image generation, and programmable tactile/force reflecting mechanisms which can generate the "feel" of medical instruments and the interaction of these instruments with the anatomical simulation. The preferred embodi¬ ment of the invention is designed for simulating endo¬ scopic medical procedures due to the minimally invasive nature of these techniques.
The preferred embodiment of this invention addresses the problems presented by the prior art by providing means for controlled, preprogrammable simula¬ tion of a medical procedure (using both active tactile feel and imagery) so that standardized practice and qualification sessions can be performed by surgeons. The same embodiment can also be used to augment the data normally delivered to the surgeon in actual endoscopic surgery, and can provide surrogate feels and imagery to surgeon trainees from actual surgeries being performed by expert surgeons.
The basic medical procedure simulation system consists of two basic functions:
(1) measuring and recording sights, sounds, and feels generated during actual surgery;
(2) accurately playing back these recorded sights, sounds, and feels and for using the recorded data to generate new information necessary to emulate the responses to alternative actions taken by a surgeon trainee.
Turning now to Figure 2, the recording appara- tus 20 that allows the critical events in a medical procedure to be measured and recorded during actual surgeries performed by expert surgeons is shown. Medical instruments 16 having pressure and position sensors 22 are connected to a data digitizer 24. The pressure and position data from the digitizer 24 is recorded by a recording system 26 consisting of a computer and a storage subsystem. A viewing scope 14 having pressure and position sensors 28 is connected to an image digitizer/recorder 30. The image digitiz- er/recorder 30 is also connected to the recording system 26. The recording system 26 can either compress and store digital image data or store analog video on a computer-controlled video cassette recorder (VCR) . The non-image data is stored on digital disk or tape storage. An accurate time base 32 is also connected to the recording subsystem 26 to provide time stamps for the instrument locations, forces or pressures, and actions such as trigger actuations during the specified scope locations, forces (or pressures) , and image data captured during the operation. Each data item (force, position, trigger action item or image captured) is time stamped so that it is placed accurately into the time sequence of events during surgery. The recording apparatus 20 in Figure 2 can be directly connected to a playback apparatus (in Figure 5) to provide immediate feedback to a trainee.
To support the generation of data and images to accommodate various user inputs, which will be different than the mechanics implemented in the operat¬ ing room and will differ from any particular surgery performed, a systematic scanning of anatomical features is performed. Figure 3 depicts the systematic scanning of anatomical features using the scope 14 to generate a complete imagery record of organ and tissue orientation surrounding the site ' of medical simulation. This supports generation of in-between data and interpolated views based on recorded data.
The virtual reality training/replay apparatus 40 needed to playback the recorded data is incorporated in Figure 4. This apparatus 40 generates the virtual space which displays the time stamped and interpolated data stored in the image and data storage system 42. A virtual reality control computer 44 receives image and non-image data from the image and data storage system 42. Position, orientation, and force data are processed by the computer 44 and sent to the force/tactile pro- grammed controller and hand/instrument tracking subsys¬ tem 46. The force/tactile programmed controller and hand/instrument tracking subsystem 46 is connected to a tactile/force reflecting subject simulation mechanism 48 in order to provide force feedback.
The virtual reality control computer 44 obtains image data from the image and data storage system 42 based on the position and orientation of the instruments. The control computer 44 processes this data and relays it to an image warp processor 50. The image warping function 50 uses morphing, defined as the correspondence of two images through corresponding polygon patches, as the means for providing interpolated images from data collected during recording sessions. A 3D graphics generator 52 is also connected to the virtual reality control computer 44 in order to generate accurate three dimensional images of endoscopic instru- ments. A graphics overlay unit 54 is used to overlay the three dimensional images from the 3D graphics generator 52 on the morphed video derived imagery from the image warp processor 50. The output of the graphics overlay unit 54 is fed into a three dimensional display system 56.
An optional (denoted by the dashed lines) trainee head tracking system 58 can be connected to the virtual reality control computer 44 to provide informa¬ tion for the three dimensional display generation function to control generation of view dependent imagery assuming that the display system 56 is a head mounted system. Alternative embodiments for the display system 56 which use more conventional display technology, such as CRTs, might not require head tracking systems 58. However, future systems are expected to replace CRT viewing of endoscope-derived image data with head- mounted three-dimensional viewing. This viewing ar- rangement will require the head position sensing subsys¬ tem 58.
The inclusion of a head position sensing subsystem 58 also allows the incorporation of a three- dimensional sound generation system 60. The sound generation system receives audio information from the virtual reality control computer 44 and sends this information to a sound system 62. The use of sound generation is also optional since it is not necessary in simulating endoscopic procedures.
Figure 5 shows how the recording apparatus 20 in Figure 2 can be used to acquire image and manipula¬ tion data 70 which is then used to compose simulation data sets 72 for replay on the virtual reality replay apparatus 40 from Figure 4. An experienced surgeon 74 performs an actual endoscopic medical procedure using endoscopic instruments 76. The recording system 20 is used to measure and record the sights, sounds, feels, and actions occurring during the actual medical proce- dure. The sights, sounds, feels, and actions data are stored in the measured data and image storage system 70. Simulation data sets 72 are formed off-line from the measured data 70 by a data composition system 78. The virtual reality training system 40 uses images and data in the simulation database 72 to allow a trainee 80 to explore this data. The trainee 80 can view, hear, and feel operations through manipulation and positioning of the simulated instruments, or based on his head loca¬ tion. Images or data not in the simulation database 72 are generated through interpolation between images/data items which are in the . database 72 (interpolation between images is accomplished through morphing -- mapping predetermined image polygons between images) .
Figure 5 also shows how the recording appara¬ tus 20 can be used to pick-up data and imagery from an in-progress medical procedure for immediate replay on the virtual reality replay/training apparatus 40. In this mode, the virtual reality replay apparatus 40 immediately receives the output of the recording appara¬ tus 20 via a communications channel 82. Therefore, the trainee 80 does not explore the input data and imagery through manipulation, but rather experiences the data stream. This data includes the virtual forces generated by the force/tactile subsystem and applied to the trainee's hands through the simulated instruments. Thus, the trainee 80 can gain intuitive understanding of the medical procedure, including the feelings, sights, and sounds of the operation, without the risks and possible complications associated with medical inter¬ vention.
Figure 6 shows how elements of the virtual reality replay system 40 can be used to provide the surgeon 90 with views of preoperative information and imagery 92, such as CT scans and MRI scans, to augment what is available now for endoscopic procedures (for instance, video imagery from the laparoscope) . A surgeon 90 performs an actual medical procedure using endoscopic instruments 94. The recording system appara¬ tus 20 is used to measure and record the actions and images in the instruments 94 during the procedure. The virtual reality training system 40 incorporates pre¬ operative test data and images 92 with the output of the recording apparatus 20. The use of virtual reality display techniques allows metabolic and anatomical structure graphics to be overlayed on the basic video imagery now available to the surgeon 90 -- this is especially valuable if we provide the surgeon 90 with a three dimensional viewer (such as a head-mount three- dimensional viewer) for graphically enhanced anatomical information.
Figure 7 shows how the virtual reality systems described already can also be used to provide feedback from and control information to robotic endoscopic medical equipment 100. A remote surgeon's 102 instru¬ ments, viewing scope, and head position are measured and recorded by the virtual reality training system 104. This data is transmitted to the robotic endoscopic surgery equipment 100 via a communications channel 106. A local surgeon's 108 instrument, scope, and head position is also measured and recorded by a second virtual reality training system 110. The robotic instruments are controlled to perform the actions of either the remote surgeon 102 or the local surgeon 108 on the patient 112. • The resulting sights, sounds, actions, and feels are experienced by both the remote surgeon 102 and the local surgeon 108. The virtual reality concept described in this document provides the means for sharing control of such robotic equipment 100 between several surgeons or trainees.
The most important subsystems in the virtual reality replay apparatus are the force/tactile control¬ ler 46 and the tactile/force reflecting mechanism 48. These systems use force reflection information generated from the simulation database (or model-base) to create the feel of the medical instruments. Figure 8 shows a control concept for creating the feel of medical instru¬ ments during a simulation. The hands 120 of a surgeon or trainee grasp and move a member representative of an instrument. The position of the instrument 122 is determined and fed into a geometric model of an organ 124. The geometric model 124 includes data representing the organ size, position, surface characteristics (in terms of elasticity, resistance to searing, slipperi- ness, etc.). Forces and torques are applied to the surgeon's/trainee's hands 120 from a force/torque subsystem 126 using data from the geometric model 124. Thus, realistic feels are produced on the simulated medical instrument representing instrument-to-organ interaction forces based on the position of the instru- ment.
An apparatus for simulating laparoscopic medical procedures shown in Figure 9 uses the servo systems concept described in Figure 8 to the provide force-feedback control. The apparatus consists of a box-like enclosure 130 which provides three openings 132 for insertion of instruments 134. When the instruments 134 are inserted they are clamped to four-axis devices 136 such as the force reflection mechanism 140 shown in Figure 10. This mechanism 140 includes, in effect, a small backdrivable robot which moves in 4 degrees of freedom: 1 linear position (extension) and 3 attitudes (roll, pitch, and yaw) . This is accomplished by the extension actuator 142, roll actuator 144, pitch actua¬ tor 146, and yaw actuator 148. This mechanism 140 can translate and provide forces/torques under programmed control (generated by the modeling subsystem in Figure 8) which feel to the trainee like the interactions characteristic of the actual medical procedure. These four-axis actuation systems are reduced implementations of universal force reflection mechanisms described in co-pending application Serial No. 984,324, filed Decem¬ ber 2, 1992.
In operation, each of the four-axis devices
136 generate a series of signals 139 representative of the position and orientation of the corresponding instrument 134. These signals are transmitted to a controller 138 which generates a series of force com- mands 137 which are returned to the four-axis devices 136 to provide force feedback on the corresponding instrument 134.
The controller 138 may be a virtual reality simulation of the environment of a medical procedure, formed either artificially or based upon data collected during an actual medical procedure. The controller 138 could also represent one or more slave devices which respond to follow the actions of the instruments 134, measure the forces acting on the slave device, and transmit force commands 137 back to the four-axis devices 136.
It should be noted that there are a variety of different ways that the four-axis device could be implemented. This device could read the position of a user's hand by means of a laser tracking system and feedback forces by urging the user's hand with jets of air. In the preferred embodiment, however, the user's hand will grip a member 134 representative of a medical instrument or another medical tool. The specific four-axis device shown in Figure 10 is now examined in more detail. The position, orientation, velocity, and/or acceleration provided on the force reflective mechanism 140 by a member 134 is sensed and transmitted as a command to the simulation which implements a virtual reality force field. In turn, a force command is generated based on the force field value for the position, orientation, velocity, and/or acceleration of the member 134 given by the simulation. This force command is transmitted to force reflective mechanism 140. Force is then generated on the member 134 by the force reflective mechanism 140 using four small, brushless, DC servo motors. The four axis force-reflection output and four-axis orientation and position control provides the user with direct kinesthetic feedback from the virtual environment traversed. A queuing button (not shown) can be incorpo¬ rated in order to actuate the instrument within the simulation or control.
The invention in its preferred embodiment is applicable to controlling a virtual or simulated medical environment. However, the invention is also well suited to the control of a remote or physical medical instru¬ ment. Further, the present invention is suited for application to any number of axes. For instance, a full six-axis interface could be used for simulating or controlling medical procedures.
While the best mode for carrying out the invention has been described in detail, those familiar with the art to which this invention relates will recognize various alternative designs and embodiments for practicing the invention as defined by the following claims.

Claims

What Is Claimed Is:
1. A method for simulating actual medical procedures, the actual medical procedure utilizing at least one medical instrument, the method comprising the steps of: measuring the actions of the medical instru¬ ment during an actual medical procedure; and generating the actions on a member representa¬ tive of the medical instrument; whereby generating the actions simulates the actual medical procedure.
2. The method of claim 1 further comprising the step of recording the measured actions.
3. The method of claim 2 wherein the step of generating the actions on the member is performed based upon the recorded actions.
4. The method of claim 1 wherein the actual medical procedure includes endoscopic surgery utilizing a viewing scope capable of generating images.
5. The method of claim 4 further comprising the step of displaying the generated images.
6. The method of claim 5 further comprising the step of recording the images generated by the viewing scope wherein the step of displaying is per- formed based upon che generated images as recorded.
7. The method of claim 4 wherein the actual medical procedure includes preoperative exploration with the viewing scope and wherein the medical instrument comprises the viewing scope.
8. The method of claim 5 further comprising the step of measuring supplemental information, wherein the step of displaying is performed in conjunction with the playback of the supplemental information.
9. The method of claim 8 wherein the supple¬ mental information is audio information.
10. The method of claim 8 wherein the supple¬ mental information is derived from medical diagnostic equipment.
11. The method of claim 6 wherein the step of displaying is performed based upon diagnostic informa- tion.
12. A system for simulating actual medical procedures, the actual medical procedure utilizing at least one medical instrument, the system comprising: measuring means for measuring the actions of the medical instrument during an actual medical proce¬ dure; and generating means operatively associated with the measuring means for generating the actions on a member representative of the medical instrument; whereby the actions of the member simulate the actual medical procedure.
13. The system of claim 12 further comprising recording means operatively associated with the measur- ing means for recording the measured actions.
14. The system of claim 13 wherein the generating means is operatively associated with the recording means and wherein the generation of the actions on the member is performed based upon the recorded actions.
15. The system of claim 12 wherein the actual medical procedure includes endoscopic surgery utilizing a viewing scope capable of generating images.
16. The system of claim 15 further comprising display means operatively associated with the viewing scope for displaying the images generated by the viewing scope.
17. The system of claim 16 further comprising recording means operatively associated with the viewing scope and the display means for recording the generated images wherein the display of the generated images is performed based upon the recorded images.
18. The system of claim 15 wherein the actual medical procedure includes preoperative exploration with the viewing scope and wherein the medical instrument comprises the viewing scope.
19. The system of claim 16 further comprising supplemental information means, operatively associated with the display means, for measuring supplemental information, wherein the display of the generated images is performed in conjunction with the playback of the supplemental information.
20. The system of claim 19 wherein the supplemental information means measures audio informa¬ tion.
21. The system of claim 20 wherein the supplemental information means measures information from medical diagnostic equipment.
22. The system of claim 18 wherein the display means further comprise means for displaying diagnostic information.
23. A method for providing a virtual reality in response to a position and orientation of a member representative of a medical instrument, the method comprising the steps of: generating an electrical signal for each of a plurality of degrees of freedom of the member as a function of the position and orientation of the member in three-dimensional space; generating at least one virtual reality force field in response to the generated electric signals; generating a force signal for each degree of freedom as a function of the force field; and directing a force on the member for each force signal, the generated forces providing the virtual reality.
24. The method of claim 23 wherein the virtual reality force field is also generated in re¬ sponse to information obtained from an actual medical procedure.
25. The method of claim 23 wherein the virtual reality force field is also generated in re¬ sponse to information obtained during pre-medical diagnostic procedures.
26. The method of claim 23 wherein the gener¬ ated forces are tactile forces.
27. A method for controlling a medical instrument during a medical procedure, the method comprising the steps of: measuring the position and orientation of a member representative of the medical instrument; generating actions on the medical instrument based upon the measured position and orientation of the member; measuring the forces acting upon the medical instrument; and generating forces on the member based upon the measured forces on the medical instrument.
28. The method of claim 27 wherein the step of measuring the forces acting upon the medical instru¬ ment includes measuring torques acting upon the medical instrument.
29. The method of claim 28 wherein the step of generating the forces on the member includes generat- ing torques on the member.
30. The method of claim 27 wherein the medical instrument includes an endoscopic medical viewing scope capable of generating images.
31. The method of claim 30 further comprising the step of displaying the generated images.
32. The method of claim 30 wherein the medical procedure includes preoperative exploration with the viewing scope.
33. The method of claim 31 further comprising the step of measuring supplemental information, wherein the step of displaying is performed in conjunction with the playback of the supplemental information.
34. The method of claim 33 wherein the supplemental information is audio information.
35. The method of claim 33 wherein the supplemental information is derived from medical diag- nostic equipment.
36. The method of claim 33 wherein the step of displaying is performed based upon diagnostic infor¬ mation.
37. The method of claim 27 further comprising the step of actuating the medical instrument in response to a command input.
38. A system for controlling a medical instrument during a medical procedure, the system comprising: first measuring means for measuring the position and orientation of a member representative of the medical instrument; first generating means operatively associated with the first measuring means for generating actions on the medical instrument based upon the measured position and orientation of the member; second measuring means for measuring the forces acting upon the medical instrument; and second generating means operatively associated with the second measuring means for generating forces on the member based upon the measured forces on the medical instrument.
39. The system of claim 38 wherein the second measuring means includes means for measuring torques acting upon the medical instrument.
40. The system of claim 39 wherein the second generating means includes means for generating torques on the member.
41. The system of claim 38 wherein the medical instrument includes an endoscopic medical viewing scope capable of generating images.
42. The system of claim 41 further comprising display means operatively associated with the viewing scope for displaying the generated images.
43. The system of claim 41 wherein the medical procedure includes preoperative exploration with the viewing scope.
44. The system of claim 41 further comprising supplemental information means, operatively associated with the display means, for measuring supplemental information, wherein the display of the generated images is performed in conjunction with the playback of the supplemental information.
45. The system of claim 44 wherein the supplemental information means measures audio informa¬ tion.
46. The system of claim 45 wherein the supplemental information means measures information from medical diagnostic equipment.
47. The system of claim 43 wherein the display means further comprise means for displaying diagnostic information.
48. The system of claim 38 further comprising actuation means operatively associated with the medical instrument for actuating the medical instrument in response to a command input.
49. A system for providing four-axis control for a medical instrument comprising: a member representative of the medical instru¬ ment; a four-axis device securable to the member, the four-axis device being responsive to the position and orientation of the member to generate electrical signals representative of the position and orientation of the member, the four-axis device including an actua¬ tor for each axis of the device; means for actuating the medical instrument in response to the electrical signals; means for generating a force field based upon a force induced upon the medical instrument; and means for generating a force signal for each axis of the four-axis device as a function of the force field, wherein the actuators are responsive to their respective force signals to provide a force to each axis of the four-axis device for reflecting the force on the medical instrument to the member; whereby the position and orientation of the member control the medical instrument.
50. The system of claim 49 wherein the four- axis device further comprises an encoder coupled to each actuator for generating the corresponding electrical signals.
51. The system of claim 49 wherein the actuators are electric motors.
52. The system of claim 51 wherein the electric motors are back-drivable.
PCT/US1994/007218 1993-07-06 1994-06-27 Method and system for simulating medical procedures including virtual reality and control method and system for use therein WO1995002233A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP94922027A EP0658265A1 (en) 1993-07-06 1994-06-27 Method and system for simulating medical procedures including virtual reality and control method and system for use therein
AU72516/94A AU7251694A (en) 1993-07-06 1994-06-27 Method and system for simulating medical procedures including virtual reality and control method and system for use therein

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US8765393A 1993-07-06 1993-07-06
US087,653 1993-07-06

Publications (1)

Publication Number Publication Date
WO1995002233A1 true WO1995002233A1 (en) 1995-01-19

Family

ID=22206450

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1994/007218 WO1995002233A1 (en) 1993-07-06 1994-06-27 Method and system for simulating medical procedures including virtual reality and control method and system for use therein

Country Status (3)

Country Link
EP (1) EP0658265A1 (en)
AU (1) AU7251694A (en)
WO (1) WO1995002233A1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2743242A1 (en) * 1995-12-28 1997-07-04 Bertrand Dominique Multiple TV Screen Server System for Home Theatre or Advertising
WO1997039680A1 (en) * 1996-04-25 1997-10-30 Dimitrios Kontarinis Monitoring needle puncture in surgical operations
WO1998003954A1 (en) * 1996-07-23 1998-01-29 Medical Simulation Corporation System for training persons to perform minimally invasive surgical procedures
WO1998010387A2 (en) * 1996-09-04 1998-03-12 Ht Medical Systems, Inc. Interventional radiology interface apparatus and method
WO1998024083A1 (en) * 1996-11-25 1998-06-04 Dadkhah Shahiar Coronary angioplasty simulator apparatus
US5800177A (en) * 1995-03-29 1998-09-01 Gillio; Robert G. Surgical simulator user input device
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US5828197A (en) * 1996-10-25 1998-10-27 Immersion Human Interface Corporation Mechanical interface having multiple grounded actuators
WO1999017265A1 (en) * 1997-09-26 1999-04-08 Boston Dynamics, Inc. Method and apparatus for surgical training and simulating surgery
US5956484A (en) * 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
US6024576A (en) * 1996-09-06 2000-02-15 Immersion Corporation Hemispherical, high bandwidth mechanical interface for computer systems
EP1051698A2 (en) * 1998-01-28 2000-11-15 HT Medical Systems, Inc. Interface device and method for interfacing instruments to vascular access simulation systems
EP1103041A1 (en) * 1998-01-28 2001-05-30 HT Medical Systems, Inc. Interface device and method for interfacing instruments to medical procedure simulation system
GB2384613A (en) * 1998-01-28 2003-07-30 Ht Medical Systems Inc Interfacing medical instruments with medical simulations systems.
US6703550B2 (en) 2001-10-10 2004-03-09 Immersion Corporation Sound data output and manipulation using haptic feedback
US6726675B1 (en) 1998-03-11 2004-04-27 Navicath Ltd. Remote control catheterization
FR2853983A1 (en) * 2003-04-17 2004-10-22 Philippe Bellanger Manual gesture assisting and training device for design field, has stimuli generator to inform position of tool relative to material, to operator by increase of reality of actions that his job implies
WO2006016348A1 (en) * 2004-08-13 2006-02-16 Haptica Limited A method and system for generating a surgical training module
US7091948B2 (en) 1997-04-25 2006-08-15 Immersion Corporation Design of force sensations for haptic feedback computer interfaces
US7594815B2 (en) * 2003-09-24 2009-09-29 Toly Christopher C Laparoscopic and endoscopic trainer including a digital camera
US8007281B2 (en) 2003-09-24 2011-08-30 Toly Christopher C Laparoscopic and endoscopic trainer including a digital camera with multiple camera angles
DE102010036904A1 (en) * 2010-08-06 2012-02-09 Technische Universität München Haptic measurement device for use in surgical training apparatus for haptic acquisition of human body, has evaluation unit controlling actuator unit based on evaluation of interaction of extension of manipulator with virtual reaction space
EP2439719A1 (en) * 2010-10-05 2012-04-11 Biosense Webster (Israel), Ltd. Simulation of an invasive procedure
US9501955B2 (en) * 2001-05-20 2016-11-22 Simbionix Ltd. Endoscopic ultrasonography simulation
US9623209B2 (en) 2008-05-06 2017-04-18 Corindus, Inc. Robotic catheter system
ITUA20163903A1 (en) * 2016-05-10 2017-11-10 Univ Degli Studi Genova SIMULATOR OF INTERVENTIONS IN LAPAROSCOPY
US9833293B2 (en) 2010-09-17 2017-12-05 Corindus, Inc. Robotic catheter system
US9962229B2 (en) 2009-10-12 2018-05-08 Corindus, Inc. System and method for navigating a guide wire
EP3612123A4 (en) * 2017-04-20 2021-01-13 Intuitive Surgical Operations, Inc. Systems and methods for constraining a virtual reality surgical system
TWI765650B (en) * 2021-07-12 2022-05-21 謝慶隆 Abdominal surgery simulation training device
US11918314B2 (en) 2009-10-12 2024-03-05 Corindus, Inc. System and method for navigating a guide wire

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2187101A (en) * 1986-01-02 1987-09-03 Champil Abraham Ninan Computer interfacing of endoscope
US4907973A (en) * 1988-11-14 1990-03-13 Hon David C Expert system simulator for modeling realistic internal environments and performance
EP0391376A1 (en) * 1989-04-06 1990-10-10 Nissim Nejat Danon Apparatus for computerized laser surgery
WO1992016141A1 (en) * 1991-03-18 1992-10-01 Wilk Peter J Automated surgical system and apparatus
DE4213584A1 (en) * 1991-04-24 1992-11-05 Olympus Optical Co Medical arrangement with object information reproduction device - detects state of part and displays it to surgeon operating medical device
EP0571827A1 (en) * 1992-05-27 1993-12-01 International Business Machines Corporation System and method for augmentation of endoscopic surgery

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2187101A (en) * 1986-01-02 1987-09-03 Champil Abraham Ninan Computer interfacing of endoscope
US4907973A (en) * 1988-11-14 1990-03-13 Hon David C Expert system simulator for modeling realistic internal environments and performance
EP0391376A1 (en) * 1989-04-06 1990-10-10 Nissim Nejat Danon Apparatus for computerized laser surgery
WO1992016141A1 (en) * 1991-03-18 1992-10-01 Wilk Peter J Automated surgical system and apparatus
DE4213584A1 (en) * 1991-04-24 1992-11-05 Olympus Optical Co Medical arrangement with object information reproduction device - detects state of part and displays it to surgeon operating medical device
EP0571827A1 (en) * 1992-05-27 1993-12-01 International Business Machines Corporation System and method for augmentation of endoscopic surgery

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PEDERSEN,PEDER C.: "PROCCEEDINGS OF THE 12TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY", 1 November 1990, IEEE, PHILADELPHIA *

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5800177A (en) * 1995-03-29 1998-09-01 Gillio; Robert G. Surgical simulator user input device
US5882206A (en) * 1995-03-29 1999-03-16 Gillio; Robert G. Virtual surgery system
US6101530A (en) * 1995-12-13 2000-08-08 Immersion Corporation Force feedback provided over a computer network
US5956484A (en) * 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
FR2743242A1 (en) * 1995-12-28 1997-07-04 Bertrand Dominique Multiple TV Screen Server System for Home Theatre or Advertising
GR1002806B (en) * 1996-04-25 1997-11-13 Measurement and relay system of (puncturing) information during laparoscopic and, in general, surgical operations.
WO1997039680A1 (en) * 1996-04-25 1997-10-30 Dimitrios Kontarinis Monitoring needle puncture in surgical operations
WO1998003954A1 (en) * 1996-07-23 1998-01-29 Medical Simulation Corporation System for training persons to perform minimally invasive surgical procedures
US5800179A (en) * 1996-07-23 1998-09-01 Medical Simulation Corporation System for training persons to perform minimally invasive surgical procedures
US6062865A (en) * 1996-07-23 2000-05-16 Medical Simulation Corporation System for training persons to perform minimally invasive surgical procedures
US6267599B1 (en) 1996-07-23 2001-07-31 Medical Simulation Corporation System for training persons to perform minimally invasive surgical procedures
WO1998010387A3 (en) * 1996-09-04 1998-07-09 Ht Medical Inc Interventional radiology interface apparatus and method
WO1998010387A2 (en) * 1996-09-04 1998-03-12 Ht Medical Systems, Inc. Interventional radiology interface apparatus and method
US6024576A (en) * 1996-09-06 2000-02-15 Immersion Corporation Hemispherical, high bandwidth mechanical interface for computer systems
US5828197A (en) * 1996-10-25 1998-10-27 Immersion Human Interface Corporation Mechanical interface having multiple grounded actuators
WO1998024083A1 (en) * 1996-11-25 1998-06-04 Dadkhah Shahiar Coronary angioplasty simulator apparatus
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US6259382B1 (en) 1996-11-26 2001-07-10 Immersion Corporation Isotonic-isometric force feedback interface
US7091948B2 (en) 1997-04-25 2006-08-15 Immersion Corporation Design of force sensations for haptic feedback computer interfaces
WO1999017265A1 (en) * 1997-09-26 1999-04-08 Boston Dynamics, Inc. Method and apparatus for surgical training and simulating surgery
JP4648475B2 (en) * 1998-01-28 2011-03-09 イマージョン メディカル,インコーポレイティド Interface apparatus and method for interfacing an instrument to a medical procedure simulation system
JP2010015164A (en) * 1998-01-28 2010-01-21 Immersion Medical Inc Interface device and method for interfacing instrument to medical procedure simulation system
GB2384613A (en) * 1998-01-28 2003-07-30 Ht Medical Systems Inc Interfacing medical instruments with medical simulations systems.
JP2003525639A (en) * 1998-01-28 2003-09-02 イマージョン メディカル,インコーポレイティド Interface apparatus and method for interfacing an instrument to a medical procedure simulation system
GB2384613B (en) * 1998-01-28 2003-09-17 Ht Medical Systems Inc Interface device and method for interfacing instruments to medical procedure simulation systems
EP1051698A4 (en) * 1998-01-28 2009-05-06 Immersion Medical Inc Interface device and method for interfacing instruments to vascular access simulation systems
EP1103041A4 (en) * 1998-01-28 2009-03-04 Immersion Medical Inc Interface device and method for interfacing instruments to medical procedure simulation system
JP2011028293A (en) * 1998-01-28 2011-02-10 Immersion Medical Inc Interface device and method for interfacing instrument to medical procedure simulation system
EP1103041A1 (en) * 1998-01-28 2001-05-30 HT Medical Systems, Inc. Interface device and method for interfacing instruments to medical procedure simulation system
EP1051698A2 (en) * 1998-01-28 2000-11-15 HT Medical Systems, Inc. Interface device and method for interfacing instruments to vascular access simulation systems
GB2349730B (en) * 1998-01-28 2003-04-09 Ht Medical Systems Inc Interface device and method for interfacing instruments to medical procedure simulation system
US6726675B1 (en) 1998-03-11 2004-04-27 Navicath Ltd. Remote control catheterization
US9501955B2 (en) * 2001-05-20 2016-11-22 Simbionix Ltd. Endoscopic ultrasonography simulation
US7208671B2 (en) 2001-10-10 2007-04-24 Immersion Corporation Sound data output and manipulation using haptic feedback
US6703550B2 (en) 2001-10-10 2004-03-09 Immersion Corporation Sound data output and manipulation using haptic feedback
WO2004095249A1 (en) * 2003-04-17 2004-11-04 Philippe Bellanger Interactive method and device for providing assistance with manual movements during material processing
CN100442205C (en) * 2003-04-17 2008-12-10 菲利普·贝伦格 Interactive method and device for providing assistance with manual movements during material processing
FR2853983A1 (en) * 2003-04-17 2004-10-22 Philippe Bellanger Manual gesture assisting and training device for design field, has stimuli generator to inform position of tool relative to material, to operator by increase of reality of actions that his job implies
US8007281B2 (en) 2003-09-24 2011-08-30 Toly Christopher C Laparoscopic and endoscopic trainer including a digital camera with multiple camera angles
US7594815B2 (en) * 2003-09-24 2009-09-29 Toly Christopher C Laparoscopic and endoscopic trainer including a digital camera
WO2006016348A1 (en) * 2004-08-13 2006-02-16 Haptica Limited A method and system for generating a surgical training module
US8924334B2 (en) 2004-08-13 2014-12-30 Cae Healthcare Inc. Method and system for generating a surgical training module
US9623209B2 (en) 2008-05-06 2017-04-18 Corindus, Inc. Robotic catheter system
US11717645B2 (en) 2008-05-06 2023-08-08 Corindus, Inc. Robotic catheter system
US10987491B2 (en) 2008-05-06 2021-04-27 Corindus, Inc. Robotic catheter system
US10342953B2 (en) 2008-05-06 2019-07-09 Corindus, Inc. Robotic catheter system
US10881474B2 (en) 2009-10-12 2021-01-05 Corindus, Inc. System and method for navigating a guide wire
US9962229B2 (en) 2009-10-12 2018-05-08 Corindus, Inc. System and method for navigating a guide wire
US11696808B2 (en) 2009-10-12 2023-07-11 Corindus, Inc. System and method for navigating a guide wire
US11918314B2 (en) 2009-10-12 2024-03-05 Corindus, Inc. System and method for navigating a guide wire
DE102010036904A1 (en) * 2010-08-06 2012-02-09 Technische Universität München Haptic measurement device for use in surgical training apparatus for haptic acquisition of human body, has evaluation unit controlling actuator unit based on evaluation of interaction of extension of manipulator with virtual reaction space
US9833293B2 (en) 2010-09-17 2017-12-05 Corindus, Inc. Robotic catheter system
AU2011226872B2 (en) * 2010-10-05 2014-09-11 Biosense Webster (Israel), Ltd. Simulation of an invasive procedure
US8636519B2 (en) 2010-10-05 2014-01-28 Biosense Webster (Israel) Ltd. Simulation of an invasive procedure
EP2439719A1 (en) * 2010-10-05 2012-04-11 Biosense Webster (Israel), Ltd. Simulation of an invasive procedure
ITUA20163903A1 (en) * 2016-05-10 2017-11-10 Univ Degli Studi Genova SIMULATOR OF INTERVENTIONS IN LAPAROSCOPY
EP3612123A4 (en) * 2017-04-20 2021-01-13 Intuitive Surgical Operations, Inc. Systems and methods for constraining a virtual reality surgical system
US11589937B2 (en) 2017-04-20 2023-02-28 Intuitive Surgical Operations, Inc. Systems and methods for constraining a virtual reality surgical system
TWI765650B (en) * 2021-07-12 2022-05-21 謝慶隆 Abdominal surgery simulation training device

Also Published As

Publication number Publication date
AU7251694A (en) 1995-02-06
EP0658265A1 (en) 1995-06-21

Similar Documents

Publication Publication Date Title
US5769640A (en) Method and system for simulating medical procedures including virtual reality and control method and system for use therein
WO1995002233A1 (en) Method and system for simulating medical procedures including virtual reality and control method and system for use therein
JP6916322B2 (en) Simulator system for medical procedure training
CN110800033B (en) Virtual reality laparoscope type tool
US5800177A (en) Surgical simulator user input device
Liu et al. A survey of surgical simulation: applications, technology, and education
EP0426767B1 (en) Internal environment simulator system
US20090311655A1 (en) Surgical procedure capture, modelling, and editing interactive playback
Tendick et al. Human-machine interfaces for minimally invasive surgery
US8616894B2 (en) Virtual operation simulator
WO2008058039A1 (en) Devices and methods for utilizing mechanical surgical devices in a virtual environment
JPH08215211A (en) Apparatus and method for supporting remote operation
KR101816172B1 (en) The simulation system for training and the method thereof
US9230452B2 (en) Device and method for generating a virtual anatomic environment
WO1996016389A1 (en) Medical procedure simulator
Ayache et al. Simulation of endoscopic surgery
CA3110725A1 (en) Vibrotactile method, apparatus and system for training and practicing dental procedures
WO1999017265A1 (en) Method and apparatus for surgical training and simulating surgery
Han et al. Virtual reality simulation of high tibial osteotomy for medical training
AU2020230230B2 (en) Laparoscopic simulator
Weiss et al. A virtual-reality-based haptic surgical training system
JPH11219100A (en) Medical simulator system
JP2004348091A (en) Entity model and operation support system using the same
WO2001088881A2 (en) Tools for endoscopic tutorial system
Rasakatla et al. Robotic Surgical training simulation for dexterity training of hands and fingers (LESUR)

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

WWE Wipo information: entry into national phase

Ref document number: 1994922027

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWP Wipo information: published in national office

Ref document number: 1994922027

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: CA

WWW Wipo information: withdrawn in national office

Ref document number: 1994922027

Country of ref document: EP