WO2007019546A2 - System, device, and methods for simulating surgical wound debridements - Google Patents

System, device, and methods for simulating surgical wound debridements Download PDF

Info

Publication number
WO2007019546A2
WO2007019546A2 PCT/US2006/031063 US2006031063W WO2007019546A2 WO 2007019546 A2 WO2007019546 A2 WO 2007019546A2 US 2006031063 W US2006031063 W US 2006031063W WO 2007019546 A2 WO2007019546 A2 WO 2007019546A2
Authority
WO
WIPO (PCT)
Prior art keywords
wound
simulated
human body
debridement
simulator
Prior art date
Application number
PCT/US2006/031063
Other languages
French (fr)
Other versions
WO2007019546A3 (en
Inventor
Lee A.Ii Belfor
Jenifer Seevinck
Frederick D. Mckenzie
Mark W. Scerbo
Hector Garcia
Sylvia Girtelschmid
Emre Baydogan
Wesley Adam Taggart
R. Bowen Loftin
Jessica R. Crouch
Yuzhong Shen
Leonard J. Weireter
Original Assignee
Old Dominion University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Old Dominion University filed Critical Old Dominion University
Publication of WO2007019546A2 publication Critical patent/WO2007019546A2/en
Publication of WO2007019546A3 publication Critical patent/WO2007019546A3/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Definitions

  • the present invention is related to the field of computer-based simulation, and, more particularly, simulation of surgical procedures.
  • Wound debridement refers generally to procedures for removing necrotic, devitalized, or contaminated tissue, and/or removing foreign objects from a patient's wound. Successful wound debridement promotes healing of the wound.
  • Surgical wound debridement is typically the fastest method of performing debridement. It tends to be the most selective method in the sense that a surgeon has virtually complete control over which tissue is removed and which is left intact. Surgical wound debridement is typically the best method of debridement for wounds afflicted with a large amount of necrotic tissue. It similarly is often the preferred method when infected tissue must also be removed.
  • the present invention provides a system, device, and related methods for simulated a surgical wound debridement.
  • the invention can be used for training medical and non-medical personnel, such as emergency and combat personnel, that are called upon to perform wound under various conditions.
  • the invention provides realistic models, visual graphics, and haptic sensations that result in an effective learning experience.
  • the invention simulates various aspects of wound debridement, including wound cleaning, tissue deformation, and foreign-body extractions.
  • the simulative experience afforded by the invention can further include sequencing guidance, performance evaluation and feedback during training sessions, and overall performance assessments designed to test the competency of personnel in performing wound debridments.
  • One embodiment of the invention is a virtual reality simulator that incorporates three-dimensional modeling of portions of the human body that exhibit realistic responses to surgical procedures performed on the body. BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic diagram of a wound debridement simulator, according to one embodiment of the invention.
  • FIG. 2 is a flowchart illustrative of a method of simulating wound debridement, according to another embodiment of the invention.
  • FIG. 3 is a schematic side view of a haptic display device, according to yet another embodiment of the invention.
  • FIG. 4 is a schematic top view of the haptic display device in FIG. 3.
  • FIG. 5 is a schematic front view of the haptic display device in FIG. 3.
  • the invention is a simulator for training medical and non-medical personnel to successfully perform various procedures associated with the debridement of a wound under a variety of circumstances.
  • the simulator more particularly, provides a realistic virtual-reality environment that creates for the user a simulated image of a portion of a human body that has suffered a wound. While viewing the wound, the user can operate the simulator to simulate cleaning the wound, deforming and removing tissue, and extracting foreign objects such as shrapnel or shards of glass from the wound.
  • the simulator simulates the look and feel of an actual performance of these procedures.
  • the user employs one or more simulative instruments concurrently such as a brush, scalpel, forceps, scissors and/or irrigator used for performing the procedures. If two or more simulative instalments are used, the user can employ them concurrently with one another. Additionally, the simulator can simulate reactions of the body to the simulative performance of the various procedures. The simulator, moreover, can be programmed to generate different bodily reactions in relation to the procedures.
  • simulative instruments such as a brush, scalpel, forceps, scissors and/or irrigator used for performing the procedures. If two or more simulative instalments are used, the user can employ them concurrently with one another.
  • the simulator can simulate reactions of the body to the simulative performance of the various procedures.
  • the simulator moreover, can be programmed to generate different bodily reactions in relation to the procedures.
  • the bodily reactions can include a change with respect to the geometry of the body as well as with respect to the image on the surface.
  • the geometry corresponds to a three-dimensional (3D) model of the body.
  • the 3D model deforms in response to and/or is modified by the simulated surgical processes, such as a deformation that arises from a simulated cutting.
  • An image of the wound, as well as changes thereto (e.g., bleeding), are projected onto, or "painted" on, the surface.
  • the surface more particularly, is the surface of the 3D model. Orientation of the imaged body or body part is the result of a mathematical transformation that is built into a graphics API.
  • the simulative performance of the various procedures provides a user with an effective learning experience, albeit one that does not necessarily require supervision of an experienced practitioner nor entail risks to an actual wound victim.
  • the simulator can provide to the user procedural sequencing, performance feedback during the simulative performance of the procedures, and post- performance evaluation of the user's performance.
  • the simulator 100 for simulating procedures relating to surgical debridement of a wound according to one embodiment is schematically illustrated.
  • the simulator 100 illustratively includes a visual display 102, the modeling system 104 in communication with the visual display, and a haptic device in communication with the tissue modeling system 106. Additionally, the simulator 100 includes a training module 108 in communication with both the modeling system 104 and the haptic device 106.
  • the simulator 100 can, according to another embodiment, also include a recordation module 110 in communication with both the modeling system 106 and the haptic device 108, as well as an evaluation module 112 in communication with the recordation module.
  • the visual display 102 can comprise, for example, a liquid crystal display (LCD), cathode ray tube (CRT) monitor, or similar type of computer-based imaging screen for generating a visual image.
  • LCD liquid crystal display
  • CRT cathode ray tube
  • the user can wear stereo-optic glasses, such as the CrystalEyes®3 glasses made by StereoGraphics Corporation of San Rafael, California, to view the visual image as a 3-D image.
  • the visual display 102 displays a simulated rendering of a portion of a human body upon which a wound has been inflicted.
  • the simulated rendering can be changed based upon user input, the user input cause a change from an image of one particular portion of a human body to another.
  • the visual display 102 can be used to render images of different wounded body portions.
  • the visual display 102 also can be changed according to user input to render different types of wounds, including, for example, a gunshot wound, shrapnel wound, or other type of wound.
  • the modeling system 104 in communication with the visual display 102 causes the animated rendering to change in response to a simulated touching of the portion of the human body.
  • the modeling system 104 for generating the actual simulation of procedures is achieved by integrating multiple, distinct modules.
  • One integrated module is a module for modeling the tissue.
  • the model for example, can be a physics-based tissue model. Accordingly, the modeling module can implement at least two different physics models for modeling tissue. The first is a mass-spring model (MSM). The second is a finite-element model (FEM).
  • MSM mass-spring model
  • FEM finite-element model
  • the tissue modeling system also includes a collision detection module. Collision detection is implemented to generate responses to simulated surgical procedures. For example, a scalpel "collides" with tissue when the scalpel intersects the "skin" of the imaged model. Collisions can similarly result from a simulated cutting, probe, or other procedure. A result of a collision can be blood flows on the surface or causing a glass shard to touch another glass shard.
  • collision detection is a distinct process separate from modeling in the sense that collision detection can provide potential input to the model, the modeling system 104 providing the integration that generates the resulting response to the input. Moreover, this embodiment of the system 104 utilizes an architecture configured to support real time updates as well as a modular software for effecting component integration to produce the desired results.
  • the modeling system 104 significantly extends conventional tissue models so as to facilitate user interaction with the model.
  • the simulation generates changes in appearance in the simulated tissue to correspond, for example, to wound cleaning, bleeding, rinsing, and treating a wound, thereby generating a realistic rendering of wound debridement.
  • the modeling system 104 thus can comprise machine-readable code for rendering portions of the human body in a manner that displays the elastic characteristics of skin, tissue, muscle, and similar such bodily components.
  • the modeling system 104 can be manifest as a physics-based model, as described above, or other model that provides a visual fidelity suitable for training.
  • the mass-spring system animating elastic characteristics of tissue can comprise creating a three- dimensional (3-D) mesh of discrete points of the object whose elastic characteristics are to be modeled. Point masses are associated with each node of the 3-D mesh, and damped springs are associated with the mesh edges.
  • the finite element model as will be readily understood by one of ordinary skill in the art, animating such elastic characteristics can comprise creating a three dimensional mesh of discrete elements whose characteristics can be modeled using a tensor. The tensor characteristics are derived from the desired bulk tissue characteristics.
  • the modeling system 104 can be made portable across diverse platforms using various known computer graphics-based libraries and toolkits.
  • the available libraries and toolkits include, for example, the OpenGL environment and the GLUT toolkit.
  • the modeling system 104 comprises a stored set of machine- readable code for rendering portions of the human body in accordance with the elastic characteristics of the various portions so rendered.
  • the machine-readable code can be stored in a memory (not shown) and executed using one or more processors (also not shown) connected with the memory, the execution generating the desired image on the visual display 102.
  • the modeling system 104 can be implemented in one or more dedicated hardwired circuits or through an integration of several distinct computing devices connected to the visual display 102.
  • the hardwired circuitry and/or integrated computing devices can be configured to generate visual renderings of portions of the human body in a manner that displays their elastic characteristics using a three-dimensional mesh of discrete points along with point masses at each node and damped springs at edges of the mesh as already described.
  • the modeling system 104 can be implemented as a combination of hardwired circuitry, computing devices, and/or machine- readable code.
  • the hap tic device 106 in communication with the tissue modeling system 104 generates force feedback (i.e., a "haptic" or tactile sensation) felt by the user in response to simulated touching of the human body.
  • the simulated touching can comprise, for example, a simulated application of a scalpel, forceps, brush, scissors, or fluid from an irrigator.
  • the haptic device 106 can include a mock instrument (not shown).
  • the haptic device 106 includes a plurality of interchangeable mock instruments configured to give the user the feel of different instruments used for performing a wound debridement, including a scalpel, forceps, brush, scissors, and irrigator.
  • the haptic device 106 causes the visual display 102 to render an image of the particular instrument in juxtaposition to the portion of the human body rendered in the same image. As the user moves the mock instrument, the visual image of the instrument moves relative to the image of the human body portion also comprising part of the visual image. [0033]
  • the degrees of freedom in movement afforded to the user are determined by a mechanical interface (not shown) that is also a component of the haptic device 106.
  • the mechanical interface more particularly, provides interfaces for input and output between the user as the mock instrument is manipulated and one or more processors (also not shown) that cause the visual image to change in response thereto.
  • the resulting position and/or orientation of the mock instrument is translated by the mechanical interface into a form suitable for interpretation by sensors of the mechanical interface.
  • the hap tic device provides electronic circuits from integrated sensors, tracking positions and using electronic signals generated by the system to produce force feedback.
  • the sensors track the movements of the mock instrument and provide suitable electronic signals to the one or more processors, which, in turn, process the position and/or orientation information and cause the image rendered by the visual display 102 to change accordingly.
  • the processors generate electronic signals corresponding to force feedback information, the signals supplied to actuators being coupled to the mechanical interface.
  • the actuators generate forces on members of the mechanical apparatus to provide corresponding forces on the mock instrument. The user, accordingly, experiences the forces so generated as realistic simulations of the tactile sensations experienced in performing the particular wound debridement procedure.
  • the training module 108 in communication with the tissue modeling system 104 and the haptic device 106 causes the system and device to operate in a predefined manner in response to at least one user-supplied input.
  • the training module 108 receives user input in the form of machine-readable data, entered for example via a keyboard or other input/output (I/O) device.
  • the training module 108 causes a particular portion of the human body to be rendered by the visual display 102 and to exhibit a particular type of wound.
  • the data also can cause the visual image, as well as the tactile responses associated therewith, to change. Accordingly, this induces not only a change in the image rendered by the visual display but also the tactile sensations generated with the haptic device 106
  • the training module 108 can cause the visual display 102 to render an image of a wound such as a thigh wound.
  • the training module 108 can cause the visual display 102 to render a specific type of wound, such as a bullet wound or shrapnel wound.
  • the training module 108 can cause the visual display to render particular, predefined characteristics, such as certain types of infection or excessive bleeding.
  • the different renderings can be used to simulate various conditions associated with particular types of wounds, thereby providing a more realistic as well as more varied learning experience for the user.
  • the training module 108 is implemented in machine-readable code.
  • the training module can be configured to run on a general-purpose or application specific computing device 110 having one or more processors and memory elements as will be readily understood by one of ordinary skill in the art.
  • the training module 108 can be implemented in one or more dedicated hardwired circuits, or as a combination of hardwired circuitry and machine-readable code.
  • the simulator 100 additionally includes a recordation module 112 in communication with both the tissue modeling system 104 and the haptic device 106.
  • the recordation module 112 records simulated responses of the portion of the human body to the simulated touching.
  • the recordation effected with the recordation module 112 provides a record of the responses induced by the particular manner in which the user performs one or more procedures for accomplishing wound debridement. If the particular wound debridement procedure or procedures are performed well, the responses generated are accordingly positive in nature. Conversely, if one or more of the procedures are not performed satisfactorily, the record will reflect the sublevel performance.
  • the simulator additionally includes an evaluation module 114.
  • the evaluation module 114 is illustratively in communication with the recordation module 112.
  • the evaluation module 114 generates a performance evaluation based upon the simulated responses recorded by the recordation module 112.
  • the evaluation module 114 can be used to identify techniques of the particular user in performing the simulated wound debridement. In particular, the evaluation module 114 can identify particular problems the user has with respect to performing one or more of the procedures related to wound debridement.
  • the simulator 100 additionally, or alternatively, includes a wound debridement procedures module 116.
  • the wound debridement procedure module 116 can be communicatively linked to the visual display 102 and/or an audio rendering device incoiporated in the simulator 100.
  • the wound debridement procedure module 116 generates user guidance, in the form of visual and/or audio output, for performing a predefined wound debridement procedure. Accordingly, the wound debridement procedure module 116 can substitute for or provide a supplement to wound debridement training by a medical professional.
  • an inexperienced user can begin immediate training by "working through” a simulated procedure under the guidance of the visual and/or audio provided by the procedure module 116 [0041]
  • the procedure module 116 provides a particularly effective teaching mechanism. Medical and non-medical personnel alike who are inexperienced in performing procedures related to wound debridement can, as already pointed out, begin immediately working through procedures in a virtual-reality environment provided by the simulator 100. Thus, the user gains hands-on experience at the outset while receiving direct instruction from the procedure module 116. The performance of the user can be evaluated as he or she carries out a procedure.
  • performance parameters can be recorded by the recordation module 112 during the simulated performance of the wound debridement, and the evaluated at the conclusion of a performance of a procedure by the evaluation module 114.
  • the simulator 100 can provide a mechanism for reaching a larger as well as more diverse, non-medical personnel included, group of individuals who have a need to be trained in the technique of wound debridement. Without the need for a patient on which to practice the procedures, inexperienced personnel can learn more efficiently and more rapidly each of the various procedures, while also avoiding risks to real-world patients.
  • the method 200 includes, at step 202, rendering a virtual-reality image of a portion of a human body upon which a wound has been inflicted.
  • the method continues at step 204 whereby the virtual-reality image is caused to change in response to a simulated touching of the portion of the human body, the touching corresponding to at least one procedure for performing a wound debridement.
  • the method 200 further includes at step 206 simulating force feedback based upon the simulated touching.
  • the step of rendering a virtual-reality image can further include causing the visual display to render an image of a particular one of a plurality of predefined portions of the human body.
  • the step of rendering a virtual-reality image further can include causing the visual display to render an image of a particular one of a plurality of predefined types of wounds.
  • the method 200 further includes recording at step 208 simulated responses of the portion of the human body to the simulated touching.
  • the method also can include, according to still another embodiment, generating at step 210 a performance evaluation based upon the simulated responses recorded. Additionally, or in lieu of steps 208 and 210, the method according to yet another embodiment can include generating user guidance for performing a predefined wound debridement procedure at step 212. The method illustratively concludes at step 214.
  • Yet another embodiment of the invention is a portable haptic display device.
  • the haptic display device can be used to display an image on a reflective surface. The image can respond interactively to manipulations using a haptic device.
  • the device can be used as an augmented/mixed-reality display device useful with a variety of haptic applications.
  • the haptic display device can be portably configured to allow easy transport in various environments.
  • the haptic display device illustratively includes a stand comprising a base portion, an extension extending from the base portion, and a holding unit connected to the extension in which a laptop computer can be positioned. Connected to the stand, as further shown, is an adjustable display surface that is connected to the stand and onto which an image generated by the laptop computer can be displayed.
  • the haptic display can include a tilt adjustment for adjustably aligning the adjustable display in relation to a viewer.
  • the holding unit is adjustably connected to the base portion, and the device further includes a tilt adjustment for adjustably aligning the holding unit relative to a viewer.
  • the display surface of the device preferably is a translucent surface that can be modified so that the translucency of the surface can be modified to accommodate a plurality of distinct applications.
  • the holding unit is preferably configured to provide access to peripheral connections of the laptop computer when the laptop computer is positioned within the holding unit.
  • the haptic display device can optionally operate with one or more haptic systems. More particularly, the haptic display device can operate with twin or dual haptic systems in order to provide for two-handed interactions. Additional representations of an embodiment of the haptic display device are provided in the APPENDIX.
  • the present invention can be realized in hardware, software, or a combination of hardware and software.
  • the present invention can be realized in a centralized fashion in one computer system, or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
  • a typical combination of hardware and software can be a general purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
  • the present invention also can be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods.
  • Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.

Abstract

A wound debridement simulator for simulating procedures relating to surgical debridement of a wound is provided. The wound debridement simulator includes a visual display for displaying a simulated rendering of a portion of a human body upon which a wound has been inflicted. The wound debridement simulator also includes a modeling system in communication with the visual display, the system causing the animated rendering to change in response to a simulated touching of the portion of the human body. Additionally, the wound debridement simulator further includes a haptic device in communication with the tissue modeling system, the device simulating force feedback based upon the simulated touching. The wound debridement simulator also includes a training module in communication with both the tissue modeling system and the haptic device, the module causing the system and device to operate in a predefined manner in response to one or more user-supplied inputs.

Description

SYSTEM, DEVICE, AND METHODS FOR SIMULATING SURGICAL WOUND DEBRIDEMENTS
FIELD OF THE INVENTION
[0001] The present invention is related to the field of computer-based simulation, and, more particularly, simulation of surgical procedures.
BACKGROUND OF THE INVENTION
[0002] Wound debridement refers generally to procedures for removing necrotic, devitalized, or contaminated tissue, and/or removing foreign objects from a patient's wound. Successful wound debridement promotes healing of the wound.
[0003] Various methods are available for performing wound debridement. The methods include, for example, auto lytic debridement, enzymatic debridement, mechanical debridement, and even the use of maggots applied to the wound. In many cases, however, a preferred method is surgical debridement. Wound debridement through surgery offers several distinct advantages. Surgical wound debridement is typically the fastest method of performing debridement. It tends to be the most selective method in the sense that a surgeon has virtually complete control over which tissue is removed and which is left intact. Surgical wound debridement is typically the best method of debridement for wounds afflicted with a large amount of necrotic tissue. It similarly is often the preferred method when infected tissue must also be removed.
[0004] The advantages afforded by surgical wound debridement, however, typically depend critically on the skill of the particular surgeon or other medical professional that performs the procedure. Because of the inherent risks attendant with most surgical procedures, moreover, surgical wound debridement must be performed with a certain amount of skill to avoid worsening the condition of the wound. Skill is similarly needed in performing pre-op and post-op procedures such as scrubbing a wound and changing bandages.
[0005] In most instances, techniques for training surgeons and other medical personnel in performing various procedures relating to wound debridement have changed little from those used in the past. Typically, the procedures are taught using actual patients. Inexperienced practitioners observe how the procedures are performed and later may have the opportunity to perform the procedures themselves under the supervision of an experienced practitioner. Such training affords only limited opportunity for a practitioner to become proficient in performing such procedures or learning to perform then under varied conditions. [0006] Moreover, the speed with which new practitioners can be trained is limited by the availability of more skilled professionals who are qualified to teach the various procedures. Limitations on the ability to teach the requisite skills to new practitioners and to offer them an opportunity to practice their newly-acquired skills under diverse circumstances could pose a problem in times of war or national emergency when the need for a greater number healthcare professionals able to perform these procedures is particularly acute. [0007] The limitations on conventional teaching devices also can hinder the teaching of emergency wound debridement procedures to non-medical personnel such as soldiers who, of necessity, may be called upon to perform wound debridement on the battlefield or far from a medical facility. Accordingly, there is a need in the art for a way to teach wound debridement procedure to medical professionals more effectively and efficiently. There is also a need for some way in which to extend training to non-medical personnel such as . soldiers who may have the need to learn how to perform wound debridement procedures under atypical, and often difficult, circumstances.
SUMMARY OF THE INVENTION
[0008] The present invention provides a system, device, and related methods for simulated a surgical wound debridement. The invention can be used for training medical and non-medical personnel, such as emergency and combat personnel, that are called upon to perform wound under various conditions. The invention provides realistic models, visual graphics, and haptic sensations that result in an effective learning experience. The invention simulates various aspects of wound debridement, including wound cleaning, tissue deformation, and foreign-body extractions. As described herein, the simulative experience afforded by the invention can further include sequencing guidance, performance evaluation and feedback during training sessions, and overall performance assessments designed to test the competency of personnel in performing wound debridments. One embodiment of the invention is a virtual reality simulator that incorporates three-dimensional modeling of portions of the human body that exhibit realistic responses to surgical procedures performed on the body. BRIEF DESCRIPTION OF THE DRAWINGS
[0009] There are shown in the drawings, embodiments which are presently preferred, it being understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown.
[0010] FIG. 1 is a schematic diagram of a wound debridement simulator, according to one embodiment of the invention.
[0011] FIG. 2 is a flowchart illustrative of a method of simulating wound debridement, according to another embodiment of the invention.
[0012] FIG. 3 is a schematic side view of a haptic display device, according to yet another embodiment of the invention.
[0013] FIG. 4 is a schematic top view of the haptic display device in FIG. 3. [0014] FIG. 5 is a schematic front view of the haptic display device in FIG. 3.
DETAILED DESCRIPTION
[0015] The invention, according to one embodiment, is a simulator for training medical and non-medical personnel to successfully perform various procedures associated with the debridement of a wound under a variety of circumstances. The simulator, more particularly, provides a realistic virtual-reality environment that creates for the user a simulated image of a portion of a human body that has suffered a wound. While viewing the wound, the user can operate the simulator to simulate cleaning the wound, deforming and removing tissue, and extracting foreign objects such as shrapnel or shards of glass from the wound. [0016] The simulator simulates the look and feel of an actual performance of these procedures. In performing the procedures, the user employs one or more simulative instruments concurrently such as a brush, scalpel, forceps, scissors and/or irrigator used for performing the procedures. If two or more simulative instalments are used, the user can employ them concurrently with one another. Additionally, the simulator can simulate reactions of the body to the simulative performance of the various procedures. The simulator, moreover, can be programmed to generate different bodily reactions in relation to the procedures.
[0017] The bodily reactions, more particularly, can include a change with respect to the geometry of the body as well as with respect to the image on the surface. The geometry corresponds to a three-dimensional (3D) model of the body. The 3D model deforms in response to and/or is modified by the simulated surgical processes, such as a deformation that arises from a simulated cutting. An image of the wound, as well as changes thereto (e.g., bleeding), are projected onto, or "painted" on, the surface. The surface, more particularly, is the surface of the 3D model. Orientation of the imaged body or body part is the result of a mathematical transformation that is built into a graphics API.
[0018] The simulative performance of the various procedures provides a user with an effective learning experience, albeit one that does not necessarily require supervision of an experienced practitioner nor entail risks to an actual wound victim. For a more effective learning experience, moreover, the simulator can provide to the user procedural sequencing, performance feedback during the simulative performance of the procedures, and post- performance evaluation of the user's performance.
[0019] Referring initially to FIG. 1, a simulator 100 for simulating procedures relating to surgical debridement of a wound according to one embodiment is schematically illustrated. [0020] The simulator 100 illustratively includes a visual display 102, the modeling system 104 in communication with the visual display, and a haptic device in communication with the tissue modeling system 106. Additionally, the simulator 100 includes a training module 108 in communication with both the modeling system 104 and the haptic device 106. The simulator 100 can, according to another embodiment, also include a recordation module 110 in communication with both the modeling system 106 and the haptic device 108, as well as an evaluation module 112 in communication with the recordation module. [0021] The visual display 102 can comprise, for example, a liquid crystal display (LCD), cathode ray tube (CRT) monitor, or similar type of computer-based imaging screen for generating a visual image. Moreover, the user can wear stereo-optic glasses, such as the CrystalEyes®3 glasses made by StereoGraphics Corporation of San Rafael, California, to view the visual image as a 3-D image.
[0022] The visual display 102 displays a simulated rendering of a portion of a human body upon which a wound has been inflicted. As described more particularly herein, the simulated rendering can be changed based upon user input, the user input cause a change from an image of one particular portion of a human body to another. Accordingly, the visual display 102 can be used to render images of different wounded body portions. Similarly, the visual display 102 also can be changed according to user input to render different types of wounds, including, for example, a gunshot wound, shrapnel wound, or other type of wound. [0023] The modeling system 104 in communication with the visual display 102 causes the animated rendering to change in response to a simulated touching of the portion of the human body. The modeling system 104 for generating the actual simulation of procedures, according to one embodiment, is achieved by integrating multiple, distinct modules. One integrated module is a module for modeling the tissue. The model, for example, can be a physics-based tissue model. Accordingly, the modeling module can implement at least two different physics models for modeling tissue. The first is a mass-spring model (MSM). The second is a finite-element model (FEM).
[0024] The tissue modeling system according to this embodiment also includes a collision detection module. Collision detection is implemented to generate responses to simulated surgical procedures. For example, a scalpel "collides" with tissue when the scalpel intersects the "skin" of the imaged model. Collisions can similarly result from a simulated cutting, probe, or other procedure. A result of a collision can be blood flows on the surface or causing a glass shard to touch another glass shard.
[0025] It is to be noted that collision detection is a distinct process separate from modeling in the sense that collision detection can provide potential input to the model, the modeling system 104 providing the integration that generates the resulting response to the input. Moreover, this embodiment of the system 104 utilizes an architecture configured to support real time updates as well as a modular software for effecting component integration to produce the desired results.
[0026] The modeling system 104 significantly extends conventional tissue models so as to facilitate user interaction with the model. The simulation generates changes in appearance in the simulated tissue to correspond, for example, to wound cleaning, bleeding, rinsing, and treating a wound, thereby generating a realistic rendering of wound debridement. [0027] The modeling system 104 thus can comprise machine-readable code for rendering portions of the human body in a manner that displays the elastic characteristics of skin, tissue, muscle, and similar such bodily components. The modeling system 104 can be manifest as a physics-based model, as described above, or other model that provides a visual fidelity suitable for training. The mass-spring system animating elastic characteristics of tissue, as will be readily understood by one of ordinary skill in the art, can comprise creating a three- dimensional (3-D) mesh of discrete points of the object whose elastic characteristics are to be modeled. Point masses are associated with each node of the 3-D mesh, and damped springs are associated with the mesh edges. The finite element model, as will be readily understood by one of ordinary skill in the art, animating such elastic characteristics can comprise creating a three dimensional mesh of discrete elements whose characteristics can be modeled using a tensor. The tensor characteristics are derived from the desired bulk tissue characteristics. [0028] If implemented as machine-readable code, the modeling system 104 can be made portable across diverse platforms using various known computer graphics-based libraries and toolkits. The available libraries and toolkits include, for example, the OpenGL environment and the GLUT toolkit.
[0029] Illustratively, the modeling system 104 comprises a stored set of machine- readable code for rendering portions of the human body in accordance with the elastic characteristics of the various portions so rendered. The machine-readable code can be stored in a memory (not shown) and executed using one or more processors (also not shown) connected with the memory, the execution generating the desired image on the visual display 102.
[0030] Alternatively, the modeling system 104 can be implemented in one or more dedicated hardwired circuits or through an integration of several distinct computing devices connected to the visual display 102. The hardwired circuitry and/or integrated computing devices can be configured to generate visual renderings of portions of the human body in a manner that displays their elastic characteristics using a three-dimensional mesh of discrete points along with point masses at each node and damped springs at edges of the mesh as already described. According to still another embodiment, the modeling system 104 can be implemented as a combination of hardwired circuitry, computing devices, and/or machine- readable code.
[0031] The hap tic device 106 in communication with the tissue modeling system 104 generates force feedback (i.e., a "haptic" or tactile sensation) felt by the user in response to simulated touching of the human body. The simulated touching can comprise, for example, a simulated application of a scalpel, forceps, brush, scissors, or fluid from an irrigator. Accordingly, the haptic device 106 can include a mock instrument (not shown). According to one particular embodiment, the haptic device 106 includes a plurality of interchangeable mock instruments configured to give the user the feel of different instruments used for performing a wound debridement, including a scalpel, forceps, brush, scissors, and irrigator. [0032] The haptic device 106 causes the visual display 102 to render an image of the particular instrument in juxtaposition to the portion of the human body rendered in the same image. As the user moves the mock instrument, the visual image of the instrument moves relative to the image of the human body portion also comprising part of the visual image. [0033] The degrees of freedom in movement afforded to the user are determined by a mechanical interface (not shown) that is also a component of the haptic device 106. The mechanical interface, more particularly, provides interfaces for input and output between the user as the mock instrument is manipulated and one or more processors (also not shown) that cause the visual image to change in response thereto. [0034] As the user manipulates the mock instrument, the resulting position and/or orientation of the mock instrument is translated by the mechanical interface into a form suitable for interpretation by sensors of the mechanical interface. The hap tic device provides electronic circuits from integrated sensors, tracking positions and using electronic signals generated by the system to produce force feedback. The sensors track the movements of the mock instrument and provide suitable electronic signals to the one or more processors, which, in turn, process the position and/or orientation information and cause the image rendered by the visual display 102 to change accordingly. Additionally, the processors generate electronic signals corresponding to force feedback information, the signals supplied to actuators being coupled to the mechanical interface. The actuators generate forces on members of the mechanical apparatus to provide corresponding forces on the mock instrument. The user, accordingly, experiences the forces so generated as realistic simulations of the tactile sensations experienced in performing the particular wound debridement procedure.
[0035] The training module 108 in communication with the tissue modeling system 104 and the haptic device 106 causes the system and device to operate in a predefined manner in response to at least one user-supplied input. According to one embodiment, the training module 108 receives user input in the form of machine-readable data, entered for example via a keyboard or other input/output (I/O) device. Based on the data, the training module 108 causes a particular portion of the human body to be rendered by the visual display 102 and to exhibit a particular type of wound. The data also can cause the visual image, as well as the tactile responses associated therewith, to change. Accordingly, this induces not only a change in the image rendered by the visual display but also the tactile sensations generated with the haptic device 106
[0036] More particularly, the training module 108 can cause the visual display 102 to render an image of a wound such as a thigh wound. The training module 108 can cause the visual display 102 to render a specific type of wound, such as a bullet wound or shrapnel wound. Additionally, the training module 108 can cause the visual display to render particular, predefined characteristics, such as certain types of infection or excessive bleeding. The different renderings can be used to simulate various conditions associated with particular types of wounds, thereby providing a more realistic as well as more varied learning experience for the user.
[0037] Illustratively, the training module 108 is implemented in machine-readable code. As with the tissue modeling system 104, when implemented in machine readable code, the training module can be configured to run on a general-purpose or application specific computing device 110 having one or more processors and memory elements as will be readily understood by one of ordinary skill in the art. Alternatively, the training module 108 can be implemented in one or more dedicated hardwired circuits, or as a combination of hardwired circuitry and machine-readable code.
[0038] According to another embodiment, the simulator 100 additionally includes a recordation module 112 in communication with both the tissue modeling system 104 and the haptic device 106. The recordation module 112 records simulated responses of the portion of the human body to the simulated touching. The recordation effected with the recordation module 112 provides a record of the responses induced by the particular manner in which the user performs one or more procedures for accomplishing wound debridement. If the particular wound debridement procedure or procedures are performed well, the responses generated are accordingly positive in nature. Conversely, if one or more of the procedures are not performed satisfactorily, the record will reflect the sublevel performance. [0039] According to still another embodiment, the simulator additionally includes an evaluation module 114. The evaluation module 114 is illustratively in communication with the recordation module 112. The evaluation module 114 generates a performance evaluation based upon the simulated responses recorded by the recordation module 112. The evaluation module 114 can be used to identify techniques of the particular user in performing the simulated wound debridement. In particular, the evaluation module 114 can identify particular problems the user has with respect to performing one or more of the procedures related to wound debridement.
[0040] According to yet another embodiment, the simulator 100 additionally, or alternatively, includes a wound debridement procedures module 116. The wound debridement procedure module 116 can be communicatively linked to the visual display 102 and/or an audio rendering device incoiporated in the simulator 100. The wound debridement procedure module 116 generates user guidance, in the form of visual and/or audio output, for performing a predefined wound debridement procedure. Accordingly, the wound debridement procedure module 116 can substitute for or provide a supplement to wound debridement training by a medical professional. According to one particular embodiment, an inexperienced user can begin immediate training by "working through" a simulated procedure under the guidance of the visual and/or audio provided by the procedure module 116 [0041] When coupled with the recordation and evaluation modules 112, 114, the procedure module 116 provides a particularly effective teaching mechanism. Medical and non-medical personnel alike who are inexperienced in performing procedures related to wound debridement can, as already pointed out, begin immediately working through procedures in a virtual-reality environment provided by the simulator 100. Thus, the user gains hands-on experience at the outset while receiving direct instruction from the procedure module 116. The performance of the user can be evaluated as he or she carries out a procedure. Alternatively, performance parameters can be recorded by the recordation module 112 during the simulated performance of the wound debridement, and the evaluated at the conclusion of a performance of a procedure by the evaluation module 114. [0042] Accordingly, the simulator 100 can provide a mechanism for reaching a larger as well as more diverse, non-medical personnel included, group of individuals who have a need to be trained in the technique of wound debridement. Without the need for a patient on which to practice the procedures, inexperienced personnel can learn more efficiently and more rapidly each of the various procedures, while also avoiding risks to real-world patients. [0043] Referring now to FIG. 2, a flowchart sets forth some exemplary steps for carrying out a method of simulating procedures relating to surgical debridement of a wound according to yet another embodiment of the present invention. The method 200 includes, at step 202, rendering a virtual-reality image of a portion of a human body upon which a wound has been inflicted. The method continues at step 204 whereby the virtual-reality image is caused to change in response to a simulated touching of the portion of the human body, the touching corresponding to at least one procedure for performing a wound debridement. The method 200 further includes at step 206 simulating force feedback based upon the simulated touching.
[0044] More particularly, the step of rendering a virtual-reality image can further include causing the visual display to render an image of a particular one of a plurality of predefined portions of the human body. Alternatively, or in addition to this, the step of rendering a virtual-reality image further can include causing the visual display to render an image of a particular one of a plurality of predefined types of wounds.
[0045] According to another embodiment, the method 200 further includes recording at step 208 simulated responses of the portion of the human body to the simulated touching. The method also can include, according to still another embodiment, generating at step 210 a performance evaluation based upon the simulated responses recorded. Additionally, or in lieu of steps 208 and 210, the method according to yet another embodiment can include generating user guidance for performing a predefined wound debridement procedure at step 212. The method illustratively concludes at step 214. [0046] Yet another embodiment of the invention is a portable haptic display device. The haptic display device can be used to display an image on a reflective surface. The image can respond interactively to manipulations using a haptic device. Accordingly, the device can be used as an augmented/mixed-reality display device useful with a variety of haptic applications. As will be apparent from the following description, the haptic display device can be portably configured to allow easy transport in various environments. [0047] As shown in FIG'S. 3-5, the haptic display device illustratively includes a stand comprising a base portion, an extension extending from the base portion, and a holding unit connected to the extension in which a laptop computer can be positioned. Connected to the stand, as further shown, is an adjustable display surface that is connected to the stand and onto which an image generated by the laptop computer can be displayed. The haptic display can include a tilt adjustment for adjustably aligning the adjustable display in relation to a viewer. According to one particular, embodiment, the holding unit is adjustably connected to the base portion, and the device further includes a tilt adjustment for adjustably aligning the holding unit relative to a viewer.
[0048] The display surface of the device preferably is a translucent surface that can be modified so that the translucency of the surface can be modified to accommodate a plurality of distinct applications. The holding unit is preferably configured to provide access to peripheral connections of the laptop computer when the laptop computer is positioned within the holding unit. The haptic display device, moreover, can optionally operate with one or more haptic systems. More particularly, the haptic display device can operate with twin or dual haptic systems in order to provide for two-handed interactions. Additional representations of an embodiment of the haptic display device are provided in the APPENDIX.
[0049] The present invention, as noted throughout, can be realized in hardware, software, or a combination of hardware and software. The present invention can be realized in a centralized fashion in one computer system, or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software can be a general purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
[0050] The present invention also can be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
[0051] This invention can be embodied in other forms without departing from the spirit or essential attributes thereof. Accordingly, reference should be made to the following claims, rather than to the foregoing specification, as indicating the scope of the invention.

Claims

CLAIMSWe claim:
1. A wound debridement simulator for simulating procedures relating to surgical debridement of a wound, the wound debridement simulator comprising: a visual display for displaying a simulated rendering of a portion of a human body upon which a wound has been inflicted; a modeling system in communication with the visual display, the system causing the animated rendering to change in response to a simulated touching of the portion of the human body; at least one haptic devices in communication with the modeling system, the device simulating force feedback based upon the simulated touching; and a training module in communication with both the modeling system and the haptic device, the module causing the system and device to operate in a predefined manner in response to at least one user-supplied input.
2. The wound debridement simulator defined in Claim 1, wherein the training module causes the visual display to render an image of a particular one of a plurality of predefined portions of the human body.
3. The wound debridement simulator defined in Claim 1, wherein the training module causes the visual display to render an image of a particular one of a plurality of predefined types of wounds.
4. The wound debridement simulator as defined in Claim 1, further comprising a recordation module in communication with both the modeling system and the haptic device, the recordation module recording simulated responses of the portion of the human body to the simulated touching.
5. The wound debridement simulator defined in Claim 4, further comprising an evaluation module in communication with the recordation module, the evaluation module generating a performance evaluation based upon the simulated responses recorded.
6. The wound debridement simulator as defined in Claim 1, further comprising a wound debridement procedures module for generating user guidance for performing a predefined wound debridement procedure.
7. A method for simulating procedures relating to surgical debridement of a wound, the method comprising the steps of: rendering a virtual-reality image of a portion of a human body upon which a wound has been inflicted; causing the virtual-reality image to change in response to a simulated touching of the portion of the human body, the touching corresponding to at least one procedure for performing a wound debridement; and simulating force feedback based upon the simulated touching.
8. The method of Claim 7, further comprising causing the visual display to render an image of a particular one of a plurality of predefined portions of the human body.
9. The method of Claim 7, further comprising causing the visual display to render an image of a particular one of a plurality of predefined types of wounds.
10. The method of Claim 7, further comprising recording simulated responses of the portion of the human body to the simulated touching.
11. The method of Claim 10, further comprising generating a performance evaluation based upon the simulated responses recorded.
12. The method of Claim 7, further comprising generating user guidance for performing a predefined wound debridement procedure.
13. A computer-readable storage medium, the storage medium comprising computer instructions for: rendering a virtual-reality image of a portion of a human body upon which a wound has been inflicted; causing the virtual-reality image to change in response to a simulated touching of the portion of the human body, the touching corresponding to at least one procedure for performing a wound debridement; and simulating force feedback based upon the simulated touching.
14. The computer-readable storage medium of Claim 13, further comprising at least one computer instruction for recording simulated responses of the portion of the human body to the simulated touching and generating a performance evaluation based upon the simulated responses recorded.
15. The computer-readable storage medium of Claim 14, further comprising at least one computer instruction for generating user guidance for performing a predefined wound debridement procedure.
16. A portable haptic display device comprising: a stand comprising a base portion, an extension extending from the base portion, and a holding unit connected to the extension in which a laptop computer can be positioned; and an adjustable display surface connected to the stand onto which an image generated by the laptop computer can be displayed.
17. The device of Claim 16, further comprising a tilt adjustment for adjustably aligning the adjustable display in relation to a viewer.
18. The device of Claim 16, wherein the holding unit is adjustably connected to the base portion, and further comprising a tilt adjustment for adjustably aligning the holding unit relative to a viewer.
19. The device of Claim 16, wherein the display surface comprises a translucent surface that can be modified so that the translucency of the surface can be modified to accommodate a plurality of distinct applications.
20. The device of Claim 16, wherein the holding unit is configured to provide access to peripheral connections of a laptop computer when the laptop computer is positioned within the holding unit.
21. The device of Claim 16, wherein the device operates with at least one hap tic system.
22. The device of Claim 21, wherein the at least one haptic system comprises a dual haptic system for two-handed interactions.
PCT/US2006/031063 2005-08-08 2006-08-08 System, device, and methods for simulating surgical wound debridements WO2007019546A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US70641405P 2005-08-08 2005-08-08
US60/706,414 2005-08-08

Publications (2)

Publication Number Publication Date
WO2007019546A2 true WO2007019546A2 (en) 2007-02-15
WO2007019546A3 WO2007019546A3 (en) 2009-04-02

Family

ID=37728021

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/031063 WO2007019546A2 (en) 2005-08-08 2006-08-08 System, device, and methods for simulating surgical wound debridements

Country Status (1)

Country Link
WO (1) WO2007019546A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010083272A1 (en) * 2009-01-15 2010-07-22 Simquest Llc Interactive simulation of biological tissue
WO2010148078A3 (en) * 2009-06-16 2011-07-07 Simquest Llc Hemorrhage control simulator
CN101441205B (en) * 2008-11-17 2013-04-24 江苏科技大学 Test system of biological soft tissue force feedback touch feeling model building
CN109389590A (en) * 2017-09-28 2019-02-26 上海联影医疗科技有限公司 Colon image data processing system and method
US10729650B2 (en) 2017-01-23 2020-08-04 United States Of America As Represented By The Secretary Of The Air Force Skin punch biopsy and wound-debridgement training model
EP4070725A1 (en) * 2021-04-07 2022-10-12 Koninklijke Philips N.V. Predicting ballistic forces on a subject
EP4145468A1 (en) * 2021-09-07 2023-03-08 Toyota Jidosha Kabushiki Kaisha Injury estimation system, injury estimation method, and injury estimation program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050142525A1 (en) * 2003-03-10 2005-06-30 Stephane Cotin Surgical training system for laparoscopic procedures
US7202851B2 (en) * 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7202851B2 (en) * 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation
US20050142525A1 (en) * 2003-03-10 2005-06-30 Stephane Cotin Surgical training system for laparoscopic procedures

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BERKLEY ET AL.: 'Real-Time Element Modeling for Surgery Simulation: An application to Virtual Suturing.' IEEE TRANSACTION ON VISUALIZATION AND COMPUTER GRAPHICS vol. 10, no. 3, May 2004, pages 321 - 324 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441205B (en) * 2008-11-17 2013-04-24 江苏科技大学 Test system of biological soft tissue force feedback touch feeling model building
WO2010083272A1 (en) * 2009-01-15 2010-07-22 Simquest Llc Interactive simulation of biological tissue
WO2010148078A3 (en) * 2009-06-16 2011-07-07 Simquest Llc Hemorrhage control simulator
US9142144B2 (en) 2009-06-16 2015-09-22 Simquest Llc Hemorrhage control simulator
US10729650B2 (en) 2017-01-23 2020-08-04 United States Of America As Represented By The Secretary Of The Air Force Skin punch biopsy and wound-debridgement training model
CN109389590A (en) * 2017-09-28 2019-02-26 上海联影医疗科技有限公司 Colon image data processing system and method
WO2019061202A1 (en) * 2017-09-28 2019-04-04 Shenzhen United Imaging Healthcare Co., Ltd. System and method for processing colon image data
US11216948B2 (en) 2017-09-28 2022-01-04 Shanghai United Imaging Healthcare Co., Ltd. System and method for processing colon image data
CN109389590B (en) * 2017-09-28 2022-02-08 上海联影医疗科技股份有限公司 Colon image data processing system and method
EP4070725A1 (en) * 2021-04-07 2022-10-12 Koninklijke Philips N.V. Predicting ballistic forces on a subject
WO2022214408A1 (en) * 2021-04-07 2022-10-13 Koninklijke Philips N.V. Ballistic forces on a subject
EP4145468A1 (en) * 2021-09-07 2023-03-08 Toyota Jidosha Kabushiki Kaisha Injury estimation system, injury estimation method, and injury estimation program

Also Published As

Publication number Publication date
WO2007019546A3 (en) 2009-04-02

Similar Documents

Publication Publication Date Title
Liu et al. A survey of surgical simulation: applications, technology, and education
Escobar-Castillejos et al. A review of simulators with haptic devices for medical training
Schendel et al. A surgical simulator for planning and performing repair of cleft lips
Meier et al. Virtual reality: surgical application—challenge for the new millennium
Fried et al. The role of virtual reality in surgical training in otorhinolaryngology
WO2007019546A2 (en) System, device, and methods for simulating surgical wound debridements
Lange et al. Virtual reality in surgical training
Okamura et al. Haptics in medicine and clinical skill acquisition [special section intro.]
Mathew et al. Role of immersive (XR) technologies in improving healthcare competencies: a review
Wei et al. Augmented optometry training simulator with multi-point haptics
Müller et al. The virtual reality arthroscopy training simulator
KR20050047548A (en) Device and method for generating a virtual anatomic environment
He et al. Robotic simulators for tissue examination training with multimodal sensory feedback
Behringer et al. Some usability issues of augmented and mixed reality for e-health applications in the medical domain
KR100551201B1 (en) Virtual dental training and evaluation system using haptic interface based volumetric model
Han et al. Virtual reality simulation of high tibial osteotomy for medical training
Nakao et al. Transferring bioelasticity knowledge through haptic interaction
Khwanngern et al. Jaw surgery simulation in virtual reality for medical training
Coles et al. Haptic palpation for the femoral pulse in virtual interventional radiology
Frisoli et al. Simulation of real-time deformable soft tissues for computer assisted surgery
Tai et al. Real-time visuo-haptic surgical simulator for medical education–a review
Perez et al. Cataract surgery simulator for medical education & finite element/3D human eye model
Itsarachaiyot et al. Force acquisition on surgical instruments for virtual reality surgical training system
Gutiérrez-Fernández et al. An immersive haptic-enabled training simulation for paramedics
Montgomery et al. A surgical simulator for cleft lip planning and repair

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06801052

Country of ref document: EP

Kind code of ref document: A2