CA2201877C - Surgical navigation systems including reference and localization frames - Google Patents
Surgical navigation systems including reference and localization frames Download PDFInfo
- Publication number
- CA2201877C CA2201877C CA002201877A CA2201877A CA2201877C CA 2201877 C CA2201877 C CA 2201877C CA 002201877 A CA002201877 A CA 002201877A CA 2201877 A CA2201877 A CA 2201877A CA 2201877 C CA2201877 C CA 2201877C
- Authority
- CA
- Canada
- Prior art keywords
- body elements
- procedure
- during
- relative
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0073—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by tomography, i.e. reconstruction of 3D images from 2D projections
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
- A61B17/1739—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
- A61B5/0064—Body surface scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1077—Measuring of profiles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/12—Devices for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/50—Clinical applications
- A61B6/501—Clinical applications involving diagnosis of head, e.g. neuroimaging, craniography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
- A61B17/1703—Guides or aligning means for drills, mills, pins or wires using imaging means, e.g. by X-rays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00022—Sensing or detecting at the treatment site
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
- A61B2034/207—Divots for calibration
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3925—Markers, e.g. radio-opaque or breast lesions markers ultrasonic
- A61B2090/3929—Active markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/3945—Active visible markers, e.g. light emitting diodes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4209—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
- A61B8/4227—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by straps, belts, cuffs or braces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/11—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
Abstract
This invention is a system for use during a medical or surgical procedure on a body. The system generates an image representing the position of one or more body elements (10), (20), (30) using scans generated by a scanner prior or during the procedure. The image data set has reference points (10A-30C) for each of the body elements, the reference points of a body element having a fixed spatial relation to the body element. The system includes an apparatus (108) for identifying the relative position of each of the reference points of each of the body elements to be displayed. The system also includes a processor (104) for modifying the image data set according to the identified relative position of each of the reference points during the procedure. Also disclosed are devices for use with a surgical navigation system having a sensor array (110), (112) in communication with the device to identify its position.
Description
Hack93pLnr1 Of thB IllyEnt~Inn The invention relates generally to systems ,~ which use and generate images during medical and surgical procedures, which images assist is executing the procedures and indicate the relative position of various body parts sad instruments. In particular, the invention relates to a system for generating images during medical ~ sad surgical procedures based on a scan taken prior to or during the procedure and based on the present position of the body parts and instruments during the procedure.
Image guided medical and surgical procedures comprise a technology by which scene, obtained either 15 Pre-procedurally or infra-procedurally (i.e., prior to or during a medical or surgical procedure), are used to generate images to guide a doctor during the procedure.
The recent increase in interest in this field ie a direct result of the recent advances in scanning technology, especially in devices using computers to generate three dimensional images of parts of the body, such as computed tomography (CT) or magnetic resonance imaging (l~tI).
.'S The majority of the advances in diagrammatic imaging involve devices which tend to be large, encircle the body~part being imaged, and are expensive. Although the scans produced by these devices depict the body part under investigation with high resolution and good spatial fidelity, their coat usually precludes the dedication of a unit to be used during the performance of procedures.
Therefore, image guided surgery is usually performed using images taken preoperatively.
The reliance upon preoperative images has 1.. focused image guidance largely to the cranium. The skull, by encasing the brain, serves as a rigid body which largely inhibits chaagea in anatomy between imaging and surgery. The skull also provides a relatively easy point of reference to which fiduciala or a reference system may be attached so that registration of pre-procedural images to the procedural work space can be done simply at the beginning, during, or throughout the procedure. Registration is defined as the process of relating pre-procedural or iatra-procedural scan of the anatomy undergoing aurgeryr to the surgical or medical position of the corresponding anatamay. For example, see U.S. Patent No. 5,383,454.
This eituatioa of rigid fixation and absence of anatomical movement between imaging and surgery ie unique to the skull sad intracranial contents and permits a simple one-to-one registration process as shown is Figure 1. The position during a medical procedure or surgery is in registration with the pre-procedural image data set because of the abaeace of anatomical movement fra~n the time of the scan until the time of the procedure; in WO 96/11624 ~ ~ 7 PCT/US95/12894 effect, the skull and it's intracranial contents comprise a "rigid body," that is, an object which does not deform internally. In almost every other part of the body there is ample opportunity for movement within the anatomy which degrades the fidelity by which the pre-procedural scans depict the intra-procedural anatomy. Therefore, additional innovations are needed to bring image guidance to the rest of the body beyond the cranium.
The accuracy of image guided surgery relies upon the ability to generate images during medical and surgical procedures based on scans taken prior to or during the procedure and based on the present position and shape of the body parts during the procedure. Two types of body parts are addressed herein: 1) structures within the body that do not change shape, do not compress, nor deform between the process of imaging and the medical procedure, which are termed "rigid bodies,"
and are exemplified by the bones of the skeleton; and 2) structures within the body that can change shape and deform between the process of imaging and the medical procedure structures are termed "semi-rigid bodies," and are exemplified by the liver or prostate. Both types of body parts are likely targets for medical or surgical procedures either for repair, fusion, resection, biopsy, or radiation treatment. Therefore,. a technique is needed whereby registration can be performed between the body parts as depicted pre-procedurally on scans and the position and shape of these same body parts as detected intro-procedurally. This technique must take into account that movement can occur between portions of the . body which are not rigidly joined, such as bones connected by a joint, or fragments of a broken bone, and that shape deformation can occur for semi-rigid bodies, such as the liver or prostate. In particular, the technique must be able to modify the scanned image dataset such that the modified image dataset which is 4 ~. a 2 2 01 ~ 7 7 pCT~S95112894 used for localization and display, corresponds to position and/or shape of the body parts) of interest .
during a medical or surgical procedure. A key to achieving this correspondence is the ability to precisely detect and track the position and/or shape of the body parts) of interest during the medical or surgical procedure, as well as to track instruments, ------- or radiation used during the said procedure.
Summary of the Invention_ It is an object of this invention to provide a system which allows registration between a body part depicted in pre-procedural images and tracked during surgery.
It is a further object of this invention to provide a system which allows registration between a semi-rigid body such as the liver depicted in pre-procedural images and detected during surgery.
It is a further object of this invention to provide a system which allows registration between multiple body parts such as skeletal elements depicted in pre-procedural images and detected during surgery..
It is a further object of this invention to provide a system which can localize a semi-rigid body that may deform between imaging and a procedure and provide a display during the procedure of the body in its deformed shape .
It is a further object of this invention to provide a system which can localize multiple rigid bodies that move with respect to each other between imaging and a procedure and provide a display during the procedure of the bodies in their displaced positions.
It is another object.of this invention to provide a system for use during a medical or surgical procedure on the body, the system generating a display representing the position of one or more body elements . 220_1.877 s during the procedure based on a scan generated by a scanner either prior to or during the procedure.
It is another object of this invention to provide a system for use during a medical or surgical procedure on a body which modifies the scan taken prior to or during a procedure according to the identified relative position of each of the elements during the procedure.
It is another object of this invention to provide a system for use during a medical or surgical procedure on a body which modifies the image data set according to the identified shape of each of the element during the procedure.
It is another object of this invention to provide a system which generates a display representative of the position of a medical or surgical instrument in relation to the body elements) during a procedure.
It is a further object of this invention to provide a system for use during image guided medical and surgical procedures which is easily employed by the doctor or surgeon conducting the procedure.
It is another object of this invention to provide a system which determines the relative position and/or shape of body elements during a medical or surgical procedure based on the contour of the body elements which canavoid the need for exposing the body elements.
It is still another object of this invention to provide a system which employs one or more two dimensional fluoroscopic or x-ray images of body elements to determine their relative position and/or shape in three dimensions.
It is yet a further object of this invention to describe a surgical or medical procedure which employs a display representing the position of the body elements) during the procedure based on an image data set of the body elements) generated prior to the procedure.
It is a further object of this invention to provide a system and method for medical or surgical i procedures which allows repositioning of body elements during the procedure and still permits the generation of an image showing the relative position of the body elements.
It is a further object of this invention to provide a system and method for medical or surgical 1() procedures which allows reshaping of the body elements) during the procedure and still permits the generation of an image showing the position and current shape of the body elements.
It is a further object of this invention to 1~~ provide a system which can localize a body element and provide a display during the procedure of the position of the body element relative to an instrument, such as a forceps, microscope, or laser, so that the instrument can be precisely located relative to the body element.
2Ci Other objects and features will be in part apparent and in part poi.nt:ed out hereinafter.
The invention comprises a system for use during a procedure on a body, said system generating a display representing a position of two or more body elements during 25. the procedure, said systern comprising: a memory storing an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a plurality of data points and corresponding to a plurality of reference points for each of the body 3C~ elements, the reference points of a particular body element having a known spatial relation to the data points of the _ 7 particular body element; means for identifying, during the procedure, the position of the reference points of each of the body elements relative to the reference points of the other body elements; a processor modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as identified by the identifying means, said processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor and providing a display, illustrating the relative position of the body elements during the procedure.
The invention also comprises a method for use during a procedure, said method generating a display representing a position of two or more body elements during the procedure, said method comprising the steps of: storing an image data set in a memory, the image data set representing the position of the two or more body elements based on scans taken of the body; reading the image data set stored in the memory, said image data set having a plurality of data points and a plurality of reference points for each of the body elements, the data points of each body element having a spatial relation to the data points of another body element, the reference points of a particular body element having a known spatial relation to the data points of the particular body element; identifying, during the procedure, the position of each of the reference points of each of the body elements relative to the reference points of the other body elements; modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure in order to generate a displaced image data set representing the relative position of the body elements during the procedure; and generating a display based on the displaced image data set illustrating the relative position of the !~ body elements during the ;procedure.
The invention also comprises a system for use during a procedure on a body, said system generating a display representing a geometrical relationship of two or more body elements during the procedure, said system 11) comprising: a memory for storing an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a geometrical spatial relationship of the body elements and having a plurality of reference points on each of the body 1!i elements, the reference points of a particular body element having a known spatial relation on the particular body element; means for identifying, during the procedure, a position of each of the reference points of the body elements to be displayed; a processor modifying the 20 geometrical relationship of the body elements as necessary according to the identified position of each of the reference points as identified by the identifying means, said processor generating a displaced image data set from the modified geometrical :relationship of the body elements, 2!3 said displaced image data set representing the geometrical relationship of the body .elements during the procedure; and a display system utilizing the displaced image data set to generate the display.
The invention also comprises a system for use 30 during a procedure on a body, said system generating a display representing a position of two or more body elements during the procedure, said system comprising: a memory configured to store an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a plurality of data points and corresponding to a plurality of reference points for each of the body elements, the reference points of a particular body element having a known spatial relation to the data points of the particular body element; means for identifying the position of the reference points; a digitizer positioned to receive a signal output by the means for identifying, said digitizer being configured to determine a three dimensional position of the reference points for the particular body element; a processor modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as determined by the digitizer, said processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor, illustrating the relative position of the body elements during the procedure.
The invention also comprises a system for displaying relative positions of two or more body elements during a procedure on a body, the system comprising: a memory storing an image data set, the image data set representing the position of the body elements based on scans of the body, and having a plurality of data points correlatable to a plurality of reference points for each of the body elements, the position of reference points of a particular body element relative to the data points for that particular body element being known; a reference system for identifying, during the procedure, the position of the reference points of each of the body elements relative to the reference points of the other body elements; a processor i modifying the relative position of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as identified by the 5 reference system, the processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor to display the relative position of the body elements during the procedure.
10 The invention also comprises a system for use during a medical or surgical procedure on a body, said system being adapted to generate a display from an image data set representing the position of one or more body elements during the procedure based on scans taken of the body, wherein at least some of the body elements are movable with respect to each other, each scan having a reference for each of the one or more body elements, the reference of a particular body element having a known spatial relation to the particular body element said system comprising: a plurality of reference frames, at least one attached to each of the movable body elements; a localizer for identifying, during the procedure, the position of the reference for each of the body elements to be displayed; a processor for modifying the image data set according to the identified position of the reference during the procedure, as identified by the localizer, said processor being adapted to generate a displaced image data set representing the position and geometry of the body elements according to the identified position of the reference frames during the procedure, as identified by the localizer; and a display utilizing the displaced image data set generated by the 10a processor, illustrating the position and geometry of the body elements during the procedure.
The invention also comprises a method for use during a medical or surgical procedure on a body, said method being adapted to generate a display from an image data set representing the position of one or more body elements during the procedure based on scans taken of the body, wherein at least some of the body elements are movable with respect to each other, each scan having a reference for each of the one or more body elements, the reference of a particular body element having a known spatial relation to the particular body element, wherein there is a plurality of reference frames, at least one attached to each of the movable body elements; said method comprising the steps of:
identifying, during the procedure, the position of the reference for each of the body elements to be displayed;
modifying the image data set according to the identified position of the reference during the procedure, as identified by the localizer; generating a displaced image data set representing the position and geometry of the body elements according to the identified position of the reference frames during the procedure; and illustrating the position and geometry of the body elements during the procedure utilizing the displaced image data set.
In addition, the invention comprises a method for displaying the relative positions of a plurality of body elements during a procedure on a body, the method comprising: storing scan images of body elements, the scan images having a plurality of data points corresponding to a plurality of reference points for each of the body elements, the position of the reference points of a particular body element relative to the data points for that particular body element being known; identifying the position of each of the lOb reference points of each of the body elements relative to the reference points of the other body elements during a procedure; modifying the relative position of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference paints during the procedure;
generating a displaced image data set representing the position of the body elements during the procedure; and displaying the relative position of the body elements during the procedure.
'64725-841 Hrief Descri~pition of the Drawincs Figure 1 is an illustration of the prior art system in which rigid fi~cation and absence of movement between imaging and surgery permits a one-to-one registration process between the pre-surgical scan and the position in surgery.
Figure 2A is an illustration of operation of the invention in which the pre-procedural image data set .in modified in accordance with the intra-procedural position in order to generate a displaced and/or defosnned data set representative of the intra-procedural position.
Figure 28 is a block diagram of one preferred embodiment of a system according to the invention.
Figure 3 is an illustration of the pre-procedural alignment of three body elements during scanning.
Figure 4 is as illustration of the intra-procedural alignment of the three body elements of Figure 3 during surgery.
Figure 5 is an illustration of three body elements, one of which hay a reference frame attached thereto, in combination with a registration probe.
Figure 6 is an illustration showing ultrasound registration according to the imrention in which emitters are attached to the ultrasound for a virtual reference 3~0 and, optionally, the patient's body for as actual reference .
Figure '7 is an illustration of a fluoroscopic localizer according to the invention for providing projections of an image of the body elea~eate.
Figure 8 is an illustration of a drill guide instrument of the invention wherein the position of a drill guide relative to the body elements may be displayed.
Figures 9 and 10 illustrate a clamped reference frame and a wired reference frame, respectively.
Figure 11 is a schematic diagram of one preferred embodiment of a cranial surgical navigation system according to the invention.
Figure 11A is a top plan view of one preferred embodiment of a cranial reference arc frame according to the invention.
Figure 11B is a side plan view, partially in cross section, of one preferred embodiment of a cranial reference arc frame according to the invention.
Figure 11C is a wiring diagram of one preferred embodiment of a cranial reference arc frame according to the invention.
Figure 12A is a top plan view of one preferred embodiment of a spinal reference arc frame according to the invention.
Figure 12B is a front plan view, partially in cross section, of one preferred embodiment of a spinal reference arc frame according to the invention.
Figure 12C is a side plan view of one preferred embodiment of a spinal reference arc frame according to the invention.
Figure 12D is a top plan view of one preferred embodiment of a thoraco-lumbar mount according to the invention.
Figure 12E is a front plan view, partially in ' cross section, of one preferred embodiment of a thoraco-lumbar mount according to the invention. ' Figure 12F is a side plan view of one preferred embodiment of a thoraco-lumbar mount according to the invention.
Figure 12G is a wiring diagram of one preferred embodiment of a spinal reference arc frame according to the invention.
~ Figure 13A is a top plan view ofone preferred embodiment of a biopsy guide localization frame according to the invention.
Figure 13B is a side plan view, partially in cross section, of one preferred embodiment of a biopsy guide localization frame according to the invention.
Figure 13C is a front plan view of one preferred embodiment of a biopsy guide localization frame according to the invention.
Figure 13D is a top plan view of one preferred embodiment of a drill guide localization frame according to the invention.
Figure 13E is a side plan view, partially in cross section, of one preferred embodiment of a drill guide localization frame according to the invention.
Figure 13F is a top plan view of one preferred embodiment of a drill yoke localization frame according to the invention.
Figure 13G is a side plan view, partially in cross section, of one preferred embodiment of a drill yoke localization frame according to the invention.
Figure 13H is a top plan view of one preferred embodiment of a ventriculostomy probe including an integrated localization frame according to the invention.
Figure 13I is a side.plan view, partially in cross section, of one preferred embodiment of a ventriculostomy probe including an integral localization ~ frame according to the invention.
Figure 13J is a wiring diagram of one preferred ~ embodiment of a localization frame according to the invention.
Corresponding reference characters indicate corresponding parts throughout the drawings.
~- 1~~7 Detailed Description of the Preferred Embodiments Referring to Fig. 2, an overview of operation _ of one preferred embodiment of the system according to the invention is illustrated. Prior to a particular , procedure, the body elements which will be part of the procedure are scanned to determine their alignment, i.e., their pre-operative position. For example, the alignment may be such as illustrated in Fig. 3 wherein body elements 10, 20, and 30 are more or less aligned in parallel. These body elements may be bones or other rigid bodies. In Fig. 3, three-dimensional skeletal elements 10, 20, 30 are depicted in two dimensions as highly stylized vertebral bodies, with square vertebra 11, 21, 31, small rectangular pedicles 12, 22, 32, and triangular spinous processes 13, 23, 33. During imaging, scans are taken at intervals through the body parts 10, 20, 30 as represented in Fig. 3 by nine straight lines generally referred to be reference character 40. At least one scan must be obtained through each of the body elements and the scans taken together-constitute a three-dimensional pre-procedural image data set.
Fig. 2B is a block diagram of the system.
according to the invention. A scanner interface 102 allows a processor 104 to obtain the pre-procedural image data set generated by the scanner and store the data set in pre-procedural image data set memory 106. Preferably, after imaging, processor 104 applies a discrimination process to the pre-procedural image data set so that only the body elements 10, 20, 30 remain in memory 106. If a - 30 discrimination process is employed, processor 104 may execute the discrimination process while data is being ' transferred from the scanner through the scanner interface 102 for storage in memory 106. Alternatively, ' memory 106 may be used for storing undiscriminated data and a separate memory (not shown) may be provided for storing the discriminated data. In this alternative, fir. '~;
WO 96!11624 1$
processor 104 would transfer the data set from the scanner through scanner interface 102 into memory 106 and then would discriminate the data stored in memory 106 to generate a discriminated image data set which would be stored in the separate memory.
Once the body elements 10, 20, 30 are discriminated and each defined as a single rigid body, they can be repositioned by established software algorithms to form the displaced image data set. Each rigid body element, 10, 20, 30, must have at least three recognizable reference points which are visible on the pre-procedural images. These reference points must be accurately detected during the procedure. For body part .10, reference points 10A, !0B, and lOC are located on the spinous process 13; for body part 20, reference points 20A and 20C are located on the vertebra 21 and reference point 20B is located on spinous process 23; and for body part 30, reference points 30A and 30B are located on the spinous process 33 and reference point 30C is located on the vertebra 31. More than one reference point can be selected on each scan through the bone, although the maximal accuracy of registration is achieved by separating the reference points as far as possible. For example, in the case of posterior spinal surgery, it may be preferable to select reference points 10A, !0B, and lOC on the spinous process which is routinely exposed during such surgery. It is contemplated that system software may allow the manual or automated identification of these same points on the images of the body elements 10, 20, 30. As Fig. 3 is a two-dimensional projection of a three-dimension process, the reference points will not be limited to a perfect sagittal plane, as depicted.
After imaging, the skeletal body elements 10, 20, 30 may move with respect to each other at the joints or fracture lines. In the procedure room, such as an operating room or a room where a medical procedure will WO 96/11624 t ~ ~ ~ PCTlL1895/12894 be performed, after positioning the patient the surgery, the body elements will assume a different geometry, such as the geometry depicted in Fig. 4.
As a result of this movement, the pre- , procedural image data set stored in memory 106, consisting of the scans through the skeletal elements, does not depict the operative position of the skeletal elements, as shown a.n Fig. 4. However, the shape of the skeletal elements, as depicted by the scans through the element, is consistent between imaging and procedure since they are rigid bodies, as indicated by the lines 40 through each element in Fig. 4. Therefore, the image data set must be modified to depict the intraprocedural geometry of the skeletal elements. This modification is performed by identifying the location of each reference point of each skeletal element in procedure space. As diagrammatically illustrated in Fig. 2, a localizer 108 (see Figure 13, below, for more details) identifies the location and provides this information so that the pre- .
procedural data set may be deformed or re-positioned into the displaced data set. As a result, the displaced data set is in registration with the intra-procedural position of the elements 10, 20, 30. Once the locations of the reference points are determined by the localizer 108, processor 104, which is a part of the work station, can execute software which re-positions the images of the skeletal elements to reflect the position of the actual elements in the procedure room thus forming the displaced set and the registration between the displaced set and the intra-procedural position.
Preferably, a three-dimensional digitizer may be used as the localizer 108 to determine the position and space of the elements 10, 20, 30 during the ' procedure. In general, the digitizer would include a reference array 110 which receives emissions from a series of emitters. Usually, the emissions consist of l~ . ~ , WO 96!11624 ~ ~ ~ ~ ~ ~ ~ PCT/US95/12894 some sort of energy, such as light, sound or electromagnetic radiation. The reference array 110 is distant from the emitters which are applied to and positioned in coordination with the elements being localized, determining the position of the emitters. As is apparent, the emitters may be placed distant to the elements and the reference array 110 may be attached to~
the elements being localized.
Referring to Fig. 2, an alternate preferred embodiment of the system according to the invention in the case where the body elements are not rigid, but rather semi-rigid such that shape deformations may occur to the body elements is described as follows. Prior to a particular procedure, the body elements which will be part of the procedure are scanned to determine their pre-operative position and shape. For e~cample, the alignment may be such as illustrated in Fig. 3 wherein body elements 10, 20, and 30 are more or less aligned a.n parallel and have a defined shape. These body elements may be soft tissue such as the prostate or other semi-rigid bodies.
After imaging, the elements 10, 20, 30 may move with respect to each other and also their shape may become deformed. In the procedure room, such as an operating room or a room where a medical procedure will be performed, after positioning the patient the surgery, the body elements may assume a different geometry, such as the geometry depicted in Fig. 4 where geometry depicts both element alignment (position) and shape.
As a result of this changed geometry, the pre-procedural image data set stored in memo,°y 106, does not depict the operative geometry of the bor~y elements, as shown in Fig. 4. Indeed, the shape of the body elements, as depicted,by the scans through the element, may have changed between imaging and procedure since they are semi-rigid bodies. Therefore, the image data set must be ~~~~~1~'~~
WO 96/11624 PCTlUS95/12894 modified to depict the current geometry of the body elements. This modification is performed by identifying the location of the reference points of each body element in procedure space. As diagrammatically illustrated in Fig. 2, a localizes 108 possibly in communication with a processor 104 identifies the location of the reference points and provides this information so that the pre-procedural data set may be deformed into the displaced data set. Once the locations of the reference points are determined, processor 104, which is a part of the work station, can execute software which modifies the images of the body elements to reflect the geometry of the actual elements in the procedure room thus forming the displaced set and the registration between the displaced set and the intra-procedural position. As a result, the displaced data set is a.n registration with the intra-procedural geometry of the elements 10, 20, 30.
According to one preferred embodiment of the invention, a reference frame 116 is attached to one of the body elements 10 at the beginning of the procedure.
Various reference frame embodiments are illustrated in more detail in Figures 11 and 12, below. Reference frame 116 is equipped with a plurality of emitters 114 which together define a three-dimensional intraprocedural coordinate system with respect to the body element 10.
In conventional terms, the reference frame 116 defines the stereotactic space with respect to the body element 10. Emitters 114 communicate with sensors 112 on a reference array 110 located in the procedure room and remote from the reference frame 116 and patient. If the body of the patient is not immobilized during surgery, then multiple reference frames may be required for each body element to define a surgical space with respect to each element.. The surgical space may alternatively be defined by rigid fixation of the frame emitters 114 directly (or indirectly, for example, to the skin) to the WO 96/11624 2 ~ ~ '~ , 8 7 ~ pCT/US95/12894 skeletal elements 10, 20, or 30. In either case, the . emitters 114 emit a signal which is received by the sensors 112. The received signal is digitized to compute position, for example, by triangulation. Through such information, the localizes 108 or a digitizer which is part of the localizes 108 can determine the exact three-dimensional position of the frame emitters 114 relative to the sensors 112. Thereby, localizes 108 or the processor 104 can exactly determine the position of the reference frame 116 relative to the array which is free to move except during localization, e.g., activation of the emitters 114 on the reference frame 116 and activation of the probe emitters 112. Emitters 114 of the reference frame 116 are energized to provide radiation to the sensors 112, which radiation is received and generates signals provided to the localizes 108 for determining the position of the frame 116 relative to the array 110.
Next, it is necessary to determine the position of the body element 10, which may be a skeletal element, to which the reference frame 116 is affixed or positioned with respect to. In particular, the position of the body element 10 relative to the reference frame 116 must be determined, thereby determining the position of the body element 10 in the surgical space defined by the reference frame 116. After exposure of the reference points 10A, 10B, lOC by surgical dissection, the reference points are touched by the tip of a registration probe 118 equipped with emitters 120. As each of the reference points 10A, 10B, lOC is touched by the tip of the probe 120, the ' emitters are energized to communicate with the sensors 112 of reference array 110. This communication permits ' the localizes 108 to determine.the position of the registration probe 120, thereby determining the position of the tip of the probe 120, thereby determining the position of the reference point 10A on which the tip is WO 96!I1624 ~~ ~ ~. PCT/US95/12894 positioned. By touching each of the reference points 10A, 10B, lOC on each body element 10, 20, 30 involved in -the procedure, an intra-procedural geometry data is generated and stored in memory 121. This data is related .
5 to the corresponding reference points on the pre-procedural images of the same elements by processor 104.
which employs software to derive a transformation which allows the determination of the exact procedural position, orientation, and shape in surgical space of 10 each body element, and thereby modifies the pre-procedural image data set stored in memory 106 to produce a displaced image data set which is stored in memory 122.
The displaced image data set in memory 122 reflects the geometry of the actual elements 10, 20, 30 during the 15 procedure. Processor 104 displays the displaced image data set on display 124 to provide a visual depiction of the geometry of the body elements 10, 20, 30 during the procedure. This image is used during the procedure to assist in the procedure. In addition, it is contemplated 20 that an instrument, such as a forceps, a laser, a microscope, a endoscope, or a radiation delivery system, which would be used during the procedure may be modified by the addition of emitters. This modified device when moved into the area of the body elements 10, 20, 30 would be activated so that its emitters v~ould communicate with the reference array 110 thereby permitting localizes 108 to determine the instrument's position. As a result, processor 104 would modify display 124 to indicate the position of the instrument or the instruments focal point, such as by positioning a cursor, with respect to the body elements 10, 20, 30. ' Further, it is contemplated that the addition of emitters on an instrument (effector) may be used with the system in order to create a closed-loop feedback for actively (in the case of robotics) or passively controlling or monitoring the instrument and its WO 96/11624 ~ ' ~ ~ '~ ~ 7 ~ PCT/US95/12894 position. Such a control loop allows the monitoring of certain procedures such as the delivery of radiation to the body or the use of a drill where the object of the procedure is to keep the focal point of the instrument in a safe zone, i.e. a predetermined procedural plan. Such a control loop could also control the operation of a robotically controlled instrument where the robotics could be driven (directly or indirectly) by processor 104 to control the position the position of the instrument.
For example, the processor could instruct a robotic arm to control the position of a laser. The laser position could be monitored, such as by emitters on the laser.
The processor would be programmed with the control parameters for the laser so that it would precisely follow a predetermined path.
Reference frame 116 allows the patient to be moved during the procedure without the need for re-registering the position of each of the body elements 10, 20, 30. It is assumed that during the procedure, the body elements are fixed relative to each other. Since the reference frame 116 is fixed (directly or indirectly) to body element 10, movement of the patient results in corresponding movement of the reference frame 116.
Periodically, or after each movement of the patient, array emitters 114 may be energized to communicate with the sensors 112 of reference array 110 in order to permit localizer 108 to determine the position~of the reference frame 116. Since the reference frame 116 is in a known relative position to element 110 and since we have assumed that elements 20 and 30 are in fixed relation to element 10, localizer 108 and/or processor 104 can determine the position of the elements and thereby maintain registration.
An alternative to touching the reference points A, B, C with the tip of the probe 118 would be to use a contour scanner 126a with emitters attached 126b. Such a device, using some form of energy such as sound or light which is emitted, reflected by the contour and sensed, would allow the extraction of a contour of the body elements 10, 20, 30, thus serving as a multitude of reference points which would allow registration to occur.
The registration process as analogous to the process described for ultrasound extracted contours below.
In certain situations, markers may be used on the skin surface as reference points to allow the transformation of the pre-procedural image data set into the displaced image data set. Reciprocally, akin surface fiducials applied at the 'time of ianaging can be used to re-position the body to match the geometzy during imaging and is described below.
Localization of body elements 10, 20, 30 may be desired without intra-procedural exposure of the reference points A, H, C on those body elements.
8xamples wherein the spine is.minimally exposed include percutaaeous biopsy of the spine or discectomy, spinal ZO fixation, endoscopy, percutaa~ous spinal implant insertion, percutaneous fusion, insertion of drug delivery systems, sad radiation delivezy. In this situation, localization of reference points on the body elements meet be determined by some form of imaging which can localize through overlying soft tissue and/or discriminate surrounding tissue sad structures. There are currently two imaging techniques which are available to a surgeon is the operating room or a doctor in a procedure room which satisfy the needs of being low cost and portable. Hoth imaging techniques, ultrasonography and radiography, can produce two-or three-dimensional images which can be employed is the fashion described herein to register a three-dimensional form such as a skeletal element.
The coupling of a three-dimensional digitizer to a probe of an ultrasound device affords benefits in that a contour can be obtained which can be related directly to a reference system that defines three-dimensional coordinates in the procedural work space, i.e., the surgical space. In the context of the present invention, a patient is imaged prior to a procedure to generate a pre-procedural image data set which is stored in memory 106. In the procedure 10. room, the patient's body is immobilized to stabilize the spatial relationship between the body elements 10, 20, 30. A procedural reference system, surgical space, for the body is established by attaching a reference frame 116 to one of the body elements or by otherwise attaching emitters to the patient or body elements as noted above, or by attaching emitters to a device capable of tracking one of the body elements thereby forming a known relationship with the body element. For example, this could be performed by using the percutaneous placement of a reference frame similar to the one described above, radiopaque markers screwed into the elements or by placing emitters 130 directly on the skins, as illustrated in Fig. 6, based on the assumption that the skin does not move appreciably during the procedure or in respect to the body elements.
An ultrasound probe 128 equipped with at least three emitters 130 is then placed over the body element of interest. The contour (which can be either two- or three-dimensional) of the body element is then obtained using the ultrasound probe 128. This contour can be expressed directly or indirectly in the procedural coordinates defined by the reference system (surgical space). Emitters 130 communicate with sensors 112 of reference array 110 to indicate the position of the ultrasound probe 128. An ultrasound scanner 131 which energizes probe 128 determines the contour of the body ~~~~ ~7 7 element of interest and being scanned. This contour information is provided to processor 104 for storage in intra-procedural geometry data memory 121.
The intra-procedural contour stored in memory , 121 is then compared by a contour matching algorithm to a corresponding contour extracted from the pre-operative image data set stored in memory 106. Alternatively, a pre-procedural contour data set may be stored in memory 134 based on a pre-procedural ultrasound scan which is input into memory 134 via scanner interface 102 prior to the procedure. This comparison process continues until a match is found for each one of the elements. Through this contour matching process, a registration is obtained between the images of each body element and the corresponding position of each element in the procedural space, thereby allowing the formation of the displaced image data set 122 used for localization and display.
Note that the contours used in the matching process only have to be sufficiently identical to accomplish a precise match - the contours do not have to be the same extent of the body element.
In certain instances, the ultrasound registration noted above may not be applicable. For example, ultrasound does not penetrate bone, and the presence of overlying bone would preclude the registration of an underlying skeletal element. Further, the resolution of ultrasound declines as the depth of the tissue being imaged increases and may not be useful when the skeletal element a.s so deep as to preclude obtaining an accurate ultrasonically generated contour. In these circumstances, a radiological method is indicated, which utilizes the greater penetrating power of x-rays.
Pre-operative imaging occurs as usual and the skeletal elements may be discriminated from the soft tissue in the image data set as above. In particular, a CT scan of the skeletal elements 10, 20, 30 could be W O 96!11624 ~, ~ ~ ~. ~ ~ ~ ~ PCT/LTS95/12894 taken prior to the procedure. Processor 104 may then _ discriminate the skeletal elements and store the pre-procedural image data set in memory 106. Next, the patient is immobilized for the procedure. A radiograph 5 of the skeletal anatomy of interest is taken by a radiographic device equipped with emitters detectable by the digitizer. For example, a fluoroscopic localizes 136 is illustrated in Fig. 7. Localizes 136 includes a device which emits x-rays such as tube 138 and a screen 10 140 which is sensitive to x-rays, producing an image when x-rays pass through it. This screen is referred to as a fluoroscopic plate. Emitters 142 may be positioned on the tube 138, or on the fluoroscopic plate 140 or on both. For devices in which the tube 138 is rigidly 15 attached to the plate 140, emitters need only be provided on either the tube or the plate. Alternatively, the reference array 110 may be attached to the tube or the plate, obviating the need for emitters on this element.
By passing x- rays through. the skeletal element 141 of 20 interest, a two-dimensional image based on bone density is produced and recorded by the plate. The image produced by the fluoroscopic localizes 136 is determined by the angle of the tube 138 with respect to the plate 140 and the position of the skeletal elements 25 therebetween and can be defined with respect to procedure coordinates (surgical space). Fluoroscopic localizes 136 includes a processor which digitizes the image on the plate 140 and provides the digitized image to processor 104 for possible processing and subsequent storage a.n intra-procedural geometry data memory 121. Processor 104 ' may simulate the generation of this two-dimensional x-ray image by creating a series of two-dimensional projection of the three-dimensional skeletal elements that have been discriminated in the image data set stored in memory 106.
Each two dimensional projection would represent the passage of an X-ray beam through the body at a specific angle and distance. In order to form the displaced data set and thus achieve registration, an iterative process is used which selects that a two-dimensional projection through the displaced data set that most closely matches the actual radiographic images) stored in memory 121.
The described process can utilize more than one radiographic image. Since the processor 104 is also aware of the position of the fluoroscopic localizers because of the emitters 142 thereon, which are in communication with localizer 108, the exact position of the skeletal elements during the procedure is determined.
As noted above, the procedural reference system or surgical space for the body can be established by attaching emitters to a device capable of detecting and tracking, i.e. identifying, one of the body elements thereby forming a known relationship with the body element. For example, the emitters 130 on the ultrasound probe 128 together and without the three emitters on the ' patient's body form a type of reference frame 116 as depicted in Figure 6 which can be virtually attached to body element 10 by continuously or periodically updating the ultrasound contour of body element 10 stored in intra-procedural geometry data memory 121 which the processor 104 then uses to match to the contour of body element 10 stored in pre-procedural~memory 106 thereby continuously or periodically updating the displaced image data set in memory 122 so that registration with the procedural position of the body elements is maintained.
It is contemplated that a virtual reference frame can be accomplished using any number of devices that are capable of detecting and tracking a body element such as radiographic devices (fluoroscope), endoscopes, or contour scanners.
The above solutions achieve registration by the formation of a displaced image data set stored in memory 122 which matches the displacement of the skeletal WO 96111624 ~ ~ ~ ~ 7 7 p~/Ug95112894 elements at the time of the procedure. An alternative technique to achieve registration a.s to ensure that the positions of the skeletal elements during the procedure are identical to that found at the time of imaging. This can be achieved by using a frame that adjusts and immobilizes the patient's position. In this technique, at least three markers are placed on the skin prior to imaging. These markers have to be detectable by the imaging technique employed and are called fiducials. A
multiplicity of fiducials is desirable for improving accuracy.
During the procedure, the patient's body is placed on a frame that allows precise positioning. Such frames are commonly used for spinal surgery and could be modified to allow their use during imaging and could be used for repositioning the patient during the procedure.
These frames could be equipped with drive mechanisms that allow the body to be moved slowly through a variety of positions. The fiducials placed at the time of imaging are replaced by emitters. By activating the drive mechanism on the frame, the exact position of the emitters can be determined during the procedure and compared to the position of the fiducials on the pre-procedural image data set stored in memory 106. Once the emitters assume a geometry identical to the geometry of the fiducials of the image data set, it is considered that the skeletal elements will have resumed a geometric relationship identical to the position during the pre-procedural scan, and the procedure can be performed using the unaltered image data set stored in memory 106.
In general, instrumentation employed during procedures on the skeleton is somewhat different than that used for cranial applications. Rather than being concerned with the current location, surgery on the skeleton usually consists of placing hardware through bones, taking a biopsy through the bone, or removing WO 96!11624 PCT/US95/12894 ~ 0'~ ~7 7.
fragments. Therefore, the instrumentation has to be specialized for this application.
One instrument that is used commonly is a drill. By placing emitters.on a surgical drill, and by .
having a fixed relationship between the drill body and its tip (usually a drill bit), the direction and position of the drill bit can be determined. At leasr_ three emitters would be needed on the drill, as most drills have a complex three-dimensional shape. Alternatively, emitters could be placed on a drill guide tube 800 having emitters 802, and the direction 804 of the screw being placed or hole being made could be determined by the digitizer and indicated on the image data set (see Fig.
.8). The skeletal element 806 would also have emitters thereon to indicate its position.
Besides modification of existing instrumentation, new instrumentation is required to provide a reference system for surgery as discussed above. These reference frames, each equipped with at least 3 emitters, require fixation to the bone which prevents movement or rotation.
For open surgery, a clamp like arrangement, as depicted in Fig. 9, can be used. A clamp 900 is equipped with at least two points 902, 904, 906, 908 which provide fixation to a projection 910 of a skeletal element. By using at least two point fixation the clamp 900, which functions as a reference frame, will not rotate with respect to the skeletal element. The clamp includes emitters 912, 914, 916 which communicate with the array to indicate the position of the skeletal element as it is moved during the procedure.
Many procedures deal with bone fragments 940 which are not exposed during surgery, but simply fixated with either wires or screws 950, 952 introduced through the skin 954. Fig. 10 depicts a reference platform 956 attached to such wires or screws 950, 952 projecting WO 96!11624 ~ ~ ~ ~ ~ 7 pCT/LTg95112894 through the skin 954. The platform 956 includes a plurality of emitters 958, 960, 962, 964 which communicate with the array to indicate the position of the bone fragment 940 as it is moved during the procedure.
The reference frame can be slipped over or attached to the projecting screws or wires to establish a reference system. Alternatively, the frame can be attached to only one wire, as long as the method of attachment of the frame to the screw or wire prevents rotation, and that the wire or screw cannot rotate within the attached skeletal element.
REFERENCE AND LOCALIZATION FRAMES
Fig. 11 is a schematic diagram of one preferred embodiment of a cranial surgical navigation system according to the invention. Portable system cabinet 102 includes a surgical work station 104 which is supported for viewing by the surgeon or technician using the system. Work station 104 includes a screen 106 for illustrating the various scans and is connected to a personal computer 108 for controlling the monitor 106.
The system also includes an optical digitizer including a camera array 110, a camera mounting stand 112 for supporting the array remote from and in line of sight with the patient, a digitizer control unit 114 on the portable system cabinet 102 and connected to the computer 108, a foot switch 116 for controlling operation of the system and a breakout box 118 for interconnecting the foot sT,~ritch 116 and the digitizer control unit 114.
Also connected via the break out box 118 is a reference frame assembly 120 including a reference frame 122 with cable connected to the break out box 118, a vertical support assembly 124, a head clamp attachment 126 and a horizontal support assembly 128. Optical probe 130 (which is a localization frame) is also connected via cable to the digitizer control unit 114 via the break out box 118.
In operation, a patient's head for other "rigid" body element) is affixed to the head clamp 5 attachment 126. To determine the position of optical probe 130 with respect to the head within the head clamp attachment 126, a surgeon would step on pedal 116 to energize the emitters of reference frame 122. The emitters would generate a light signal which would be 10 picked up by camera array 110 and triangulated to determine the position of the head. The emitters of the optical probe 164 v~ould also be energized to emit light signals which are picked up by the camera array to determine the position of the optical probe 164. Based 15 on the relative position of the head and the probe 164.
control box 114 would illustrate a preoperative scan on the screen of monitor 106 which would indicate the position of the probe relative to and/or within the head.
Fig. 11A is a top plan view of one preferred 20 embodiment of a cranial reference arc frame 122 according to the invention. Reference frame 122 is for use with a surgical navigation system such as illustrated in Fig. 11 having a sensor array such as camera array 110 which is in communication with the reference frame 122 to identify 25 its position. The reference frame 122 includes a base member 132 having an upper base 134 and a base plate 136 which each have a semi-circular configuration and are joined together by screws 138 to form a cavity 140 therebetween. The base and plate may be made of anodized 30 aluminum or other autoclavable material. The top of the upper base may be provided with one or more spring clamps 142 for engaging a Leyla retractor arm. As shown in Fig.
11A, the upper base is provided with five spring clamps 142.
Either or both ends of the reference frame 122 may be provided with a bayonet fitting 144 f or engaging a 31 <
clamp which would also engage a Leyla retractor. One or both ends of the reference frame 122 is also formed into a radial projection 146 for supporting a screw 148 and crank handle 150 used to lock the reference frame to a head clamp such as head clamp 126 shown in Fig. 11 or a Mayfield Triadr" surgical clamp. This allows the reference frame 122 to be placed in a fixed position relative to the head so that any movement of the head would also include corresponding movement of the reference frame 122.
Radial projection 146, screw 148 and handle 150 constitute a coupling on the base member 132 f or engaging a structure attached to a body part (the head) thereby providing a fixed reference relative to the head in order to maintain the base member 132 in fixed relation to the head.
Equally spaced about the reference frame 122 are a plurality LEDs 152 for communicating with the of camera array 110. The LEDs 152 are mounted in holes 154 in the upper base 134, which holes 154 are in communication with the cavity 140. Wires 156 are connected to each of the terminals of the LEDs 152 are positioned within the cavity 140. The other ends of the wires are connecte d to a connector 158 for engaging a cable connected the digitizer 114 of the surgical to navigation system. The cable provides signals for activating the LED s 152. Connector 158 is mounted on a support projection 160 which projects from the base plate 136. This support projection 160 has a channel therein for permitting the wires to be connected to the connector 128. Fig. ilA is wiring diagram of one preferred a embodiment of the reference frame 122 according to the invention. As is illustrated in Fig. 11C, each LBD
tezminal is connec ted to a separate pin of the connector 158. Although the invention is illustrated as having a connector for enga ging a cable, it is contemplated that WU 96/11624 ~ , 1 ~ PCT/US95/12894 the reference frame 122 may be battery operated so that no cable is necessary.
The reference frame 122 is essentially a semi-circular arc so that it fits around the head of the patient to allow communication of multiple LEDs 152 on the reference frame 122 with the camera array 110. The multiple LEDs 152 on the reference frame 122 are positioned in a precisely known geometric arrangement so that the calibration of the camera array 110 can be checked continuously by comparing the LEDs geometric positions as calculated by the digitizer 114 with those precisely known geometric positions. Inconsistencies in this information indicates the need to recalibrate the system or to reposition the reference frame 122 so that it can more accurately communicate with the camera array 110. Frame 122 also includes a calibration divot 162.
In particular, divot 162 a.s an exactly located depression within the upper base 134 and is used to calibrate or check the calibration during the medical or surgical procedure the position of the tip of the probe. The precise location of each of the LEDs 152 relative to the calibration divot 162 is known. Therefore, locating a tip of a localization frame probe in the calibration divot 162 allows the calibration or the calibration check of the probes in the following manner. The tip of the probe is located within the calibration divot 162 and the LEDs on the probe are energized to provide light signals to the camera array 110. The LEDs on the reference frame 122 are also energized to communicate with the camera array 110. Using the known position of the divot 162 with respect to the position of each of the LEDs 152 as calculated by the digitizer 114, the.location of the calibration divot 162 is compared to the location of the tip of the pxobe as calculated by the digitizer using the LEDs on the probe in order to confirm that there is no distortion in the probe tip relative to the divot 162.
wo 96!11624 2 2 ~ ~ ~ ~ 7 PCTlUS95/12894 Distortion in the probe tip indicates the need to recalibrate the probe so that it can more accurately Communicate with the camera array 110 or to retire the probe.
Figs. 12A, 12B, and 12C illustrate another preferred embodiment of the reference frame in the form of a spine reference arc frame 200. As with reference frame 122, spine reference arc frame 200 has an upper base 202 which engages a base plate 204 to form a cavity 206 therebetween. As shown in Fig. 12A, the spine reference arc frame 200 has a generally U-shape configuration with LEDs 208 located at the ends of the legs 209 of the U-shaped member and at the intersection of the legs and base 211 of the U-shaped member.
Projecting laterally from the base 211 is a coupling 210 for engaging a thoraco-lumbar mount 212 as illustrated in Figs. 12D, 12E, and 12F. Also positioned on the base 211 is a calibration divot 214 which is a depression having . the same purpose as the calibration divot 162 of the reference frame 122. Coupling 210 has twenty-four evenly spaced teeth 216 arranged in a circular pattern for engaging the twenty-four equally spaced teeth 218 of the thoraco-lumbar mount. This allows the spine reference arc frame 200 to be positioned to form various angles relative to the mount 212., It is contemplated that any other variable position connector may be used to join the spine reference arc.frame 200 and the mount 212. Base plate 204 has an opening therein for engaging a connector 220 for receiving a cable to the digitizer control unit 114. The LEDs 208 are connected to the connector 220 by - wires 222 as illustrated in wiring diagram Fig. 12G.
Referring to Figs. 12D, 12E, and 12F, thoraco-lumbar mount 212 comprises a clamp shaft 224 having an axial bore therein within which is positioned an actuating shaft 226 which is connected to an actuating knob 228 extending beyond the end of clamp shaft 224.
The end of the actuating shaft 226 opposite the actuating knob 228 has an internal threaded bore 230 which engages external threads of an actuation screw 232. A U-shaped head 234 of screw 232 supports a pivot pin 236 between its legs. The pivot pin passes through the jaws 238 so that the jaws 238 rotate about the pivot pin 236 and move relative to each other defining a receiving area 240 within which a spinal bone or other body part may be clamped. The jaws 238 have teeth 239 for engaging a spinal bone or other body part and are spring loaded and held in their open position by spring plungers 242. As the actuating knob 228 is turned to engage the threads of actuation screw 232, the screw 232 is drawn into the bore 230 also drawing the jaws into a housing 246. This results in the caroming surfaces 244 of housing 246 engaging the follower surfaces 248 of the jaws 238 closing the jaws and closing the receiving area 240 as the jaws are pulled into the housing.
The other end of clamp shaft 224 has a perpendicular projection 250 for supporting the teeth 218 which engage the teeth 216 of the coupling 210 of the spine reference arc frame 200. A spine reference arc clamp screw 252 passes through the array of teeth 218 and engages a threaded opening 254 in the coupling 210 of frame 200. Screw 252 engages opening 254 and locks teeth 216 and teeth 218 together to fix the angle between the spine reference arc frame 200 and the thoraco-lumbar mount 212. As a result, when the mount 212 is connected to a bone by placing the bone in the receiving area 240 and turning the actuating knob 228 to close the jaws 238 and the receiving area, the frame 200 is in a fixed position relative to the bone which is engaged by the jaws. Any movement of.the bone results in movement of the frame 20.0 which can be detected by the camera array 110.
pCT/US95/12894 Referring to Figs. 13A, 13B and 13C, one preferred embodiment of a localization biopsy guide frame 300 is illustrated. In general, the frame 300 includes a localization frame 302 which supports a biopsy guide 304 5 and which also supports a support pin 306. The localization frame 302 is comprised of an upper base 308.
and a base plate 310 which join to form a cavity 312 within which the wires 314 connecting to the LEDs 316 are located. As shown in the figure 13A, the localization 10 frame has an elongated portion 318 and a generally V-shaped portion 320 having legs 322 and 324. An LED 316 is located at the end of each of the legs 322 and an LED
316 is also located at the ends of the elongated portion 318. As a result the four LEDs 316 form a rectangular 15 array. However, the underlying localization frame 302 does not have a rectangular configuration which allows it to be adapted for other uses, such as a drill guide assembly as illustrated and described below with regard to Figs. 13D and 13E. In general, the V-shaped portion 20 320 extends laterally from the elongated portion 318 in order to accomplish the rectangular configuration of the LEDs 316. Note that a rectangular configuration for the LEDs 316 is not required and that in fact, a trapezoidal configuration for the LEDs 316 may be preferred in order 25 to uniquely distinguish the orientation of the localization frame 302. Support pin 306 passes through the upper base 308 and is essentially parallel to a linear axis defined by the elongated portion 318. The purpose of support pin 306 is to allow clamps to engage 30 it so that the localization biopsy guide frame 300 can be placed in a particular position relative to a body part in order to guide a biopsy needle.
In order to guide a biopsy needle, the . localization frame 302 is fitted with a biopsy guide 35 304 which is mounted to the top of the upper base 308 and held in place by a clamp 328 which engages the upper base 308 via four screws 330. The upper base 308 is also provided with a semicircular channel 332 which forms a seat for receiving the biopsy guide 326. The guide 304 comprises a hollow tube 334 having a collar 336 at one end thereof, which has a threaded radial opening for receiving set screw 338.
The base plate 310 is fitted with a connector 340 for engaging a cable which is connected to the digitizer 114 for providing signals for energizing the LEDs 316. Figure 12G illustrates one preferred embodiment of a wiring diagram which interconnects the connector 340 and four LEDs.
The localization frame 302 is made of the same TM
material as the reference frame 122, i.e., ULTEM 1000 black which is autoclavable. The biopsy guide 304 may be stainless steel or any other autoclavable metal or plastic material. As with the reference frame, the localization frame may be battery operated thereby avoiding the need for a cable or a connector for engaging the cable.
Figs. 13D and 13E illustrate another localization device in the form of a localization drill guide assembly 350. The assembly 350 includes a localization frame 302 which is the same ae the frame used for the localization biopsy guide frame 300, except that it does not have a support pin 306. It does have a semicircular channel 332 in the upper base 308 which receives a handle and drill guide assembly 354 instead of the biopsy guide tube assembly 304. Assembly 354 includes a handle 356 which is used by the surgeon, doctor, technician or nurse conducting the procedure.
Handle 356 has a bore 358 therein for receiving a shaft 360 which is seated within the semicircular chapel 332.
The shaft terminates into an integral collar 362 which supports a drill guide tube 364. The axis of the drill guide tube 364 is at an angle relative to the axis of the shaft 360 to assist in aligning the drill guide tube 364 relative to the point at which the drill bit will be entering the patient's body. In one preferred embodiment, handle and drill guide assembly 354 is a standard off-the-shelf instrument which is mounted to the channel 332 of the localization frame 302. The handle and drill guide assembly 354 may be a Sofamor Danek Part 870-705. Screws 366 (having heads insulated with high temperature RTV compound) attach the shaft 360 to the upper base 308 of the localization frame 302 and hold the shaft 360 in place within the channel 332. As noted above, the V-shaped portion 320 of the localization frame 302 forms an opening 368 between its legs 322 and 324 so that the drill guide tube 364 may be located therebetween and project downwardly from the plane generally defined by the localization frame 302. This allows the surgeon to sight in the position of the drill guide tube 364 by looking through the tube. Connector 370 is similar to connector 340, except that it provides an angular engagement with the cable which allows for more freedom of movement of the localization drill guide assembly 350.
As with the localization f came noted above, the frame itself is made of ULTEM 1000 which is autoclavable. The handle may be wood, plastic, or any other autoclavable material and the shaft, collar and drill guide may be metal, plastic or other autoclavable material, such as stainless steel. Figure 13J illustrates a preferred embodiment of the wiring diaigram for the localization drill guide assembly 350.
Figs. 13F and 13G illustrate another localization device in the form of a drill yoke localization frame 400. This frame 400 includes a localization frame 302 of the same configuration as the localization frames for the localization biopsy guide frame 300 and the localization drill guide assembly 350.
Projecting from the underside of the base plate 310 is a support member 402 which also supports a drill yoke 404 in a plane which is essentially perpendicular to the plane defined by the localization frame 302. Yoke 404 is essentially a collar which fits over the housing of a Midas RexT""
drill and is fixedly attached thereto by a set screw 4106.
The drill yoke localization frame 400 allows the drill housing to be precisely positioned for use during surgery.
Support member 402 also supports a connector 408 for receiving a cable which is connected to the digitizer control unit 114. Support member 402 has a hollow channel therein so that the connector 408 may be connected to the wires 410 which connect to the LEDs 316.
Figure 13J illustrates one preferred embodiment of a wiring connection between the LEDs 316 and the connector 408.
Figs. 13H and 13I illustrate another localization device in the form of a ventriculostomy probe 500. Probe 500 includes a handle 502 having a bore 504 therein for receiving a support shaft 506 which in turn supports a catheter guide tube 508 along an axis which is parallel to the axis of the handle 502. The handle includes three LEDs 510 mounted along its top surface for communication with the camera array 110. The handle 502 has a hollow channel terminating in a bore 512 f or receiving a connector 514. The connector 514 is connected to wires 516 Which are also connected to the terminals of the LEDs 510. Figure 13J illustrates one preferred embodiment of a wiring diagram for interconnecting the connector 514 and the LEDs 510. In operation, the tube 508 is positioned within the body, the brain for example, so that a catheter may be inserted within the body. Tube 508 includes a top slot 518 which allows a catheter to be inserted therein. Preferably, the tube tip at its center is collinear With the chip WO 96!11624 ~ ~ ~ ~ ~ ~ ~ PCTlUS95l12894 height of all three LEDs 510 so that a linear axis is defined therebetween. Based on this linear axis and the predetermined knowledge of the distance between the tip and the LEDs 510, the camera array 110 and digitizer 114 can determine the position of the tip at any instant during a surgical or medical procedure.
The system of tile invention may be used in the following manner. A reference frame is attached to a body part. For example, cranial reference arc frame 122 may be attached directly to a head via a head clamp such as a Mayfield clamp or spine reference arc frame 200 may be attached directly to a spinous bone via thoraco-lumbar mount 212. Thereafter, movement of the body part will result in corresponding movement of the attached reference frame. The position of the body part may be tracked by energizing the LEDs of the reference frame to provide a signal to the camera array 110 so that the array can determine and track the position of the reference frame and, consequently, the position of the body part.
A localization frame is used to precisely position an instrument relative to the body part. For example, a localization biopsy guide frame 300 may be used to position a biopsy needle relative to the body part. Alternatively, a localization drill guide assembly 350 may be used to position a drill bit relative to the body part. Alternatively, a drill yoke localization frame 400 may be used to position a drill relative to the body part. Alternatively, a ventriculostomy probe 500 may be used to position a,catheter relative to a body part. The position of the instrument may be tracked by energizing the LEDs of the localization frame to provide a signal to the camera array 110 so that the array can determine and track the position of the localization frame and, consequently, the position of the instrument.
W O 96/11624 ~ ' PCT/US95/12894 During calibration of the system, the position of the reference frame relative to the body part is determined. Markers used during the preoperative scan are located and identified in coordinates of the surgical 5 space as defined by the reference frame. Note that anatomic landmarks may be used as markers. This provides a relationship between the preoperative scan space and the surgical space. Once this relationship is established, the system knows the position of the 10 preoperative scans relative to the reference frame and thus can generate scans which illustrate the position of the localization frame and the instrument relative to the body part. In other words, the system accomplishes image guided surgery. The system is ideally suited for 15 locating small, deep-seated vascular lesions and tumors and for reducing the extent of the microsurgical dissection. It is also useful in identifying boundaries.
For example, suppose a surgeon is trying to identify a boundary between normal brain and large supratentorial 20 gliomas, which may be clearly shown on the preoperative scans but which may be difficult to visually locate in the operating room during a procedure. The surgeon would take a localization probe and position it a point near the boundary. The LEDs of the reference frame and 25 localization probe are fired by use of the foot switch 116. As a result, the monitor 106 would provide a screen showing the position of the probe relative to a preoperative scan. By referring to the monitor, the surgeon can now determine the direction in which the 30 probe should be more to more precisely locate the boundary. One the boundary is located, microcottonoid markers can be placed at the boundary of the tumor as displayed on the monitor before resection is started.
The placement of ventricular catheters for shunts, 35 ventriculostomy, or reservoirs is also facilitated by the use of the system, especially in patients who have small ventricles or who have underlying coagulopathy (e. g., liver failure, acquired immunodeficiency syndrome) that makes a single pass desirable. The system can also be useful for performing stereotactic biopsies. For further information regarding the system, see the following articles:
Germano, Isabelle M., The NeuroStation System for Image Guided, Frameleas Stereotaxy, Neurosurg~ery, VoI. 37, No.
2, August 1995.
Smith et al., The Neurostation"'-A Highly accurate, Minimally Invasive Solution to Frameless Stereotactic Neurosurgery, Computerized Medical Imaaina and Graphics, Vol. 18, No. 4, pp. 247-256, 1994.
In view of the above, it will be seen that the several objects of the invention are achieved and other advantageous results attained.
As various changes could be made in the above constructions, products, and methods without departing from the scope of the invention, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
Image guided medical and surgical procedures comprise a technology by which scene, obtained either 15 Pre-procedurally or infra-procedurally (i.e., prior to or during a medical or surgical procedure), are used to generate images to guide a doctor during the procedure.
The recent increase in interest in this field ie a direct result of the recent advances in scanning technology, especially in devices using computers to generate three dimensional images of parts of the body, such as computed tomography (CT) or magnetic resonance imaging (l~tI).
.'S The majority of the advances in diagrammatic imaging involve devices which tend to be large, encircle the body~part being imaged, and are expensive. Although the scans produced by these devices depict the body part under investigation with high resolution and good spatial fidelity, their coat usually precludes the dedication of a unit to be used during the performance of procedures.
Therefore, image guided surgery is usually performed using images taken preoperatively.
The reliance upon preoperative images has 1.. focused image guidance largely to the cranium. The skull, by encasing the brain, serves as a rigid body which largely inhibits chaagea in anatomy between imaging and surgery. The skull also provides a relatively easy point of reference to which fiduciala or a reference system may be attached so that registration of pre-procedural images to the procedural work space can be done simply at the beginning, during, or throughout the procedure. Registration is defined as the process of relating pre-procedural or iatra-procedural scan of the anatomy undergoing aurgeryr to the surgical or medical position of the corresponding anatamay. For example, see U.S. Patent No. 5,383,454.
This eituatioa of rigid fixation and absence of anatomical movement between imaging and surgery ie unique to the skull sad intracranial contents and permits a simple one-to-one registration process as shown is Figure 1. The position during a medical procedure or surgery is in registration with the pre-procedural image data set because of the abaeace of anatomical movement fra~n the time of the scan until the time of the procedure; in WO 96/11624 ~ ~ 7 PCT/US95/12894 effect, the skull and it's intracranial contents comprise a "rigid body," that is, an object which does not deform internally. In almost every other part of the body there is ample opportunity for movement within the anatomy which degrades the fidelity by which the pre-procedural scans depict the intra-procedural anatomy. Therefore, additional innovations are needed to bring image guidance to the rest of the body beyond the cranium.
The accuracy of image guided surgery relies upon the ability to generate images during medical and surgical procedures based on scans taken prior to or during the procedure and based on the present position and shape of the body parts during the procedure. Two types of body parts are addressed herein: 1) structures within the body that do not change shape, do not compress, nor deform between the process of imaging and the medical procedure, which are termed "rigid bodies,"
and are exemplified by the bones of the skeleton; and 2) structures within the body that can change shape and deform between the process of imaging and the medical procedure structures are termed "semi-rigid bodies," and are exemplified by the liver or prostate. Both types of body parts are likely targets for medical or surgical procedures either for repair, fusion, resection, biopsy, or radiation treatment. Therefore,. a technique is needed whereby registration can be performed between the body parts as depicted pre-procedurally on scans and the position and shape of these same body parts as detected intro-procedurally. This technique must take into account that movement can occur between portions of the . body which are not rigidly joined, such as bones connected by a joint, or fragments of a broken bone, and that shape deformation can occur for semi-rigid bodies, such as the liver or prostate. In particular, the technique must be able to modify the scanned image dataset such that the modified image dataset which is 4 ~. a 2 2 01 ~ 7 7 pCT~S95112894 used for localization and display, corresponds to position and/or shape of the body parts) of interest .
during a medical or surgical procedure. A key to achieving this correspondence is the ability to precisely detect and track the position and/or shape of the body parts) of interest during the medical or surgical procedure, as well as to track instruments, ------- or radiation used during the said procedure.
Summary of the Invention_ It is an object of this invention to provide a system which allows registration between a body part depicted in pre-procedural images and tracked during surgery.
It is a further object of this invention to provide a system which allows registration between a semi-rigid body such as the liver depicted in pre-procedural images and detected during surgery.
It is a further object of this invention to provide a system which allows registration between multiple body parts such as skeletal elements depicted in pre-procedural images and detected during surgery..
It is a further object of this invention to provide a system which can localize a semi-rigid body that may deform between imaging and a procedure and provide a display during the procedure of the body in its deformed shape .
It is a further object of this invention to provide a system which can localize multiple rigid bodies that move with respect to each other between imaging and a procedure and provide a display during the procedure of the bodies in their displaced positions.
It is another object.of this invention to provide a system for use during a medical or surgical procedure on the body, the system generating a display representing the position of one or more body elements . 220_1.877 s during the procedure based on a scan generated by a scanner either prior to or during the procedure.
It is another object of this invention to provide a system for use during a medical or surgical procedure on a body which modifies the scan taken prior to or during a procedure according to the identified relative position of each of the elements during the procedure.
It is another object of this invention to provide a system for use during a medical or surgical procedure on a body which modifies the image data set according to the identified shape of each of the element during the procedure.
It is another object of this invention to provide a system which generates a display representative of the position of a medical or surgical instrument in relation to the body elements) during a procedure.
It is a further object of this invention to provide a system for use during image guided medical and surgical procedures which is easily employed by the doctor or surgeon conducting the procedure.
It is another object of this invention to provide a system which determines the relative position and/or shape of body elements during a medical or surgical procedure based on the contour of the body elements which canavoid the need for exposing the body elements.
It is still another object of this invention to provide a system which employs one or more two dimensional fluoroscopic or x-ray images of body elements to determine their relative position and/or shape in three dimensions.
It is yet a further object of this invention to describe a surgical or medical procedure which employs a display representing the position of the body elements) during the procedure based on an image data set of the body elements) generated prior to the procedure.
It is a further object of this invention to provide a system and method for medical or surgical i procedures which allows repositioning of body elements during the procedure and still permits the generation of an image showing the relative position of the body elements.
It is a further object of this invention to provide a system and method for medical or surgical 1() procedures which allows reshaping of the body elements) during the procedure and still permits the generation of an image showing the position and current shape of the body elements.
It is a further object of this invention to 1~~ provide a system which can localize a body element and provide a display during the procedure of the position of the body element relative to an instrument, such as a forceps, microscope, or laser, so that the instrument can be precisely located relative to the body element.
2Ci Other objects and features will be in part apparent and in part poi.nt:ed out hereinafter.
The invention comprises a system for use during a procedure on a body, said system generating a display representing a position of two or more body elements during 25. the procedure, said systern comprising: a memory storing an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a plurality of data points and corresponding to a plurality of reference points for each of the body 3C~ elements, the reference points of a particular body element having a known spatial relation to the data points of the _ 7 particular body element; means for identifying, during the procedure, the position of the reference points of each of the body elements relative to the reference points of the other body elements; a processor modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as identified by the identifying means, said processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor and providing a display, illustrating the relative position of the body elements during the procedure.
The invention also comprises a method for use during a procedure, said method generating a display representing a position of two or more body elements during the procedure, said method comprising the steps of: storing an image data set in a memory, the image data set representing the position of the two or more body elements based on scans taken of the body; reading the image data set stored in the memory, said image data set having a plurality of data points and a plurality of reference points for each of the body elements, the data points of each body element having a spatial relation to the data points of another body element, the reference points of a particular body element having a known spatial relation to the data points of the particular body element; identifying, during the procedure, the position of each of the reference points of each of the body elements relative to the reference points of the other body elements; modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure in order to generate a displaced image data set representing the relative position of the body elements during the procedure; and generating a display based on the displaced image data set illustrating the relative position of the !~ body elements during the ;procedure.
The invention also comprises a system for use during a procedure on a body, said system generating a display representing a geometrical relationship of two or more body elements during the procedure, said system 11) comprising: a memory for storing an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a geometrical spatial relationship of the body elements and having a plurality of reference points on each of the body 1!i elements, the reference points of a particular body element having a known spatial relation on the particular body element; means for identifying, during the procedure, a position of each of the reference points of the body elements to be displayed; a processor modifying the 20 geometrical relationship of the body elements as necessary according to the identified position of each of the reference points as identified by the identifying means, said processor generating a displaced image data set from the modified geometrical :relationship of the body elements, 2!3 said displaced image data set representing the geometrical relationship of the body .elements during the procedure; and a display system utilizing the displaced image data set to generate the display.
The invention also comprises a system for use 30 during a procedure on a body, said system generating a display representing a position of two or more body elements during the procedure, said system comprising: a memory configured to store an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a plurality of data points and corresponding to a plurality of reference points for each of the body elements, the reference points of a particular body element having a known spatial relation to the data points of the particular body element; means for identifying the position of the reference points; a digitizer positioned to receive a signal output by the means for identifying, said digitizer being configured to determine a three dimensional position of the reference points for the particular body element; a processor modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as determined by the digitizer, said processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor, illustrating the relative position of the body elements during the procedure.
The invention also comprises a system for displaying relative positions of two or more body elements during a procedure on a body, the system comprising: a memory storing an image data set, the image data set representing the position of the body elements based on scans of the body, and having a plurality of data points correlatable to a plurality of reference points for each of the body elements, the position of reference points of a particular body element relative to the data points for that particular body element being known; a reference system for identifying, during the procedure, the position of the reference points of each of the body elements relative to the reference points of the other body elements; a processor i modifying the relative position of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as identified by the 5 reference system, the processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor to display the relative position of the body elements during the procedure.
10 The invention also comprises a system for use during a medical or surgical procedure on a body, said system being adapted to generate a display from an image data set representing the position of one or more body elements during the procedure based on scans taken of the body, wherein at least some of the body elements are movable with respect to each other, each scan having a reference for each of the one or more body elements, the reference of a particular body element having a known spatial relation to the particular body element said system comprising: a plurality of reference frames, at least one attached to each of the movable body elements; a localizer for identifying, during the procedure, the position of the reference for each of the body elements to be displayed; a processor for modifying the image data set according to the identified position of the reference during the procedure, as identified by the localizer, said processor being adapted to generate a displaced image data set representing the position and geometry of the body elements according to the identified position of the reference frames during the procedure, as identified by the localizer; and a display utilizing the displaced image data set generated by the 10a processor, illustrating the position and geometry of the body elements during the procedure.
The invention also comprises a method for use during a medical or surgical procedure on a body, said method being adapted to generate a display from an image data set representing the position of one or more body elements during the procedure based on scans taken of the body, wherein at least some of the body elements are movable with respect to each other, each scan having a reference for each of the one or more body elements, the reference of a particular body element having a known spatial relation to the particular body element, wherein there is a plurality of reference frames, at least one attached to each of the movable body elements; said method comprising the steps of:
identifying, during the procedure, the position of the reference for each of the body elements to be displayed;
modifying the image data set according to the identified position of the reference during the procedure, as identified by the localizer; generating a displaced image data set representing the position and geometry of the body elements according to the identified position of the reference frames during the procedure; and illustrating the position and geometry of the body elements during the procedure utilizing the displaced image data set.
In addition, the invention comprises a method for displaying the relative positions of a plurality of body elements during a procedure on a body, the method comprising: storing scan images of body elements, the scan images having a plurality of data points corresponding to a plurality of reference points for each of the body elements, the position of the reference points of a particular body element relative to the data points for that particular body element being known; identifying the position of each of the lOb reference points of each of the body elements relative to the reference points of the other body elements during a procedure; modifying the relative position of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference paints during the procedure;
generating a displaced image data set representing the position of the body elements during the procedure; and displaying the relative position of the body elements during the procedure.
'64725-841 Hrief Descri~pition of the Drawincs Figure 1 is an illustration of the prior art system in which rigid fi~cation and absence of movement between imaging and surgery permits a one-to-one registration process between the pre-surgical scan and the position in surgery.
Figure 2A is an illustration of operation of the invention in which the pre-procedural image data set .in modified in accordance with the intra-procedural position in order to generate a displaced and/or defosnned data set representative of the intra-procedural position.
Figure 28 is a block diagram of one preferred embodiment of a system according to the invention.
Figure 3 is an illustration of the pre-procedural alignment of three body elements during scanning.
Figure 4 is as illustration of the intra-procedural alignment of the three body elements of Figure 3 during surgery.
Figure 5 is an illustration of three body elements, one of which hay a reference frame attached thereto, in combination with a registration probe.
Figure 6 is an illustration showing ultrasound registration according to the imrention in which emitters are attached to the ultrasound for a virtual reference 3~0 and, optionally, the patient's body for as actual reference .
Figure '7 is an illustration of a fluoroscopic localizer according to the invention for providing projections of an image of the body elea~eate.
Figure 8 is an illustration of a drill guide instrument of the invention wherein the position of a drill guide relative to the body elements may be displayed.
Figures 9 and 10 illustrate a clamped reference frame and a wired reference frame, respectively.
Figure 11 is a schematic diagram of one preferred embodiment of a cranial surgical navigation system according to the invention.
Figure 11A is a top plan view of one preferred embodiment of a cranial reference arc frame according to the invention.
Figure 11B is a side plan view, partially in cross section, of one preferred embodiment of a cranial reference arc frame according to the invention.
Figure 11C is a wiring diagram of one preferred embodiment of a cranial reference arc frame according to the invention.
Figure 12A is a top plan view of one preferred embodiment of a spinal reference arc frame according to the invention.
Figure 12B is a front plan view, partially in cross section, of one preferred embodiment of a spinal reference arc frame according to the invention.
Figure 12C is a side plan view of one preferred embodiment of a spinal reference arc frame according to the invention.
Figure 12D is a top plan view of one preferred embodiment of a thoraco-lumbar mount according to the invention.
Figure 12E is a front plan view, partially in ' cross section, of one preferred embodiment of a thoraco-lumbar mount according to the invention. ' Figure 12F is a side plan view of one preferred embodiment of a thoraco-lumbar mount according to the invention.
Figure 12G is a wiring diagram of one preferred embodiment of a spinal reference arc frame according to the invention.
~ Figure 13A is a top plan view ofone preferred embodiment of a biopsy guide localization frame according to the invention.
Figure 13B is a side plan view, partially in cross section, of one preferred embodiment of a biopsy guide localization frame according to the invention.
Figure 13C is a front plan view of one preferred embodiment of a biopsy guide localization frame according to the invention.
Figure 13D is a top plan view of one preferred embodiment of a drill guide localization frame according to the invention.
Figure 13E is a side plan view, partially in cross section, of one preferred embodiment of a drill guide localization frame according to the invention.
Figure 13F is a top plan view of one preferred embodiment of a drill yoke localization frame according to the invention.
Figure 13G is a side plan view, partially in cross section, of one preferred embodiment of a drill yoke localization frame according to the invention.
Figure 13H is a top plan view of one preferred embodiment of a ventriculostomy probe including an integrated localization frame according to the invention.
Figure 13I is a side.plan view, partially in cross section, of one preferred embodiment of a ventriculostomy probe including an integral localization ~ frame according to the invention.
Figure 13J is a wiring diagram of one preferred ~ embodiment of a localization frame according to the invention.
Corresponding reference characters indicate corresponding parts throughout the drawings.
~- 1~~7 Detailed Description of the Preferred Embodiments Referring to Fig. 2, an overview of operation _ of one preferred embodiment of the system according to the invention is illustrated. Prior to a particular , procedure, the body elements which will be part of the procedure are scanned to determine their alignment, i.e., their pre-operative position. For example, the alignment may be such as illustrated in Fig. 3 wherein body elements 10, 20, and 30 are more or less aligned in parallel. These body elements may be bones or other rigid bodies. In Fig. 3, three-dimensional skeletal elements 10, 20, 30 are depicted in two dimensions as highly stylized vertebral bodies, with square vertebra 11, 21, 31, small rectangular pedicles 12, 22, 32, and triangular spinous processes 13, 23, 33. During imaging, scans are taken at intervals through the body parts 10, 20, 30 as represented in Fig. 3 by nine straight lines generally referred to be reference character 40. At least one scan must be obtained through each of the body elements and the scans taken together-constitute a three-dimensional pre-procedural image data set.
Fig. 2B is a block diagram of the system.
according to the invention. A scanner interface 102 allows a processor 104 to obtain the pre-procedural image data set generated by the scanner and store the data set in pre-procedural image data set memory 106. Preferably, after imaging, processor 104 applies a discrimination process to the pre-procedural image data set so that only the body elements 10, 20, 30 remain in memory 106. If a - 30 discrimination process is employed, processor 104 may execute the discrimination process while data is being ' transferred from the scanner through the scanner interface 102 for storage in memory 106. Alternatively, ' memory 106 may be used for storing undiscriminated data and a separate memory (not shown) may be provided for storing the discriminated data. In this alternative, fir. '~;
WO 96!11624 1$
processor 104 would transfer the data set from the scanner through scanner interface 102 into memory 106 and then would discriminate the data stored in memory 106 to generate a discriminated image data set which would be stored in the separate memory.
Once the body elements 10, 20, 30 are discriminated and each defined as a single rigid body, they can be repositioned by established software algorithms to form the displaced image data set. Each rigid body element, 10, 20, 30, must have at least three recognizable reference points which are visible on the pre-procedural images. These reference points must be accurately detected during the procedure. For body part .10, reference points 10A, !0B, and lOC are located on the spinous process 13; for body part 20, reference points 20A and 20C are located on the vertebra 21 and reference point 20B is located on spinous process 23; and for body part 30, reference points 30A and 30B are located on the spinous process 33 and reference point 30C is located on the vertebra 31. More than one reference point can be selected on each scan through the bone, although the maximal accuracy of registration is achieved by separating the reference points as far as possible. For example, in the case of posterior spinal surgery, it may be preferable to select reference points 10A, !0B, and lOC on the spinous process which is routinely exposed during such surgery. It is contemplated that system software may allow the manual or automated identification of these same points on the images of the body elements 10, 20, 30. As Fig. 3 is a two-dimensional projection of a three-dimension process, the reference points will not be limited to a perfect sagittal plane, as depicted.
After imaging, the skeletal body elements 10, 20, 30 may move with respect to each other at the joints or fracture lines. In the procedure room, such as an operating room or a room where a medical procedure will WO 96/11624 t ~ ~ ~ PCTlL1895/12894 be performed, after positioning the patient the surgery, the body elements will assume a different geometry, such as the geometry depicted in Fig. 4.
As a result of this movement, the pre- , procedural image data set stored in memory 106, consisting of the scans through the skeletal elements, does not depict the operative position of the skeletal elements, as shown a.n Fig. 4. However, the shape of the skeletal elements, as depicted by the scans through the element, is consistent between imaging and procedure since they are rigid bodies, as indicated by the lines 40 through each element in Fig. 4. Therefore, the image data set must be modified to depict the intraprocedural geometry of the skeletal elements. This modification is performed by identifying the location of each reference point of each skeletal element in procedure space. As diagrammatically illustrated in Fig. 2, a localizer 108 (see Figure 13, below, for more details) identifies the location and provides this information so that the pre- .
procedural data set may be deformed or re-positioned into the displaced data set. As a result, the displaced data set is in registration with the intra-procedural position of the elements 10, 20, 30. Once the locations of the reference points are determined by the localizer 108, processor 104, which is a part of the work station, can execute software which re-positions the images of the skeletal elements to reflect the position of the actual elements in the procedure room thus forming the displaced set and the registration between the displaced set and the intra-procedural position.
Preferably, a three-dimensional digitizer may be used as the localizer 108 to determine the position and space of the elements 10, 20, 30 during the ' procedure. In general, the digitizer would include a reference array 110 which receives emissions from a series of emitters. Usually, the emissions consist of l~ . ~ , WO 96!11624 ~ ~ ~ ~ ~ ~ ~ PCT/US95/12894 some sort of energy, such as light, sound or electromagnetic radiation. The reference array 110 is distant from the emitters which are applied to and positioned in coordination with the elements being localized, determining the position of the emitters. As is apparent, the emitters may be placed distant to the elements and the reference array 110 may be attached to~
the elements being localized.
Referring to Fig. 2, an alternate preferred embodiment of the system according to the invention in the case where the body elements are not rigid, but rather semi-rigid such that shape deformations may occur to the body elements is described as follows. Prior to a particular procedure, the body elements which will be part of the procedure are scanned to determine their pre-operative position and shape. For e~cample, the alignment may be such as illustrated in Fig. 3 wherein body elements 10, 20, and 30 are more or less aligned a.n parallel and have a defined shape. These body elements may be soft tissue such as the prostate or other semi-rigid bodies.
After imaging, the elements 10, 20, 30 may move with respect to each other and also their shape may become deformed. In the procedure room, such as an operating room or a room where a medical procedure will be performed, after positioning the patient the surgery, the body elements may assume a different geometry, such as the geometry depicted in Fig. 4 where geometry depicts both element alignment (position) and shape.
As a result of this changed geometry, the pre-procedural image data set stored in memo,°y 106, does not depict the operative geometry of the bor~y elements, as shown in Fig. 4. Indeed, the shape of the body elements, as depicted,by the scans through the element, may have changed between imaging and procedure since they are semi-rigid bodies. Therefore, the image data set must be ~~~~~1~'~~
WO 96/11624 PCTlUS95/12894 modified to depict the current geometry of the body elements. This modification is performed by identifying the location of the reference points of each body element in procedure space. As diagrammatically illustrated in Fig. 2, a localizes 108 possibly in communication with a processor 104 identifies the location of the reference points and provides this information so that the pre-procedural data set may be deformed into the displaced data set. Once the locations of the reference points are determined, processor 104, which is a part of the work station, can execute software which modifies the images of the body elements to reflect the geometry of the actual elements in the procedure room thus forming the displaced set and the registration between the displaced set and the intra-procedural position. As a result, the displaced data set is a.n registration with the intra-procedural geometry of the elements 10, 20, 30.
According to one preferred embodiment of the invention, a reference frame 116 is attached to one of the body elements 10 at the beginning of the procedure.
Various reference frame embodiments are illustrated in more detail in Figures 11 and 12, below. Reference frame 116 is equipped with a plurality of emitters 114 which together define a three-dimensional intraprocedural coordinate system with respect to the body element 10.
In conventional terms, the reference frame 116 defines the stereotactic space with respect to the body element 10. Emitters 114 communicate with sensors 112 on a reference array 110 located in the procedure room and remote from the reference frame 116 and patient. If the body of the patient is not immobilized during surgery, then multiple reference frames may be required for each body element to define a surgical space with respect to each element.. The surgical space may alternatively be defined by rigid fixation of the frame emitters 114 directly (or indirectly, for example, to the skin) to the WO 96/11624 2 ~ ~ '~ , 8 7 ~ pCT/US95/12894 skeletal elements 10, 20, or 30. In either case, the . emitters 114 emit a signal which is received by the sensors 112. The received signal is digitized to compute position, for example, by triangulation. Through such information, the localizes 108 or a digitizer which is part of the localizes 108 can determine the exact three-dimensional position of the frame emitters 114 relative to the sensors 112. Thereby, localizes 108 or the processor 104 can exactly determine the position of the reference frame 116 relative to the array which is free to move except during localization, e.g., activation of the emitters 114 on the reference frame 116 and activation of the probe emitters 112. Emitters 114 of the reference frame 116 are energized to provide radiation to the sensors 112, which radiation is received and generates signals provided to the localizes 108 for determining the position of the frame 116 relative to the array 110.
Next, it is necessary to determine the position of the body element 10, which may be a skeletal element, to which the reference frame 116 is affixed or positioned with respect to. In particular, the position of the body element 10 relative to the reference frame 116 must be determined, thereby determining the position of the body element 10 in the surgical space defined by the reference frame 116. After exposure of the reference points 10A, 10B, lOC by surgical dissection, the reference points are touched by the tip of a registration probe 118 equipped with emitters 120. As each of the reference points 10A, 10B, lOC is touched by the tip of the probe 120, the ' emitters are energized to communicate with the sensors 112 of reference array 110. This communication permits ' the localizes 108 to determine.the position of the registration probe 120, thereby determining the position of the tip of the probe 120, thereby determining the position of the reference point 10A on which the tip is WO 96!I1624 ~~ ~ ~. PCT/US95/12894 positioned. By touching each of the reference points 10A, 10B, lOC on each body element 10, 20, 30 involved in -the procedure, an intra-procedural geometry data is generated and stored in memory 121. This data is related .
5 to the corresponding reference points on the pre-procedural images of the same elements by processor 104.
which employs software to derive a transformation which allows the determination of the exact procedural position, orientation, and shape in surgical space of 10 each body element, and thereby modifies the pre-procedural image data set stored in memory 106 to produce a displaced image data set which is stored in memory 122.
The displaced image data set in memory 122 reflects the geometry of the actual elements 10, 20, 30 during the 15 procedure. Processor 104 displays the displaced image data set on display 124 to provide a visual depiction of the geometry of the body elements 10, 20, 30 during the procedure. This image is used during the procedure to assist in the procedure. In addition, it is contemplated 20 that an instrument, such as a forceps, a laser, a microscope, a endoscope, or a radiation delivery system, which would be used during the procedure may be modified by the addition of emitters. This modified device when moved into the area of the body elements 10, 20, 30 would be activated so that its emitters v~ould communicate with the reference array 110 thereby permitting localizes 108 to determine the instrument's position. As a result, processor 104 would modify display 124 to indicate the position of the instrument or the instruments focal point, such as by positioning a cursor, with respect to the body elements 10, 20, 30. ' Further, it is contemplated that the addition of emitters on an instrument (effector) may be used with the system in order to create a closed-loop feedback for actively (in the case of robotics) or passively controlling or monitoring the instrument and its WO 96/11624 ~ ' ~ ~ '~ ~ 7 ~ PCT/US95/12894 position. Such a control loop allows the monitoring of certain procedures such as the delivery of radiation to the body or the use of a drill where the object of the procedure is to keep the focal point of the instrument in a safe zone, i.e. a predetermined procedural plan. Such a control loop could also control the operation of a robotically controlled instrument where the robotics could be driven (directly or indirectly) by processor 104 to control the position the position of the instrument.
For example, the processor could instruct a robotic arm to control the position of a laser. The laser position could be monitored, such as by emitters on the laser.
The processor would be programmed with the control parameters for the laser so that it would precisely follow a predetermined path.
Reference frame 116 allows the patient to be moved during the procedure without the need for re-registering the position of each of the body elements 10, 20, 30. It is assumed that during the procedure, the body elements are fixed relative to each other. Since the reference frame 116 is fixed (directly or indirectly) to body element 10, movement of the patient results in corresponding movement of the reference frame 116.
Periodically, or after each movement of the patient, array emitters 114 may be energized to communicate with the sensors 112 of reference array 110 in order to permit localizer 108 to determine the position~of the reference frame 116. Since the reference frame 116 is in a known relative position to element 110 and since we have assumed that elements 20 and 30 are in fixed relation to element 10, localizer 108 and/or processor 104 can determine the position of the elements and thereby maintain registration.
An alternative to touching the reference points A, B, C with the tip of the probe 118 would be to use a contour scanner 126a with emitters attached 126b. Such a device, using some form of energy such as sound or light which is emitted, reflected by the contour and sensed, would allow the extraction of a contour of the body elements 10, 20, 30, thus serving as a multitude of reference points which would allow registration to occur.
The registration process as analogous to the process described for ultrasound extracted contours below.
In certain situations, markers may be used on the skin surface as reference points to allow the transformation of the pre-procedural image data set into the displaced image data set. Reciprocally, akin surface fiducials applied at the 'time of ianaging can be used to re-position the body to match the geometzy during imaging and is described below.
Localization of body elements 10, 20, 30 may be desired without intra-procedural exposure of the reference points A, H, C on those body elements.
8xamples wherein the spine is.minimally exposed include percutaaeous biopsy of the spine or discectomy, spinal ZO fixation, endoscopy, percutaa~ous spinal implant insertion, percutaneous fusion, insertion of drug delivery systems, sad radiation delivezy. In this situation, localization of reference points on the body elements meet be determined by some form of imaging which can localize through overlying soft tissue and/or discriminate surrounding tissue sad structures. There are currently two imaging techniques which are available to a surgeon is the operating room or a doctor in a procedure room which satisfy the needs of being low cost and portable. Hoth imaging techniques, ultrasonography and radiography, can produce two-or three-dimensional images which can be employed is the fashion described herein to register a three-dimensional form such as a skeletal element.
The coupling of a three-dimensional digitizer to a probe of an ultrasound device affords benefits in that a contour can be obtained which can be related directly to a reference system that defines three-dimensional coordinates in the procedural work space, i.e., the surgical space. In the context of the present invention, a patient is imaged prior to a procedure to generate a pre-procedural image data set which is stored in memory 106. In the procedure 10. room, the patient's body is immobilized to stabilize the spatial relationship between the body elements 10, 20, 30. A procedural reference system, surgical space, for the body is established by attaching a reference frame 116 to one of the body elements or by otherwise attaching emitters to the patient or body elements as noted above, or by attaching emitters to a device capable of tracking one of the body elements thereby forming a known relationship with the body element. For example, this could be performed by using the percutaneous placement of a reference frame similar to the one described above, radiopaque markers screwed into the elements or by placing emitters 130 directly on the skins, as illustrated in Fig. 6, based on the assumption that the skin does not move appreciably during the procedure or in respect to the body elements.
An ultrasound probe 128 equipped with at least three emitters 130 is then placed over the body element of interest. The contour (which can be either two- or three-dimensional) of the body element is then obtained using the ultrasound probe 128. This contour can be expressed directly or indirectly in the procedural coordinates defined by the reference system (surgical space). Emitters 130 communicate with sensors 112 of reference array 110 to indicate the position of the ultrasound probe 128. An ultrasound scanner 131 which energizes probe 128 determines the contour of the body ~~~~ ~7 7 element of interest and being scanned. This contour information is provided to processor 104 for storage in intra-procedural geometry data memory 121.
The intra-procedural contour stored in memory , 121 is then compared by a contour matching algorithm to a corresponding contour extracted from the pre-operative image data set stored in memory 106. Alternatively, a pre-procedural contour data set may be stored in memory 134 based on a pre-procedural ultrasound scan which is input into memory 134 via scanner interface 102 prior to the procedure. This comparison process continues until a match is found for each one of the elements. Through this contour matching process, a registration is obtained between the images of each body element and the corresponding position of each element in the procedural space, thereby allowing the formation of the displaced image data set 122 used for localization and display.
Note that the contours used in the matching process only have to be sufficiently identical to accomplish a precise match - the contours do not have to be the same extent of the body element.
In certain instances, the ultrasound registration noted above may not be applicable. For example, ultrasound does not penetrate bone, and the presence of overlying bone would preclude the registration of an underlying skeletal element. Further, the resolution of ultrasound declines as the depth of the tissue being imaged increases and may not be useful when the skeletal element a.s so deep as to preclude obtaining an accurate ultrasonically generated contour. In these circumstances, a radiological method is indicated, which utilizes the greater penetrating power of x-rays.
Pre-operative imaging occurs as usual and the skeletal elements may be discriminated from the soft tissue in the image data set as above. In particular, a CT scan of the skeletal elements 10, 20, 30 could be W O 96!11624 ~, ~ ~ ~. ~ ~ ~ ~ PCT/LTS95/12894 taken prior to the procedure. Processor 104 may then _ discriminate the skeletal elements and store the pre-procedural image data set in memory 106. Next, the patient is immobilized for the procedure. A radiograph 5 of the skeletal anatomy of interest is taken by a radiographic device equipped with emitters detectable by the digitizer. For example, a fluoroscopic localizes 136 is illustrated in Fig. 7. Localizes 136 includes a device which emits x-rays such as tube 138 and a screen 10 140 which is sensitive to x-rays, producing an image when x-rays pass through it. This screen is referred to as a fluoroscopic plate. Emitters 142 may be positioned on the tube 138, or on the fluoroscopic plate 140 or on both. For devices in which the tube 138 is rigidly 15 attached to the plate 140, emitters need only be provided on either the tube or the plate. Alternatively, the reference array 110 may be attached to the tube or the plate, obviating the need for emitters on this element.
By passing x- rays through. the skeletal element 141 of 20 interest, a two-dimensional image based on bone density is produced and recorded by the plate. The image produced by the fluoroscopic localizes 136 is determined by the angle of the tube 138 with respect to the plate 140 and the position of the skeletal elements 25 therebetween and can be defined with respect to procedure coordinates (surgical space). Fluoroscopic localizes 136 includes a processor which digitizes the image on the plate 140 and provides the digitized image to processor 104 for possible processing and subsequent storage a.n intra-procedural geometry data memory 121. Processor 104 ' may simulate the generation of this two-dimensional x-ray image by creating a series of two-dimensional projection of the three-dimensional skeletal elements that have been discriminated in the image data set stored in memory 106.
Each two dimensional projection would represent the passage of an X-ray beam through the body at a specific angle and distance. In order to form the displaced data set and thus achieve registration, an iterative process is used which selects that a two-dimensional projection through the displaced data set that most closely matches the actual radiographic images) stored in memory 121.
The described process can utilize more than one radiographic image. Since the processor 104 is also aware of the position of the fluoroscopic localizers because of the emitters 142 thereon, which are in communication with localizer 108, the exact position of the skeletal elements during the procedure is determined.
As noted above, the procedural reference system or surgical space for the body can be established by attaching emitters to a device capable of detecting and tracking, i.e. identifying, one of the body elements thereby forming a known relationship with the body element. For example, the emitters 130 on the ultrasound probe 128 together and without the three emitters on the ' patient's body form a type of reference frame 116 as depicted in Figure 6 which can be virtually attached to body element 10 by continuously or periodically updating the ultrasound contour of body element 10 stored in intra-procedural geometry data memory 121 which the processor 104 then uses to match to the contour of body element 10 stored in pre-procedural~memory 106 thereby continuously or periodically updating the displaced image data set in memory 122 so that registration with the procedural position of the body elements is maintained.
It is contemplated that a virtual reference frame can be accomplished using any number of devices that are capable of detecting and tracking a body element such as radiographic devices (fluoroscope), endoscopes, or contour scanners.
The above solutions achieve registration by the formation of a displaced image data set stored in memory 122 which matches the displacement of the skeletal WO 96111624 ~ ~ ~ ~ 7 7 p~/Ug95112894 elements at the time of the procedure. An alternative technique to achieve registration a.s to ensure that the positions of the skeletal elements during the procedure are identical to that found at the time of imaging. This can be achieved by using a frame that adjusts and immobilizes the patient's position. In this technique, at least three markers are placed on the skin prior to imaging. These markers have to be detectable by the imaging technique employed and are called fiducials. A
multiplicity of fiducials is desirable for improving accuracy.
During the procedure, the patient's body is placed on a frame that allows precise positioning. Such frames are commonly used for spinal surgery and could be modified to allow their use during imaging and could be used for repositioning the patient during the procedure.
These frames could be equipped with drive mechanisms that allow the body to be moved slowly through a variety of positions. The fiducials placed at the time of imaging are replaced by emitters. By activating the drive mechanism on the frame, the exact position of the emitters can be determined during the procedure and compared to the position of the fiducials on the pre-procedural image data set stored in memory 106. Once the emitters assume a geometry identical to the geometry of the fiducials of the image data set, it is considered that the skeletal elements will have resumed a geometric relationship identical to the position during the pre-procedural scan, and the procedure can be performed using the unaltered image data set stored in memory 106.
In general, instrumentation employed during procedures on the skeleton is somewhat different than that used for cranial applications. Rather than being concerned with the current location, surgery on the skeleton usually consists of placing hardware through bones, taking a biopsy through the bone, or removing WO 96!11624 PCT/US95/12894 ~ 0'~ ~7 7.
fragments. Therefore, the instrumentation has to be specialized for this application.
One instrument that is used commonly is a drill. By placing emitters.on a surgical drill, and by .
having a fixed relationship between the drill body and its tip (usually a drill bit), the direction and position of the drill bit can be determined. At leasr_ three emitters would be needed on the drill, as most drills have a complex three-dimensional shape. Alternatively, emitters could be placed on a drill guide tube 800 having emitters 802, and the direction 804 of the screw being placed or hole being made could be determined by the digitizer and indicated on the image data set (see Fig.
.8). The skeletal element 806 would also have emitters thereon to indicate its position.
Besides modification of existing instrumentation, new instrumentation is required to provide a reference system for surgery as discussed above. These reference frames, each equipped with at least 3 emitters, require fixation to the bone which prevents movement or rotation.
For open surgery, a clamp like arrangement, as depicted in Fig. 9, can be used. A clamp 900 is equipped with at least two points 902, 904, 906, 908 which provide fixation to a projection 910 of a skeletal element. By using at least two point fixation the clamp 900, which functions as a reference frame, will not rotate with respect to the skeletal element. The clamp includes emitters 912, 914, 916 which communicate with the array to indicate the position of the skeletal element as it is moved during the procedure.
Many procedures deal with bone fragments 940 which are not exposed during surgery, but simply fixated with either wires or screws 950, 952 introduced through the skin 954. Fig. 10 depicts a reference platform 956 attached to such wires or screws 950, 952 projecting WO 96!11624 ~ ~ ~ ~ ~ 7 pCT/LTg95112894 through the skin 954. The platform 956 includes a plurality of emitters 958, 960, 962, 964 which communicate with the array to indicate the position of the bone fragment 940 as it is moved during the procedure.
The reference frame can be slipped over or attached to the projecting screws or wires to establish a reference system. Alternatively, the frame can be attached to only one wire, as long as the method of attachment of the frame to the screw or wire prevents rotation, and that the wire or screw cannot rotate within the attached skeletal element.
REFERENCE AND LOCALIZATION FRAMES
Fig. 11 is a schematic diagram of one preferred embodiment of a cranial surgical navigation system according to the invention. Portable system cabinet 102 includes a surgical work station 104 which is supported for viewing by the surgeon or technician using the system. Work station 104 includes a screen 106 for illustrating the various scans and is connected to a personal computer 108 for controlling the monitor 106.
The system also includes an optical digitizer including a camera array 110, a camera mounting stand 112 for supporting the array remote from and in line of sight with the patient, a digitizer control unit 114 on the portable system cabinet 102 and connected to the computer 108, a foot switch 116 for controlling operation of the system and a breakout box 118 for interconnecting the foot sT,~ritch 116 and the digitizer control unit 114.
Also connected via the break out box 118 is a reference frame assembly 120 including a reference frame 122 with cable connected to the break out box 118, a vertical support assembly 124, a head clamp attachment 126 and a horizontal support assembly 128. Optical probe 130 (which is a localization frame) is also connected via cable to the digitizer control unit 114 via the break out box 118.
In operation, a patient's head for other "rigid" body element) is affixed to the head clamp 5 attachment 126. To determine the position of optical probe 130 with respect to the head within the head clamp attachment 126, a surgeon would step on pedal 116 to energize the emitters of reference frame 122. The emitters would generate a light signal which would be 10 picked up by camera array 110 and triangulated to determine the position of the head. The emitters of the optical probe 164 v~ould also be energized to emit light signals which are picked up by the camera array to determine the position of the optical probe 164. Based 15 on the relative position of the head and the probe 164.
control box 114 would illustrate a preoperative scan on the screen of monitor 106 which would indicate the position of the probe relative to and/or within the head.
Fig. 11A is a top plan view of one preferred 20 embodiment of a cranial reference arc frame 122 according to the invention. Reference frame 122 is for use with a surgical navigation system such as illustrated in Fig. 11 having a sensor array such as camera array 110 which is in communication with the reference frame 122 to identify 25 its position. The reference frame 122 includes a base member 132 having an upper base 134 and a base plate 136 which each have a semi-circular configuration and are joined together by screws 138 to form a cavity 140 therebetween. The base and plate may be made of anodized 30 aluminum or other autoclavable material. The top of the upper base may be provided with one or more spring clamps 142 for engaging a Leyla retractor arm. As shown in Fig.
11A, the upper base is provided with five spring clamps 142.
Either or both ends of the reference frame 122 may be provided with a bayonet fitting 144 f or engaging a 31 <
clamp which would also engage a Leyla retractor. One or both ends of the reference frame 122 is also formed into a radial projection 146 for supporting a screw 148 and crank handle 150 used to lock the reference frame to a head clamp such as head clamp 126 shown in Fig. 11 or a Mayfield Triadr" surgical clamp. This allows the reference frame 122 to be placed in a fixed position relative to the head so that any movement of the head would also include corresponding movement of the reference frame 122.
Radial projection 146, screw 148 and handle 150 constitute a coupling on the base member 132 f or engaging a structure attached to a body part (the head) thereby providing a fixed reference relative to the head in order to maintain the base member 132 in fixed relation to the head.
Equally spaced about the reference frame 122 are a plurality LEDs 152 for communicating with the of camera array 110. The LEDs 152 are mounted in holes 154 in the upper base 134, which holes 154 are in communication with the cavity 140. Wires 156 are connected to each of the terminals of the LEDs 152 are positioned within the cavity 140. The other ends of the wires are connecte d to a connector 158 for engaging a cable connected the digitizer 114 of the surgical to navigation system. The cable provides signals for activating the LED s 152. Connector 158 is mounted on a support projection 160 which projects from the base plate 136. This support projection 160 has a channel therein for permitting the wires to be connected to the connector 128. Fig. ilA is wiring diagram of one preferred a embodiment of the reference frame 122 according to the invention. As is illustrated in Fig. 11C, each LBD
tezminal is connec ted to a separate pin of the connector 158. Although the invention is illustrated as having a connector for enga ging a cable, it is contemplated that WU 96/11624 ~ , 1 ~ PCT/US95/12894 the reference frame 122 may be battery operated so that no cable is necessary.
The reference frame 122 is essentially a semi-circular arc so that it fits around the head of the patient to allow communication of multiple LEDs 152 on the reference frame 122 with the camera array 110. The multiple LEDs 152 on the reference frame 122 are positioned in a precisely known geometric arrangement so that the calibration of the camera array 110 can be checked continuously by comparing the LEDs geometric positions as calculated by the digitizer 114 with those precisely known geometric positions. Inconsistencies in this information indicates the need to recalibrate the system or to reposition the reference frame 122 so that it can more accurately communicate with the camera array 110. Frame 122 also includes a calibration divot 162.
In particular, divot 162 a.s an exactly located depression within the upper base 134 and is used to calibrate or check the calibration during the medical or surgical procedure the position of the tip of the probe. The precise location of each of the LEDs 152 relative to the calibration divot 162 is known. Therefore, locating a tip of a localization frame probe in the calibration divot 162 allows the calibration or the calibration check of the probes in the following manner. The tip of the probe is located within the calibration divot 162 and the LEDs on the probe are energized to provide light signals to the camera array 110. The LEDs on the reference frame 122 are also energized to communicate with the camera array 110. Using the known position of the divot 162 with respect to the position of each of the LEDs 152 as calculated by the digitizer 114, the.location of the calibration divot 162 is compared to the location of the tip of the pxobe as calculated by the digitizer using the LEDs on the probe in order to confirm that there is no distortion in the probe tip relative to the divot 162.
wo 96!11624 2 2 ~ ~ ~ ~ 7 PCTlUS95/12894 Distortion in the probe tip indicates the need to recalibrate the probe so that it can more accurately Communicate with the camera array 110 or to retire the probe.
Figs. 12A, 12B, and 12C illustrate another preferred embodiment of the reference frame in the form of a spine reference arc frame 200. As with reference frame 122, spine reference arc frame 200 has an upper base 202 which engages a base plate 204 to form a cavity 206 therebetween. As shown in Fig. 12A, the spine reference arc frame 200 has a generally U-shape configuration with LEDs 208 located at the ends of the legs 209 of the U-shaped member and at the intersection of the legs and base 211 of the U-shaped member.
Projecting laterally from the base 211 is a coupling 210 for engaging a thoraco-lumbar mount 212 as illustrated in Figs. 12D, 12E, and 12F. Also positioned on the base 211 is a calibration divot 214 which is a depression having . the same purpose as the calibration divot 162 of the reference frame 122. Coupling 210 has twenty-four evenly spaced teeth 216 arranged in a circular pattern for engaging the twenty-four equally spaced teeth 218 of the thoraco-lumbar mount. This allows the spine reference arc frame 200 to be positioned to form various angles relative to the mount 212., It is contemplated that any other variable position connector may be used to join the spine reference arc.frame 200 and the mount 212. Base plate 204 has an opening therein for engaging a connector 220 for receiving a cable to the digitizer control unit 114. The LEDs 208 are connected to the connector 220 by - wires 222 as illustrated in wiring diagram Fig. 12G.
Referring to Figs. 12D, 12E, and 12F, thoraco-lumbar mount 212 comprises a clamp shaft 224 having an axial bore therein within which is positioned an actuating shaft 226 which is connected to an actuating knob 228 extending beyond the end of clamp shaft 224.
The end of the actuating shaft 226 opposite the actuating knob 228 has an internal threaded bore 230 which engages external threads of an actuation screw 232. A U-shaped head 234 of screw 232 supports a pivot pin 236 between its legs. The pivot pin passes through the jaws 238 so that the jaws 238 rotate about the pivot pin 236 and move relative to each other defining a receiving area 240 within which a spinal bone or other body part may be clamped. The jaws 238 have teeth 239 for engaging a spinal bone or other body part and are spring loaded and held in their open position by spring plungers 242. As the actuating knob 228 is turned to engage the threads of actuation screw 232, the screw 232 is drawn into the bore 230 also drawing the jaws into a housing 246. This results in the caroming surfaces 244 of housing 246 engaging the follower surfaces 248 of the jaws 238 closing the jaws and closing the receiving area 240 as the jaws are pulled into the housing.
The other end of clamp shaft 224 has a perpendicular projection 250 for supporting the teeth 218 which engage the teeth 216 of the coupling 210 of the spine reference arc frame 200. A spine reference arc clamp screw 252 passes through the array of teeth 218 and engages a threaded opening 254 in the coupling 210 of frame 200. Screw 252 engages opening 254 and locks teeth 216 and teeth 218 together to fix the angle between the spine reference arc frame 200 and the thoraco-lumbar mount 212. As a result, when the mount 212 is connected to a bone by placing the bone in the receiving area 240 and turning the actuating knob 228 to close the jaws 238 and the receiving area, the frame 200 is in a fixed position relative to the bone which is engaged by the jaws. Any movement of.the bone results in movement of the frame 20.0 which can be detected by the camera array 110.
pCT/US95/12894 Referring to Figs. 13A, 13B and 13C, one preferred embodiment of a localization biopsy guide frame 300 is illustrated. In general, the frame 300 includes a localization frame 302 which supports a biopsy guide 304 5 and which also supports a support pin 306. The localization frame 302 is comprised of an upper base 308.
and a base plate 310 which join to form a cavity 312 within which the wires 314 connecting to the LEDs 316 are located. As shown in the figure 13A, the localization 10 frame has an elongated portion 318 and a generally V-shaped portion 320 having legs 322 and 324. An LED 316 is located at the end of each of the legs 322 and an LED
316 is also located at the ends of the elongated portion 318. As a result the four LEDs 316 form a rectangular 15 array. However, the underlying localization frame 302 does not have a rectangular configuration which allows it to be adapted for other uses, such as a drill guide assembly as illustrated and described below with regard to Figs. 13D and 13E. In general, the V-shaped portion 20 320 extends laterally from the elongated portion 318 in order to accomplish the rectangular configuration of the LEDs 316. Note that a rectangular configuration for the LEDs 316 is not required and that in fact, a trapezoidal configuration for the LEDs 316 may be preferred in order 25 to uniquely distinguish the orientation of the localization frame 302. Support pin 306 passes through the upper base 308 and is essentially parallel to a linear axis defined by the elongated portion 318. The purpose of support pin 306 is to allow clamps to engage 30 it so that the localization biopsy guide frame 300 can be placed in a particular position relative to a body part in order to guide a biopsy needle.
In order to guide a biopsy needle, the . localization frame 302 is fitted with a biopsy guide 35 304 which is mounted to the top of the upper base 308 and held in place by a clamp 328 which engages the upper base 308 via four screws 330. The upper base 308 is also provided with a semicircular channel 332 which forms a seat for receiving the biopsy guide 326. The guide 304 comprises a hollow tube 334 having a collar 336 at one end thereof, which has a threaded radial opening for receiving set screw 338.
The base plate 310 is fitted with a connector 340 for engaging a cable which is connected to the digitizer 114 for providing signals for energizing the LEDs 316. Figure 12G illustrates one preferred embodiment of a wiring diagram which interconnects the connector 340 and four LEDs.
The localization frame 302 is made of the same TM
material as the reference frame 122, i.e., ULTEM 1000 black which is autoclavable. The biopsy guide 304 may be stainless steel or any other autoclavable metal or plastic material. As with the reference frame, the localization frame may be battery operated thereby avoiding the need for a cable or a connector for engaging the cable.
Figs. 13D and 13E illustrate another localization device in the form of a localization drill guide assembly 350. The assembly 350 includes a localization frame 302 which is the same ae the frame used for the localization biopsy guide frame 300, except that it does not have a support pin 306. It does have a semicircular channel 332 in the upper base 308 which receives a handle and drill guide assembly 354 instead of the biopsy guide tube assembly 304. Assembly 354 includes a handle 356 which is used by the surgeon, doctor, technician or nurse conducting the procedure.
Handle 356 has a bore 358 therein for receiving a shaft 360 which is seated within the semicircular chapel 332.
The shaft terminates into an integral collar 362 which supports a drill guide tube 364. The axis of the drill guide tube 364 is at an angle relative to the axis of the shaft 360 to assist in aligning the drill guide tube 364 relative to the point at which the drill bit will be entering the patient's body. In one preferred embodiment, handle and drill guide assembly 354 is a standard off-the-shelf instrument which is mounted to the channel 332 of the localization frame 302. The handle and drill guide assembly 354 may be a Sofamor Danek Part 870-705. Screws 366 (having heads insulated with high temperature RTV compound) attach the shaft 360 to the upper base 308 of the localization frame 302 and hold the shaft 360 in place within the channel 332. As noted above, the V-shaped portion 320 of the localization frame 302 forms an opening 368 between its legs 322 and 324 so that the drill guide tube 364 may be located therebetween and project downwardly from the plane generally defined by the localization frame 302. This allows the surgeon to sight in the position of the drill guide tube 364 by looking through the tube. Connector 370 is similar to connector 340, except that it provides an angular engagement with the cable which allows for more freedom of movement of the localization drill guide assembly 350.
As with the localization f came noted above, the frame itself is made of ULTEM 1000 which is autoclavable. The handle may be wood, plastic, or any other autoclavable material and the shaft, collar and drill guide may be metal, plastic or other autoclavable material, such as stainless steel. Figure 13J illustrates a preferred embodiment of the wiring diaigram for the localization drill guide assembly 350.
Figs. 13F and 13G illustrate another localization device in the form of a drill yoke localization frame 400. This frame 400 includes a localization frame 302 of the same configuration as the localization frames for the localization biopsy guide frame 300 and the localization drill guide assembly 350.
Projecting from the underside of the base plate 310 is a support member 402 which also supports a drill yoke 404 in a plane which is essentially perpendicular to the plane defined by the localization frame 302. Yoke 404 is essentially a collar which fits over the housing of a Midas RexT""
drill and is fixedly attached thereto by a set screw 4106.
The drill yoke localization frame 400 allows the drill housing to be precisely positioned for use during surgery.
Support member 402 also supports a connector 408 for receiving a cable which is connected to the digitizer control unit 114. Support member 402 has a hollow channel therein so that the connector 408 may be connected to the wires 410 which connect to the LEDs 316.
Figure 13J illustrates one preferred embodiment of a wiring connection between the LEDs 316 and the connector 408.
Figs. 13H and 13I illustrate another localization device in the form of a ventriculostomy probe 500. Probe 500 includes a handle 502 having a bore 504 therein for receiving a support shaft 506 which in turn supports a catheter guide tube 508 along an axis which is parallel to the axis of the handle 502. The handle includes three LEDs 510 mounted along its top surface for communication with the camera array 110. The handle 502 has a hollow channel terminating in a bore 512 f or receiving a connector 514. The connector 514 is connected to wires 516 Which are also connected to the terminals of the LEDs 510. Figure 13J illustrates one preferred embodiment of a wiring diagram for interconnecting the connector 514 and the LEDs 510. In operation, the tube 508 is positioned within the body, the brain for example, so that a catheter may be inserted within the body. Tube 508 includes a top slot 518 which allows a catheter to be inserted therein. Preferably, the tube tip at its center is collinear With the chip WO 96!11624 ~ ~ ~ ~ ~ ~ ~ PCTlUS95l12894 height of all three LEDs 510 so that a linear axis is defined therebetween. Based on this linear axis and the predetermined knowledge of the distance between the tip and the LEDs 510, the camera array 110 and digitizer 114 can determine the position of the tip at any instant during a surgical or medical procedure.
The system of tile invention may be used in the following manner. A reference frame is attached to a body part. For example, cranial reference arc frame 122 may be attached directly to a head via a head clamp such as a Mayfield clamp or spine reference arc frame 200 may be attached directly to a spinous bone via thoraco-lumbar mount 212. Thereafter, movement of the body part will result in corresponding movement of the attached reference frame. The position of the body part may be tracked by energizing the LEDs of the reference frame to provide a signal to the camera array 110 so that the array can determine and track the position of the reference frame and, consequently, the position of the body part.
A localization frame is used to precisely position an instrument relative to the body part. For example, a localization biopsy guide frame 300 may be used to position a biopsy needle relative to the body part. Alternatively, a localization drill guide assembly 350 may be used to position a drill bit relative to the body part. Alternatively, a drill yoke localization frame 400 may be used to position a drill relative to the body part. Alternatively, a ventriculostomy probe 500 may be used to position a,catheter relative to a body part. The position of the instrument may be tracked by energizing the LEDs of the localization frame to provide a signal to the camera array 110 so that the array can determine and track the position of the localization frame and, consequently, the position of the instrument.
W O 96/11624 ~ ' PCT/US95/12894 During calibration of the system, the position of the reference frame relative to the body part is determined. Markers used during the preoperative scan are located and identified in coordinates of the surgical 5 space as defined by the reference frame. Note that anatomic landmarks may be used as markers. This provides a relationship between the preoperative scan space and the surgical space. Once this relationship is established, the system knows the position of the 10 preoperative scans relative to the reference frame and thus can generate scans which illustrate the position of the localization frame and the instrument relative to the body part. In other words, the system accomplishes image guided surgery. The system is ideally suited for 15 locating small, deep-seated vascular lesions and tumors and for reducing the extent of the microsurgical dissection. It is also useful in identifying boundaries.
For example, suppose a surgeon is trying to identify a boundary between normal brain and large supratentorial 20 gliomas, which may be clearly shown on the preoperative scans but which may be difficult to visually locate in the operating room during a procedure. The surgeon would take a localization probe and position it a point near the boundary. The LEDs of the reference frame and 25 localization probe are fired by use of the foot switch 116. As a result, the monitor 106 would provide a screen showing the position of the probe relative to a preoperative scan. By referring to the monitor, the surgeon can now determine the direction in which the 30 probe should be more to more precisely locate the boundary. One the boundary is located, microcottonoid markers can be placed at the boundary of the tumor as displayed on the monitor before resection is started.
The placement of ventricular catheters for shunts, 35 ventriculostomy, or reservoirs is also facilitated by the use of the system, especially in patients who have small ventricles or who have underlying coagulopathy (e. g., liver failure, acquired immunodeficiency syndrome) that makes a single pass desirable. The system can also be useful for performing stereotactic biopsies. For further information regarding the system, see the following articles:
Germano, Isabelle M., The NeuroStation System for Image Guided, Frameleas Stereotaxy, Neurosurg~ery, VoI. 37, No.
2, August 1995.
Smith et al., The Neurostation"'-A Highly accurate, Minimally Invasive Solution to Frameless Stereotactic Neurosurgery, Computerized Medical Imaaina and Graphics, Vol. 18, No. 4, pp. 247-256, 1994.
In view of the above, it will be seen that the several objects of the invention are achieved and other advantageous results attained.
As various changes could be made in the above constructions, products, and methods without departing from the scope of the invention, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
Claims (98)
1. A system for use during a procedure on a body, said system generating a display representing a position of two or more body elements during the procedure, said system comprising:
a memory storing an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a plurality of data points and corresponding to a plurality of reference points for each of the body elements, the reference points of a particular body element having a known spatial relation to the data points of the particular body element;
means for identifying, during the procedure, the position of the reference points of each of the body elements relative to the reference points of the other body elements;
a processor modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as identified by the identifying means, said processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor and providing a display illustrating the relative position of the body elements during the procedure.
a memory storing an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a plurality of data points and corresponding to a plurality of reference points for each of the body elements, the reference points of a particular body element having a known spatial relation to the data points of the particular body element;
means for identifying, during the procedure, the position of the reference points of each of the body elements relative to the reference points of the other body elements;
a processor modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as identified by the identifying means, said processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor and providing a display illustrating the relative position of the body elements during the procedure.
2. The system of claim 1 wherein the reference points are defined in relation to the body elements and further comprising:
a medical or surgical instrument;
means for identifying, during the procedure, the position of medical or surgical instrument relative to one or more of the body elements; and wherein the display is responsive to the medical or surgical instrument position identifying means and provides a display to illustrate during the procedure the relative position of the body elements relative to the medical or surgical instrument.
a medical or surgical instrument;
means for identifying, during the procedure, the position of medical or surgical instrument relative to one or more of the body elements; and wherein the display is responsive to the medical or surgical instrument position identifying means and provides a display to illustrate during the procedure the relative position of the body elements relative to the medical or surgical instrument.
3. The system of claim 2 wherein the medical or surgical instrument position identifying means determines an orientation of the medical or surgical instrument relative to the body elements and wherein the display illustrates the orientation of the medical or surgical instrument relative to the body elements.
4, The system of claim 1 wherein the identifying means comprises:
a reference array having a location for providing a reference; and means for determining the position of the reference points of the body elements to be displayed relative to the reference array.
a reference array having a location for providing a reference; and means for determining the position of the reference points of the body elements to be displayed relative to the reference array.
5. The system of claim 4 further comprising a registration probe in communication with the reference array and wherein the determining means determines the position of a tip of the registration probe relative to the reference array whereby the position of the reference points of the body elements can be determined by positioning the tip of the registration probe at each of the reference points.
6. The system of claim 5 wherein the reference array includes sensors and wherein the registration probe includes emitters communicating with the sensors of the reference array to indicate the position of the registration probe relative to the reference array.
7. The system of claim 5 wherein the registration probe includes sensors and wherein the reference array includes emitters communicating with the sensors of the registration probe to indicate the position of the registration probe relative to the reference array.
8. The system of claim 4 further comprising a reference frame having a position in known relation to one of the body elements, said reference frame in communication with the reference array, and further comprising means for determining the position of the reference frame relative to the reference array; whereby the body may be moved during the procedure so that the system can determine the position of each of the body elements after movement without again identifying the relative position of each of the reference points of each of the body elements.
9. The system of claim 8 wherein the reference array includes sensors and wherein the reference frame is attached to one of the body elements and includes emitters communicating with the sensors of the reference array to indicate the position of the reference frame relative to the reference array.
10. The system of claim 8 wherein the reference frame includes sensors and wherein the reference array is adapted to be attached to one of the body elements and includes emitters communicating with the sensors of the reference frame to indicate the position of the reference frame relative to the reference array.
11. The system of claim 1 further comprising means for discriminating the body elements of the image data set by creating an image data subset defining each of the body elements.
12. The system of claim 11 wherein the processor translates each of the image data subsets from the position of the body elements at a prior point in time to the position of the body elements during the procedure whereby the displaced data set consists of the translated image data subsets.
13. The system of claim 11 wherein the identifying means comprises a device configured to determine a position of a contour of each of the body elements during the procedure and wherein the processor compares the position of the contour of each of the body elements at a point in time during the procedure as determined by the device to the position of the contour of each of the body elements prior to the point in time as represented by the image data subset.
14. The system of claim 13 wherein the identifying means comprises an ultrasound probe for determining a position of a contour of each of the body elements during the procedure and wherein the processor compares the position of the contour of the each of the body elements at a point in time during the procedure as determined by the device to the position of the contour of each of the body elements prior to the point in time as represented by the image data subset whereby, the contour of the body elements may be determined without the need for exposing the body elements.
15. The system of claim 13 wherein the identifying means comprises a scanner for determining a position of a contour of each of the body elements during the procedure and wherein the processor compares the position of the contour of each of the body elements at a point in time during the procedure as determined by the device to the position of the contour of each of the body elements prior to the point in time as represented by the image data subset.
16. The system of claim 14 further comprising a reference array and wherein said ultrasound probe has emitters thereon in communication with the reference array and wherein the determining means is adapted to determine the position of the ultrasound probe relative to the reference array whereby the position of the contour of each body element can be determined.
17. The system of claim 14 comprising a reference array having emitters thereon in communication with the ultrasound probe and wherein the determining means is adapted to determine the position of the ultrasound probe relative to the reference array whereby the position of the contour of each body element can be determined.
18. The system of claim 15 further comprising a reference array and wherein said scanner has emitters thereon in communication with the reference array and wherein the determining means is adapted to determine the position of the scanner relative to the reference array whereby the position of the contour of each body element can be determined.
19. The system of claim 15 comprising a reference array having emitters thereon in communication with the scanner and wherein the determining means is adapted to determine the position of the scanner relative to the reference array whereby the position of the contour of each body element can be determined.
20. The system of claim 4 further comprising a fluoroscopic device for determining a position of a projection of each of the body elements during the procedure and wherein the processor compares the position of the projection of each of the body elements at a point in time during the procedure to the position of the projection of each of the body elements prior to the point in time.
21. The system of claim 20 wherein the fluoroscopic device comprises a fluoroscopic tube in fixed relation to a fluoroscopic plate between which the body elements are located for determining a position of a projection of each of the body elements during the procedure and wherein the processor compares the position of the projection of each of the body elements at a point in time during the procedure to the position of the projection of each of the body elements prior to the point in time.
22. The system of claim 21 further comprising a reference array and wherein said fluoroscopic tube or said fluoroscopic plate each have emitters thereon in communication with the reference array and wherein the determining means is adapted to determine the position of the tube and plate relative to the reference array whereby the position of the projection of each body element can be determined.
23. The system of claim 21 comprising a reference array having emitters thereon in communication with the fluoroscopic tube or the fluoroscopic plate and wherein the determining means is adapted to determine the position of the tube or plate relative to the reference array whereby the position of the projection of each body element can be determined.
24. The system of claim 20 wherein the fluoroscopic device determines a position of a projection of each of the body elements during the procedure, and the processor compares the position of the projection of each of the body elements at a point in time during the procedure as determined by the fluoroscopic device to the position of the projection of each of the body elements prior to the point in time as represented by the image data set whereby the projection of the body elements may be determined without the need for exposing the body elements.
25. The system of claim 24 wherein the reference array has emitters or sensors thereon in communication with the fluoroscopic device and wherein the determining means determines the position of the fluoroscopic device relative to the reference array whereby the position of the projection of each body element can be determined.
26. A system according to claim 2, wherein the processor monitors the position of the instrument and deactivates it when the monitored position indicates that it is outside a predefined safe zone.
27. A system according to claim 2, further comprising robotics to control the position of the instrument and wherein the processor monitors the position of the instrument and instructs the robotics to control it in a predetermined manner.
28. A system according to claim 2 wherein the means for identifying includes emitters for emitting light, sound or electromagnetic radiation and sensors for sensing the emitted light, sound or electromagnetic radiation.
29. A method for use during a procedure, said method generating a display representing a position of two or more body elements during the procedure, said method comprising the steps of:
storing an image data set in a memory, the image data set representing the position of the two or more body elements based on scans taken of the body;
reading the image data set stored in the memory, said image data set having a plurality of data points and a plurality of reference points for each of the body elements, the data points of each body element having a spatial relation to the data points of another body element, the reference points of a particular body element having a known spatial relation to the data points of the particular body element;
identifying, during the procedure, the position of each of the reference points of each of the body elements relative to the reference points of the other body elements;
modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure in order to generate a displaced image data set representing the relative position of the body elements during the procedure;
and generating a display based on the displaced image data set illustrating the relative position of the body elements during the procedure.
storing an image data set in a memory, the image data set representing the position of the two or more body elements based on scans taken of the body;
reading the image data set stored in the memory, said image data set having a plurality of data points and a plurality of reference points for each of the body elements, the data points of each body element having a spatial relation to the data points of another body element, the reference points of a particular body element having a known spatial relation to the data points of the particular body element;
identifying, during the procedure, the position of each of the reference points of each of the body elements relative to the reference points of the other body elements;
modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure in order to generate a displaced image data set representing the relative position of the body elements during the procedure;
and generating a display based on the displaced image data set illustrating the relative position of the body elements during the procedure.
30. The method of claim 29 further comprising the step of repositioning the body elements so that the display based on the displaced data set representing the position of the body elements at a point in time during the procedure is substantially the same as the display based on the image data set generated prior to the point in time whereby the position of the body elements after repositioning is substantially the same as the position of the body elements when the image data set was generated.
31. The method of claim 29 wherein the reference points are part of the body elements and further comprising the steps of:
providing a medical or surgical instrument;
identifying, during the procedure, the position of medical or surgical instrument relative to one or more of the body elements; and generating a display based on the position of the medical or surgical instrument illustrating the relative position of the body elements and the medical or surgical instrument during the procedure.
providing a medical or surgical instrument;
identifying, during the procedure, the position of medical or surgical instrument relative to one or more of the body elements; and generating a display based on the position of the medical or surgical instrument illustrating the relative position of the body elements and the medical or surgical instrument during the procedure.
32. The method of claim 31 further comprising the steps of determining an orientation of the medical or surgical instrument relative to the body elements and generating a display illustrating the orientation of the medical or surgical instrument relative to the body elements.
33. The method of claim 29 further comprising the steps of:
providing a reference array having a location for providing a reference; and determining the position of the reference points of the body elements to be displayed relative to the reference array.
providing a reference array having a location for providing a reference; and determining the position of the reference points of the body elements to be displayed relative to the reference array.
34. The method of claim 33 further comprising the steps of providing a registration probe in communication with the reference array and determining the position of the reference points of the body elements by positioning the tip of the registration probe at each of the reference points.
35. The method of claim 33 further comprising the steps of providing a reference frame having a position in known relation to one of the body elements, said reference frame in communication with the reference array, and;
determining the position of the reference frame relative to the reference array whereby the body may be moved during the procedure so that the method can determine the position of each of the body elements after movement without again identifying the relative position of each of the reference points of each of the body elements.
determining the position of the reference frame relative to the reference array whereby the body may be moved during the procedure so that the method can determine the position of each of the body elements after movement without again identifying the relative position of each of the reference points of each of the body elements.
36. The method of claim 29 further comprising the step of discriminating the body elements of the image data set by creating an image data subset defining each of the body elements.
37. The method of claim 36 further comprising the step of translating each of the image data subsets from the position of the body elements at a prior point in time to the position of the body elements during the procedure whereby the displaced data set comprises the translated image data subsets.
38. The method of claim 36 comprising the steps of determining a position of a contour of each of the body elements during the procedure and comparing the position of the contour of each of the body elements at a point in time during the procedure to the position of the contour of each of the body elements prior to the point in time as represented by the image data set.
39. The method of claim 29 wherein the identifying step comprises the steps of positioning the body elements between a fluoroscopic tube and a fluoroscope plate in fixed relation to the tube, energizing the tube to generate a projection of each of the elements on the plate, determining the relative position of the fluoroscopic projection of each of the body elements during the procedure and comparing the position represented by the fluoroscopic projection of each of the body elements at a point in time during the procedure to the relative position of the body elements prior to the point in time.
40. A system for use during a procedure on a body, said system generating a display representing a geometrical relationship of two or more body elements during the procedure, said system comprising:
a memory for storing an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a geometrical spatial relationship of the body elements and having a plurality of reference points on each of the body elements, the reference points of a particular body element having a known spatial relation on the particular body element;
means for identifying, during the procedure, a position of each of the reference points of the body elements to be displayed;
a processor modifying the geometrical relationship of the body elements as necessary according to the identified position of each of the reference points as identified by the identifying means, said processor generating a displaced image data set from the modified geometrical relationship of the body elements, said displaced image data set representing the geometrical relationship of the body elements during the procedure; and a display system utilizing the displaced image data set to generate the display.
a memory for storing an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a geometrical spatial relationship of the body elements and having a plurality of reference points on each of the body elements, the reference points of a particular body element having a known spatial relation on the particular body element;
means for identifying, during the procedure, a position of each of the reference points of the body elements to be displayed;
a processor modifying the geometrical relationship of the body elements as necessary according to the identified position of each of the reference points as identified by the identifying means, said processor generating a displaced image data set from the modified geometrical relationship of the body elements, said displaced image data set representing the geometrical relationship of the body elements during the procedure; and a display system utilizing the displaced image data set to generate the display.
41. A system for use during a procedure on a body, said system generating a display representing a position of two or more body elements during the procedure, said system comprising:
a memory configured to store an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a plurality of data points and corresponding to a plurality of reference points for each of the body elements, the reference points of a particular body element having a known spatial relation to the data points of the particular body element;
means for identifying the position of the reference points;
a digitizer positioned to receive a signal output by the means for identifying, said digitizer being configured to determine a three dimensional position of the reference points for the particular body element;
a processor modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as determined by the digitizer, said processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor, illustrating the relative position of the body elements during the procedure.
a memory configured to store an image data set representing the position of the two or more body elements based on scans taken of the body, the image data set having a plurality of data points and corresponding to a plurality of reference points for each of the body elements, the reference points of a particular body element having a known spatial relation to the data points of the particular body element;
means for identifying the position of the reference points;
a digitizer positioned to receive a signal output by the means for identifying, said digitizer being configured to determine a three dimensional position of the reference points for the particular body element;
a processor modifying the spatial relation of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure as determined by the digitizer, said processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor, illustrating the relative position of the body elements during the procedure.
42. The system of claim 41 wherein the means for identifying comprises at least three light emitters positioned with respect to the reference points on the particular body element; and light sensors positioned to detect light emitted by the at least three light emitters; and wherein the digitizer positioned to receive a signal output by the light sensors, said digitizer being configured to determine a three dimensional position of the reference points for the particular body element relative to the light emitters for the other body elements.
43. The system of claim 41 wherein the reference points are in relation to the body elements and further comprising:
a medical or surgical instrument;
means for identifying, during the procedure, the position of the medical or surgical instrument relative to one or more of the body elements; and wherein the display is responsive to the medical or surgical instrument position identifying means and includes means for illustrating, during the procedure, the relative position of the body elements relative to the medical or surgical instrument.
a medical or surgical instrument;
means for identifying, during the procedure, the position of the medical or surgical instrument relative to one or more of the body elements; and wherein the display is responsive to the medical or surgical instrument position identifying means and includes means for illustrating, during the procedure, the relative position of the body elements relative to the medical or surgical instrument.
44. A system according to claim 43, wherein the processor monitors the position of the instrument and deactivates it when the monitored position indicates that it is outside a predefined safe zone.
45. A system according to claim 43, further comprising robotics to control the position of the instrument and wherein the processor monitors the position of the instrument and instructs the robotics to control it in a predetermined manner.
46. A system according to claim 41 wherein the means for identifying includes emitters for emitting light, sound or electromagnetic radiation and sensors for sensing the emitted light, sound or electromagnetic radiation.
47. The system of claim 43 wherein the medical or surgical instrument position identifying means determines an orientation of the medical or surgical instrument relative to the body elements and wherein the display illustrates the orientation of the medical or surgical instrument relative to the body elements.
48. The system of claim 41 further comprising means for discriminating the body elements of the image data set by creating an image data subset defining each of the body elements.
49. The system of claim 48 wherein the processor translates each of the image data subsets from the position of the body elements at a prior point in time to the position of the body elements during the procedure whereby the displaced data set consists of the translated image data subsets.
50. The system of claim 48 further comprising a contour position determining means for determining a position of a contour of each of the body elements during the procedure and wherein the processor compares the position of the contour of each of the body elements at a point in time during the procedure as determined by the contour position determining means to the position of the contour of each of the body elements prior to the point in time as represented by the image data subset.
51. The system of claim 50 wherein the contour position determining means comprises an ultrasound probe for determining a position of a contour of each of the body elements during the procedure and wherein the processor compares the position of the contour of each of the body elements at a point in time during the procedure as determined by the contour position determining means to the position of the contour of each of the body elements prior to the point in time as represented by the image data subset whereby the contour of the body elements may be determined without the need for exposing the body elements.
52. The system of claim 50 wherein the contour position determining means comprises a scanner for determining a position of a contour of each of the body elements during the procedure and wherein the processor compares the position of the contour of each of the body elements at a point in time during the procedure as determined by the contour position determining means to the position of the contour of each of the body elements prior to the point in time as represented by the image data subset.
53. The system of claim 51 further comprising a reference array and wherein said ultrasound probe has emitters thereon in communication with the reference array and wherein the contour position determining means determines the position of the ultrasound probe relative to the reference array whereby the position of the contour of each body element can be determined.
54. The system of claim 51 comprising a reference array having emitters thereon in communication with the ultrasound probe and wherein the contour position determining means determines the position of the ultrasound probe relative to the reference array whereby the position of the contour of each body element can be determined.
55. The system of claim 52 further comprising a reference array and wherein said scanner has emitters thereon in communication with the reference array and wherein the contour position determining means determines the position of the scanner relative to the reference array whereby the position of the contour of each body element can be determined.
56. The system of claim 52 further comprising a reference array having emitters thereon in communication with the scanner and wherein the contour position determining means determines the position of the scanner relative to the reference array whereby the position of the contour of each body element can be determined.
57. The system of claim 42 further comprising a fluoroscopic device for determining a position of a projection of each of the body elements during the procedure and wherein the processor compares the position of the projection of each of the body elements at a point in time during the procedure to the position of the projection of each of the body elements prior to the point in time.
58. The system of claim 57 wherein the fluoroscopic device comprises a fluoroscopic tube in fixed relation to a fluoroscopic plate between which the body elements are located for determining a position of a projection of each of the body elements during the procedure and wherein the processor compares the position of the projection of each of the body elements at a point in time during the procedure to the position of the projection of each of the body elements prior to the point in time.
59. The system of claim 58 further comprising a reference array and wherein said fluoroscopic tube or said fluoroscopic plate have emitters thereon in communication with the reference array and wherein fluoroscopic device includes means for determining the position of the tube and plate relative to the reference array whereby the position of the projection of each body element can be determined.
60. The system of claim 58 further comprising a reference array having emitters thereon in communication with the fluoroscopic tube or the fluoroscopic plate and wherein the fluoroscopic device includes means for determining the position of the tube or plate relative to the reference array whereby the position of the projection of each body element can be determined.
61. The system of claim 41 wherein the means for identifying comprises light emitters and at least three light sensors positioned with respect to the reference points on the particular body element to detect light emitted by the light emitters; and wherein the digitizer positioned to receive a signal output by the light sensors, said digitizer being configured to determine a three dimensional position of the reference points for the particular body element relative to the light emitters for the other body elements.
62. The system of claim 57 wherein the fluoroscopic device determines a position of a projection of each of the body elements during the procedure, and the processor compares the position of the projection of each of the body elements at a point in time during the procedure as determined by the fluoroscopic device to the position of the projection of each of the body elements prior to the point in time as represented by the image data set whereby the projection of the body elements may be determined without the need for exposing the body elements.
63. The system of claim 62 comprising a reference array having emitters or sensors thereon in communication with the fluoroscopic device and wherein the determining means determines the position of the fluoroscopic device relative to the reference array whereby the position of the projection of each body element can be determined.
64. A system for displaying relative positions of two or more body elements during a procedure on a body, the system comprising:
a memory storing an image data set, the image data set representing the position of the body elements based on scans of the body, and having a plurality of data points correlatable to a plurality of reference points for each of the body elements, the position of reference points of a particular body element relative to the data points for that particular body element being known;
a reference system for identifying, during the procedure, the position of the reference points of each of the body elements relative to the reference points of the other body elements;
a processor modifying the relative position of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference paints during the procedure as identified by the reference system, the processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor to display the relative position of the body elements during the procedure.
a memory storing an image data set, the image data set representing the position of the body elements based on scans of the body, and having a plurality of data points correlatable to a plurality of reference points for each of the body elements, the position of reference points of a particular body element relative to the data points for that particular body element being known;
a reference system for identifying, during the procedure, the position of the reference points of each of the body elements relative to the reference points of the other body elements;
a processor modifying the relative position of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference paints during the procedure as identified by the reference system, the processor generating a displaced image data set representing the position of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor to display the relative position of the body elements during the procedure.
65. The system of claim 64 wherein the reference system comprises a reference frame in communication with a reference array.
66. The system of claim 64 further comprising a medical or surgical instrument, wherein the reference system identifies, during the procedure, the position of the instrument relative to at least the body elements, and the display illustrates the position of the instrument relative to the body elements based on the identified position of the instrument.
67. A system according to claim 66, wherein the processor monitors the position of the instrument and deactivates it when the monitored position indicates that it is outside a predefined safe zone.
68. A system according to claim 66, further comprising robotics to control the position of the instrument and wherein the processor monitors the position of the instrument and instructs the robotics to control it in a predetermined manner.
69. A system according to claim 64 wherein the reference system includes emitters for emitting light, sound or electromagnetic radiation and sensors for sensing the emitted light, sound or electromagnetic radiation.
70. The system of claim 65 wherein the reference system determines an orientation of the instrument relative to the body elements and the display illustrates the orientation of the medical instrument relative to the body elements.
71. The system of claim 65 wherein the reference array provides a reference, and further comprising a localizes for determining the position of the reference points of the body elements relative to the reference array.
72. The system of claim 71 further comprising a registration probe in communication with the reference array, wherein the localizes determines the position of a tip of the registration probe relative to the reference array and the position of the reference points of the body elements can be determined by positioning the tip of the registration probe at each of the reference points.
73. The system of claim 65 wherein the position of the reference frame is known in relation to one of the body elements, and the reference system determines the position of the reference frame relative to the reference array so that the body may be moved during the procedure so that the system can determine the position of each of the body elements after movement without re-identifying the relative position of each of the reference points of each of the body elements.
74. The system of claim 64 further comprising a processor for discriminating the body elements of the image data set by creating an image data subset defining each of the body elements.
75. The system of claim 74 wherein the processor translates each of the image data subsets from the position of the body elements at a point in time to the position of the body elements at a later point in time so that the displaced data set consists of the translated image data subsets.
76. The system of claim 64 wherein the reference system determines a position of a contour of each of the body elements during the procedure and wherein the processor compares the position of the contour of each of the body elements at a point in time during the procedure as determined by the device to the position of the contour of each of the body elements prior to the point in time as represented by the image data set.
77. The system of claim 76 wherein the device comprises an ultrasound probe.
78. The system of claim 76 wherein the device comprises a scanner.
79. The system of claim 71 further comprising a fluoroscopic device for determining a position of a projection of each of the body elements during the procedure, wherein the processor compares the position of the projection of each of the body elements at a point in time during the procedure to the position of the projection of each of the body elements prior to the point in time.
80. The system of claim 79 wherein the fluoroscopic device comprises a fluoroscopic tube in fixed relation to a fluoroscopic plate between which the body elements are located.
81. A system for use during a medical or surgical procedure on a body, said system being adapted to generate a display from an image data set representing the position of one or more body elements during the procedure based on scans taken of the body, wherein at least some of the body elements are movable with respect to each other, each scan having a reference for each of the one or more body elements, the reference of a particular body element having a known spatial relation to the particular body element, said system comprising:
a plurality of reference frames, at least one attached to each of the movable body elements;
a localizes for identifying, during the procedure, the position of the reference for each of the body elements to be displayed;
a processor for modifying the image data set according to the identified position of the reference during the procedure, as identified by the localizes, said processor being adapted to generate a displaced image data set representing the position and geometry of the body elements according to the identified position of the reference frames during the procedure, as identified by the localizes; and a display utilizing the displaced image data set generated by the processor, illustrating the position and geometry of the body elements during the procedure.
a plurality of reference frames, at least one attached to each of the movable body elements;
a localizes for identifying, during the procedure, the position of the reference for each of the body elements to be displayed;
a processor for modifying the image data set according to the identified position of the reference during the procedure, as identified by the localizes, said processor being adapted to generate a displaced image data set representing the position and geometry of the body elements according to the identified position of the reference frames during the procedure, as identified by the localizes; and a display utilizing the displaced image data set generated by the processor, illustrating the position and geometry of the body elements during the procedure.
82. A system according to claim 81 wherein each of the body elements has a contour having a known spatial relation to the corresponding body element and wherein the reference relating to each body element comprises the corresponding contour of the body element.
83. A system according to claim 81 wherein each of the body elements has reference points having a known spatial relation to the corresponding body element and wherein the reference relating to each body element comprises the corresponding reference points of the body element.
84. A system according to claim 83, further comprising a reference array for providing a reference, the localizes being adapted for determining the position of the reference points for the body elements to be displayed relative to the reference array.
85. A system according to claim 84 wherein the reference frames each have a position in known relation to their associated body element, said reference frames being adapted to be in communication with the reference array, and wherein the localizes is adapted to determine the position of the reference frames relative to the reference array, the body being movable during the procedure while the body elements remain in fixed relation to their associated reference frame so that the system can determine the position of each of the body elements after movement without again identifying the relative position of each of the reference points for each of the body elements.
86. A system according to claim 84 wherein the reference frames each having a corresponding position in known relation to each of the body elements, said reference frames being adapted to be in communication with the reference array, and wherein the localizer is adapted to determine the position of the reference frames relative to the reference array, the body being movable during the procedure while the body elements remain in known relation to the reference frames so that the system can determine the position of each of the body elements after movement without again identifying the relative position of each of the reference points for each of the body elements.
87. A system according to any one of claims 84-86, wherein the reference array includes one of sensors or emitters and wherein each one of the plurality of reference frames are adapted to be attached to each of the body elements, the plurality of reference frames including the other of sensors or emitters communicating with the emitters or sensors, respectively, of the reference array to indicate the position of the reference frames relative to the reference array.
88. A system according to any one of claims 84-87, wherein the reference array includes sensors and wherein the reference frames are capable of one of continuously and periodically detecting and tracking at least one of the body elements and includes emitters communicating with the sensors of the reference array to indicate the position of the reference frames relative to the reference array.
89. A system according to any one of claims 81-88, further comprising a device for determining a position of a contour of each of the body elements during the procedure and wherein the processor is adapted to compare the position of the contour of each of the body elements at a point in time during the procedure as determined by the device to the position of the contour of each of the body elements prior to the point in time as represented by the image data subset.
90. A system according to any one of claims 81-89, wherein the localizer comprises a fluoroscopic device for determining a position of a projection of each of the body elements during the procedure and wherein the processor compares the position of the projection of each of the body elements at a point in time during the procedure to the position of the projection of each of the body elements prior to the point in time.
91. A system according to any one of claims 81-90 wherein at least one of the body elements comprises a semi-rigid body element such as soft tissue.
92. A system according to any one of claims 81-91 wherein the processor modifies the image data set by translation or transformation to generate a displaced image data set representing the position and geometry of the body elements during the procedure.
93. A method for use during a medical or surgical procedure on a body, said method being adapted to generate a display from an image data set representing the position of one or more body elements during the procedure based on scans taken of the body, wherein at least some of the body elements are movable with respect to each other, each scan having a reference for each of the one or more body elements, the reference of a particular body element having a known spatial relation to the particular body element, wherein there is a plurality of reference frames, at least one attached to each of the movable body elements; said method comprising the steps of:
identifying, during the procedure, the position of the reference for each of the body elements to be displayed;
modifying the image data set according to the identified position of the reference during the procedure, as identified by the localizer;
generating a displaced image data set representing the position and geometry of the body elements according to the identified position of the reference frames during the procedure; and illustrating the position and geometry of the body elements during the procedure utilizing the displaced image data set.
identifying, during the procedure, the position of the reference for each of the body elements to be displayed;
modifying the image data set according to the identified position of the reference during the procedure, as identified by the localizer;
generating a displaced image data set representing the position and geometry of the body elements according to the identified position of the reference frames during the procedure; and illustrating the position and geometry of the body elements during the procedure utilizing the displaced image data set.
94. A method for displaying the relative positions of a plurality of body elements during a procedure on a body, the method comprising:
storing scan images of body elements, the scan images having a plurality of data points corresponding to a plurality of reference points for each of the body elements, the position of the reference points of a particular body element relative to the data points for that particular body element being known;
identifying the position of each of the reference points of each of the body elements relative to the reference points of the other body elements during a procedure;
modifying the relative position of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure;
generating a displaced image data set representing the position of the body elements during the procedure; and displaying the relative position of the body elements during the procedure.
storing scan images of body elements, the scan images having a plurality of data points corresponding to a plurality of reference points for each of the body elements, the position of the reference points of a particular body element relative to the data points for that particular body element being known;
identifying the position of each of the reference points of each of the body elements relative to the reference points of the other body elements during a procedure;
modifying the relative position of the data points of one body element relative to the data points of another body element according to the identified relative position of the reference points during the procedure;
generating a displaced image data set representing the position of the body elements during the procedure; and displaying the relative position of the body elements during the procedure.
95. The method of claim 94 further comprising discriminating the contour of the body elements of the image data set and creating an image data subset defining each of the body elements.
96. The method of claim 95 further comprising scanning the body elements during the procedure to determine the contour of the body elements and comparing the position of the contour of each of the body elements at a point in time during the procedure to the position of the contour of each of the body elements prior to the point in time as represented by the image data subset.
97. A system according to claim 4, further comprising a plurality of reference frames, each having a corresponding position in known relation to each of the body elements, said reference frames being adapted to be in communication with the reference array, and wherein the localizer is adapted to determine the position of the reference frames relative to the reference array, the body being movable during the procedure while the body elements remain in known relation to the reference frames so that the system can determine the position of each of the body elements after movement without again identifying the relative position of each of the reference points for each of the body elements.
98. A system according to claim 97, wherein the reference array includes sensors and wherein each one of the plurality of reference frames are adapted to be attached to each of the body elements, the plurality of reference frames including emitters communicating with the sensors of the reference array to indicate the position of the reference frames relative to the reference array.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US31961594A | 1994-10-07 | 1994-10-07 | |
US08/319,615 | 1994-10-07 | ||
US341595P | 1995-09-08 | 1995-09-08 | |
US60/003,415 | 1995-09-08 | ||
PCT/US1995/012894 WO1996011624A2 (en) | 1994-10-07 | 1995-10-05 | Surgical navigation systems including reference and localization frames |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2201877A1 CA2201877A1 (en) | 1996-04-25 |
CA2201877C true CA2201877C (en) | 2004-06-08 |
Family
ID=26671725
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002201877A Expired - Lifetime CA2201877C (en) | 1994-10-07 | 1995-10-05 | Surgical navigation systems including reference and localization frames |
Country Status (8)
Country | Link |
---|---|
US (3) | US6236875B1 (en) |
EP (3) | EP0869745B8 (en) |
JP (1) | JP3492697B2 (en) |
AT (3) | ATE320226T1 (en) |
AU (1) | AU3950595A (en) |
CA (1) | CA2201877C (en) |
DE (4) | DE29521895U1 (en) |
WO (1) | WO1996011624A2 (en) |
Families Citing this family (688)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2652928B1 (en) | 1989-10-05 | 1994-07-29 | Diadix Sa | INTERACTIVE LOCAL INTERVENTION SYSTEM WITHIN A AREA OF A NON-HOMOGENEOUS STRUCTURE. |
US6347240B1 (en) * | 1990-10-19 | 2002-02-12 | St. Louis University | System and method for use in displaying images of a body part |
US7074179B2 (en) * | 1992-08-10 | 2006-07-11 | Intuitive Surgical Inc | Method and apparatus for performing minimally invasive cardiac procedures |
US8603095B2 (en) | 1994-09-02 | 2013-12-10 | Puget Bio Ventures LLC | Apparatuses for femoral and tibial resection |
US6695848B2 (en) | 1994-09-02 | 2004-02-24 | Hudson Surgical Design, Inc. | Methods for femoral and tibial resection |
DE29521895U1 (en) * | 1994-10-07 | 1998-09-10 | Univ St Louis | Surgical navigation system comprising reference and localization frames |
US6978166B2 (en) * | 1994-10-07 | 2005-12-20 | Saint Louis University | System for use in displaying images of a body part |
US5592939A (en) | 1995-06-14 | 1997-01-14 | Martinelli; Michael A. | Method and system for navigating a catheter probe |
US6256529B1 (en) * | 1995-07-26 | 2001-07-03 | Burdette Medical Systems, Inc. | Virtual reality 3D visualization for surgical procedures |
US5951571A (en) * | 1996-09-19 | 1999-09-14 | Surgical Navigation Specialist Inc. | Method and apparatus for correlating a body with an image of the body |
US6132441A (en) | 1996-11-22 | 2000-10-17 | Computer Motion, Inc. | Rigidly-linked articulating wrist with decoupled motion transmission |
US6731966B1 (en) | 1997-03-04 | 2004-05-04 | Zachary S. Spigelman | Systems and methods for targeting a lesion |
US6119033A (en) * | 1997-03-04 | 2000-09-12 | Biotrack, Inc. | Method of monitoring a location of an area of interest within a patient during a medical procedure |
SE511291C2 (en) * | 1997-03-18 | 1999-09-06 | Anders Widmark | Procedure, arrangement and reference organ for radiation therapy |
DE19751761B4 (en) * | 1997-04-11 | 2006-06-22 | Brainlab Ag | System and method for currently accurate detection of treatment targets |
US5970499A (en) | 1997-04-11 | 1999-10-19 | Smith; Kurt R. | Method and apparatus for producing and accessing composite data |
US6752812B1 (en) | 1997-05-15 | 2004-06-22 | Regent Of The University Of Minnesota | Remote actuation of trajectory guide |
WO1999001078A2 (en) * | 1997-07-03 | 1999-01-14 | Koninklijke Philips Electronics N.V. | Image-guided surgery system |
US6096050A (en) * | 1997-09-19 | 2000-08-01 | Surgical Navigation Specialist Inc. | Method and apparatus for correlating a body with an image of the body |
US6226548B1 (en) | 1997-09-24 | 2001-05-01 | Surgical Navigation Technologies, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
US5999837A (en) * | 1997-09-26 | 1999-12-07 | Picker International, Inc. | Localizing and orienting probe for view devices |
US5987960A (en) * | 1997-09-26 | 1999-11-23 | Picker International, Inc. | Tool calibrator |
US6081336A (en) * | 1997-09-26 | 2000-06-27 | Picker International, Inc. | Microscope calibrator |
US6021343A (en) * | 1997-11-20 | 2000-02-01 | Surgical Navigation Technologies | Image guided awl/tap/screwdriver |
CA2333583C (en) * | 1997-11-24 | 2005-11-08 | Everette C. Burdette | Real time brachytherapy spatial registration and visualization system |
US6348058B1 (en) * | 1997-12-12 | 2002-02-19 | Surgical Navigation Technologies, Inc. | Image guided spinal surgery guide, system, and method for use thereof |
WO1999033406A1 (en) | 1997-12-31 | 1999-07-08 | Surgical Navigation Technologies, Inc. | Wireless probe system for use with a stereotactic surgical device |
EP1051123A1 (en) * | 1998-01-28 | 2000-11-15 | Eric Richard Cosman | Optical object tracking system |
FR2779339B1 (en) | 1998-06-09 | 2000-10-13 | Integrated Surgical Systems Sa | MATCHING METHOD AND APPARATUS FOR ROBOTIC SURGERY, AND MATCHING DEVICE COMPRISING APPLICATION |
JP4132009B2 (en) | 1998-06-22 | 2008-08-13 | アーオー テクノロジー アクチエンゲゼルシャフト | Fiducial matching using fiducial screws |
US6459927B1 (en) | 1999-07-06 | 2002-10-01 | Neutar, Llc | Customizable fixture for patient positioning |
US6327491B1 (en) | 1998-07-06 | 2001-12-04 | Neutar, Llc | Customized surgical fixture |
US6482182B1 (en) | 1998-09-03 | 2002-11-19 | Surgical Navigation Technologies, Inc. | Anchoring system for a brain lead |
EP1115328A4 (en) * | 1998-09-24 | 2004-11-10 | Super Dimension Ltd | System and method for determining the location of a catheter during an intra-body medical procedure |
US6195577B1 (en) | 1998-10-08 | 2001-02-27 | Regents Of The University Of Minnesota | Method and apparatus for positioning a device in a body |
US6340363B1 (en) | 1998-10-09 | 2002-01-22 | Surgical Navigation Technologies, Inc. | Image guided vertebral distractor and method for tracking the position of vertebrae |
JP4101951B2 (en) * | 1998-11-10 | 2008-06-18 | オリンパス株式会社 | Surgical microscope |
US6430434B1 (en) * | 1998-12-14 | 2002-08-06 | Integrated Surgical Systems, Inc. | Method for determining the location and orientation of a bone for computer-assisted orthopedic procedures using intraoperatively attached markers |
FR2788428B1 (en) * | 1999-01-18 | 2001-06-22 | Georges Bettega | SURGICAL IMPLANT, POSITIONING TOOL FOR THIS IMPLANT, MARKER SUPPORT FOR THIS IMPLANT AND CORRESPONDING SURGICAL KIT |
US6144875A (en) * | 1999-03-16 | 2000-11-07 | Accuray Incorporated | Apparatus and method for compensating for respiratory and patient motion during treatment |
ES2260901T3 (en) | 1999-03-17 | 2006-11-01 | Synthes Ag Chur | IN SITU PLANNING AND GUIDE DEVICE OF A LIGAMENT INJERTO. |
CA2370960C (en) | 1999-04-20 | 2006-06-13 | Synthes (U.S.A.) | Device for the percutaneous obtainment of 3d-coordinates on the surface of a human or animal organ |
EP1173105B1 (en) | 1999-04-22 | 2004-10-27 | Medtronic Surgical Navigation Technologies | Apparatus and method for image guided surgery |
JP4693246B2 (en) * | 1999-05-03 | 2011-06-01 | アーオー テクノロジー アクチエンゲゼルシャフト | Position detecting device having gravity vector direction calculation auxiliary means |
AU778448B2 (en) | 1999-05-07 | 2004-12-02 | University Of Virginia Patent Foundation | Method and system for fusing a spinal region |
US6805697B1 (en) * | 1999-05-07 | 2004-10-19 | University Of Virginia Patent Foundation | Method and system for fusing a spinal region |
US6928490B1 (en) | 1999-05-20 | 2005-08-09 | St. Louis University | Networking infrastructure for an operating room |
US6390982B1 (en) * | 1999-07-23 | 2002-05-21 | Univ Florida | Ultrasonic guidance of target structures for medical procedures |
US6674916B1 (en) * | 1999-10-18 | 2004-01-06 | Z-Kat, Inc. | Interpolation in transform space for multiple rigid object registration |
US6379302B1 (en) | 1999-10-28 | 2002-04-30 | Surgical Navigation Technologies Inc. | Navigation information overlay onto ultrasound imagery |
US6499488B1 (en) * | 1999-10-28 | 2002-12-31 | Winchester Development Associates | Surgical sensor |
US11331150B2 (en) | 1999-10-28 | 2022-05-17 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US6235038B1 (en) | 1999-10-28 | 2001-05-22 | Medtronic Surgical Navigation Technologies | System for translation of electromagnetic and optical localization systems |
US8644907B2 (en) | 1999-10-28 | 2014-02-04 | Medtronic Navigaton, Inc. | Method and apparatus for surgical navigation |
EP1229847B1 (en) | 1999-11-15 | 2006-04-05 | SYNTHES AG Chur | Device for the determination of reduction parameters for the subsequent reduction of a fractured bone |
DE19963440C2 (en) * | 1999-12-28 | 2003-02-20 | Siemens Ag | Method and system for visualizing an object |
US7635390B1 (en) | 2000-01-14 | 2009-12-22 | Marctec, Llc | Joint replacement component having a modular articulating surface |
DE10005880B4 (en) * | 2000-02-10 | 2004-09-16 | Grieshammer, Thomas, Dr. | Device for inserting screws into a vertebra |
WO2001064124A1 (en) * | 2000-03-01 | 2001-09-07 | Surgical Navigation Technologies, Inc. | Multiple cannula image guided tool for image guided procedures |
US8517923B2 (en) | 2000-04-03 | 2013-08-27 | Intuitive Surgical Operations, Inc. | Apparatus and methods for facilitating treatment of tissue via improved delivery of energy based and non-energy based modalities |
US6468203B2 (en) | 2000-04-03 | 2002-10-22 | Neoguide Systems, Inc. | Steerable endoscope and improved method of insertion |
US6858005B2 (en) | 2000-04-03 | 2005-02-22 | Neo Guide Systems, Inc. | Tendon-driven endoscope and methods of insertion |
US8888688B2 (en) | 2000-04-03 | 2014-11-18 | Intuitive Surgical Operations, Inc. | Connector device for a controllable instrument |
US6610007B2 (en) | 2000-04-03 | 2003-08-26 | Neoguide Systems, Inc. | Steerable segmented endoscope and method of insertion |
US6535756B1 (en) | 2000-04-07 | 2003-03-18 | Surgical Navigation Technologies, Inc. | Trajectory storage apparatus and method for surgical navigation system |
US7660621B2 (en) | 2000-04-07 | 2010-02-09 | Medtronic, Inc. | Medical device introducer |
US7366561B2 (en) * | 2000-04-07 | 2008-04-29 | Medtronic, Inc. | Robotic trajectory guide |
US20030135102A1 (en) * | 2000-05-18 | 2003-07-17 | Burdette Everette C. | Method and system for registration and guidance of intravascular treatment |
US6478802B2 (en) * | 2000-06-09 | 2002-11-12 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus for display of an image guided drill bit |
US7228165B1 (en) * | 2000-06-26 | 2007-06-05 | Boston Scientific Scimed, Inc. | Apparatus and method for performing a tissue resection procedure |
US6837892B2 (en) * | 2000-07-24 | 2005-01-04 | Mazor Surgical Technologies Ltd. | Miniature bone-mounted surgical robot |
US7359748B1 (en) * | 2000-07-26 | 2008-04-15 | Rhett Drugge | Apparatus for total immersion photography |
WO2002013714A1 (en) | 2000-08-17 | 2002-02-21 | Image Guided Neurologies, Inc. | Trajectory guide with instrument immobilizer |
US6907281B2 (en) * | 2000-09-07 | 2005-06-14 | Ge Medical Systems | Fast mapping of volumetric density data onto a two-dimensional screen |
FR2814667B1 (en) * | 2000-09-29 | 2002-12-20 | Bertrand Lombard | STEREOTAXIC FRAME, TRANSMITTING BLOCK HOLDER DEVICE AND RELATED SURGICAL NAVIGATION METHOD AND DEVICE |
JP4674948B2 (en) * | 2000-09-29 | 2011-04-20 | オリンパス株式会社 | Surgical navigation device and method of operating surgical navigation device |
EP1197185B1 (en) * | 2000-10-11 | 2004-07-14 | Stryker Leibinger GmbH & Co. KG | Device for determining or tracking the position of a bone |
US6556857B1 (en) | 2000-10-24 | 2003-04-29 | Sdgi Holdings, Inc. | Rotation locking driver for image guided instruments |
IL140136A (en) * | 2000-12-06 | 2010-06-16 | Intumed Ltd | Apparatus for self-guided intubation |
DE50002672D1 (en) * | 2000-12-19 | 2003-07-31 | Brainlab Ag | Method and device for navigation-assisted dental treatment |
US7043961B2 (en) * | 2001-01-30 | 2006-05-16 | Z-Kat, Inc. | Tool calibrator and tracker system |
US6678546B2 (en) * | 2001-01-30 | 2004-01-13 | Fischer Imaging Corporation | Medical instrument guidance using stereo radiolocation |
US20090182226A1 (en) * | 2001-02-15 | 2009-07-16 | Barry Weitzner | Catheter tracking system |
DE10108139A1 (en) * | 2001-02-20 | 2002-08-29 | Boegl Max Bauunternehmung Gmbh | Method for measuring and / or machining a workpiece |
US7547307B2 (en) * | 2001-02-27 | 2009-06-16 | Smith & Nephew, Inc. | Computer assisted knee arthroplasty instrumentation, systems, and processes |
WO2002067800A2 (en) | 2001-02-27 | 2002-09-06 | Smith & Nephew, Inc. | Surgical navigation systems and processes for high tibial osteotomy |
US8062377B2 (en) | 2001-03-05 | 2011-11-22 | Hudson Surgical Design, Inc. | Methods and apparatus for knee arthroplasty |
US7594917B2 (en) * | 2001-03-13 | 2009-09-29 | Ethicon, Inc. | Method and apparatus for fixing a graft in a bone tunnel |
US7195642B2 (en) | 2001-03-13 | 2007-03-27 | Mckernan Daniel J | Method and apparatus for fixing a graft in a bone tunnel |
US6517546B2 (en) * | 2001-03-13 | 2003-02-11 | Gregory R. Whittaker | Method and apparatus for fixing a graft in a bone tunnel |
CN100556370C (en) * | 2001-03-26 | 2009-11-04 | Lb医药有限公司 | Be used for material is excised or the method and apparatus system of processed |
JP2002306509A (en) * | 2001-04-10 | 2002-10-22 | Olympus Optical Co Ltd | Remote operation supporting system |
GB0109444D0 (en) | 2001-04-17 | 2001-06-06 | Unilever Plc | Toothbrush usage monitoring system |
US7526112B2 (en) | 2001-04-30 | 2009-04-28 | Chase Medical, L.P. | System and method for facilitating cardiac intervention |
US7327862B2 (en) * | 2001-04-30 | 2008-02-05 | Chase Medical, L.P. | System and method for facilitating cardiac intervention |
EP1401323A4 (en) * | 2001-05-31 | 2009-06-03 | Image Navigation Ltd | Image guided implantology methods |
US6636757B1 (en) | 2001-06-04 | 2003-10-21 | Surgical Navigation Technologies, Inc. | Method and apparatus for electromagnetic navigation of a surgical probe near a metal object |
US6887245B2 (en) * | 2001-06-11 | 2005-05-03 | Ge Medical Systems Global Technology Company, Llc | Surgical drill for use with a computer assisted surgery system |
US6482209B1 (en) * | 2001-06-14 | 2002-11-19 | Gerard A. Engh | Apparatus and method for sculpting the surface of a joint |
US6723102B2 (en) | 2001-06-14 | 2004-04-20 | Alexandria Research Technologies, Llc | Apparatus and method for minimally invasive total joint replacement |
US6584339B2 (en) | 2001-06-27 | 2003-06-24 | Vanderbilt University | Method and apparatus for collecting and processing physical space data for use while performing image-guided surgery |
US7063705B2 (en) | 2001-06-29 | 2006-06-20 | Sdgi Holdings, Inc. | Fluoroscopic locator and registration device |
AU2002352034B2 (en) * | 2001-06-29 | 2007-01-18 | Warsaw Orthopedic, Inc. | Fluoroscopic locator and registration device |
US7708741B1 (en) | 2001-08-28 | 2010-05-04 | Marctec, Llc | Method of preparing bones for knee replacement surgery |
WO2003032837A1 (en) * | 2001-10-12 | 2003-04-24 | University Of Florida | Computer controlled guidance of a biopsy needle |
ATE261273T1 (en) * | 2001-12-18 | 2004-03-15 | Brainlab Ag | PROJECTION OF PATIENT IMAGE DATA FROM TRANSLOX OR LAYER IMAGE CAPTURE METHOD ON VIDEO IMAGES |
EP1469781B1 (en) | 2002-01-09 | 2016-06-29 | Intuitive Surgical Operations, Inc. | Apparatus for endoscopic colectomy |
US6947786B2 (en) | 2002-02-28 | 2005-09-20 | Surgical Navigation Technologies, Inc. | Method and apparatus for perspective inversion |
US8010180B2 (en) | 2002-03-06 | 2011-08-30 | Mako Surgical Corp. | Haptic guidance system and method |
US7457658B2 (en) * | 2002-03-07 | 2008-11-25 | Medtronic, Inc. | Algorithm for accurate three-dimensional reconstruction of non-linear implanted medical devices in VIVO |
US6971991B2 (en) * | 2002-03-08 | 2005-12-06 | Imperium, Inc. | Apparatus for multimodal plane wave ultrasound imaging |
US9155544B2 (en) | 2002-03-20 | 2015-10-13 | P Tech, Llc | Robotic systems and methods |
US9375203B2 (en) | 2002-03-25 | 2016-06-28 | Kieran Murphy Llc | Biopsy needle |
US7927368B2 (en) | 2002-03-25 | 2011-04-19 | Kieran Murphy Llc | Device viewable under an imaging beam |
US20030181810A1 (en) | 2002-03-25 | 2003-09-25 | Murphy Kieran P. | Kit for image guided surgical procedures |
US6990368B2 (en) | 2002-04-04 | 2006-01-24 | Surgical Navigation Technologies, Inc. | Method and apparatus for virtual digital subtraction angiography |
US6887247B1 (en) | 2002-04-17 | 2005-05-03 | Orthosoft Inc. | CAS drill guide and drill tracking system |
US7998062B2 (en) | 2004-03-29 | 2011-08-16 | Superdimension, Ltd. | Endoscope structures and techniques for navigating to a target in branched structure |
US6980849B2 (en) * | 2002-04-17 | 2005-12-27 | Ricardo Sasso | Instrumentation and method for performing image-guided spinal surgery using an anterior surgical approach |
US6993374B2 (en) * | 2002-04-17 | 2006-01-31 | Ricardo Sasso | Instrumentation and method for mounting a surgical navigation reference device to a patient |
US8180429B2 (en) * | 2002-04-17 | 2012-05-15 | Warsaw Orthopedic, Inc. | Instrumentation and method for mounting a surgical navigation reference device to a patient |
US7213598B2 (en) | 2002-05-28 | 2007-05-08 | Brainlab Ag | Navigation-calibrating rotationally asymmetrical medical instruments or implants |
US6736821B2 (en) | 2002-06-18 | 2004-05-18 | Sdgi Holdings, Inc. | System and method of mating implants and vertebral bodies |
US7477763B2 (en) | 2002-06-18 | 2009-01-13 | Boston Scientific Scimed, Inc. | Computer generated representation of the imaging pattern of an imaging device |
US7107091B2 (en) * | 2002-07-25 | 2006-09-12 | Orthosoft Inc. | Multiple bone tracking |
US7787934B2 (en) * | 2002-07-29 | 2010-08-31 | Medtronic, Inc. | Fiducial marker devices, tools, and methods |
US20040019265A1 (en) * | 2002-07-29 | 2004-01-29 | Mazzocchi Rudy A. | Fiducial marker devices, tools, and methods |
US7720522B2 (en) * | 2003-02-25 | 2010-05-18 | Medtronic, Inc. | Fiducial marker devices, tools, and methods |
US7187800B2 (en) * | 2002-08-02 | 2007-03-06 | Computerized Medical Systems, Inc. | Method and apparatus for image segmentation using Jensen-Shannon divergence and Jensen-Renyi divergence |
DE10235795B4 (en) | 2002-08-05 | 2018-10-31 | Siemens Healthcare Gmbh | Medical device |
EP2151215B1 (en) * | 2002-08-09 | 2012-09-19 | Kinamed, Inc. | Non-imaging tracking tools for hip replacement surgery |
CA2633137C (en) | 2002-08-13 | 2012-10-23 | The Governors Of The University Of Calgary | Microsurgical robot system |
US7306603B2 (en) | 2002-08-21 | 2007-12-11 | Innovative Spinal Technologies | Device and method for percutaneous placement of lumbar pedicle screws and connecting rods |
EP1542591A2 (en) * | 2002-08-29 | 2005-06-22 | Computerized Medical Systems, Inc. | Methods and systems for localizing a medical imaging probe and for spatial registration and mapping of a biopsy needle during a tissue biopsy |
DE10241069B4 (en) * | 2002-09-05 | 2004-07-15 | Aesculap Ag & Co. Kg | Device for detecting the contour of a surface |
US7704260B2 (en) | 2002-09-17 | 2010-04-27 | Medtronic, Inc. | Low profile instrument immobilizer |
US7166114B2 (en) | 2002-09-18 | 2007-01-23 | Stryker Leibinger Gmbh & Co Kg | Method and system for calibrating a surgical tool and adapter thereof |
US20040068263A1 (en) * | 2002-10-04 | 2004-04-08 | Benoit Chouinard | CAS bone reference with articulated support |
FI113615B (en) * | 2002-10-17 | 2004-05-31 | Nexstim Oy | Three-dimensional modeling of skull shape and content |
WO2004039249A1 (en) * | 2002-10-29 | 2004-05-13 | Olympus Corporation | Endoscope information processor and processing method |
US7697972B2 (en) | 2002-11-19 | 2010-04-13 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
US7599730B2 (en) | 2002-11-19 | 2009-10-06 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
US7094241B2 (en) | 2002-11-27 | 2006-08-22 | Zimmer Technology, Inc. | Method and apparatus for achieving correct limb alignment in unicondylar knee arthroplasty |
US8388540B2 (en) * | 2002-12-13 | 2013-03-05 | Boston Scientific Scimed, Inc. | Method and apparatus for orienting a medical image |
US20040120558A1 (en) * | 2002-12-18 | 2004-06-24 | Sabol John M | Computer assisted data reconciliation method and apparatus |
US7490085B2 (en) * | 2002-12-18 | 2009-02-10 | Ge Medical Systems Global Technology Company, Llc | Computer-assisted data processing system and method incorporating automated learning |
US7636596B2 (en) | 2002-12-20 | 2009-12-22 | Medtronic, Inc. | Organ access device and method |
US7029477B2 (en) * | 2002-12-20 | 2006-04-18 | Zimmer Technology, Inc. | Surgical instrument and positioning method |
US8246602B2 (en) | 2002-12-23 | 2012-08-21 | Medtronic, Inc. | Catheters with tracking elements and permeable membranes |
US7226456B2 (en) | 2002-12-31 | 2007-06-05 | Depuy Acromed, Inc. | Trackable medical tool for use in image guided surgery |
WO2004068406A2 (en) * | 2003-01-30 | 2004-08-12 | Chase Medical, L.P. | A method and system for image processing and contour assessment |
US7660623B2 (en) | 2003-01-30 | 2010-02-09 | Medtronic Navigation, Inc. | Six degree of freedom alignment display for medical procedures |
US20050043609A1 (en) * | 2003-01-30 | 2005-02-24 | Gregory Murphy | System and method for facilitating cardiac intervention |
US20040171930A1 (en) | 2003-02-04 | 2004-09-02 | Zimmer Technology, Inc. | Guidance system for rotary surgical instrument |
US6925339B2 (en) | 2003-02-04 | 2005-08-02 | Zimmer Technology, Inc. | Implant registration device for surgical navigation system |
US20050267354A1 (en) * | 2003-02-04 | 2005-12-01 | Joel Marquart | System and method for providing computer assistance with spinal fixation procedures |
US20040152955A1 (en) * | 2003-02-04 | 2004-08-05 | Mcginley Shawn E. | Guidance system for rotary surgical instrument |
US7458977B2 (en) | 2003-02-04 | 2008-12-02 | Zimmer Technology, Inc. | Surgical navigation instrument useful in marking anatomical structures |
US6988009B2 (en) * | 2003-02-04 | 2006-01-17 | Zimmer Technology, Inc. | Implant registration device for surgical navigation system |
US7559935B2 (en) * | 2003-02-20 | 2009-07-14 | Medtronic, Inc. | Target depth locators for trajectory guide for introducing an instrument |
US7896889B2 (en) | 2003-02-20 | 2011-03-01 | Medtronic, Inc. | Trajectory guide with angled or patterned lumens or height adjustment |
DE10309500A1 (en) * | 2003-02-26 | 2004-09-16 | Aesculap Ag & Co. Kg | Patella reference device |
US20040176683A1 (en) * | 2003-03-07 | 2004-09-09 | Katherine Whitin | Method and apparatus for tracking insertion depth |
US8882657B2 (en) | 2003-03-07 | 2014-11-11 | Intuitive Surgical Operations, Inc. | Instrument having radio frequency identification systems and methods for use |
US20070055142A1 (en) * | 2003-03-14 | 2007-03-08 | Webler William E | Method and apparatus for image guided position tracking during percutaneous procedures |
US20050049672A1 (en) * | 2003-03-24 | 2005-03-03 | Murphy Kieran P. | Stent delivery system and method using a balloon for a self-expandable stent |
WO2004087235A2 (en) * | 2003-03-27 | 2004-10-14 | Cierra, Inc. | Methods and apparatus for treatment of patent foramen ovale |
US7972330B2 (en) * | 2003-03-27 | 2011-07-05 | Terumo Kabushiki Kaisha | Methods and apparatus for closing a layered tissue defect |
US7186251B2 (en) * | 2003-03-27 | 2007-03-06 | Cierra, Inc. | Energy based devices and methods for treatment of patent foramen ovale |
US8021362B2 (en) * | 2003-03-27 | 2011-09-20 | Terumo Kabushiki Kaisha | Methods and apparatus for closing a layered tissue defect |
US7165552B2 (en) * | 2003-03-27 | 2007-01-23 | Cierra, Inc. | Methods and apparatus for treatment of patent foramen ovale |
US6939348B2 (en) * | 2003-03-27 | 2005-09-06 | Cierra, Inc. | Energy based devices and methods for treatment of patent foramen ovale |
US7293562B2 (en) * | 2003-03-27 | 2007-11-13 | Cierra, Inc. | Energy based devices and methods for treatment of anatomic tissue defects |
US20070016005A1 (en) * | 2003-05-21 | 2007-01-18 | Koninklijke Philips Electronics N.V. | Apparatus and method for recording the movement of organs of the body |
WO2004107959A2 (en) * | 2003-05-30 | 2004-12-16 | Schaerer Mayfield Usa, Inc. | Stylus for surgical navigation system |
WO2005000140A2 (en) | 2003-06-02 | 2005-01-06 | Murphy Stephen B | Virtual trial reduction system for hip arthroplasty and coordinate systems therefor |
US7831295B2 (en) * | 2003-06-05 | 2010-11-09 | Aesculap Ag & Co. Kg | Localization device cross check |
US7311701B2 (en) * | 2003-06-10 | 2007-12-25 | Cierra, Inc. | Methods and apparatus for non-invasively treating atrial fibrillation using high intensity focused ultrasound |
US20040260199A1 (en) * | 2003-06-19 | 2004-12-23 | Wilson-Cook Medical, Inc. | Cytology collection device |
US6932823B2 (en) | 2003-06-24 | 2005-08-23 | Zimmer Technology, Inc. | Detachable support arm for surgical navigation system reference array |
US7873403B2 (en) * | 2003-07-15 | 2011-01-18 | Brainlab Ag | Method and device for determining a three-dimensional form of a body from two-dimensional projection images |
US7463823B2 (en) * | 2003-07-24 | 2008-12-09 | Brainlab Ag | Stereoscopic visualization device for patient image data and video images |
US7343030B2 (en) * | 2003-08-05 | 2008-03-11 | Imquant, Inc. | Dynamic tumor treatment system |
US8055323B2 (en) * | 2003-08-05 | 2011-11-08 | Imquant, Inc. | Stereotactic system and method for defining a tumor treatment region |
US20050053200A1 (en) * | 2003-08-07 | 2005-03-10 | Predrag Sukovic | Intra-operative CT scanner |
US8150495B2 (en) | 2003-08-11 | 2012-04-03 | Veran Medical Technologies, Inc. | Bodily sealants and methods and apparatus for image-guided delivery of same |
US7398116B2 (en) * | 2003-08-11 | 2008-07-08 | Veran Medical Technologies, Inc. | Methods, apparatuses, and systems useful in conducting image guided interventions |
US20050049485A1 (en) * | 2003-08-27 | 2005-03-03 | Harmon Kim R. | Multiple configuration array for a surgical navigation system |
EP2316328B1 (en) | 2003-09-15 | 2012-05-09 | Super Dimension Ltd. | Wrap-around holding device for use with bronchoscopes |
EP1667749B1 (en) | 2003-09-15 | 2009-08-05 | Super Dimension Ltd. | System of accessories for use with bronchoscopes |
US7862570B2 (en) | 2003-10-03 | 2011-01-04 | Smith & Nephew, Inc. | Surgical positioners |
WO2005034757A1 (en) * | 2003-10-03 | 2005-04-21 | Xoran Technologies, Inc. | Ct imaging system for robotic intervention |
US7764985B2 (en) | 2003-10-20 | 2010-07-27 | Smith & Nephew, Inc. | Surgical navigation system component fault interfaces and related processes |
US20050101970A1 (en) * | 2003-11-06 | 2005-05-12 | Rosenberg William S. | Functional image-guided placement of bone screws, path optimization and orthopedic surgery |
US7794467B2 (en) | 2003-11-14 | 2010-09-14 | Smith & Nephew, Inc. | Adjustable surgical cutting systems |
US20050109855A1 (en) | 2003-11-25 | 2005-05-26 | Mccombs Daniel | Methods and apparatuses for providing a navigational array |
US20070163139A1 (en) * | 2003-11-26 | 2007-07-19 | Russell Donald G | Markers, methods of marking, and marking systems for use in association with images |
US20050119570A1 (en) * | 2003-12-01 | 2005-06-02 | Stephen Lewis | Ultrasonic image and visualization aid |
US20070014452A1 (en) * | 2003-12-01 | 2007-01-18 | Mitta Suresh | Method and system for image processing and assessment of a state of a heart |
JP4155915B2 (en) * | 2003-12-03 | 2008-09-24 | ニスカ株式会社 | Sheet post-processing apparatus and image forming apparatus |
DE10357184A1 (en) * | 2003-12-08 | 2005-07-07 | Siemens Ag | Combination of different images relating to bodily region under investigation, produces display images from assembled three-dimensional fluorescence data image set |
US7771436B2 (en) | 2003-12-10 | 2010-08-10 | Stryker Leibinger Gmbh & Co. Kg. | Surgical navigation tracker, system and method |
US7873400B2 (en) * | 2003-12-10 | 2011-01-18 | Stryker Leibinger Gmbh & Co. Kg. | Adapter for surgical navigation trackers |
US9393039B2 (en) | 2003-12-17 | 2016-07-19 | Brainlab Ag | Universal instrument or instrument set for computer guided surgery |
EP1543789B1 (en) * | 2003-12-17 | 2006-10-04 | BrainLAB AG | Universal instrument or set of instruments for navigation in computer-aided surgery |
US7641661B2 (en) | 2003-12-26 | 2010-01-05 | Zimmer Technology, Inc. | Adjustable resection guide |
US20060030854A1 (en) | 2004-02-02 | 2006-02-09 | Haines Timothy G | Methods and apparatus for wireplasty bone resection |
US20060015115A1 (en) | 2004-03-08 | 2006-01-19 | Haines Timothy G | Methods and apparatus for pivotable guide surfaces for arthroplasty |
US7857814B2 (en) | 2004-01-14 | 2010-12-28 | Hudson Surgical Design, Inc. | Methods and apparatus for minimally invasive arthroplasty |
US8114083B2 (en) | 2004-01-14 | 2012-02-14 | Hudson Surgical Design, Inc. | Methods and apparatus for improved drilling and milling tools for resection |
US8021368B2 (en) | 2004-01-14 | 2011-09-20 | Hudson Surgical Design, Inc. | Methods and apparatus for improved cutting tools for resection |
US7815645B2 (en) | 2004-01-14 | 2010-10-19 | Hudson Surgical Design, Inc. | Methods and apparatus for pinplasty bone resection |
EP1711119A1 (en) | 2004-01-23 | 2006-10-18 | Traxyz Medical, Inc. | Methods and apparatus for performing procedures on target locations in the body |
US7333643B2 (en) * | 2004-01-30 | 2008-02-19 | Chase Medical, L.P. | System and method for facilitating cardiac intervention |
US20050187562A1 (en) * | 2004-02-03 | 2005-08-25 | Grimm James E. | Orthopaedic component inserter for use with a surgical navigation system |
US8764725B2 (en) | 2004-02-09 | 2014-07-01 | Covidien Lp | Directional anchoring mechanism, method and applications thereof |
EP1563799B2 (en) * | 2004-02-11 | 2012-11-28 | BrainLAB AG | Adjustable marker arrangement |
US20050182422A1 (en) | 2004-02-13 | 2005-08-18 | Schulte Gregory T. | Apparatus for securing a therapy delivery device within a burr hole and method for making same |
EP1734878B1 (en) * | 2004-02-20 | 2011-10-19 | Hector O. Pacheco | Method for determining the size of pedicle screws |
US8046049B2 (en) * | 2004-02-23 | 2011-10-25 | Biosense Webster, Inc. | Robotically guided catheter |
US20050215888A1 (en) * | 2004-03-05 | 2005-09-29 | Grimm James E | Universal support arm and tracking array |
US20060052691A1 (en) * | 2004-03-05 | 2006-03-09 | Hall Maleata Y | Adjustable navigated tracking element mount |
US20050203539A1 (en) * | 2004-03-08 | 2005-09-15 | Grimm James E. | Navigated stemmed orthopaedic implant inserter |
US8114086B2 (en) | 2004-03-08 | 2012-02-14 | Zimmer Technology, Inc. | Navigated cut guide locator |
US7993341B2 (en) * | 2004-03-08 | 2011-08-09 | Zimmer Technology, Inc. | Navigated orthopaedic guide and method |
FR2867376B1 (en) * | 2004-03-12 | 2007-01-05 | Tornier Sa | DEVICE AND ASSEMBLY FOR DETERMINING THE POSITION OF A PORTION OF A HUMAN BODY |
EP1723605A1 (en) * | 2004-03-12 | 2006-11-22 | Bracco Imaging, S.P.A. | Accuracy evaluation of video-based augmented reality enhanced surgical navigation systems |
WO2005091978A2 (en) * | 2004-03-22 | 2005-10-06 | Vanderbilt University | System and methods for surgical instrument disablement via image-guided position feedback |
US7520848B2 (en) * | 2004-04-09 | 2009-04-21 | The Board Of Trustees Of The Leland Stanford Junior University | Robotic apparatus for targeting and producing deep, focused transcranial magnetic stimulation |
US8052591B2 (en) * | 2006-05-05 | 2011-11-08 | The Board Of Trustees Of The Leland Stanford Junior University | Trajectory-based deep-brain stereotactic transcranial magnetic stimulation |
EP1737375B1 (en) | 2004-04-21 | 2021-08-11 | Smith & Nephew, Inc | Computer-aided navigation systems for shoulder arthroplasty |
US7711405B2 (en) * | 2004-04-28 | 2010-05-04 | Siemens Corporation | Method of registering pre-operative high field closed magnetic resonance images with intra-operative low field open interventional magnetic resonance images |
FR2871363B1 (en) * | 2004-06-15 | 2006-09-01 | Medtech Sa | ROBOTIZED GUIDING DEVICE FOR SURGICAL TOOL |
US7367975B2 (en) | 2004-06-21 | 2008-05-06 | Cierra, Inc. | Energy based devices and methods for treatment of anatomic tissue defects |
US20060036148A1 (en) * | 2004-07-23 | 2006-02-16 | Grimm James E | Navigated surgical sizing guide |
US8167888B2 (en) * | 2004-08-06 | 2012-05-01 | Zimmer Technology, Inc. | Tibial spacer blocks and femoral cutting guide |
US8016835B2 (en) * | 2004-08-06 | 2011-09-13 | Depuy Spine, Inc. | Rigidly guided implant placement with control assist |
US8182491B2 (en) | 2004-08-06 | 2012-05-22 | Depuy Spine, Inc. | Rigidly guided implant placement |
US8131342B2 (en) * | 2004-08-24 | 2012-03-06 | General Electric Company | Method and system for field mapping using integral methodology |
US7397475B2 (en) * | 2004-09-02 | 2008-07-08 | Siemens Medical Solutions Usa, Inc. | Interactive atlas extracted from volume data |
EP1647236A1 (en) | 2004-10-15 | 2006-04-19 | BrainLAB AG | Apparatus and method for the position checking of markers |
US9579121B2 (en) * | 2004-10-28 | 2017-02-28 | Nico Corporation | Holding arrangement for a surgical access system |
US9216015B2 (en) | 2004-10-28 | 2015-12-22 | Vycor Medical, Inc. | Apparatus and methods for performing brain surgery |
US7497863B2 (en) | 2004-12-04 | 2009-03-03 | Medtronic, Inc. | Instrument guiding stage apparatus and method for using same |
US7744606B2 (en) | 2004-12-04 | 2010-06-29 | Medtronic, Inc. | Multi-lumen instrument guide |
US7621874B2 (en) * | 2004-12-14 | 2009-11-24 | Scimed Life Systems, Inc. | Systems and methods for improved three-dimensional imaging of a body lumen |
US20060161059A1 (en) * | 2005-01-20 | 2006-07-20 | Zimmer Technology, Inc. | Variable geometry reference array |
US20060229641A1 (en) * | 2005-01-28 | 2006-10-12 | Rajiv Gupta | Guidance and insertion system |
US20060173268A1 (en) * | 2005-01-28 | 2006-08-03 | General Electric Company | Methods and systems for controlling acquisition of images |
US20060181482A1 (en) * | 2005-02-03 | 2006-08-17 | Iaquinto John M | Apparatus for providing visual data during an operation |
US20060184003A1 (en) * | 2005-02-03 | 2006-08-17 | Lewin Jonathan S | Intra-procedurally determining the position of an internal anatomical target location using an externally measurable parameter |
US8177788B2 (en) | 2005-02-22 | 2012-05-15 | Smith & Nephew, Inc. | In-line milling system |
CA2816973C (en) * | 2005-03-07 | 2014-11-04 | Hector O. Pacheco | Cannula for insertion in the elongated opening of a pedicle |
AU2006235506B2 (en) | 2005-04-11 | 2011-06-30 | Terumo Kabushiki Kaisha | Methods and apparatus to achieve a closure of a layered tissue defect |
EP3135238B1 (en) * | 2005-04-18 | 2019-05-29 | Image Navigation Ltd | Methods and apparatus for dental implantation |
JP4914574B2 (en) * | 2005-04-18 | 2012-04-11 | オリンパスメディカルシステムズ株式会社 | Endoscope shape detection device |
US20060287583A1 (en) | 2005-06-17 | 2006-12-21 | Pool Cover Corporation | Surgical access instruments for use with delicate tissues |
US7840256B2 (en) * | 2005-06-27 | 2010-11-23 | Biomet Manufacturing Corporation | Image guided tracking array and method |
US20070083100A1 (en) * | 2005-07-20 | 2007-04-12 | Sebastian Schulz-Stubner | Ventriculostomy Catheter with In Situ Ultrasound Capability |
DE102005039657A1 (en) * | 2005-08-22 | 2007-03-22 | Siemens Ag | Medical instrument e.g. catheter, representation method for x-ray diagnostic device, involves superimposing actual position of medical instrument in image of three-dimensional data record |
US20070066881A1 (en) | 2005-09-13 | 2007-03-22 | Edwards Jerome R | Apparatus and method for image guided accuracy verification |
EP1924198B1 (en) | 2005-09-13 | 2019-04-03 | Veran Medical Technologies, Inc. | Apparatus for image guided accuracy verification |
US7643862B2 (en) | 2005-09-15 | 2010-01-05 | Biomet Manufacturing Corporation | Virtual mouse for use in surgical navigation |
US20070073133A1 (en) * | 2005-09-15 | 2007-03-29 | Schoenefeld Ryan J | Virtual mouse for use in surgical navigation |
US20070066917A1 (en) * | 2005-09-20 | 2007-03-22 | Hodorek Robert A | Method for simulating prosthetic implant selection and placement |
US7835784B2 (en) | 2005-09-21 | 2010-11-16 | Medtronic Navigation, Inc. | Method and apparatus for positioning a reference frame |
US20070078678A1 (en) * | 2005-09-30 | 2007-04-05 | Disilvestro Mark R | System and method for performing a computer assisted orthopaedic surgical procedure |
US7981038B2 (en) * | 2005-10-11 | 2011-07-19 | Carnegie Mellon University | Sensor guided catheter navigation system |
US8357181B2 (en) | 2005-10-27 | 2013-01-22 | Warsaw Orthopedic, Inc. | Intervertebral prosthetic device for spinal stabilization and method of implanting same |
EP1956962B1 (en) | 2005-11-22 | 2020-09-16 | Intuitive Surgical Operations, Inc. | System for determining the shape of a bendable instrument |
US8083879B2 (en) | 2005-11-23 | 2011-12-27 | Intuitive Surgical Operations, Inc. | Non-metallic, multi-strand control cable for steerable instruments |
WO2007064937A1 (en) * | 2005-12-02 | 2007-06-07 | University Of Rochester | Image-guided therapy delivery and diagnostic needle system |
US20070161888A1 (en) * | 2005-12-30 | 2007-07-12 | Sherman Jason T | System and method for registering a bone of a patient with a computer assisted orthopaedic surgery system |
US20070167741A1 (en) * | 2005-12-30 | 2007-07-19 | Sherman Jason T | Apparatus and method for registering a bone of a patient with a computer assisted orthopaedic surgery system |
US7957789B2 (en) | 2005-12-30 | 2011-06-07 | Medtronic, Inc. | Therapy delivery system including a navigation element |
US7525309B2 (en) | 2005-12-30 | 2009-04-28 | Depuy Products, Inc. | Magnetic sensor array |
US8862200B2 (en) * | 2005-12-30 | 2014-10-14 | DePuy Synthes Products, LLC | Method for determining a position of a magnetic source |
US20070156066A1 (en) * | 2006-01-03 | 2007-07-05 | Zimmer Technology, Inc. | Device for determining the shape of an anatomic surface |
US7520880B2 (en) * | 2006-01-09 | 2009-04-21 | Zimmer Technology, Inc. | Adjustable surgical support base with integral hinge |
US7744600B2 (en) | 2006-01-10 | 2010-06-29 | Zimmer Technology, Inc. | Bone resection guide and method |
US9168102B2 (en) | 2006-01-18 | 2015-10-27 | Medtronic Navigation, Inc. | Method and apparatus for providing a container to a sterile environment |
US8083795B2 (en) | 2006-01-18 | 2011-12-27 | Warsaw Orthopedic, Inc. | Intervertebral prosthetic device for spinal stabilization and method of manufacturing same |
US7780671B2 (en) | 2006-01-23 | 2010-08-24 | Zimmer Technology, Inc. | Bone resection apparatus and method for knee surgery |
CA2640075C (en) * | 2006-01-24 | 2012-07-17 | Leucadia 6, Llc | Methods for determining pedicle base circumference, pedicle isthmus and center of the pedicle isthmus for pedicle screw or instrument placement in spinal surgery |
US7662183B2 (en) * | 2006-01-24 | 2010-02-16 | Timothy Haines | Dynamic spinal implants incorporating cartilage bearing graft material |
US7885705B2 (en) | 2006-02-10 | 2011-02-08 | Murphy Stephen B | System and method for facilitating hip surgery |
US8219178B2 (en) | 2007-02-16 | 2012-07-10 | Catholic Healthcare West | Method and system for performing invasive medical procedures using a surgical robot |
US10893912B2 (en) | 2006-02-16 | 2021-01-19 | Globus Medical Inc. | Surgical tool systems and methods |
US10357184B2 (en) | 2012-06-21 | 2019-07-23 | Globus Medical, Inc. | Surgical tool systems and method |
US10653497B2 (en) | 2006-02-16 | 2020-05-19 | Globus Medical, Inc. | Surgical tool systems and methods |
US20070239153A1 (en) * | 2006-02-22 | 2007-10-11 | Hodorek Robert A | Computer assisted surgery system using alternative energy technology |
US8323290B2 (en) * | 2006-03-03 | 2012-12-04 | Biomet Manufacturing Corp. | Tensor for use in surgical navigation |
US8626953B2 (en) | 2006-03-03 | 2014-01-07 | St. Louis University | System and method of communicating data for a hospital |
JP5407014B2 (en) | 2006-03-17 | 2014-02-05 | ジンマー,インコーポレイティド | A method for determining the contour of the surface of the bone to be excised and evaluating the fit of the prosthesis to the bone |
US20070225725A1 (en) * | 2006-03-21 | 2007-09-27 | Zimmer Technology, Inc. | Modular acetabular component inserter |
WO2007110076A1 (en) * | 2006-03-24 | 2007-10-04 | B-K Medical Aps | Biopsy system |
US20070236514A1 (en) * | 2006-03-29 | 2007-10-11 | Bracco Imaging Spa | Methods and Apparatuses for Stereoscopic Image Guided Surgical Navigation |
US8267850B2 (en) * | 2007-11-27 | 2012-09-18 | Cervel Neurotech, Inc. | Transcranial magnet stimulation of deep brain targets |
US9352167B2 (en) | 2006-05-05 | 2016-05-31 | Rio Grande Neurosciences, Inc. | Enhanced spatial summation for deep-brain transcranial magnetic stimulation |
EP1857068B1 (en) * | 2006-05-11 | 2009-02-25 | BrainLAB AG | Device for tracking rigid, difficult or impossible to register body structures |
US8190389B2 (en) | 2006-05-17 | 2012-05-29 | Acclarent, Inc. | Adapter for attaching electromagnetic image guidance components to a medical device |
US8425418B2 (en) * | 2006-05-18 | 2013-04-23 | Eigen, Llc | Method of ultrasonic imaging and biopsy of the prostate |
WO2007136768A2 (en) | 2006-05-19 | 2007-11-29 | Mako Surgical Corp. | Method and apparatus for controlling a haptic device |
US8568299B2 (en) | 2006-05-19 | 2013-10-29 | Intuitive Surgical Operations, Inc. | Methods and apparatus for displaying three-dimensional orientation of a steerable distal tip of an endoscope |
US7894653B2 (en) * | 2006-05-23 | 2011-02-22 | Siemens Medical Solutions Usa, Inc. | Automatic organ detection using machine learning and classification algorithms |
RU2445007C2 (en) * | 2006-05-24 | 2012-03-20 | Конинклейке Филипс Электроникс, Н.В. | Superposition of coordinate systems |
US8635082B2 (en) | 2006-05-25 | 2014-01-21 | DePuy Synthes Products, LLC | Method and system for managing inventories of orthopaedic implants |
US8560047B2 (en) | 2006-06-16 | 2013-10-15 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
JP4816281B2 (en) * | 2006-06-22 | 2011-11-16 | 富士ゼロックス株式会社 | Document use management system, document management server and program thereof |
JP4875416B2 (en) | 2006-06-27 | 2012-02-15 | オリンパスメディカルシステムズ株式会社 | Medical guide system |
AU2007265472B2 (en) * | 2006-06-28 | 2011-08-11 | Pacheco, Hector O | Templating and placing artifical discs in spine |
DE102006032127B4 (en) | 2006-07-05 | 2008-04-30 | Aesculap Ag & Co. Kg | Calibration method and calibration device for a surgical referencing unit |
US8333771B2 (en) | 2006-08-04 | 2012-12-18 | Magrod, Llc | System for pushing and pulling surgical implants into position in vivo via a tether |
US7976546B2 (en) * | 2006-08-04 | 2011-07-12 | Magrod, Llc | Magnetic targeting system for facilitating navigation |
US8092458B2 (en) | 2006-08-04 | 2012-01-10 | Magrod, Llc | Magnetic targeting system and method of using the same |
US8092461B2 (en) * | 2006-08-04 | 2012-01-10 | Magrod, Llc | Method and apparatus for facilitating navigation of an implant |
US8565853B2 (en) | 2006-08-11 | 2013-10-22 | DePuy Synthes Products, LLC | Simulated bone or tissue manipulation |
US20080051677A1 (en) * | 2006-08-23 | 2008-02-28 | Warsaw Orthopedic, Inc. | Method and apparatus for osteochondral autograft transplantation |
US7747306B2 (en) | 2006-09-01 | 2010-06-29 | Warsaw Orthopedic, Inc. | Osteochondral implant procedure |
US20080123910A1 (en) * | 2006-09-19 | 2008-05-29 | Bracco Imaging Spa | Method and system for providing accuracy evaluation of image guided surgery |
US8660635B2 (en) | 2006-09-29 | 2014-02-25 | Medtronic, Inc. | Method and apparatus for optimizing a computer assisted surgical procedure |
US7535991B2 (en) | 2006-10-16 | 2009-05-19 | Oraya Therapeutics, Inc. | Portable orthovoltage radiotherapy |
US7620147B2 (en) | 2006-12-13 | 2009-11-17 | Oraya Therapeutics, Inc. | Orthovoltage radiotherapy |
US8064664B2 (en) | 2006-10-18 | 2011-11-22 | Eigen, Inc. | Alignment method for registering medical images |
US7879040B2 (en) | 2006-10-23 | 2011-02-01 | Warsaw Orthopedic, IN | Method and apparatus for osteochondral autograft transplantation |
US7804989B2 (en) * | 2006-10-30 | 2010-09-28 | Eigen, Inc. | Object recognition system for medical imaging |
US8852192B2 (en) | 2006-11-13 | 2014-10-07 | Warsaw Orthopedic, Inc. | Method and apparatus for osteochondral autograft transplantation |
US20080140070A1 (en) * | 2006-12-07 | 2008-06-12 | Cierra, Inc. | Multi-electrode apparatus for tissue welding and ablation |
US20080140112A1 (en) * | 2006-12-07 | 2008-06-12 | Cierra, Inc. | Method for orienting a multi-electrode apparatus |
US20080140069A1 (en) * | 2006-12-07 | 2008-06-12 | Cierra, Inc. | Multi-electrode apparatus for tissue welding and ablation |
US8068648B2 (en) * | 2006-12-21 | 2011-11-29 | Depuy Products, Inc. | Method and system for registering a bone of a patient with a computer assisted orthopaedic surgery system |
US20080161687A1 (en) * | 2006-12-29 | 2008-07-03 | Suri Jasjit S | Repeat biopsy system |
US7780349B2 (en) * | 2007-01-03 | 2010-08-24 | James G. Schwade | Apparatus and method for robotic radiosurgery beam geometry quality assurance |
US8473030B2 (en) | 2007-01-12 | 2013-06-25 | Medtronic Vascular, Inc. | Vessel position and configuration imaging apparatus and methods |
US8175350B2 (en) * | 2007-01-15 | 2012-05-08 | Eigen, Inc. | Method for tissue culture extraction |
US20080183068A1 (en) * | 2007-01-25 | 2008-07-31 | Warsaw Orthopedic, Inc. | Integrated Visualization of Surgical Navigational and Neural Monitoring Information |
US20080183188A1 (en) * | 2007-01-25 | 2008-07-31 | Warsaw Orthopedic, Inc. | Integrated Surgical Navigational and Neuromonitoring System |
US7987001B2 (en) | 2007-01-25 | 2011-07-26 | Warsaw Orthopedic, Inc. | Surgical navigational and neuromonitoring instrument |
US20080183074A1 (en) * | 2007-01-25 | 2008-07-31 | Warsaw Orthopedic, Inc. | Method and apparatus for coordinated display of anatomical and neuromonitoring information |
US8374673B2 (en) | 2007-01-25 | 2013-02-12 | Warsaw Orthopedic, Inc. | Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control |
US20080186378A1 (en) * | 2007-02-06 | 2008-08-07 | Feimo Shen | Method and apparatus for guiding towards targets during motion |
EP1955668B1 (en) * | 2007-02-07 | 2012-04-04 | BrainLAB AG | Method and device for the determination of alignment information during sonographically navigable repositioning of bone fragments |
EP2114255A4 (en) * | 2007-03-03 | 2012-08-15 | Activiews Ltd | Method, system and computer product for planning needle procedures |
US7856130B2 (en) * | 2007-03-28 | 2010-12-21 | Eigen, Inc. | Object recognition system for medical imaging |
JP5527731B2 (en) * | 2007-04-16 | 2014-06-25 | ニューロアーム サージカル エル ティ ディー | Methods, devices, and systems useful for registration |
EP1982666A1 (en) | 2007-04-19 | 2008-10-22 | Weber Instrumente GmbH | Device for identifying a spatial position |
WO2008130354A1 (en) * | 2007-04-24 | 2008-10-30 | Medtronic, Inc. | Intraoperative image registration |
US8010177B2 (en) * | 2007-04-24 | 2011-08-30 | Medtronic, Inc. | Intraoperative image registration |
US20090012509A1 (en) * | 2007-04-24 | 2009-01-08 | Medtronic, Inc. | Navigated Soft Tissue Penetrating Laser System |
US9289270B2 (en) | 2007-04-24 | 2016-03-22 | Medtronic, Inc. | Method and apparatus for performing a navigated procedure |
US8934961B2 (en) | 2007-05-18 | 2015-01-13 | Biomet Manufacturing, Llc | Trackable diagnostic scope apparatus and methods of use |
US8506558B2 (en) | 2008-01-11 | 2013-08-13 | Oraya Therapeutics, Inc. | System and method for performing an ocular irradiation procedure |
EP2606864B1 (en) * | 2007-06-04 | 2016-09-14 | Oraya Therapeutics, INC. | Assembly for positioning, stabilizing and treating an eye |
US8363783B2 (en) | 2007-06-04 | 2013-01-29 | Oraya Therapeutics, Inc. | Method and device for ocular alignment and coupling of ocular structures |
WO2008151446A1 (en) * | 2007-06-15 | 2008-12-18 | Orthosoft Inc. | Computer-assisted surgery system and method |
JP2009136523A (en) * | 2007-12-07 | 2009-06-25 | Ge Medical Systems Global Technology Co Llc | Ultrasonic diagnosis apparatus, radiofrequency wave cautery treatment device, ultrasonic diagnosis and treatment system, and ultrasonic diagnosis and treatment apparatus |
US20100256436A1 (en) * | 2007-07-31 | 2010-10-07 | Partsch Michael J | Device and method for treating hypertension via non-invasive neuromodulation |
US20100185042A1 (en) * | 2007-08-05 | 2010-07-22 | Schneider M Bret | Control and coordination of transcranial magnetic stimulation electromagnets for modulation of deep brain targets |
US20090099405A1 (en) * | 2007-08-05 | 2009-04-16 | Neostim, Inc. | Monophasic multi-coil arrays for trancranial magnetic stimulation |
US8956274B2 (en) * | 2007-08-05 | 2015-02-17 | Cervel Neurotech, Inc. | Transcranial magnetic stimulation field shaping |
WO2009055634A1 (en) * | 2007-10-24 | 2009-04-30 | Neostim Inc. | Intra-session control of transcranial magnetic stimulation |
WO2009023680A1 (en) * | 2007-08-13 | 2009-02-19 | Neostim, Inc. | Gantry and switches for position-based triggering of tms pulses in moving coils |
US20090048515A1 (en) * | 2007-08-14 | 2009-02-19 | Suri Jasjit S | Biopsy planning system |
US9179983B2 (en) | 2007-08-14 | 2015-11-10 | Zimmer, Inc. | Method of determining a contour of an anatomical structure and selecting an orthopaedic implant to replicate the anatomical structure |
AU2008288967A1 (en) * | 2007-08-20 | 2009-02-26 | Cervel NeuroTech, Inc | Firing patterns for deep brain transcranial magnetic stimulation |
WO2009033192A1 (en) * | 2007-09-09 | 2009-03-12 | Neostim, Inc. | Focused magnetic fields |
US8548569B2 (en) * | 2007-09-24 | 2013-10-01 | MRI Interventions, Inc. | Head fixation assemblies for medical procedures |
US8265949B2 (en) | 2007-09-27 | 2012-09-11 | Depuy Products, Inc. | Customized patient surgical plan |
US8905920B2 (en) | 2007-09-27 | 2014-12-09 | Covidien Lp | Bronchoscope adapter and method |
US7703653B2 (en) | 2007-09-28 | 2010-04-27 | Tyco Healthcare Group Lp | Articulation mechanism for surgical instrument |
CN102652687B (en) | 2007-09-30 | 2015-08-19 | 德普伊产品公司 | The patient-specific orthopaedic surgical instrumentation of customization |
US8265910B2 (en) * | 2007-10-09 | 2012-09-11 | Cervel Neurotech, Inc. | Display of modeled magnetic fields |
US9220398B2 (en) | 2007-10-11 | 2015-12-29 | Intuitive Surgical Operations, Inc. | System for managing Bowden cables in articulating instruments |
US8391952B2 (en) * | 2007-10-11 | 2013-03-05 | General Electric Company | Coil arrangement for an electromagnetic tracking system |
US8571277B2 (en) * | 2007-10-18 | 2013-10-29 | Eigen, Llc | Image interpolation for medical imaging |
US20100286468A1 (en) * | 2007-10-26 | 2010-11-11 | David J Mishelevich | Transcranial magnetic stimulation with protection of magnet-adjacent structures |
US20090118641A1 (en) * | 2007-11-02 | 2009-05-07 | Jacques Van Dam | Devices, Methods, and Kits for a Biopsy Device |
US7942829B2 (en) * | 2007-11-06 | 2011-05-17 | Eigen, Inc. | Biopsy planning and display apparatus |
KR20090066776A (en) * | 2007-12-20 | 2009-06-24 | 한국전자통신연구원 | Localization service framework for estimatiing robot position and its method |
AU2008341062B2 (en) * | 2007-12-21 | 2014-12-18 | Smith & Nephew, Inc. | Multiple portal guide |
US9826992B2 (en) * | 2007-12-21 | 2017-11-28 | Smith & Nephew, Inc. | Multiple portal guide |
US7801271B2 (en) | 2007-12-23 | 2010-09-21 | Oraya Therapeutics, Inc. | Methods and devices for orthovoltage ocular radiotherapy and treatment planning |
EP3272395B1 (en) | 2007-12-23 | 2019-07-17 | Carl Zeiss Meditec, Inc. | Devices for detecting, controlling, and predicting radiation delivery |
US8571637B2 (en) | 2008-01-21 | 2013-10-29 | Biomet Manufacturing, Llc | Patella tracking method and apparatus for use in surgical navigation |
US20090324041A1 (en) * | 2008-01-23 | 2009-12-31 | Eigen, Llc | Apparatus for real-time 3d biopsy |
EP2249690B1 (en) | 2008-02-06 | 2021-09-29 | Intuitive Surgical Operations, Inc. | A segmented instrument having braking capabilities |
US8182418B2 (en) | 2008-02-25 | 2012-05-22 | Intuitive Surgical Operations, Inc. | Systems and methods for articulating an elongate body |
US20100001996A1 (en) * | 2008-02-28 | 2010-01-07 | Eigen, Llc | Apparatus for guiding towards targets during motion using gpu processing |
US20090222011A1 (en) * | 2008-02-28 | 2009-09-03 | Warsaw Orthopedic, Inc. | Targeting surgical instrument for use in spinal disc replacement and methods for use in spinal disc replacement |
WO2009117833A1 (en) * | 2008-03-25 | 2009-10-01 | Orthosoft Inc. | Method and system for planning/guiding alterations to a bone |
US9575140B2 (en) | 2008-04-03 | 2017-02-21 | Covidien Lp | Magnetic interference detection system and method |
EP2106765B1 (en) | 2008-04-03 | 2013-04-03 | BrainLAB AG | Pictorial orientation aid for medical instruments |
US8549888B2 (en) | 2008-04-04 | 2013-10-08 | Nuvasive, Inc. | System and device for designing and forming a surgical implant |
JP5318643B2 (en) * | 2008-04-23 | 2013-10-16 | 一般財団法人電力中央研究所 | Method, apparatus and program for evaluating thermal insulation performance of coating layer |
JP5318656B2 (en) * | 2008-05-27 | 2013-10-16 | 一般財団法人電力中央研究所 | Method, apparatus and program for evaluating thermal insulation performance of coating layer |
EP2297673B1 (en) | 2008-06-03 | 2020-04-22 | Covidien LP | Feature-based registration method |
US8218847B2 (en) | 2008-06-06 | 2012-07-10 | Superdimension, Ltd. | Hybrid registration method |
CA2743458C (en) * | 2008-06-18 | 2016-08-16 | Eyelab Group, Llc | System and method for determining volume-related parameters of ocular and other biological tissues |
US8932207B2 (en) | 2008-07-10 | 2015-01-13 | Covidien Lp | Integrated multi-functional endoscopic tool |
US8551074B2 (en) | 2008-09-08 | 2013-10-08 | Bayer Pharma AG | Connector system having a compressible sealing element and a flared fluid path element |
JP4253356B1 (en) * | 2008-09-13 | 2009-04-08 | 日出海 宮崎 | Navigation surgical antenna holder |
US11298113B2 (en) | 2008-10-01 | 2022-04-12 | Covidien Lp | Device for needle biopsy with integrated needle protection |
US9782565B2 (en) | 2008-10-01 | 2017-10-10 | Covidien Lp | Endoscopic ultrasound-guided biliary access system |
US9186128B2 (en) | 2008-10-01 | 2015-11-17 | Covidien Lp | Needle biopsy device |
US8968210B2 (en) | 2008-10-01 | 2015-03-03 | Covidien LLP | Device for needle biopsy with integrated needle protection |
US9332973B2 (en) | 2008-10-01 | 2016-05-10 | Covidien Lp | Needle biopsy device with exchangeable needle and integrated needle protection |
US8388659B1 (en) | 2008-10-17 | 2013-03-05 | Theken Spine, Llc | Spondylolisthesis screw and instrument for implantation |
US8795148B2 (en) * | 2009-10-26 | 2014-08-05 | Cervel Neurotech, Inc. | Sub-motor-threshold stimulation of deep brain targets using transcranial magnetic stimulation |
US8175681B2 (en) | 2008-12-16 | 2012-05-08 | Medtronic Navigation Inc. | Combination of electromagnetic and electropotential localization |
US8723628B2 (en) | 2009-01-07 | 2014-05-13 | Cervel Neurotech, Inc. | Shaped coils for transcranial magnetic stimulation |
US10070849B2 (en) * | 2009-02-20 | 2018-09-11 | Covidien Lp | Marking articulating direction for surgical instrument |
US8366719B2 (en) | 2009-03-18 | 2013-02-05 | Integrated Spinal Concepts, Inc. | Image-guided minimal-step placement of screw into bone |
US10004387B2 (en) | 2009-03-26 | 2018-06-26 | Intuitive Surgical Operations, Inc. | Method and system for assisting an operator in endoscopic navigation |
US8337397B2 (en) | 2009-03-26 | 2012-12-25 | Intuitive Surgical Operations, Inc. | Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient |
US8611984B2 (en) | 2009-04-08 | 2013-12-17 | Covidien Lp | Locatable catheter |
US20110009739A1 (en) * | 2009-07-13 | 2011-01-13 | Phillips Scott B | Transcranial ultrasound transducer with stereotactic conduit for placement of ventricular catheter |
US10039607B2 (en) | 2009-08-27 | 2018-08-07 | Brainlab Ag | Disposable and radiolucent reference array for optical tracking |
US8494613B2 (en) | 2009-08-31 | 2013-07-23 | Medtronic, Inc. | Combination localization system |
US8494614B2 (en) | 2009-08-31 | 2013-07-23 | Regents Of The University Of Minnesota | Combination localization system |
US8409098B2 (en) * | 2009-10-14 | 2013-04-02 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Method and apparatus for collection of cardiac geometry based on optical or magnetic tracking |
US8376938B2 (en) * | 2009-11-20 | 2013-02-19 | Ethicon Endo-Surgery, Inc. | Discrete flexion head for single port device |
US8435174B2 (en) * | 2009-12-11 | 2013-05-07 | Ethicon Endo-Surgery, Inc. | Methods and devices for accessing a body cavity |
US8231570B2 (en) * | 2009-12-11 | 2012-07-31 | Ethicon Endo-Surgery, Inc. | Inverted conical expandable retractor |
US8460186B2 (en) * | 2009-12-11 | 2013-06-11 | Ethicon Endo-Surgery, Inc. | Methods and devices for providing access through tissue to a surgical site |
US8517932B2 (en) * | 2009-12-11 | 2013-08-27 | Ethicon Endo-Surgery, Inc. | Methods and devices for providing access through tissue to a surgical site |
US8414483B2 (en) * | 2009-12-11 | 2013-04-09 | Ethicon Endo-Surgery, Inc. | Methods and devices for providing access into a body cavity |
US8282546B2 (en) * | 2009-12-11 | 2012-10-09 | Ethicon Endo-Surgery, Inc. | Inverted conical expandable retractor with coil spring |
US8353873B2 (en) * | 2009-12-11 | 2013-01-15 | Ethicon Endo-Surgery, Inc. | Methods and devices for providing access through tissue to a surgical site |
US8357088B2 (en) * | 2009-12-11 | 2013-01-22 | Ethicon Endo-Surgery, Inc. | Methods and devices for providing access into a body cavity |
US8500633B2 (en) * | 2009-12-11 | 2013-08-06 | Ethicon Endo-Surgery, Inc. | Methods and devices for providing surgical access through tissue to a surgical site |
US8444557B2 (en) * | 2009-12-11 | 2013-05-21 | Ethicon Endo-Surgery, Inc. | Methods and devices for providing access through tissue to a surgical site |
US8571282B2 (en) * | 2009-12-24 | 2013-10-29 | Albert Davydov | Method and apparatus for measuring spinal characteristics of a patient |
US9011448B2 (en) * | 2009-12-31 | 2015-04-21 | Orthosensor Inc. | Orthopedic navigation system with sensorized devices |
JP5624328B2 (en) * | 2010-01-04 | 2014-11-12 | 株式会社東芝 | Medical image diagnostic apparatus and image processing apparatus |
EP2526377A4 (en) * | 2010-01-19 | 2014-03-12 | Orthosoft Inc | Tracking system and method |
US8428328B2 (en) | 2010-02-01 | 2013-04-23 | Superdimension, Ltd | Region-growing algorithm |
IT1401669B1 (en) * | 2010-04-07 | 2013-08-02 | Sofar Spa | ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL. |
JP5553672B2 (en) * | 2010-04-26 | 2014-07-16 | キヤノン株式会社 | Acoustic wave measuring apparatus and acoustic wave measuring method |
CA2797302C (en) | 2010-04-28 | 2019-01-15 | Ryerson University | System and methods for intraoperative guidance feedback |
EP2566392A4 (en) | 2010-05-04 | 2015-07-15 | Pathfinder Therapeutics Inc | System and method for abdominal surface matching using pseudo-features |
WO2011159834A1 (en) | 2010-06-15 | 2011-12-22 | Superdimension, Ltd. | Locatable expandable working channel and method |
US8672837B2 (en) | 2010-06-24 | 2014-03-18 | Hansen Medical, Inc. | Methods and devices for controlling a shapeable medical device |
WO2012009603A2 (en) | 2010-07-16 | 2012-01-19 | Cervel Neurotech, Inc. | Transcranial magnetic stimulation for altering susceptibility of tissue to pharmaceuticals and radiation |
US20130303887A1 (en) | 2010-08-20 | 2013-11-14 | Veran Medical Technologies, Inc. | Apparatus and method for four dimensional soft tissue navigation |
US20120316486A1 (en) | 2010-08-20 | 2012-12-13 | Andrew Cheung | Surgical Component Navigation Systems And Methods |
US20120046536A1 (en) * | 2010-08-20 | 2012-02-23 | Manhattan Technologies, Llc | Surgical Instrument Navigation Systems and Methods |
US8551108B2 (en) | 2010-08-31 | 2013-10-08 | Orthosoft Inc. | Tool and method for digital acquisition of a tibial mechanical axis |
US8702592B2 (en) * | 2010-09-30 | 2014-04-22 | David Allan Langlois | System and method for inhibiting injury to a patient during laparoscopic surgery |
US11231787B2 (en) | 2010-10-06 | 2022-01-25 | Nuvasive, Inc. | Imaging system and method for use in surgical and interventional medical procedures |
US8526700B2 (en) | 2010-10-06 | 2013-09-03 | Robert E. Isaacs | Imaging system and method for surgical and interventional medical procedures |
US9785246B2 (en) | 2010-10-06 | 2017-10-10 | Nuvasive, Inc. | Imaging system and method for use in surgical and interventional medical procedures |
US8603078B2 (en) | 2010-10-13 | 2013-12-10 | Ethicon Endo-Surgery, Inc. | Methods and devices for guiding and supporting surgical instruments |
US9223418B2 (en) | 2010-12-15 | 2015-12-29 | Microsoft Technology Licensing, Llc | Pen digitizer |
US9378444B2 (en) * | 2010-12-23 | 2016-06-28 | Microsoft Technology Licensing, Llc | Encoded micro pattern |
US9921712B2 (en) | 2010-12-29 | 2018-03-20 | Mako Surgical Corp. | System and method for providing substantially stable control of a surgical tool |
US9119655B2 (en) | 2012-08-03 | 2015-09-01 | Stryker Corporation | Surgical manipulator capable of controlling a surgical instrument in multiple modes |
USD667549S1 (en) * | 2011-03-03 | 2012-09-18 | Karl Storz Gmbh & Co. Kg | Cervical localization instrument |
US9308050B2 (en) | 2011-04-01 | 2016-04-12 | Ecole Polytechnique Federale De Lausanne (Epfl) | Robotic system and method for spinal and other surgeries |
US11911117B2 (en) | 2011-06-27 | 2024-02-27 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US10219811B2 (en) | 2011-06-27 | 2019-03-05 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US9498231B2 (en) | 2011-06-27 | 2016-11-22 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US8617176B2 (en) | 2011-08-24 | 2013-12-31 | Depuy Mitek, Llc | Cross pinning guide devices and methods |
WO2013033566A1 (en) | 2011-09-02 | 2013-03-07 | Stryker Corporation | Surgical instrument including a cutting accessory extending from a housing and actuators that establish the position of the cutting accessory relative to the housing |
WO2013044043A1 (en) * | 2011-09-22 | 2013-03-28 | Kyle Robert Lynn | Ultrasound tracking adapter |
US9452276B2 (en) | 2011-10-14 | 2016-09-27 | Intuitive Surgical Operations, Inc. | Catheter with removable vision probe |
US10238837B2 (en) | 2011-10-14 | 2019-03-26 | Intuitive Surgical Operations, Inc. | Catheters with control modes for interchangeable probes |
US20130303944A1 (en) | 2012-05-14 | 2013-11-14 | Intuitive Surgical Operations, Inc. | Off-axis electromagnetic sensor |
US9387048B2 (en) | 2011-10-14 | 2016-07-12 | Intuitive Surgical Operations, Inc. | Catheter sensor systems |
US9510771B1 (en) | 2011-10-28 | 2016-12-06 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
US10159456B2 (en) | 2011-11-22 | 2018-12-25 | Ge Medical Systems Israel, Ltd | Systems and methods for biopsy guidance using a biopsy unit including at least one of an imaging detector or ultrasound probe concurrently mounted with a biopsy guide |
RU2619990C2 (en) * | 2011-12-27 | 2017-05-22 | Конинклейке Филипс Н.В. | Intraoperative monitoring of tracking system quality |
EP4056111A3 (en) | 2012-02-22 | 2022-12-07 | Veran Medical Technologies, Inc. | Systems, methods, and devices for four dimensional soft tissue navigation |
TWI463964B (en) * | 2012-03-03 | 2014-12-11 | Univ China Medical | System and apparatus for an image guided navigation system in surgery |
WO2013134623A1 (en) | 2012-03-08 | 2013-09-12 | Neutar, Llc | Patient and procedure customized fixation and targeting devices for stereotactic frames |
US11207132B2 (en) | 2012-03-12 | 2021-12-28 | Nuvasive, Inc. | Systems and methods for performing spinal surgery |
CN103479403B (en) | 2012-06-08 | 2016-06-22 | 长庚大学 | System and the method thereof that focusing ultrasound wave releases energy is guided with operation guiding system |
US11298196B2 (en) | 2012-06-21 | 2022-04-12 | Globus Medical Inc. | Surgical robotic automation with tracking markers and controlled tool advancement |
US11253327B2 (en) | 2012-06-21 | 2022-02-22 | Globus Medical, Inc. | Systems and methods for automatically changing an end-effector on a surgical robot |
US10350013B2 (en) | 2012-06-21 | 2019-07-16 | Globus Medical, Inc. | Surgical tool systems and methods |
US11857149B2 (en) | 2012-06-21 | 2024-01-02 | Globus Medical, Inc. | Surgical robotic systems with target trajectory deviation monitoring and related methods |
US11864839B2 (en) | 2012-06-21 | 2024-01-09 | Globus Medical Inc. | Methods of adjusting a virtual implant and related surgical navigation systems |
US10231791B2 (en) | 2012-06-21 | 2019-03-19 | Globus Medical, Inc. | Infrared signal based position recognition system for use with a robot-assisted surgery |
US10136954B2 (en) | 2012-06-21 | 2018-11-27 | Globus Medical, Inc. | Surgical tool systems and method |
US10624710B2 (en) | 2012-06-21 | 2020-04-21 | Globus Medical, Inc. | System and method for measuring depth of instrumentation |
US11317971B2 (en) | 2012-06-21 | 2022-05-03 | Globus Medical, Inc. | Systems and methods related to robotic guidance in surgery |
US11399900B2 (en) | 2012-06-21 | 2022-08-02 | Globus Medical, Inc. | Robotic systems providing co-registration using natural fiducials and related methods |
US11793570B2 (en) | 2012-06-21 | 2023-10-24 | Globus Medical Inc. | Surgical robotic automation with tracking markers |
US11395706B2 (en) | 2012-06-21 | 2022-07-26 | Globus Medical Inc. | Surgical robot platform |
US11116576B2 (en) | 2012-06-21 | 2021-09-14 | Globus Medical Inc. | Dynamic reference arrays and methods of use |
US11857266B2 (en) | 2012-06-21 | 2024-01-02 | Globus Medical, Inc. | System for a surveillance marker in robotic-assisted surgery |
JP2015528713A (en) | 2012-06-21 | 2015-10-01 | グローバス メディカル インコーポレイティッド | Surgical robot platform |
US11607149B2 (en) | 2012-06-21 | 2023-03-21 | Globus Medical Inc. | Surgical tool systems and method |
US11864745B2 (en) | 2012-06-21 | 2024-01-09 | Globus Medical, Inc. | Surgical robotic system with retractor |
US11045267B2 (en) | 2012-06-21 | 2021-06-29 | Globus Medical, Inc. | Surgical robotic automation with tracking markers |
US10758315B2 (en) | 2012-06-21 | 2020-09-01 | Globus Medical Inc. | Method and system for improving 2D-3D registration convergence |
US20130345757A1 (en) | 2012-06-22 | 2013-12-26 | Shawn D. Stad | Image Guided Intra-Operative Contouring Aid |
US9226796B2 (en) | 2012-08-03 | 2016-01-05 | Stryker Corporation | Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path |
WO2014022786A2 (en) | 2012-08-03 | 2014-02-06 | Stryker Corporation | Systems and methods for robotic surgery |
US9820818B2 (en) | 2012-08-03 | 2017-11-21 | Stryker Corporation | System and method for controlling a surgical manipulator based on implant parameters |
US9993305B2 (en) | 2012-08-08 | 2018-06-12 | Ortoma Ab | Method and system for computer assisted surgery |
WO2014032041A1 (en) * | 2012-08-24 | 2014-02-27 | Old Dominion University Research Foundation | Method and system for image registration |
US9008757B2 (en) | 2012-09-26 | 2015-04-14 | Stryker Corporation | Navigation system including optical and non-optical sensors |
US9339309B1 (en) | 2012-10-11 | 2016-05-17 | Nuvasive, Inc. | Systems and methods for inserting cross-connectors |
US9269140B2 (en) | 2012-12-10 | 2016-02-23 | The Cleveland Clinic Foundation | Image fusion with automated compensation for brain deformation |
RU2663649C2 (en) | 2013-02-28 | 2018-08-07 | Конинклейке Филипс Н.В. | Segmentation of large objects from multiple three-dimensional views |
CN108175503B (en) | 2013-03-13 | 2022-03-18 | 史赛克公司 | System for arranging objects in an operating room in preparation for a surgical procedure |
WO2014165060A2 (en) | 2013-03-13 | 2014-10-09 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US9057600B2 (en) | 2013-03-13 | 2015-06-16 | Hansen Medical, Inc. | Reducing incremental measurement sensor error |
US9241742B2 (en) | 2013-03-14 | 2016-01-26 | DePuy Synthes Products, Inc. | Methods and devices for polyaxial screw alignment |
US10347380B2 (en) * | 2013-03-14 | 2019-07-09 | Think Surgical, Inc. | Intra-operative registration of anatomical structures |
US9014851B2 (en) | 2013-03-15 | 2015-04-21 | Hansen Medical, Inc. | Systems and methods for tracking robotically controlled medical instruments |
US10105149B2 (en) | 2013-03-15 | 2018-10-23 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US9283046B2 (en) | 2013-03-15 | 2016-03-15 | Hansen Medical, Inc. | User interface for active drive apparatus with finite range of motion |
US9271663B2 (en) | 2013-03-15 | 2016-03-01 | Hansen Medical, Inc. | Flexible instrument localization from both remote and elongation sensors |
US9629595B2 (en) | 2013-03-15 | 2017-04-25 | Hansen Medical, Inc. | Systems and methods for localizing, tracking and/or controlling medical instruments |
US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
US10854111B2 (en) | 2013-06-12 | 2020-12-01 | University Of Florida Research Foundation, Inc. | Simulation system and methods for surgical training |
WO2015003224A1 (en) | 2013-07-09 | 2015-01-15 | Cryptych Pty Ltd | Spinal surgery navigation |
FR3010628B1 (en) | 2013-09-18 | 2015-10-16 | Medicrea International | METHOD FOR REALIZING THE IDEAL CURVATURE OF A ROD OF A VERTEBRAL OSTEOSYNTHESIS EQUIPMENT FOR STRENGTHENING THE VERTEBRAL COLUMN OF A PATIENT |
US9283048B2 (en) | 2013-10-04 | 2016-03-15 | KB Medical SA | Apparatus and systems for precise guidance of surgical tools |
US9848922B2 (en) | 2013-10-09 | 2017-12-26 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
FR3012030B1 (en) | 2013-10-18 | 2015-12-25 | Medicrea International | METHOD FOR REALIZING THE IDEAL CURVATURE OF A ROD OF A VERTEBRAL OSTEOSYNTHESIS EQUIPMENT FOR STRENGTHENING THE VERTEBRAL COLUMN OF A PATIENT |
WO2015066280A1 (en) * | 2013-10-30 | 2015-05-07 | Brigham And Women's Hospital, Inc. | Ventriculostomy guidance device |
US9241771B2 (en) | 2014-01-15 | 2016-01-26 | KB Medical SA | Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery |
WO2015121311A1 (en) | 2014-02-11 | 2015-08-20 | KB Medical SA | Sterile handle for controlling a robotic surgical system from a sterile field |
EP3110361A4 (en) * | 2014-02-28 | 2017-11-01 | Cedars-Sinai Medical Center | Surgical drape |
EP2923669B1 (en) | 2014-03-24 | 2017-06-28 | Hansen Medical, Inc. | Systems and devices for catheter driving instinctiveness |
US10537393B2 (en) * | 2014-04-04 | 2020-01-21 | Izi Medical Products, Llc | Medical device for surgical navigation system and corresponding method of manufacturing |
US20150305612A1 (en) | 2014-04-23 | 2015-10-29 | Mark Hunter | Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter |
US20150305650A1 (en) | 2014-04-23 | 2015-10-29 | Mark Hunter | Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue |
CN106659537B (en) | 2014-04-24 | 2019-06-11 | Kb医疗公司 | The surgical instrument holder used in conjunction with robotic surgical system |
JP2017515593A (en) | 2014-05-13 | 2017-06-15 | ビコール メディカル,インコーポレイティド | Guide system attachment for a surgical introducer |
US10952593B2 (en) | 2014-06-10 | 2021-03-23 | Covidien Lp | Bronchoscope adapter |
US9974500B2 (en) | 2014-07-11 | 2018-05-22 | Ge Medical Systems Israel, Ltd. | Systems and methods for open imaging |
EP3169252A1 (en) | 2014-07-14 | 2017-05-24 | KB Medical SA | Anti-skid surgical instrument for use in preparing holes in bone tissue |
US10357378B2 (en) * | 2014-07-22 | 2019-07-23 | Zimmer, Inc. | Devices and methods for trochanteric osteotomy |
US9993177B2 (en) | 2014-08-28 | 2018-06-12 | DePuy Synthes Products, Inc. | Systems and methods for intraoperatively measuring anatomical orientation |
US9950194B2 (en) | 2014-09-09 | 2018-04-24 | Mevion Medical Systems, Inc. | Patient positioning system |
US10070972B2 (en) | 2014-10-03 | 2018-09-11 | Hospital For Special Surgery | System and method for intraoperative joint contact mechanics measurement |
USD820984S1 (en) | 2014-10-07 | 2018-06-19 | Synaptive Medical (Barbados) Inc. | Pointer tool |
US10433893B1 (en) | 2014-10-17 | 2019-10-08 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
USD788915S1 (en) * | 2014-10-17 | 2017-06-06 | Synaptive Medical (Barbados) Inc. | Port tracking tool |
FR3030222B1 (en) * | 2014-12-23 | 2021-09-24 | Yann Glard | SURGICAL GUIDANCE SYSTEM |
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
EP3258872B1 (en) | 2015-02-18 | 2023-04-26 | KB Medical SA | Systems for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique |
DE102015204867A1 (en) | 2015-03-18 | 2016-09-22 | Kuka Roboter Gmbh | Robot system and method for operating a teleoperative process |
CN107645924B (en) | 2015-04-15 | 2021-04-20 | 莫比乌斯成像公司 | Integrated medical imaging and surgical robotic system |
US10426555B2 (en) | 2015-06-03 | 2019-10-01 | Covidien Lp | Medical instrument with sensor for use in a system and method for electromagnetic navigation |
US10058394B2 (en) | 2015-07-31 | 2018-08-28 | Globus Medical, Inc. | Robot arm and methods of use |
US10646298B2 (en) | 2015-07-31 | 2020-05-12 | Globus Medical, Inc. | Robot arm and methods of use |
US10080615B2 (en) | 2015-08-12 | 2018-09-25 | Globus Medical, Inc. | Devices and methods for temporary mounting of parts to bone |
JP6894431B2 (en) | 2015-08-31 | 2021-06-30 | ケービー メディカル エスアー | Robotic surgical system and method |
US10034716B2 (en) | 2015-09-14 | 2018-07-31 | Globus Medical, Inc. | Surgical robotic systems and methods thereof |
US9727963B2 (en) | 2015-09-18 | 2017-08-08 | Auris Surgical Robotics, Inc. | Navigation of tubular networks |
US10716544B2 (en) | 2015-10-08 | 2020-07-21 | Zmk Medical Technologies Inc. | System for 3D multi-parametric ultrasound imaging |
US9771092B2 (en) | 2015-10-13 | 2017-09-26 | Globus Medical, Inc. | Stabilizer wheel assembly and methods of use |
US10327861B2 (en) | 2015-10-22 | 2019-06-25 | Straight Shot, LLC | Surgical implant alignment device |
US9962134B2 (en) | 2015-10-28 | 2018-05-08 | Medtronic Navigation, Inc. | Apparatus and method for maintaining image quality while minimizing X-ray dosage of a patient |
WO2017079655A2 (en) | 2015-11-04 | 2017-05-11 | Mcafee Paul C | Methods and apparatus for spinal reconstructive surgery and measuring spinal length and intervertebral spacing, tension and rotation |
AU2015414802B2 (en) * | 2015-11-19 | 2020-12-24 | Eos Imaging | Method of preoperative planning to correct spine misalignment of a patient |
US10143526B2 (en) | 2015-11-30 | 2018-12-04 | Auris Health, Inc. | Robot-assisted driving systems and methods |
EP3184071A1 (en) | 2015-12-22 | 2017-06-28 | SpineMind AG | Device for intraoperative image-guided navigation during surgical interventions in the vicinity of the spine and the adjoining thoracic, pelvis or head area |
US9554411B1 (en) | 2015-12-30 | 2017-01-24 | DePuy Synthes Products, Inc. | Systems and methods for wirelessly powering or communicating with sterile-packed devices |
US10335241B2 (en) | 2015-12-30 | 2019-07-02 | DePuy Synthes Products, Inc. | Method and apparatus for intraoperative measurements of anatomical orientation |
KR20180099702A (en) | 2015-12-31 | 2018-09-05 | 스트리커 코포레이션 | System and method for performing surgery on a patient at a target site defined by a virtual object |
US10448910B2 (en) | 2016-02-03 | 2019-10-22 | Globus Medical, Inc. | Portable medical imaging system |
US10117632B2 (en) | 2016-02-03 | 2018-11-06 | Globus Medical, Inc. | Portable medical imaging system with beam scanning collimator |
US11058378B2 (en) | 2016-02-03 | 2021-07-13 | Globus Medical, Inc. | Portable medical imaging system |
US11883217B2 (en) | 2016-02-03 | 2024-01-30 | Globus Medical, Inc. | Portable medical imaging system and method |
US10842453B2 (en) | 2016-02-03 | 2020-11-24 | Globus Medical, Inc. | Portable medical imaging system |
WO2017139556A1 (en) | 2016-02-12 | 2017-08-17 | Medos International Sarl | Systems and methods for intraoperatively measuring anatomical orientation |
US10866119B2 (en) | 2016-03-14 | 2020-12-15 | Globus Medical, Inc. | Metal detector for detecting insertion of a surgical device into a hollow tube |
US10478254B2 (en) | 2016-05-16 | 2019-11-19 | Covidien Lp | System and method to access lung tissue |
EP3484376A4 (en) | 2016-07-12 | 2020-07-22 | Mobius Imaging LLC | Multi-stage dilator and cannula system and method |
US11037464B2 (en) | 2016-07-21 | 2021-06-15 | Auris Health, Inc. | System with emulator movement tracking for controlling medical devices |
EP3506817A4 (en) | 2016-08-30 | 2020-07-22 | The Regents of The University of California | Methods for biomedical targeting and delivery and devices and systems for practicing the same |
US10820835B2 (en) | 2016-09-12 | 2020-11-03 | Medos International Sarl | Systems and methods for anatomical alignment |
WO2018053282A1 (en) | 2016-09-16 | 2018-03-22 | GYS Tech, LLC d/b/a Cardan Robotics | System and method for mounting a robotic arm in a surgical robotic system |
US20180098816A1 (en) * | 2016-10-06 | 2018-04-12 | Biosense Webster (Israel) Ltd. | Pre-Operative Registration of Anatomical Images with a Position-Tracking System Using Ultrasound |
WO2018075784A1 (en) | 2016-10-21 | 2018-04-26 | Syverson Benjamin | Methods and systems for setting trajectories and target locations for image guided surgery |
US11751948B2 (en) | 2016-10-25 | 2023-09-12 | Mobius Imaging, Llc | Methods and systems for robot-assisted surgery |
US11653979B2 (en) | 2016-10-27 | 2023-05-23 | Leucadia 6, Llc | Intraoperative fluoroscopic registration of vertebral bodies |
US10638952B2 (en) | 2016-10-28 | 2020-05-05 | Covidien Lp | Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system |
US10615500B2 (en) | 2016-10-28 | 2020-04-07 | Covidien Lp | System and method for designing electromagnetic navigation antenna assemblies |
US10517505B2 (en) | 2016-10-28 | 2019-12-31 | Covidien Lp | Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system |
US10722311B2 (en) | 2016-10-28 | 2020-07-28 | Covidien Lp | System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map |
US10446931B2 (en) | 2016-10-28 | 2019-10-15 | Covidien Lp | Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same |
US10792106B2 (en) | 2016-10-28 | 2020-10-06 | Covidien Lp | System for calibrating an electromagnetic navigation system |
US10751126B2 (en) | 2016-10-28 | 2020-08-25 | Covidien Lp | System and method for generating a map for electromagnetic navigation |
US10418705B2 (en) | 2016-10-28 | 2019-09-17 | Covidien Lp | Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same |
US10376258B2 (en) | 2016-11-07 | 2019-08-13 | Vycor Medical, Inc. | Surgical introducer with guidance system receptacle |
US10543016B2 (en) | 2016-11-07 | 2020-01-28 | Vycor Medical, Inc. | Surgical introducer with guidance system receptacle |
US10510171B2 (en) | 2016-11-29 | 2019-12-17 | Biosense Webster (Israel) Ltd. | Visualization of anatomical cavities |
WO2018109556A1 (en) | 2016-12-12 | 2018-06-21 | Medicrea International | Systems and methods for patient-specific spinal implants |
EP3554414A1 (en) | 2016-12-16 | 2019-10-23 | MAKO Surgical Corp. | Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site |
US10244926B2 (en) | 2016-12-28 | 2019-04-02 | Auris Health, Inc. | Detecting endolumenal buckling of flexible instruments |
JP7233841B2 (en) | 2017-01-18 | 2023-03-07 | ケービー メディカル エスアー | Robotic Navigation for Robotic Surgical Systems |
US11071594B2 (en) | 2017-03-16 | 2021-07-27 | KB Medical SA | Robotic navigation of robotic surgical systems |
US10682129B2 (en) | 2017-03-23 | 2020-06-16 | Mobius Imaging, Llc | Robotic end effector with adjustable inner diameter |
CN108990412B (en) | 2017-03-31 | 2022-03-22 | 奥瑞斯健康公司 | Robot system for cavity network navigation compensating physiological noise |
US11089975B2 (en) | 2017-03-31 | 2021-08-17 | DePuy Synthes Products, Inc. | Systems, devices and methods for enhancing operative accuracy using inertial measurement units |
EP4108201B1 (en) | 2017-04-21 | 2024-03-27 | Medicrea International | A system for developing one or more patient-specific spinal implants |
EP4344658A2 (en) | 2017-05-10 | 2024-04-03 | MAKO Surgical Corp. | Robotic spine surgery system |
US11033341B2 (en) | 2017-05-10 | 2021-06-15 | Mako Surgical Corp. | Robotic spine surgery system and methods |
US10980509B2 (en) * | 2017-05-11 | 2021-04-20 | Siemens Medical Solutions Usa, Inc. | Deformable registration of preoperative volumes and intraoperative ultrasound images from a tracked transducer |
US10022192B1 (en) | 2017-06-23 | 2018-07-17 | Auris Health, Inc. | Automatically-initialized robotic systems for navigation of luminal networks |
JP7229989B2 (en) | 2017-07-17 | 2023-02-28 | ボイジャー セラピューティクス インコーポレイテッド | Trajectory array guide system |
US10675094B2 (en) | 2017-07-21 | 2020-06-09 | Globus Medical Inc. | Robot surgical platform |
AU2018214021A1 (en) * | 2017-08-10 | 2019-02-28 | Biosense Webster (Israel) Ltd. | Method and apparatus for performing facial registration |
US11660145B2 (en) | 2017-08-11 | 2023-05-30 | Mobius Imaging Llc | Method and apparatus for attaching a reference marker to a patient |
US11534211B2 (en) | 2017-10-04 | 2022-12-27 | Mobius Imaging Llc | Systems and methods for performing lateral-access spine surgery |
AU2018346790A1 (en) | 2017-10-05 | 2020-04-30 | Mobius Imaging, Llc | Methods and systems for performing computer assisted surgery |
US11058493B2 (en) | 2017-10-13 | 2021-07-13 | Auris Health, Inc. | Robotic system configured for navigation path tracing |
US10555778B2 (en) | 2017-10-13 | 2020-02-11 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
US10639079B2 (en) | 2017-10-24 | 2020-05-05 | Straight Shot, LLC | Surgical implant alignment device |
US10792052B2 (en) * | 2017-10-26 | 2020-10-06 | Arthrex, Inc. | Surgical drill guide |
US11219489B2 (en) | 2017-10-31 | 2022-01-11 | Covidien Lp | Devices and systems for providing sensors in parallel with medical tools |
US11382666B2 (en) | 2017-11-09 | 2022-07-12 | Globus Medical Inc. | Methods providing bend plans for surgical rods and related controllers and computer program products |
US11794338B2 (en) | 2017-11-09 | 2023-10-24 | Globus Medical Inc. | Robotic rod benders and related mechanical and motor housings |
FR3073135B1 (en) | 2017-11-09 | 2019-11-15 | Quantum Surgical | ROBOTIC DEVICE FOR MINI-INVASIVE MEDICAL INTERVENTION ON SOFT TISSUE |
EP3492032B1 (en) | 2017-11-09 | 2023-01-04 | Globus Medical, Inc. | Surgical robotic systems for bending surgical rods |
US11134862B2 (en) | 2017-11-10 | 2021-10-05 | Globus Medical, Inc. | Methods of selecting surgical implants and related devices |
US10918422B2 (en) | 2017-12-01 | 2021-02-16 | Medicrea International | Method and apparatus for inhibiting proximal junctional failure |
CN110831534B (en) | 2017-12-08 | 2023-04-28 | 奥瑞斯健康公司 | System and method for medical instrument navigation and targeting |
EP3684562A4 (en) | 2017-12-14 | 2021-06-30 | Auris Health, Inc. | System and method for estimating instrument location |
WO2019125964A1 (en) | 2017-12-18 | 2019-06-27 | Auris Health, Inc. | Methods and systems for instrument tracking and navigation within luminal networks |
CN111970984A (en) | 2018-01-12 | 2020-11-20 | 彼得·L·波纳 | Robot operation control system |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
CA3092738A1 (en) * | 2018-03-01 | 2019-09-06 | Archeoptix Biomedical Inc. | System for intracranial imaging and treatment |
US10524866B2 (en) | 2018-03-28 | 2020-01-07 | Auris Health, Inc. | Systems and methods for registration of location sensors |
JP7225259B2 (en) | 2018-03-28 | 2023-02-20 | オーリス ヘルス インコーポレイテッド | Systems and methods for indicating probable location of instruments |
US10573023B2 (en) | 2018-04-09 | 2020-02-25 | Globus Medical, Inc. | Predictive visualization of medical imaging scanner component movement |
WO2019222495A1 (en) | 2018-05-18 | 2019-11-21 | Auris Health, Inc. | Controllers for robotically-enabled teleoperated systems |
US11191594B2 (en) | 2018-05-25 | 2021-12-07 | Mako Surgical Corp. | Versatile tracking arrays for a navigation system and methods of recovering registration using the same |
CN109700529B (en) * | 2018-05-29 | 2021-09-03 | 常州锦瑟医疗信息科技有限公司 | Navigation system for bendable rigid tissue |
WO2019231895A1 (en) | 2018-05-30 | 2019-12-05 | Auris Health, Inc. | Systems and methods for location sensor-based branch prediction |
MX2020012904A (en) | 2018-05-31 | 2021-02-26 | Auris Health Inc | Image-based airway analysis and mapping. |
EP3801189A4 (en) | 2018-05-31 | 2022-02-23 | Auris Health, Inc. | Path-based navigation of tubular networks |
CN112236083A (en) | 2018-05-31 | 2021-01-15 | 奥瑞斯健康公司 | Robotic system and method for navigating a luminal network detecting physiological noise |
US11051829B2 (en) | 2018-06-26 | 2021-07-06 | DePuy Synthes Products, Inc. | Customized patient-specific orthopaedic surgical instrument |
US11291507B2 (en) | 2018-07-16 | 2022-04-05 | Mako Surgical Corp. | System and method for image based registration and calibration |
US11298186B2 (en) * | 2018-08-02 | 2022-04-12 | Point Robotics Medtech Inc. | Surgery assistive system and method for obtaining surface information thereof |
US11337742B2 (en) | 2018-11-05 | 2022-05-24 | Globus Medical Inc | Compliant orthopedic driver |
EP3876860A1 (en) | 2018-11-06 | 2021-09-15 | Bono, Peter L. | Robotic surgical system and method |
US11278360B2 (en) | 2018-11-16 | 2022-03-22 | Globus Medical, Inc. | End-effectors for surgical robotic systems having sealed optical components |
US11602402B2 (en) | 2018-12-04 | 2023-03-14 | Globus Medical, Inc. | Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems |
US11744655B2 (en) | 2018-12-04 | 2023-09-05 | Globus Medical, Inc. | Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems |
US11806084B2 (en) | 2019-03-22 | 2023-11-07 | Globus Medical, Inc. | System for neuronavigation registration and robotic trajectory guidance, and related methods and devices |
US11317978B2 (en) | 2019-03-22 | 2022-05-03 | Globus Medical, Inc. | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
US11419616B2 (en) | 2019-03-22 | 2022-08-23 | Globus Medical, Inc. | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
US20200297357A1 (en) | 2019-03-22 | 2020-09-24 | Globus Medical, Inc. | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
US11571265B2 (en) | 2019-03-22 | 2023-02-07 | Globus Medical Inc. | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
US11382549B2 (en) | 2019-03-22 | 2022-07-12 | Globus Medical, Inc. | System for neuronavigation registration and robotic trajectory guidance, and related methods and devices |
US11877801B2 (en) | 2019-04-02 | 2024-01-23 | Medicrea International | Systems, methods, and devices for developing patient-specific spinal implants, treatments, operations, and/or procedures |
US11925417B2 (en) | 2019-04-02 | 2024-03-12 | Medicrea International | Systems, methods, and devices for developing patient-specific spinal implants, treatments, operations, and/or procedures |
US11045179B2 (en) | 2019-05-20 | 2021-06-29 | Global Medical Inc | Robot-mounted retractor system |
EP3989793A4 (en) | 2019-06-28 | 2023-07-19 | Auris Health, Inc. | Console overlay and methods of using same |
US11628023B2 (en) | 2019-07-10 | 2023-04-18 | Globus Medical, Inc. | Robotic navigational system for interbody implants |
KR20220058569A (en) | 2019-08-30 | 2022-05-09 | 아우리스 헬스, 인코포레이티드 | System and method for weight-based registration of position sensors |
WO2021038495A1 (en) | 2019-08-30 | 2021-03-04 | Auris Health, Inc. | Instrument image reliability systems and methods |
US11571171B2 (en) | 2019-09-24 | 2023-02-07 | Globus Medical, Inc. | Compound curve cable chain |
US11864857B2 (en) | 2019-09-27 | 2024-01-09 | Globus Medical, Inc. | Surgical robot with passive end effector |
US11426178B2 (en) | 2019-09-27 | 2022-08-30 | Globus Medical Inc. | Systems and methods for navigating a pin guide driver |
US11890066B2 (en) | 2019-09-30 | 2024-02-06 | Globus Medical, Inc | Surgical robot with passive end effector |
US11510684B2 (en) | 2019-10-14 | 2022-11-29 | Globus Medical, Inc. | Rotary motion passive end effector for surgical robots in orthopedic surgeries |
US11769251B2 (en) | 2019-12-26 | 2023-09-26 | Medicrea International | Systems and methods for medical image analysis |
WO2021137109A1 (en) | 2019-12-31 | 2021-07-08 | Auris Health, Inc. | Alignment techniques for percutaneous access |
CN114929148A (en) | 2019-12-31 | 2022-08-19 | 奥瑞斯健康公司 | Alignment interface for percutaneous access |
US11298195B2 (en) | 2019-12-31 | 2022-04-12 | Auris Health, Inc. | Anatomical feature identification and targeting |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
US11253216B2 (en) | 2020-04-28 | 2022-02-22 | Globus Medical Inc. | Fixtures for fluoroscopic imaging systems and related navigation systems and methods |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US11317973B2 (en) | 2020-06-09 | 2022-05-03 | Globus Medical, Inc. | Camera tracking bar for computer assisted navigation during surgery |
US11382713B2 (en) | 2020-06-16 | 2022-07-12 | Globus Medical, Inc. | Navigated surgical system with eye to XR headset display calibration |
US11877807B2 (en) | 2020-07-10 | 2024-01-23 | Globus Medical, Inc | Instruments for navigated orthopedic surgeries |
US11793588B2 (en) | 2020-07-23 | 2023-10-24 | Globus Medical, Inc. | Sterile draping of robotic arms |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
US11523785B2 (en) | 2020-09-24 | 2022-12-13 | Globus Medical, Inc. | Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement |
US11911112B2 (en) | 2020-10-27 | 2024-02-27 | Globus Medical, Inc. | Robotic navigational system |
US11941814B2 (en) | 2020-11-04 | 2024-03-26 | Globus Medical Inc. | Auto segmentation using 2-D images taken during 3-D imaging spin |
US11717350B2 (en) | 2020-11-24 | 2023-08-08 | Globus Medical Inc. | Methods for robotic assistance and navigation in spinal surgery and related systems |
CN117412724A (en) * | 2021-05-28 | 2024-01-16 | 柯惠有限合伙公司 | System and method for evaluating breath hold during intra-procedural imaging |
US11857273B2 (en) | 2021-07-06 | 2024-01-02 | Globus Medical, Inc. | Ultrasonic robotic surgical navigation |
US11439444B1 (en) | 2021-07-22 | 2022-09-13 | Globus Medical, Inc. | Screw tower and rod reduction tool |
US11918304B2 (en) | 2021-12-20 | 2024-03-05 | Globus Medical, Inc | Flat panel registration fixture and method of using same |
Family Cites Families (192)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3821469A (en) | 1972-05-15 | 1974-06-28 | Amperex Electronic Corp | Graphical data device |
US3868565A (en) | 1973-07-30 | 1975-02-25 | Jack Kuipers | Object tracking and orientation determination means, system and process |
JPS5531539Y2 (en) | 1974-08-02 | 1980-07-28 | ||
DE2443558B2 (en) | 1974-09-11 | 1979-01-04 | Siemens Ag, 1000 Berlin Und 8000 Muenchen | Device for puncturing internal organs and vessels |
US3963028A (en) | 1975-02-06 | 1976-06-15 | Texas Medical Products, Inc. | Suction wand |
US3983474A (en) | 1975-02-21 | 1976-09-28 | Polhemus Navigation Sciences, Inc. | Tracking and determining orientation of object using coordinate transformation means, system and process |
US4068156A (en) | 1977-03-01 | 1978-01-10 | Martin Marietta Corporation | Rate control system for manipulator arms |
US4182312A (en) | 1977-05-20 | 1980-01-08 | Mushabac David R | Dental probe |
US4117337A (en) | 1977-11-03 | 1978-09-26 | General Electric Company | Patient positioning indication arrangement for a computed tomography system |
FR2416480A1 (en) | 1978-02-03 | 1979-08-31 | Thomson Csf | RADIANT SOURCE LOCATION DEVICE AND STEERING TRACKING SYSTEM INCLUDING SUCH A DEVICE |
DE7805301U1 (en) | 1978-02-22 | 1978-07-06 | Howmedica International, Inc. Zweigniederlassung Kiel, 2300 Kiel | Distal aiming device for locking nailing |
SU745515A1 (en) * | 1978-02-27 | 1980-07-05 | Научно-Исследовательский Институт Экспериментальной Медицины Амн Ссср | Stereotaxic apparatus |
US4341200A (en) | 1978-08-21 | 1982-07-27 | Ametek, Inc. | Solar energy collector sub-assemblies and combinations thereof |
DE2852949A1 (en) | 1978-12-05 | 1980-06-19 | Theo Dipl Ing Rieger | Cassette or container coding - has mechanical or magnetic code contents related for mechanical or opto-electronic sensing |
US4259725A (en) | 1979-03-01 | 1981-03-31 | General Electric Company | Cursor generator for use in computerized tomography and other image display systems |
US4341220A (en) | 1979-04-13 | 1982-07-27 | Pfizer Inc. | Stereotactic surgery apparatus and method |
US4608977A (en) | 1979-08-29 | 1986-09-02 | Brown Russell A | System using computed tomography as for selective body treatment |
US4419012A (en) | 1979-09-11 | 1983-12-06 | Elliott Brothers (London) Limited | Position measuring system |
US4398540A (en) | 1979-11-05 | 1983-08-16 | Tokyo Shibaura Denki Kabushiki Kaisha | Compound mode ultrasound diagnosis apparatus |
DE2948986C2 (en) | 1979-12-05 | 1982-10-28 | Siemens AG, 1000 Berlin und 8000 München | Medical examination facility |
DE8006965U1 (en) | 1980-03-14 | 1981-08-27 | Robert Bosch Gmbh, 7000 Stuttgart | ADDITIONAL HANDLE FOR A HAND MACHINE TOOL |
US4638798A (en) | 1980-09-10 | 1987-01-27 | Shelden C Hunter | Stereotactic method and apparatus for locating and treating or removing lesions |
US4358856A (en) | 1980-10-31 | 1982-11-09 | General Electric Company | Multiaxial x-ray apparatus |
AU7986682A (en) | 1981-02-12 | 1982-08-19 | New York University | Apparatus for stereotactic surgery |
NL8101722A (en) | 1981-04-08 | 1982-11-01 | Philips Nv | CONTOUR METER. |
FI64282C (en) | 1981-06-04 | 1983-11-10 | Instrumentarium Oy | DIAGNOSISPARATUR FOER BESTAEMMANDE AV VAEVNADERNAS STRUKTUR OC SAMMANSAETTNING |
US4465069A (en) | 1981-06-04 | 1984-08-14 | Barbier Jean Y | Cranial insertion of surgical needle utilizing computer-assisted tomography |
US4407298A (en) | 1981-07-16 | 1983-10-04 | Critikon Inc. | Connector for thermodilution catheter |
US4396945A (en) | 1981-08-19 | 1983-08-02 | Solid Photography Inc. | Method of sensing the position and orientation of elements in space |
US4473074A (en) | 1981-09-28 | 1984-09-25 | Xanar, Inc. | Microsurgical laser device |
DE3205915A1 (en) | 1982-02-19 | 1983-09-15 | Fred Dr. 6907 Nußloch Wunschik | Device for puncturing intracorporeal organs |
US4457311A (en) | 1982-09-03 | 1984-07-03 | Medtronic, Inc. | Ultrasound imaging system for scanning the human back |
US4506676A (en) | 1982-09-10 | 1985-03-26 | Duska Alois A | Radiographic localization technique |
US4961422A (en) | 1983-01-21 | 1990-10-09 | Marchosky J Alexander | Method and apparatus for volumetric interstitial conductive hyperthermia |
US4585350A (en) | 1983-01-28 | 1986-04-29 | Pryor Timothy R | Pulsed robotic inspection |
US4651732A (en) | 1983-03-17 | 1987-03-24 | Frederick Philip R | Three-dimensional light guidance system for invasive procedures |
JPS59218513A (en) | 1983-05-26 | 1984-12-08 | Fanuc Ltd | Arc control method of industrial robot |
NL8302228A (en) | 1983-06-22 | 1985-01-16 | Optische Ind De Oude Delft Nv | MEASURING SYSTEM FOR USING A TRIANGULAR PRINCIPLE, CONTACT-FREE MEASURING A DISTANCE GIVEN BY A SURFACE CONTOUR TO AN OBJECTIVE LEVEL. |
SE8306243L (en) | 1983-11-14 | 1985-05-15 | Cytex Medicinteknik Ab | LOCATION METHODOLOGY |
DE3342675A1 (en) | 1983-11-25 | 1985-06-05 | Fa. Carl Zeiss, 7920 Heidenheim | METHOD AND DEVICE FOR CONTACTLESS MEASUREMENT OF OBJECTS |
US4753528A (en) | 1983-12-13 | 1988-06-28 | Quantime, Inc. | Laser archery distance device |
US4549555A (en) | 1984-02-17 | 1985-10-29 | Orthothronics Limited Partnership | Knee laxity evaluator and motion module/digitizer arrangement |
US4841967A (en) | 1984-01-30 | 1989-06-27 | Chang Ming Z | Positioning device for percutaneous needle insertion |
US4674057A (en) | 1984-02-14 | 1987-06-16 | Lockheed Corporation | Ultrasonic ranging control system for industrial robots |
US4571834A (en) | 1984-02-17 | 1986-02-25 | Orthotronics Limited Partnership | Knee laxity evaluator and motion module/digitizer arrangement |
US4583538A (en) | 1984-05-04 | 1986-04-22 | Onik Gary M | Method and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization |
US4649504A (en) | 1984-05-22 | 1987-03-10 | Cae Electronics, Ltd. | Optical position and orientation measurement techniques |
US4775235A (en) | 1984-06-08 | 1988-10-04 | Robotic Vision Systems, Inc. | Optical spot scanning system for use in three-dimensional object inspection |
DE3423135A1 (en) | 1984-06-22 | 1986-01-02 | Dornier Gmbh, 7990 Friedrichshafen | METHOD FOR READING A DISTANCE IMAGE LINE |
JPS6149205A (en) | 1984-08-16 | 1986-03-11 | Seiko Instr & Electronics Ltd | Robot control system |
US4705395A (en) | 1984-10-03 | 1987-11-10 | Diffracto Ltd. | Triangulation data integrity |
JPS6186606A (en) | 1984-10-05 | 1986-05-02 | Hitachi Ltd | Configuration measuring method without contact |
US4821206A (en) | 1984-11-27 | 1989-04-11 | Photo Acoustic Technology, Inc. | Ultrasonic apparatus for positioning a robot hand |
US4592352A (en) | 1984-11-30 | 1986-06-03 | Patil Arun A | Computer-assisted tomography stereotactic system |
US4706665A (en) | 1984-12-17 | 1987-11-17 | Gouda Kasim I | Frame for stereotactic surgery |
DE3500605A1 (en) | 1985-01-10 | 1986-07-10 | Markus Dr. 5300 Bonn Hansen | DEVICE FOR MEASURING THE POSITIONS AND MOVEMENTS OF THE LOWER JAW RELATIVE TO THE UPPER JAW |
DE3502634A1 (en) | 1985-01-26 | 1985-06-20 | Deutsche Forschungs- und Versuchsanstalt für Luft- und Raumfahrt e.V., 5000 Köln | OPTICAL-ELECTRONIC DISTANCE METER |
USD291246S (en) | 1985-03-05 | 1987-08-04 | Zimmer, Inc. | Drill guide instrument for surgical use or the like |
DE3508730A1 (en) | 1985-03-12 | 1986-09-18 | Siemens AG, 1000 Berlin und 8000 München | Measuring device for medical purposes |
US4782239A (en) | 1985-04-05 | 1988-11-01 | Nippon Kogaku K. K. | Optical position measuring apparatus |
US4672306A (en) | 1985-04-08 | 1987-06-09 | Tektronix, Inc. | Electronic probe having automatic readout of identification and status |
US4737921A (en) | 1985-06-03 | 1988-04-12 | Dynamic Digital Displays, Inc. | Three dimensional medical image display system |
SE447848B (en) | 1985-06-14 | 1986-12-15 | Anders Bengtsson | INSTRUMENTS FOR SEATING SURFACE TOPOGRAPHY |
US4743771A (en) | 1985-06-17 | 1988-05-10 | View Engineering, Inc. | Z-axis height measurement system |
US4805615A (en) | 1985-07-02 | 1989-02-21 | Carol Mark P | Method and apparatus for performing stereotactic surgery |
US4686997A (en) | 1985-07-03 | 1987-08-18 | The United States Of America As Represented By The Secretary Of The Air Force | Skeletal bone remodeling studies using guided trephine sample |
US4705401A (en) | 1985-08-12 | 1987-11-10 | Cyberware Laboratory Inc. | Rapid three-dimensional surface digitizer |
US4737032A (en) | 1985-08-26 | 1988-04-12 | Cyberware Laboratory, Inc. | Surface mensuration sensor |
JPH0619243B2 (en) | 1985-09-19 | 1994-03-16 | 株式会社トプコン | Coordinate measuring method and apparatus thereof |
IL76517A (en) | 1985-09-27 | 1989-02-28 | Nessim Igal Levy | Distance measuring device |
US4709156A (en) | 1985-11-27 | 1987-11-24 | Ex-Cell-O Corporation | Method and apparatus for inspecting a surface |
US4794262A (en) | 1985-12-03 | 1988-12-27 | Yukio Sato | Method and apparatus for measuring profile of three-dimensional object |
DE3543867C3 (en) | 1985-12-12 | 1994-10-06 | Wolf Gmbh Richard | Device for the spatial location and destruction of concrements in body cavities |
US4742815A (en) | 1986-01-02 | 1988-05-10 | Ninan Champil A | Computer monitoring of endoscope |
US4722056A (en) | 1986-02-18 | 1988-01-26 | Trustees Of Dartmouth College | Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope |
JP2685071B2 (en) | 1986-03-10 | 1997-12-03 | 三菱電機株式会社 | Numerical control unit |
US4776749A (en) | 1986-03-25 | 1988-10-11 | Northrop Corporation | Robotic device |
EP0239409A1 (en) | 1986-03-28 | 1987-09-30 | Life Technology Research Foundation | Robot for surgical operation |
SE469321B (en) | 1986-04-14 | 1993-06-21 | Joenkoepings Laens Landsting | SET AND DEVICE TO MAKE A MODIFIED THREE-DIMENSIONAL IMAGE OF AN ELASTIC DEFORMABLE PURPOSE |
US5078140A (en) | 1986-05-08 | 1992-01-07 | Kwoh Yik S | Imaging device - aided robotic stereotaxis system |
US4822163A (en) | 1986-06-26 | 1989-04-18 | Robotic Vision Systems, Inc. | Tracking vision sensor |
US4767934A (en) | 1986-07-02 | 1988-08-30 | Honeywell Inc. | Active ranging system |
US4723544A (en) | 1986-07-09 | 1988-02-09 | Moore Robert R | Hemispherical vectoring needle guide for discolysis |
US4791934A (en) | 1986-08-07 | 1988-12-20 | Picker International, Inc. | Computer tomography assisted stereotactic surgery system and method |
US4733969A (en) | 1986-09-08 | 1988-03-29 | Cyberoptics Corporation | Laser probe for determining distance |
US4743770A (en) | 1986-09-22 | 1988-05-10 | Mitutoyo Mfg. Co., Ltd. | Profile-measuring light probe using a change in reflection factor in the proximity of a critical angle of light |
US4761072A (en) | 1986-09-30 | 1988-08-02 | Diffracto Ltd. | Electro-optical sensors for manual control |
US4933843A (en) | 1986-11-06 | 1990-06-12 | Storz Instrument Company | Control system for ophthalmic surgical instruments |
US4750487A (en) | 1986-11-24 | 1988-06-14 | Zanetti Paul H | Stereotactic frame |
US4764015A (en) | 1986-12-31 | 1988-08-16 | Owens-Illinois Television Products Inc. | Method and apparatus for non-contact spatial measurement |
US4733662A (en) | 1987-01-20 | 1988-03-29 | Minnesota Mining And Manufacturing Company | Tissue gripping and cutting assembly for surgical instrument |
US4837669A (en) | 1987-01-28 | 1989-06-06 | Manville Corporation | Low profile industrial luminaire |
US5005142A (en) | 1987-01-30 | 1991-04-02 | Westinghouse Electric Corp. | Smart sensor system for diagnostic monitoring |
DE8701668U1 (en) | 1987-02-04 | 1987-04-02 | Aesculap-Werke Ag Vormals Jetter & Scheerer, 7200 Tuttlingen, De | |
DE3703422A1 (en) | 1987-02-05 | 1988-08-18 | Zeiss Carl Fa | OPTOELECTRONIC DISTANCE SENSOR |
US4753128A (en) | 1987-03-09 | 1988-06-28 | Gmf Robotics Corporation | Robot with spring pivot balancing mechanism |
US4745290A (en) | 1987-03-19 | 1988-05-17 | David Frankel | Method and apparatus for use in making custom shoes |
US4762016A (en) | 1987-03-27 | 1988-08-09 | The Regents Of The University Of California | Robotic manipulator having three degrees of freedom |
US4875478A (en) | 1987-04-10 | 1989-10-24 | Chen Harry H | Portable compression grid & needle holder |
US4793355A (en) | 1987-04-17 | 1988-12-27 | Biomagnetic Technologies, Inc. | Apparatus for process for making biomagnetic measurements |
US4733661A (en) | 1987-04-27 | 1988-03-29 | Palestrant Aubrey M | Guidance device for C.T. guided drainage and biopsy procedures |
US4809694A (en) | 1987-05-19 | 1989-03-07 | Ferrara Vincent L | Biopsy guide |
DE3717871C3 (en) | 1987-05-27 | 1995-05-04 | Georg Prof Dr Schloendorff | Method and device for reproducible visual representation of a surgical intervention |
US4836778A (en) | 1987-05-26 | 1989-06-06 | Vexcel Corporation | Mandibular motion monitoring system |
DE3884800D1 (en) | 1987-05-27 | 1993-11-11 | Schloendorff Georg Prof Dr | METHOD AND DEVICE FOR REPRODUCIBLE OPTICAL PRESENTATION OF A SURGICAL OPERATION. |
US4835710A (en) | 1987-07-17 | 1989-05-30 | Cincinnati Milacron Inc. | Method of moving and orienting a tool along a curved path |
US4829373A (en) | 1987-08-03 | 1989-05-09 | Vexcel Corporation | Stereo mensuration apparatus |
DE3844716C2 (en) | 1987-08-24 | 2001-02-22 | Mitsubishi Electric Corp | Ionised particle beam therapy device |
US4931056A (en) | 1987-09-04 | 1990-06-05 | Neurodynamics, Inc. | Catheter guide apparatus for perpendicular insertion into a cranium orifice |
US4991579A (en) | 1987-11-10 | 1991-02-12 | Allen George S | Method and apparatus for providing related images over time of a portion of the anatomy using fiducial implants |
US5079699A (en) | 1987-11-27 | 1992-01-07 | Picker International, Inc. | Quick three-dimensional display |
US5027818A (en) | 1987-12-03 | 1991-07-02 | University Of Florida | Dosimetric technique for stereotactic radiosurgery same |
US4938762A (en) | 1987-12-16 | 1990-07-03 | Protek Ag | Reference system for implantation of condylar total knee prostheses |
US5251127A (en) | 1988-02-01 | 1993-10-05 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
EP0326768A3 (en) | 1988-02-01 | 1991-01-23 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
US4913683A (en) * | 1988-07-05 | 1990-04-03 | Medical Engineering Corporation | Infusion stent system |
US5050608A (en) | 1988-07-12 | 1991-09-24 | Medirand, Inc. | System for indicating a position to be operated in a patient's body |
US4896673A (en) | 1988-07-15 | 1990-01-30 | Medstone International, Inc. | Method and apparatus for stone localization using ultrasound imaging |
US4982188A (en) | 1988-09-20 | 1991-01-01 | Grumman Aerospace Corporation | System for measuring positional characteristics of an ejected object |
US5099846A (en) | 1988-12-23 | 1992-03-31 | Hardy Tyrone L | Method and apparatus for video presentation from a variety of scanner imaging sources |
DE3904595C1 (en) | 1989-02-16 | 1990-04-19 | Deutsches Krebsforschungszentrum Stiftung Des Oeffentlichen Rechts, 6900 Heidelberg, De | Device for determining the spatial coordinates of stereotactic target points by means of X-ray pictures |
US5197476A (en) | 1989-03-16 | 1993-03-30 | Christopher Nowacki | Locating target in human body |
US5257998A (en) | 1989-09-20 | 1993-11-02 | Mitaka Kohki Co., Ltd. | Medical three-dimensional locating apparatus |
ES2085885T3 (en) | 1989-11-08 | 1996-06-16 | George S Allen | MECHANICAL ARM FOR INTERACTIVE SURGERY SYSTEM DIRECTED BY IMAGES. |
US5222499A (en) | 1989-11-15 | 1993-06-29 | Allen George S | Method and apparatus for imaging the anatomy |
US5047036A (en) | 1989-11-17 | 1991-09-10 | Koutrouvelis Panos G | Stereotactic device |
CA2003497C (en) | 1989-11-21 | 1999-04-06 | Michael M. Greenberg | Probe-correlated viewing of anatomical image data |
US5078142A (en) | 1989-11-21 | 1992-01-07 | Fischer Imaging Corporation | Precision mammographic needle biopsy system |
US5080662A (en) | 1989-11-27 | 1992-01-14 | Paul Kamaljit S | Spinal stereotaxic device and method |
US5224049A (en) | 1990-04-10 | 1993-06-29 | Mushabac David R | Method, system and mold assembly for use in preparing a dental prosthesis |
US5163430A (en) | 1990-04-27 | 1992-11-17 | Medco, Inc. | Method and apparatus for performing stereotactic surgery |
US5107839A (en) | 1990-05-04 | 1992-04-28 | Pavel V. Houdek | Computer controlled stereotaxic radiotherapy system and method |
US5086401A (en) | 1990-05-11 | 1992-02-04 | International Business Machines Corporation | Image-directed robotic system for precise robotic surgery including redundant consistency checking |
US5295483A (en) | 1990-05-11 | 1994-03-22 | Christopher Nowacki | Locating target in human body |
US5017139A (en) | 1990-07-05 | 1991-05-21 | Mushabac David R | Mechanical support for hand-held dental/medical instrument |
ATE126994T1 (en) | 1990-07-31 | 1995-09-15 | Faro Medical Technologies Inc | COMPUTER-ASSISTED SURGICAL DEVICE. |
DE4024402C1 (en) | 1990-08-01 | 1991-10-31 | Dr.Ing.H.C. F. Porsche Ag, 7000 Stuttgart, De | |
US5193106A (en) | 1990-08-28 | 1993-03-09 | Desena Danforth | X-ray identification marker |
US5198877A (en) | 1990-10-15 | 1993-03-30 | Pixsys, Inc. | Method and apparatus for three-dimensional non-contact shape sensing |
US5207223A (en) | 1990-10-19 | 1993-05-04 | Accuray, Inc. | Apparatus for and method of performing stereotaxic surgery |
EP1690511B1 (en) | 1990-10-19 | 2010-07-14 | St. Louis University | Surgical probe locating system for head use |
US5059789A (en) | 1990-10-22 | 1991-10-22 | International Business Machines Corp. | Optical position and orientation sensor |
US5295200A (en) | 1991-01-09 | 1994-03-15 | Board Of Regents, The University Of Texas System | Method and apparatus for determining the alignment of an object |
US6006126A (en) * | 1991-01-28 | 1999-12-21 | Cosman; Eric R. | System and method for stereotactic registration of image scan data |
US5662111A (en) | 1991-01-28 | 1997-09-02 | Cosman; Eric R. | Process of stereotactic optical navigation |
US5291889A (en) | 1991-05-23 | 1994-03-08 | Vanguard Imaging Ltd. | Apparatus and method for spatially positioning images |
US5279309A (en) * | 1991-06-13 | 1994-01-18 | International Business Machines Corporation | Signaling device and method for monitoring positions in a surgical operation |
US5261404A (en) | 1991-07-08 | 1993-11-16 | Mick Peter R | Three-dimensional mammal anatomy imaging system and method |
US5249581A (en) | 1991-07-15 | 1993-10-05 | Horbal Mark T | Precision bone alignment |
DE4134481C2 (en) | 1991-10-18 | 1998-04-09 | Zeiss Carl Fa | Surgical microscope for computer-aided, stereotactic microsurgery |
US5371778A (en) | 1991-11-29 | 1994-12-06 | Picker International, Inc. | Concurrent display and adjustment of 3D projection, coronal slice, sagittal slice, and transverse slice images |
DE4207901C3 (en) | 1992-03-12 | 1999-10-07 | Aesculap Ag & Co Kg | Method and device for displaying a work area in a three-dimensional structure |
US5389101A (en) | 1992-04-21 | 1995-02-14 | University Of Utah | Apparatus and method for photogrammetric surgical localization |
US5603318A (en) | 1992-04-21 | 1997-02-18 | University Of Utah Research Foundation | Apparatus and method for photogrammetric surgical localization |
FR2691093B1 (en) | 1992-05-12 | 1996-06-14 | Univ Joseph Fourier | ROBOT FOR GUIDANCE OF GESTURES AND CONTROL METHOD. |
US5357953A (en) | 1992-05-21 | 1994-10-25 | Puritan-Bennett Corporation | Measurement device and method of calibration |
FR2694881B1 (en) | 1992-07-31 | 1996-09-06 | Univ Joseph Fourier | METHOD FOR DETERMINING THE POSITION OF AN ORGAN. |
USD349573S (en) | 1992-08-17 | 1994-08-09 | Flexbar Machine Corp. | Quick-lock holder for laparoscopic instrument |
US5368030A (en) | 1992-09-09 | 1994-11-29 | Izi Corporation | Non-invasive multi-modality radiographic surface markers |
US5647361A (en) | 1992-09-28 | 1997-07-15 | Fonar Corporation | Magnetic resonance imaging method and apparatus for guiding invasive therapy |
DE4233978C1 (en) | 1992-10-08 | 1994-04-21 | Leibinger Gmbh | Body marking device for medical examinations |
US5732703A (en) | 1992-11-30 | 1998-03-31 | The Cleveland Clinic Foundation | Stereotaxy wand and tool guide |
US5309913A (en) * | 1992-11-30 | 1994-05-10 | The Cleveland Clinic Foundation | Frameless stereotaxy system |
US5517990A (en) | 1992-11-30 | 1996-05-21 | The Cleveland Clinic Foundation | Stereotaxy wand and tool guide |
US5305091A (en) | 1992-12-07 | 1994-04-19 | Oreo Products Inc. | Optical coordinate measuring system for large objects |
FR2699271B1 (en) | 1992-12-15 | 1995-03-17 | Univ Joseph Fourier | Method for determining the femoral anchor point of a cruciate knee ligament. |
US5730130A (en) | 1993-02-12 | 1998-03-24 | Johnson & Johnson Professional, Inc. | Localization cap for fiducial markers |
US5611147A (en) | 1993-02-23 | 1997-03-18 | Faro Technologies, Inc. | Three dimensional coordinate measuring apparatus |
US5483961A (en) | 1993-03-19 | 1996-01-16 | Kelly; Patrick J. | Magnetic field digitizer for stereotactic surgery |
ZA942812B (en) * | 1993-04-22 | 1995-11-22 | Pixsys Inc | System for locating the relative positions of objects in three dimensional space |
USD353668S (en) | 1993-05-24 | 1994-12-20 | Minnesota Mining And Manufacturing Company | Pneumoneedle |
US5391199A (en) | 1993-07-20 | 1995-02-21 | Biosense, Inc. | Apparatus and method for treating cardiac arrhythmias |
FR2709656B1 (en) * | 1993-09-07 | 1995-12-01 | Deemed Int Sa | Installation for computer-assisted microsurgery operation and methods implemented by said installation. |
DE4330873A1 (en) | 1993-09-13 | 1995-03-16 | Zeiss Carl Fa | Coordinate measuring device with a probe and electronics for processing the probe signal |
US5558091A (en) | 1993-10-06 | 1996-09-24 | Biosense, Inc. | Magnetic determination of position and orientation |
US5399146A (en) | 1993-12-13 | 1995-03-21 | Nowacki; Christopher | Isocentric lithotripter |
USD357534S (en) | 1993-12-15 | 1995-04-18 | Zimmer, Inc. | Surgical parallel drill guide instrument |
US5531227A (en) | 1994-01-28 | 1996-07-02 | Schneider Medical Technologies, Inc. | Imaging device and method |
USD359557S (en) | 1994-02-09 | 1995-06-20 | Zimmer, Inc. | Orthopaedic drill guide |
GB9405299D0 (en) * | 1994-03-17 | 1994-04-27 | Roke Manor Research | Improvements in or relating to video-based systems for computer assisted surgery and localisation |
US5490196A (en) | 1994-03-18 | 1996-02-06 | Metorex International Oy | Multi energy system for x-ray imaging applications |
US5531520A (en) | 1994-09-01 | 1996-07-02 | Massachusetts Institute Of Technology | System and method of registration of three-dimensional data sets including anatomical body data |
DE4432890B4 (en) | 1994-09-15 | 2004-02-19 | Brainlab Ag | Device for detecting the position of irradiation target points |
US5803089A (en) * | 1994-09-15 | 1998-09-08 | Visualization Technology, Inc. | Position tracking and imaging system for use in medical applications |
DE29521895U1 (en) * | 1994-10-07 | 1998-09-10 | Univ St Louis | Surgical navigation system comprising reference and localization frames |
US5617857A (en) | 1995-06-06 | 1997-04-08 | Image Guided Technologies, Inc. | Imaging system having interactive medical instruments and methods |
US5722594A (en) * | 1995-08-14 | 1998-03-03 | Farr; Sandra L. | Convertible child stroller/package carrier |
US5638819A (en) | 1995-08-29 | 1997-06-17 | Manwaring; Kim H. | Method and apparatus for guiding an instrument to a target |
US5649936A (en) | 1995-09-19 | 1997-07-22 | Real; Douglas D. | Stereotactic guide apparatus for use with neurosurgical headframe |
US5769861A (en) | 1995-09-28 | 1998-06-23 | Brainlab Med. Computersysteme Gmbh | Method and devices for localizing an instrument |
US5682886A (en) | 1995-12-26 | 1997-11-04 | Musculographics Inc | Computer-assisted surgical system |
US5799055A (en) * | 1996-05-15 | 1998-08-25 | Northwestern University | Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy |
-
1995
- 1995-10-05 DE DE29521895U patent/DE29521895U1/en not_active Expired - Lifetime
- 1995-10-05 DE DE69528998T patent/DE69528998T2/en not_active Expired - Lifetime
- 1995-10-05 AU AU39505/95A patent/AU3950595A/en not_active Abandoned
- 1995-10-05 DE DE69532829T patent/DE69532829T2/en not_active Expired - Lifetime
- 1995-10-05 EP EP95937379A patent/EP0869745B8/en not_active Expired - Lifetime
- 1995-10-05 WO PCT/US1995/012894 patent/WO1996011624A2/en active IP Right Grant
- 1995-10-05 EP EP01130400A patent/EP1201199B1/en not_active Expired - Lifetime
- 1995-10-05 AT AT01130400T patent/ATE320226T1/en not_active IP Right Cessation
- 1995-10-05 US US08/809,404 patent/US6236875B1/en not_active Expired - Lifetime
- 1995-10-05 CA CA002201877A patent/CA2201877C/en not_active Expired - Lifetime
- 1995-10-05 JP JP51333296A patent/JP3492697B2/en not_active Expired - Lifetime
- 1995-10-05 AT AT99114109T patent/ATE262844T1/en not_active IP Right Cessation
- 1995-10-05 AT AT95937379T patent/ATE228338T1/en not_active IP Right Cessation
- 1995-10-05 EP EP99114109A patent/EP0950379B1/en not_active Expired - Lifetime
- 1995-10-05 DE DE69534862T patent/DE69534862T2/en not_active Expired - Lifetime
-
1998
- 1998-06-26 US US09/105,067 patent/US6490467B1/en not_active Expired - Lifetime
-
2001
- 2001-04-12 US US09/832,848 patent/US7139601B2/en not_active Expired - Lifetime
Also Published As
Publication number | Publication date |
---|---|
DE29521895U1 (en) | 1998-09-10 |
DE69528998T2 (en) | 2003-07-03 |
DE69532829D1 (en) | 2004-05-06 |
EP0869745B1 (en) | 2002-11-27 |
US6490467B1 (en) | 2002-12-03 |
DE69532829T2 (en) | 2005-01-27 |
JP2002510214A (en) | 2002-04-02 |
EP0869745A4 (en) | 1998-10-14 |
US7139601B2 (en) | 2006-11-21 |
EP1201199A2 (en) | 2002-05-02 |
DE69528998D1 (en) | 2003-01-09 |
EP0950379A3 (en) | 2003-01-02 |
ATE320226T1 (en) | 2006-04-15 |
EP0869745B8 (en) | 2003-04-16 |
US6236875B1 (en) | 2001-05-22 |
WO1996011624A2 (en) | 1996-04-25 |
DE69534862D1 (en) | 2006-05-11 |
US20020035321A1 (en) | 2002-03-21 |
AU3950595A (en) | 1996-05-06 |
EP0950379A2 (en) | 1999-10-20 |
JP3492697B2 (en) | 2004-02-03 |
WO1996011624A3 (en) | 1996-07-18 |
ATE262844T1 (en) | 2004-04-15 |
DE69534862T2 (en) | 2006-08-17 |
EP1201199B1 (en) | 2006-03-15 |
EP0869745A2 (en) | 1998-10-14 |
EP0950379B1 (en) | 2004-03-31 |
ATE228338T1 (en) | 2002-12-15 |
CA2201877A1 (en) | 1996-04-25 |
EP1201199A3 (en) | 2003-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2201877C (en) | Surgical navigation systems including reference and localization frames | |
US6434415B1 (en) | System for use in displaying images of a body part | |
US6978166B2 (en) | System for use in displaying images of a body part | |
US6167145A (en) | Bone navigation system | |
EP4159149A1 (en) | Surgical navigation system, computer for performing surgical navigation method, and storage medium | |
USRE44305E1 (en) | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation | |
US5999837A (en) | Localizing and orienting probe for view devices | |
EP1820465B1 (en) | Universal image registration interface | |
JP2013508103A (en) | Automatic registration of images for image guided surgery | |
US11672607B2 (en) | Systems, devices, and methods for surgical navigation with anatomical tracking | |
CN117064557B (en) | Surgical robot for orthopedic surgery | |
EP3733112A1 (en) | System for robotic trajectory guidance for navigated biopsy needle | |
US20230404686A1 (en) | Coupler For Robotic End Effector | |
US20200297451A1 (en) | System for robotic trajectory guidance for navigated biopsy needle, and related methods and devices | |
Langlotz et al. | Computer-assisted Minimally Invasive Spine Surgery: State of the Art |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
MKEX | Expiry |
Effective date: 20151005 |