CA2255041C - Stereotactic surgical procedure apparatus and method - Google Patents

Stereotactic surgical procedure apparatus and method Download PDF

Info

Publication number
CA2255041C
CA2255041C CA002255041A CA2255041A CA2255041C CA 2255041 C CA2255041 C CA 2255041C CA 002255041 A CA002255041 A CA 002255041A CA 2255041 A CA2255041 A CA 2255041A CA 2255041 C CA2255041 C CA 2255041C
Authority
CA
Canada
Prior art keywords
image
computer
block
trajectory
surgical device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CA002255041A
Other languages
French (fr)
Other versions
CA2255041A1 (en
Inventor
Michael A. Peshkin
Julio J. Santos-Munne
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern University
Original Assignee
Northwestern University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=27095352&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=CA2255041(C) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Northwestern University filed Critical Northwestern University
Publication of CA2255041A1 publication Critical patent/CA2255041A1/en
Application granted granted Critical
Publication of CA2255041C publication Critical patent/CA2255041C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/464Displaying means of special interest involving a plurality of displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • G06T3/08
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/363Use of fiducial points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • G06T2207/10121Fluoroscopy
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Abstract

An apparatus and method are provided for coordinating two fluoroscope images which permit accurate computer-based planning of the insertion point and angle of approach of a needle, drill, screw, nail, wire or other surgical instrumentation into the body of a patient and subsequently guide the surgeon in performing the insertion in accordance with the plan.

Description

STEREOTACTIC SURGICAL PROCEDURE APPARATL~ AND METHOD
Background and Summary of the Invention The present invention relates to an apparatus and method for planning and guiding insertion of an object along a linear trajectory into a body. More particularly, the l0 present invention relates to an apparatus and method for coordinating two captured fluoroscope images to permit effective three-dimensional planning of the trajectory using only two-dimensional images, Numerous medical interventions involve placing a needle, drill, screw, nail, wire or other device in the body. In some cases the angle and position of the device are both of critical importance, for example in the drilling of a hole for a screw along the axis of a spinal pedicle. In other cases, it is primarily the positioning of the end-point of the device which is important, for example in placing a biopsy needle into a suspected tumor.
In still other cases, the objective is only to define a point rather than a fine, for example in targeting a tumor for radiation therapy. Many other examples exist, especially in the field
2 0 of orthopaedics.
The present invention is also relevant to the development of percutaneous technique. Executing a linear trajectory for the insertion of instrumentation into the body through the skin is more difficult than open surgical technique, but the reduced invasiveness and trauma of percutaneous placement makes it desirable.
Fluoroscopy is frequently used by surgeons to assist medical procedures.
Continuous fluoroscopy during a surgical procedure is undesirable because it exposes the surgeon's hands to radiation. Furthermore, regardless of whether intermittent or continuous fluoroscopy is used, the resulting images are two-dimensional while insertion of the surgical instrument requites three-dimensional awareness by the surgeon.
3 0 The apparatus and method of the present invention involve acquisition and storage of two separate fluoroscopic images of the body, taken from two different angles.
Typically, although not necessarily, these would be an anteriorlposterior (A/P) image taken front-to-back of the patient, and a sagittal image taken side-to-side.
These two fluoroscopic images are displayed on two adjacent computer monitors. The surgeon uses a trackball or other computer input device to specify on the monitors an insertion point and an insertion trajectory.
A mechanical positioning device is then used to position a guide through which the surgeon performs the insertion of the surgical instrument. The positioning device may either be an active computer controlled manipulator such as a robot, or it may be a manually adjusted mechanical device which is set numerically in accordance with an output from the computer.
The apparatus and method of the present invention establish the projective geometric relationships relating each of two acquired fluoroscopic images to the three-dimensional workspace around and within the patient's body, despite essentially arbitrary positioning of the fluoroscope. The two images then become a coordinated pair, which permits three-dimensional planning that might otherwise be expected to require a computed tomography (CT) scan.
While the acquisition and display of two approximately orthogonal images may be expected to present the surgeon with the greatest ability to plan in three dimensions, two images are not strictly necessary. It is possible to use a single captured image for some procedures, particularly if the surgeon has adjusted the beam axis of the 2 0 fluoroscope into alignment with the intended trajectory. Furthermore, more than two images could also be acquired and coordinated, should that be advantageous.
Several other approaches to stereotactic or robotic surgery, planned on a computer screen displaying medical images, have been described by other workers, and will be listed below. Some background is given here before discussing prior art. The 2 5 method and apparatus of the present invention constitute a technique we call coordinated fluoroscopy. Coordinated fluoroscopy is a technique for REGISTRATION and for SURGICAL PLANNING. It allows registration based on the acquired fluoroscopic images themselves, without requiring any additional measuring devices. It allows three-dimensional surgical planning based on fluoroscopic views from two angles, without 3 0 requiring three-dimensional imaging such as computed tomography (CT), and without requiring that the two fluoroscopic images be acquired from orthogonal fluoroscope poses.

REGISTRATION
Registration is a key step in any image-guided surgical system.
Registration is the determination of the correspondence between points of the image upon which a surgical plan is prepared, and points of the workspace in the vicinity of (and within) the patient. If a numerically controlled tool (whether robotic or manual) is to be used, the coordinate system of that device must also be brought into registry with the image.
It is common to accomplish registration with the help of a global positioning device, usually optical, which can measure the three-dimensional coordinates of markers placed anywhere over a large volume of space. Coordinated fluoroscopy avoids the necessity for this expensive and inconvenient device, instead deriving registration directly from the acquired fluoroscopic images themselves.
Coordinated fluoroscopy uses a "registration artifact" which is held in a fixed position relative to the patient while one or more fluoroscopic images are acquired from different angles (poses).
There is no need to constrain the fluoroscope poses at which these various images are acquired, for instance to require that they be orthogonal, nor is there a need to instrument the fluoroscope so that the pose angles can be measured. Instead, pose information is extracted after-the-fact from the images. It is a substantial benefit-of the present invention that surgeons can acquire fluoroscopic images using fluoroscope poses of their own choosing, as they are accustomed.
The registration artifact contains a plurality of features (fiducials) which are designed to be easily identifiable on a fluoroscopic image. The embodiment described here uses eight small steel spheres embedded in a radiolucent matrix. The positions of 2 5 these fiducials are known relative to a coordinate system fixed in the artifact, either by design or by measurement.
From the two-dimensional locations of the projections of these fiducials in a fluoroscopic image, we can determine the geometric projections that carry a general three dimensional point anywhere in the vicinity of the artifact into a projected point on 3 0 the image. This establishes registration between image and workspace.
Several images can each be registered relative to the same registration artifact, thus also bringing all the images into registry with one another.
-4-Identification of the geometric projections, as discussed above, would not be possible with raw fluoroscope images, which are highly nonlinear and distorted. It is necessary first to map and compensate for these distortions. It is usefi~l to be aware of the necessity of distortion compensation when comparing the present invention to prior art.
SURGICAL PLANNING
Surgical planning is also a key step in image-guided surgery. Planning of three-dimensional surgical procedures might be expected to be done on a 1o three-dimensional dataset, such as can be reconstructed from computed tomography (CT) data. However, surgeons are accustomed to planning on two-dimensional images:
radiographs or fluoroscopic images. Indeed even when CT data is available, planning is usually done on individual two-dimensional CT "slices" rather than on a three-dimensional reconstruction.
The coordinates of the endpoints of a line segment representing an intended screw, biopsy needle, or drilled hole are of course three-dimensional, as are the coordinates of a single point within the body marking the present location of a tumor or a fragment of shrapnel. In surgical planning such points can be specified on a two-dimensional image, or on each of several two-dimensional images. Each such 2 o two-dimensional image is a projection of the same three-dimensional space.
It is necessary to convert the two-dimensional coordinates of specified points on each of several images into a three-dimensional coordinate which can be used to guide a tool along a desired trajectory or to a desired point within the body.
To do so one must have knowledge of the geometric relationship of the projections that created the 2 5 images.
In the absence of such geometric knowledge a point specified on one image and a point independently specified on another image may in fact not correspond to any single point within the body. This is so because a point specified on a two-dimensional image is the projection of a LINE in space. The implied point in 3 0 three-dimensions is the intersection of two such lines, one implied by the point specified on each image. Two such Iines created independently may be skew, intersecting nowhere.
Similarly, line segments for an intended procedure can not be chosen independently on
-5-two images, otherwise they will in general not correspond to a well-defined three-dimensional line segment.
In coordinated fluoroscopy, the geometric projections that relate the two images to a single three-dimensional coordinate system are established before planning commences. The points chosen by the surgeon on two (or more) images can therefore be constrained by the software such that they DO correspond to a well-defined point in three-dimensions. In practice, as a surgeon adjusts an intended point or line segment on one image, the point or line segment displayed on the other images) continuously updates and adjusts as well. One cannot draw "arbitrary" points or line segments independently on the images; the software only allows one to draw points or line segments that correspond to a well-defined point or line segment in three-dimensions.
The benefits of planning on geometrically coordinated images as described above are threefold:
1 ) Once the surgeon has selected a point or a line segment on two images, the three-dimensional point or line segment to which the selections correspond is filly defined and ready to be executed.
2) An axial view such as could be attained from a CT slice is generally unattainable fluoroscopically. The angle that is most easily visualized in axial view, known as the transverse angle, is therefore difficult to select or execute under 2 0 fluoroscopy. In coordinated fluoroscopy the transverse angle is implicitly specified by the surgeon by selecting line segments on two images. This may assist the surgeon in visualizing and planning the transverse angle for a procedure.
3) In conventional fluoroscopy, image dilation due to beam divergence is of unknown extent, making accurate measurement of anatomic distances difficult. In 2 5 coordinated fluoroscopy the actual in-situ length of an intended line segment can be determined by the software. This is usefirl for selecting appropriate screw length, as well as for other purposes.
BACKGROUND
3 0 Lavalle et al. in Grenoble, France have developed a system for spinal surgery which uses computed tomography as an image source. The CT data is assembled into a three-dimensional data set which can then be resliced at will on orthogonal planes.
-6-Surgical planning proceeds on three mutually orthogonal planes simultaneously.
Registration is performed by using an optical tracking device to digitize arbitrary surface points of the vertebrae, and matches those surface points to the CT data set.
Nolte et al. in Bern, Switzerland have developed a very similar spinal system to Lavalle et al. Registration differs in that the optical tracking device is used to digitize specific anatomic landmarks rather than general surface contours. The features are then pointed out manually in CT data, allowing a match to be made.
P. Finlay in High Wycombie, England has developed a fluoroscopic system for head-of femur (hip) fractures. Accuracy requirements in this procedure are not very great, so fluoroscope distortion compensation is not needed. Its absence also precludes identification of the geometric projections from images as is done in the present invention.
Instead, the two fluoroscope poses are required to be orthogonal and the C-arm must not be moved along the floor in between the two images. Registration is accomplished by noting various features of a surgical tool which appears in the images, and by highlighting a marker wire which also appears in the field of view of the fluoroscope.
Potamianos et al. in London, England have developed a system for kidney biopsy and similar soft-tissue procedures. It incorporates a digitizing mechanical arm to which a biopsy needle is attached, and which can be moved about manually by the surgeon. Surgical planning per se is absent; instead a line segment representing the 2 0 present position of needle is displayed superimposed upon captured (static) fluoroscope images, as the needle is moved manually near and within the patient.
Phillips et al. in Hull, England have developed a system for orthopaedic procedures. It uses a optical tracking device as well as a fluoroscope.
Registration is accomplished by instrumenting the fluoroscope with light emitting diodes and tracking 2 5 them with the optical tracker. Surgical planning software is specific to the surgical procedure, and tends to offer medical opinion rather than just display a trajectory as in the present invention. For intramedullary nail placement, for instance, the surgeon outlines target holes in an intramedullary prosthetic, and software calculates a trajectory through them.
3o U.S. Patent 4,750,487 (Zanetti) describes a stereotactic frame which overlays a patient. A single anterior/posterior fluorograph is then acquired, in which a crosshairs affixed to the frame is visible. By measuring the displacement of the crosshairs _7_ from the desired target, a motion of the frame can be accomplished which brings the two into alignment. This invention does not facilitate three-dimensional stereotaxy as does the present invention.
U. S. Patent 5,078,140 (Kwoh) describes a stereotactic and robotic system for neurosurgery. It uses CT images.
ASPECTS OF THE INVENTION
According to the present invention, a method is provided for planning a stereotactic surgical procedure for a linear trajectory insertion of surgical instrumentation 1 o into a body using a fluoroscope for generating images of the body. The method includes placing adjacent to the body a registration artifact containing a plurality of fiducials;
displaying on a computer monitor an image of the patient's body and the registration artifact; receiving a user or automatic algorithmic input to identify two-dimensional coordinates of the fiducials of the registration artifact displayed on the first monitor; and registering the image by creating a geometric model having parameters, said model projecting three-dimensional coordinates into image points, and numerically optimizing the parameters of the geometric model such that the projections of the known three-dimensional coordinates of the fiducials best fit the identified two-dimensional coordinates in the image.
2 0 The method fizrther includes displaying on a second computer monitor a second image, taken of the patient's body and the registration artifact but from an angle different from that of the first image, and receiving a user or automatic algorithmic input to identify two-dimensional coordinates of the fiducials displayed on the second computer monitor; and registering the second image by creating a geometric model having parameters, said model projecting three-dimensional coordinates into image points, and numerically optimizing the parameters of the geometric model such that the projections of the known three-dimensional coordinates of the fiducials best fit the identified two-dimensional coordinates in the second image.
The method, whether one or two images have been acquired, further 3 o includes the step of receiving a user input to select on a computer monitor an entry point for a surgical instrument. In the case of two images, also receiving a user input to select on a computer monitor the position, length, and angles of a virtual guidewire representing _g-the trajectory for the surgical instrument: and drawing a segment, to be known as a PROJECTED GUIDEWIRE, on the image(s). When there are two images, the projected guidewires are constrained to correspond geometrically to the same three-dimensional segment in space, to be known as the VIRTUAL GUIDEWIRE.
The method further includes receiving a user input to move either end of a projected guidewire, by revising the virtual guidewire of which the projected guidewire(s) are projections, and by redrawing the projected guidewires in correspondence with the revised virtual guidewire.
The method further includes receiving a user input to change the length of the virtual guidewire, and redrawing the projected guidwire(s) in correspondence with the revised virtual guidewire. A special case is that the length is zero, so that what is planned is a virtual targetpoint rather than a virtual guidewire.
The method further includes receiving a user input to change the sagittal, transverse, or coronal angles) of the virtual guidewire, updating the orientation of the virtual guidewire based on the new angles, and redrawing the projected guidewire(s) in correspondence with the revised virtual guidewire.
The method further includes producing an output to adjust the coordinates of a tool guide such that the projection of the axis of the guide in an 2 0 image is brought into correspondence with the entry point displayed on the computer monitor.
The method further includes producing an output to adjust the coordinates of a tool guide such that it is brought into correspondence with the virtual guidewire: or producing an output to adjust the coordinates of a tool guide 2 5 such that the position of the guide along its axis is offset by a preselected distance from one endpoint of the virtual guidewire, in order to control the location within the body of the surgical instrument to be inserted.
The method further includes transmitting said coordinates to a robot or other automatic mechanical device, or displaying said coordinates such 30 that human operator may manually adjust a mechanical device.
In accordance with one aspect of the present invention there is provided a computer-aided method for planning a surgical procedure comprising:
registering to a known coordinate frame a first two-dimensional, fluoroscopic image of a body's anatomy taken at a first observation angle; displaying the first -8a-image; and drawing in the displayed first image a representation of at least one of a surgical device to be placed in the body based on the registration of the first image with the known coordinate frame.
In accordance with another aspect of the present invention there is provided a computer readable storage medium encoded with instructions, which, when read by a computer, enable a computer to undertake a process comprising: registering to a known coordinate frame a first two-dimensional, fluoroscopic image of a body's anatomy taken at a first observation angle;
displaying the first image; and drawing in the displayed first image a representation of a surgical device to be placed in the body based on the registration of the first image with the known coordinate frame.
Brief Descriution of the Drawings The detailed description particularly refers to the accompanying figures in which:

Fig. 1 is a diagrammatic illustration of the stereotactic surgical apparatus of the present invention for coordinating images from a fluoroscope, planning a linear trajectory medical intervention, and controlling a robot to control the linear trajectory medical intervention;
Fig. 2 is a perspective view of a registration artifact and tool guide of the present invention;
Fig. 3a is a sample screen display of the user interface which includes an anterior/posterior (A/P) taken by the fluoroscope and displayed on a first computer monitor along with a number of the buttons and entry fields necessary to run the program;
to Fig. 3b is a sample screen display which includes a sagittal image taken by the fluoroscope and displayed on a second computer monitor along with a number of the buttons and entry fields necessary to run the program;
Fig. 3c is a flow chart of the steps performed by the computer during a main program loop;
Fig. 4 is a flow chart illustrating the steps performed by the computer to acquire an A/P image from the fluoroscope;
Fig. 5 is a flow chart illustrating the steps performed by the computer to acquire a sagittal image from the fluoroscope;
Fig. 6 is a flow chart illustrating the steps performed by the computer and the user to select or identify A/P fiducials from the A/P image displayed in Fig. 3a;
Fig. 7 is a flow chart of the steps performed by the computer and the user to select or identify sagittal fiducials displayed on the sagittal image of Fig. 3b;
Fig. 8 is a flow chart illustrating the steps performed by the computer to register the A/P image;
Fig. 9 is a flow chart illustrating the steps performed by the computer to register the sagittal image;
Fig. 10 is a flow chart illustrating the steps performed by the computer for changing a transverse angle of the virtual guidewire;
Fig. 11 is a flow chart illustrating the steps performed by the computer to 3 o change the length of the virtual guidewire used in the stereotactic surgical procedure;
Fig. 12 is a flow chart illustrating the steps performed by the computer to change a sagittal angle of the virtual guidewire;

Fig. 13 is a flow chart illustrating the steps performed by the computer to change the approach angle of the robot;
Fig. 14 is a flow chart illustrating the steps performed by the computer to move the robot illustrated in Fig. 1 to the planned position and orientation;
Fig. 15 is a flow chart illustrating the steps performed by the computer to move the end effector of the robot along the axis of the tool guide;
Fig. 16 is a flow chart illustrating the steps performed by the computer when the computer receives a user input based on a cursor in the A/P image area of Fig.
3 a;
to Fig. 17 is a flow chart illustrating the steps performed by the computer when the computer receives a user input based on a cursor in the sagittal image area in Fig. 3b; and Fig. 18 is a flow chart illustrating the steps performed by the computer when the computer receives a user input based on a cursor in the robot control areas of Figs.3a-b.
Detailed Description of Drawings Referring now to the drawings, Fig. 1 illustrates the stereotactic system 10 for linear trajectory medical interventions using calibrated and coordinated fluoroscopy.
2 o The apparatus and method of the present invention is designed to utilize images from a fluoroscope 12 such as a standard C-arm which generates fluoroscopic or x-ray images of a body on a surgical table 14. The imaging arm 16 is moveable so that both anterior/posterior (A/P) and sagittal or side images of the body can be taken.
A robot 18 is situated adjacent the surgical table 14. Illustratively, the 2 5 robot is a PUMA-560 robot. The robot 18 includes a movable arm assembly 20 having an end flange 22. An alignment or registration artifact 24 is coupled to the end flange 22 of robot 18.
The registration artifact 24 is best illustrated in Fig. 2. The artifact 24 is X-ray and visually transparent with the exception of 8 opaque spheres or fiducials 26, and 3 o an aperture 30 to hold a tool guide 28 through the artifact 24. Initially, the artifact 24 is positioned roughly over the area of interest of body 32 and within the field of view of the fluoroscope 16. Therefore, the fiducials 26 show up as distinct dots on the A/P and sagittal images as discussed below. The shape of the artifact is designed so that the image dots from the fiducials 26 will not over shadow each other and is sensitive to any angular deviations. The robot arm 20 can adjust the artifact 24 in three-dimensions about X-axis 34, Y-axis 36, or Z-axis 38 illustrated in Fig. 1.
The coordinated fluoroscopic control system of the present invention is controlled by computer 40, which includes a microprocessor 42, and internal RAM 44, and a hard disk drive 46. Computer 40 is coupled to two separate graphics monitors 48 and 50. The first graphics monitor 48 displays a sagittal image taken by the C-arm 12.
The second monitor 50 displays an A/P image taken by the C-arm 12. Computer 40 l0 further includes a serial communication port 52 which is coupled to a controller 53 of robot 18. Computer 40 is also coupled to C-arm 12 for receiving the images from the C-arm 12 through an image acquisition card 54. Computer 40 is also coupled to an input device 56 which is illustratively a keyboard having a track ball input control 58. Track ball input 58 controls a cursor on both monitor 48, 50.
The displays on monitors 48 and 50 are illustrated in Figs. 3a and 3b.
Referring now to Fig. 3b, the sagittal image is displayed in area 62 on monitor 48. All eight fiducials 26 should appear in the sagittal image area 62. If not, the artifact 24 or the C-arm 12 should be adjusted. As discussed in detailed below, computer 40 displays a top entry point 64 and a bottom point 66 of a projected guidewire 68. The projected 2 o guidewire 68 is a line segment which is displayed on the sagittal image area representing the position of the instrumentation to be inserted during the stereotactic surgical procedure. A line of sight 70 is also displayed in the sagittal image area 62.
Various user option buttons are displayed on monitor 48. The surgeon or operator can access these options by moving the cursor to the buttons and clicking or by 2 5 selecting the appropriate function keys (F 1, F2, etc. ) on the keyboard.
The option buttons displayed on monitor 48 include button 72 (fianction F2) for acquiring the sagittal image, button 74 (F4) for selecting sagittal fiducials, and button 76 (F6) for registering the sagittal image. In addition, button 78 (F 10) is provided for setting the sagittal angle, button 80 (F8) is provided for setting the screw length, and button 82 (F12) is provided 3 o for moving the robot along an axis of the tool guide. Finally, the display screen includes a robot control area 84. The operator can move the cursor and click in the robot control area 84 to control robot 18 as discussed below.

Referring to Fig. 3a, the A/P image displayed on the display screen of monitor SO is illustrated. The A/P image is displayed in area 86 of the screen. Again, all eight fiducials 26 should appear within the A/P image area 86. The top insertion point of the virtual guidewire is illustrated at location 88, and the bottom point is located at location 90. The projection of the guidewire onto the A/P image is illustrated by line segment 92.
Computer 40 also displays various option buttons on monitor 50. Button 94 (F1) is provided for acquiring the A/P image. Button 96 (F3) is provided for selecting the A/P fiducials. Button 98 {FS) is provided for registering the AP image.
Button 100 (F7) is provided for setting a transverse angle of the virtual guidewire, and button 102 (F9) is provided for setting an approach angle for the robot. Button 104 (Fl 1) is provided for moving the robot. Computer 40 also displays a robot control area 84. The operator can move the cursor and click in the robot control area 84 to control robot 18 as discussed in detail below.
The present invention allows the surgeon to select the point of entry for the surgical instrument by moving the top point of the projected guidewire 88 in the A/P
image area 86. The operator can also adjust the bottom point of the projected guidewire 90 to specify the transverse and sagittal angle. In addition, the operator can adjust the top point of the projected guidewire 64 to specify the position on the line of sight and bottom 2 o point of the projected guidewire 66 to specify the sagittal and transverse angle in the sagittal image area 62. Therefore, the surgeon can select the desired position and orientation of the surgical instrument into the body.
The computer 40 is programmed with software to correct spatial distortions from the optics of the fluoroscope 12. The system of the present invention 2 5 permits ei~ective three-dimensional planning of the stereotactic surgical procedure using only a pair of two dimensional fluorographic images displayed on the adjacent monitors 48 and S0. It is not required to use a CT slice in order to fully specify the location of the surgical instrument. The computer 40 establishes the direct geometric relationship between the A/P and sagittal images, despite image distortions and the essentially random 3 0 or free-hand positioning of the C-arm 12, to establish the A/P and sagittal images. The improved system of the present invention can establish this exact geometric relationship within sub-millimeter accuracy.

Once the sagittal and A/P images are registered, points or lines chosen by the surgeon on one of the A/P image or the sagittal image are immediately displayed by computer 40 as corresponding projections on the other image. Therefore, using the sagittal image on monitor 48 and the A/P image on monitor 50, the surgeon can stereotactically plan the linear trajectory without the requirement of CT scan slice.
Accordingly, the procedure of the present invention can be performed without the very expensive CT scan devices which can cost in excess of $ I million.
Details of the operation of the software for controlling the system of the present invention are illustrated in Figs. 3c-18.
to Alt of the notations, subscripts and mathematical formulae, equations, and explanations are included in the attached Appendix. Throughout the flow charts described Figs. 4-18, reference will be made to the Appendix and to the numbered Sections [1] through [15] set forth in the Appendix.
The main program begins at block 110 of Fig. 3c. Computer 40 creates a parent window at block 112 and then draws buttons on a main window as illustrated at block 114. Computer 40 then creates a sagittal child window on monitor 48 as illustrated at block 116. Computer 40 also creates an A/P child window on monitor SO as illustrated at block 118. Computer 40 then determines whether a button or key has been pressed at block 120. If not, computer 20 waits as illustrated at block 122 and then 2 o returns to block 120 to wait for a button or key to be pressed.
If a button or key was pressed at block 120, computer 40 determines whether the Acquire A/P Image button 94 or the Fl key was pressed at block 124. If so, computer 40 advances to block 166 of Fig. 4. If not, computer 40 determines whether the Acquire Sagittal Image button 94 or the F2 key was pressed at block 126. If so, the 2 5 computer 40 advances to block 200 of Fig. 5. If not, computer 40 determines whether the Select A/P Fiducial button 96 or the F3 key was pressed at block I28. If so, computer 40 advances to block 234 of Fig. 6. If button 96 or the F3 key was not pressed at block 128, computer 40 determines whether the Select Sagittal Fiducial button 74 or the F4 key was selected as illustrated at block 130. If so, computer 40 advances to block 276 of Fig. 7. If 3 0 not, computer 40 advances to block 132.
In block 132, computer 40 determines whether the Register A/P Image button 98 or the FS key was pressed. If so, computer 40 advances to block 324 of Fig. 8.

if not, computer 40 determines whether the Register Sagittal Image button 76 or the F6 was pressed as illustrated at block 134. If so, computer 40 advances to block 3 SO of Fig.
9. If not, computer 40 advances to block 136.
From block 136, computer 40 determines whether the Transverse Angle button 100 or the F7 key was pressed as illustrated at block 138. If so, computer 40 advances to block 376 of Fig. 10. If not, computer 40 determines whether the screw Length button 80 or F8 key was pressed as illustrated at block 140. If so, computer 40 advances to block 388 of Fig. 11. If not, computer 40 determines whether the Sagittal Angle button 78 or the F10 key was pressed as illustrated at block 142. If so, computer l0 40 advances to block 400 of Fig. 12. If not, computer 40 determines whether the Approach Angle button 102 or the F9 key was pressed as illustrated at block 144. If so, computer 40 advances to block 412 of Fig. 13. If not, computer 40 advances to block 146.
In block 146, computer 40 determines whether the Move Robot button 104 or the Fll key was pressed. If so, computer 40 advances to block 422 of Fig. 14. If not, computer 40 determines whether the Move Robot Along Axis button 82 or the key was pressed as illustrated at block 148. If so, computer 40 advances to block 452 of Fig. 15. If not, computer 40 determines whether the A/P Image area of monitor 50 has been selected by clicking when the cursor is in the A/P image area 86 as illustrated at 2 0 block 150. If so, computer 40 advances to block 476 of Fig. 16. If not, computer 40 then determines whether the Sagittal Image area was selected by positioning the cursor in the sagittal image area 62 on monitor 48 and clicking. If so, computer 40 advances to block 506 of Fig. 17. if not, computer 40 advances to block 154.
From block 154, computer 40 determines whether the robot control area 2 5 54 or 106 was selected by moving the cursor and clicking in the Robot Control area 84 on monitor 48 or the Robot Control area 106 on monitor 50. If the Robot Control was selected, computer 40 advances block 536 of Fig. 18. If the Robot Control was not selected, computer 40 advances to block 158 to determine whether the "Q" key was pressed indicating the operator desires to quit the main program. If the "Q"
button was 30 pressed, then computer 40 frees all allocated memory as illustrated at block 160 and ends the main program as illustrated at block 162. If the "Q" button was not pressed at block -I S-158, computer 40 advances back to block 122, waiting for a another button or key to be pressed.
The various functions performed by the system of the present invention will be described in detail. If the Acquire A/P Image button 94 or the F1 key is pressed the, computer 40 advances to block 166 of Fig. 4. Computer 40 then determines whether the image acquisition card is in a passthrough mode at block 168. Button 94 and the F1 key are toggle buttons. When the button 94 or the F 1 key is initially pressed, the card is in passthrough mode and images from the C-arm 12 are transmitted directly to the monitor 50. Whatever image is being taken by the C-arm is seen on the monitor 50 in the 1o A/P image area 86. Therefore, if the card is not in the pass-through mode at block 168, pressing button 94 or the F1 key sets the pass-through mode at block 170.
Computer 40 then returns to wait for the next command as illustrated at block 172. When the button 94 or the F1 key is pressed again after the image acquisition card within the computer 40 is in pass-through mode, it freezes the live image and captures the A/P image as illustrated at block 174. This captured image is then displayed on monitor SO as illustrated at block 176. Computer 40 then disables and dims buttons F11, F12 and FS, and enables and brightens button 96 and key F3 as illustrated at block 178. In other words, after the A/P
image has been captured, computer 40 allows the operator to have the option to select the A/P fiducials through button 96 or key F3.
2 o Computer 40 then assigns a NULL tool as illustrated at block 180. The NULL tool of the robot is the three-dimensional location of end flange 22 of robot 18. In other words, the end flange 22 establishes a three-dimensional position for the robot, without depending on the particular surgical instrumentation which may be attached to the end flange 22. Computer 40 determines whether the NULL tool was properly assigned at block 182. If not, computer 40 generates an error message "Tool Not Assigned!" as illustrated at block 184. Computer 40 then waits for the next command as illustrated at block 186. If the NULL tool is assigned properly at block 182, computer 40 gets the current position of the end flange from the robot controller 53 as illustrated at block 188. Computer 40 then determines whether the sagittal image is displayed on 3 0 monitor 48 as illustrated at block 190. If not, computer 40 sends a message of "Acquire Sagittal Image" as illustrated at block 192, and then returns to wait for the next command at block 194. If the sagittal image is displayed at block 190, computer 40 sends the message "Select the Fiducials" as illustrated at block 196. Computer 40 then returns to wait for the next command at block 198.
If the Acquire Sagittal Image button 72 or the F2 key is pressed, computer 40 advances to block 200 of Fig. 5. Computer 40 then determines whether the image acquisition card is in a pass-through mode at block 202. Button 72 and the F2 key are toggle buttons. If the card is not in the pass-through mode at block 202, pressing button 72 or the F2 key sets the pass-through mode at block 204. Computer 40 then returns to wait for the next command as illustrated at block 206. When the button 72 or the F2 key is pressed again after the image acquisition card within the computer 40 is in pass-through l0 mode, it freezes the live image and captures the sagittal image as illustrated at block 208.
This captured image is then displayed on monitor 48 as illustrated at block 210.
Computer 40 then disables and dims buttons F11, F12 and F6, and enables and brightens button 74 and key F3 as illustrated at block 212. In other words, after the sagittal image has been captured, computer 40 allows the operator to have the option to select the sagittal fiducials through button 74 or key F4.
Computer 40 then assigns a NULL tool as illustrated at block 214.
Computer 40 determines whether the NULL tool was properly assigned at block 216. If not, computer 40 generates an error message "Tool Not Assigned!" as illustrated at block 218. Computer 40 then waits for the next command as illustrated at block 220.
If the 2 o NULL tool is assigned properly at block 216, computer 40 gets the current position of the end flange 22 from the robot controller 53 as illustrated at block 222.
Computer 40 then determines whether the A/P image is displayed on monitor 50 as illustrated at block 224.
If not, computer 40 sends a message of "Acquire A/P Image" as illustrated at block 226, and then returns to wait for the next command at block 228. If the A/P image is displayed 2 5 at block 224, computer 40 sends the message "Select the Fiducials" as illustrated at block 230. Computer 40 then returns to wait for the next command at block 232.
If the Select A/P Fiducials button 96 or the F3 key button is pressed, computer 40 advances to block 234 of Fig. 6. Computer 40 first determines whether the A/P image is displayed on monitor 50 as illustrated at block 236. If not, computer 40 3 o generates an error message, "Acquire A/P Image" as illustrated at block 23 8. Computer 40 then returns to wait for the next command as illustrated at block 240.

If the A/P image is displayed at block 236, computer 40 displays a square cursor on the display screen of monitor SO as illustrated at block 242.
Computer 40 then resets the number of located fiducials to zero as illustrated at block 244.
Next, computer 40 waits for the trackball button to be clicked by the operator as illustrated as block 246.
Once the trackball button is clicked over a fiducial shadow, computer 40 generates a beep as illustrated at block 248. Computer 40 then performs edge detection around the selected mouse cursor coordinate as illustrated at block 250. Such edge detection is performed using a gradient base method developed by John Canny and described in the article referenced in Section [1] of the attached Appendix. Such article is hereby to incorporated by reference and made a part of this detailed description.
Computer 40 then determines whether at least 3 edge pixels were found during the edge detection step as illustrated at block 252. If not, computer 40 generates an error message of "Try Again Closer to the Fiducial" as illustrated at block 254.
Computer 40 then returns to block 246 to wait for the mouse button to be clicked again.
If at least three edge pixels were found at block 252, computer 40 maps the edge pixels to their calibrated image coordinates using equation [13] from the attached Appendix as illustrated at block 256.
Computer 40 then finds the center of the fiducial shadow generated by the fiducials 26 using the calibrated edge pixels as set forth in equation [ 14]
of the Appendix.
This step is illustrated at block 258. Computer 40 then advances to block 262 of Fig. b.
From block 262, computer 40 draws a circle around the center of the fiducial shadow.
Computer 40 then determines whether all eight of the fiducials 26 have been located in the A!P image as illustrated at block 264. If not, computer 40 returns to block 246 of Fig. 6 and then waits for the mouse button to be clicked again over a 2 5 different fiducial shadow.
If all eight fiducials have been located at block 264, computer 40 then saves the established image coordinates of all the fiducials in the computer memory as illustrated at block 268. Computer 40 then enables and brightens the Register A/P Image Button 98 and FS key as illustrated at block 270. Computer 40 then transmits the 3 0 message "Register A/P Image" as illustrated at block 272.

Next, computer 40 automatically advances to location ENTRY1 ofFig. 8 as illustrated at Block 274. Computer 40 does not wait for an operator to press a button to move to location ENTRY 1 of Fig. 8.
If the Select Sagittal Fiducials or the F4 key button is pressed, computer 40 advances to block 276 of Fig. 7. Computer 40 first determines whether the sagittal image is displayed on monitor 48 as illustrated at block 278. If not, computer generates an error message, "Acquire Sagittal Image" as illustrated at block 280.
Computer 40 then returns to wait for the next command as illustrated at block 282.
If the sagittal image is displayed at block 278, computer 40 displays a l0 square cursor on the display screen of monitor 48 as illustrated at block 290. Computer 40 then resets the number of located fiducials to zero as illustrated at block 292. Next, computer 40 waits for the trackball button to be clicked by the operator as illustrated as block 294. Once the trackball button is clicked, computer 40 generates a beep as illustrated at block 296. Computer 40 then performs edge detection around the selected trackball cursor coordinate as illustrated at block 298. Such edge detection is performed using a gradient base method developed by John Canny and described in the article referenced in Section [ 1 J of the attached Appendix.
Computer 40 then determines whether at least 3 edge pixels were found during the edge detection step as illustrated at block 300. If not, computer 40 generates 2 0 an error message of "Try Again Closer to the Fiducial" as illustrated at block 302.
Computer 40 then returns to block 294 to wait for the trackball button to be clicked again. If at least three edge pixels were found at block 300, computer 40 maps the edge pixels to their calibrated image coordinates using equation [13J from the attached Appendix as illustrated at block 304.
2 5 Computer 40 then finds the center of the fiducial shadow generated by the fiducials 26 using the calibrated edge pixels as set forth in equation [14J of the Appendix.
This step is illustrated at block 306. Computer 40 then advances to block 310.
From block 310, computer 40 draws a circle around the center of the fiducial shadow.
Computer 40 then determines whether all eight of the fiducials 26 have been located in the 3 0 sagittal image as illustrated at block 312. If not, computer 40 returns to block 294 and then waits for the trackball button to be clicked again.

If all eight fiducials have been located at block 312, computer 40 then saves the established image coordinates of all the fiducials in the computer memory as illustrated at block 316. Computer 40 then enables and brightens the Register sagittal Image Button 76 and the F6 key as illustrated at block 318. Computer 40 then transmits a message of "Register Sagittal Image" as illustrated at block 320.
Next, computer 40 automatically advances to location ENTRY2 of Fig. 9 as illustrated at block 322. Computer 40 does not wait for an operator to press a button to move to location ENTRY2 of Fig. 9.
If the Register A/P Image button 98 or the FS key was pressed, computer 40 advances to block 324 of Fig. 8. Computer 40 first determines whether all of the A/P
fiducials have been found as illustrated at block 326. If not, computer 40 generates an error message of "Haven't Selected All the Fiducials" as illustrated at block 328.
Computer 40 then returns to wait for the next command as illustrated at block 330.
If all the A/P fiducials have been found at block 326, computer 40 advances to block 332. As discussed above, computer 40 also automatically advances to block 332 from block 274 of Fig. 6 after all the fiducials have been selected.
In block 332, computer 40 first recalls all the two-dimensional coordinates of the A/P fiducial centers. Next, the computer 40 reads in data from a file of the three-dimensional coordinates of the center of the fiducials 26 as illustrated at block 334.
2 o The three-dimensional coordinates of the fiducials 26 are obtained using a Coordinate Measurement Machine (CMM). Therefore, this data provides information related to the actual location of the fiducials 26. Typically, these CMMed coordinates are obtained from the manufacturer of the registration artifact 24.
Next, computer 40 optimizes the parameters of a geometric model which projects three dimensional coordinates into corresponding image points. The optimized model is encapsulated in a registration matrix as set forth in section [3].
Optimization is performed by minimizing (in a least squares sense) the deviation between the model's projections of the three-dimensional coordinates read at block 334, and the two-dimensional coordinates read at block 332. The Levenberg-Marquardt method is 3 o used for optimization, as described in equation [2] of the attached Appendix and as illustrated at block 336 . Computer 40 then constructs a registration matrix as set forth in section [3] ofthe attached Appendix. This step is illustrated at block 338.

Computer 40 next determines whether the sagittal image has been registered as illustrated at block 340. If not, computer 40 generates a message of "Perform Sagittal Registration" as illustrated at block 342. Computer 40 then returns to wait for the next command as illustrated at block 344.
If the sagittal image has been registered at block 340, computer 40 generates a display message of "Pick Entry Point" as illustrated at block 346.
Computer 40 then returns to wait for the next command as illustrated at block 348.
If the Register sagittal Image button 76 or the F6 key have been pressed, computer 40 advances to block 3 SO of Fig. 9. Computer 40 first determines whether all to ofthe sagittal fiducials have been found as illustrated at block 352. If not, computer 40 generates an error message of "Haven't Selected All the Fiducials" as illustrated at block 354. Computer 40 then returns to wait for the next command as illustrated at block 356.
If all the sagittal fiducials have been found at block 352, computer 40 advances to block 358. As discussed above, computer 40 also automatically advances to block 358 from block 322 of Fig. 7 after all the fiducials have been selected.
In block 358, computer 40 first recalls all the two-dimensional coordinates of the sagittal fiducial centers. Next, the computer 40 reads in data from a file of the three-dimensional coordinates of the center of the fiducials 26 as illustrated at block 360.
The coordinates of the fiducials 26 are obtained using a Coordinate Measurement 2 0 Machine (CMM). Therefore, this data provides information related to the actual location of the fiducials 26. Typically, these coordinates are obtained from the manufacturer of the registration artifact 24.
Next, computer 40 optimizes the fit between the three-dimensional coordinates read at block 360 and the two-dimensional coordinates read at block 358 2 5 using the Levenberg-Marquardt method described in equation [2] of the attached Appendix as illustrated at block 362. Computer 40 then constructs a registration matrix as set forth in section [4] of the attached Appendix. This step is illustrated at block 364.
Computer 40 next determines whether the A/P image has been registered as illustrated at block 366. If not, computer 40 generates a message of "Perform A/P
3 0 Registration" as illustrated at block 368. Computer 40 then returns to wait for the next command as illustrated at block 370.

If the A/P image has been registered at block 366, computer 40 generates a message of "Pick Entry Point" as illustrated at block 372. Computer 40 then returns to wait for the next command as illustrated at block 374.
If the transverse angle button 100 or the F7 key is pressed, computer 40 advances to block 376 of Fig. 10. The transverse angle is the angle determined by using the right hand rule about the X axis 34 of Fig. 1. To adjust the transverse angle, the operator places the cursor in the Entry Field button 101 of Fig. 3a as illustrated at block 378 of Fig. 10. The operator then enters a numeric value for the transverse angle as iilustrated at block 380. Computer 40 then reads the new transverse angle, and updates 1 o the orientation of the virtual guidewire using the equations set forth in section [6] of the attached Appendix. This step is illustrated at block 382. Next, computer 40 redraws the virtual guidewire projection 92 in the A/P image area 86 and 68 in the sagittal image area 62 based on the new transverse angle using the equation set forth in section
[7] of the attached Appendix as illustrated at block 384. Computer 40 then returns to wait for the next command as illustrated at block 386.
If the Screw Length button 80 or the F8 key was pressed, computer 40 advances to block 388 of Fig. 11. The cursor is then placed on the entry field 81 of Fig.
3b as illustrated at block 390. The operator then enters the numeric value for the new screw length as illustrated at block 392. Computer 40 reads the new screw length, and 2 o updates the length of the virtual guidewire using the equations set forth in section [ 11 ] of the Appendix. This step is illustrated at block 394. Next, computer 40 redraws the projected guidewire 92 in the A/P image area 86 and the projected guidewire 68 in the sagittal image area 62 using the equations set forth in section [7] of the Appendix. These steps are illustrated at block 396. Next, computer 40 returns to wait for the next 2 5 command as illustrated at block 398.
if the Sagittal Angle button 78 or the F l0 key is pressed, computer 40 advances to block 400 of Fig.12 to adjust the sagittal angle. The sagittal angle is the angle about the Y-axis 36 ofFig. 1 using the right hand rule.
The cursor is placed in an entry field 79 of Fig. 3b as illustrated at block 3 o 402. Next, the operator enters a numeric value for the sagittal angle as illustrated at block 404. Computer 40 then reads the value of the new sagittal angle, and updates the orientation of the virtual guidewire using the equations set forth in section ~ 10) of the Appendix. These steps are illustrated at block 406. Next, computer 40 redraws the projected guidewire 92 in the A/P image area 86 and the projected guidewire 68 in the sagittal image area 62 using the equations set forth in section [7] of the Appendix. These steps are illustrated at block 408. The computer 40 then returns to wait for the next instruction as illustrated at block 410.
If the Approach Angle button 102 or the F9 key was pressed, computer 40 advances to block 412 of Fig. 13 . The approach angle is the angle taken about the Z-axis 38 ofFig. 1 using the right hand rule.
The cursor is placed in the entry field 103 of Fig. 3a as illustrated at block 414. The operator then enters a numeric value for the new approach angle as illustrated at block 416. The computer 40 then reads the new approach angle as illustrated at block 418. Computer 40 then returns to wait for the next command as illustrated at block 420.
In order to plan a linear trajectory in space, only two angles are needed, for this particular procedure the transverse angle and the sagittal angle are used. The approach angle permits the surgeon to control movement of the robot. In other words, the approach angle is not used with the planning of the trajectory.
If the Move Robot button 104, or the F 11 key are pressed, computer 40 advances to block 422 of Fig. 14. Computer 40 first recalls the approach angle from memory as illustrated at block 424. Next, computer 40 recalls the sagittal angle, the 2 0 transverse angle and the three-dimensional coordinates of the top point of the virtual guidewire as illustrated at block 426. Next, computer 40 calculates the planned position and orientation using the equations in section [12] of the Appendix. This step is set forth at block 428. Next, computer 40 reads in data from a file related to the specific surgical end-effector being used for the surgical procedure as illustrated at block 430. This data includes the three-dimensional coordinates from the Coordinate Measurement Machine (CMM).
Computer 40 determines whether the surgical end-effector is properly assigned at block 434. If not, computer 40 generates an error message of "Surgical end-effector Not Assigned" as illustrated at block 436. Computer 40 then returns to wail for the next command as illustrated at block 438.
If the surgical end-effector is properly assigned at block 434, computer sends a command through serial communication port SO to the robot controller 53 to
8 PCTlUS97/08128 move the robot to the planned position and orientation as illustrated at block 440.
Computer 40 assigns the "NULL" end-effector as illustrated at block 442.
Computer 40 determines whether the NULL end-effector was properly assigned at block 444.
If not, computer 40 generates an error message of "NULL end-effector Not Assigned" as illustrated at block 446. Computer 40 then returns to wait for the next command at block 448. If the NULL end-effector is properly assigned at block 444, computer 40 returns to wait for the next command as illustrated at block 450.
If the Move Robot Along Axis button 82 of Fig. 3b is selected, computer 40 advances to block 452 of Fig. 1 S. The computer 40 has already moved the robot to 1 o the proper orientation during the steps of Fig. 20. Therefore, the steps of Fig. 21 are designed to move the robot along the tool guide axis defined by the tool guide 28 of Fig.
2. The tool guide axis typically moves toward and away from the body on the table 14 along the tool guide axis. Computer 40 determines whether a thread entitled "Move Robot Axis" has been dispatched at block 454. This thread program runs by itself until it is stopped. If the program is not started at block 454, computer 40 starts this program as illustrated at block 456. Computer 40 then returns to wait for additional instructions at block 458. If the thread program has started at block 454, then computer determines whether the Page Up button has been pressed at block 460. If not, computer 40 determines whether the Page Down button has been pressed at block 462. If not, 2 0 computer 40 returns to block 464 to wait for the next command.
If the Page Up button was pressed at block 460, computer 40 determines whether the Page Up button is still being pressed at block 466. If not, computer 40 returns to wait for the next command as illustrated at block 468. If the Page Up button is still being pressed at block 466, computer 40 sends a VAL command from 2 5 communication port 50 to robot controller 53 to move the robot in the positive tool guide axis direction as illustrated at block 470. The positive tool guide axis direction is up away from the patient. Computer 40 then returns to block 466.
If the Page Down button has been pressed at block 462, computer 40 determines whether the Page Down button is still being pressed at block 472.
If not, 3 o computer 40 returns at block 468 to wait for the next command. If the Page Down button is still being pressed at block 472, computer 40 sends a VAL command to move the robot in the negative tool guide axis direction as illustrated at block 474. The negative tool guide axis direction is down toward the patient. Computer 40 then returns to block 472.
In other words, the control steps of Fig. 15 permit the operator to move the robot along its tool guide axis. Once the robot is moving in either the positive or negative direction, it keeps moving until the Page Up or Page Down are released. The entire robot moves in order to maintain the end-effector 24 and the tool guide 28 in the same orientation along the planned axis. In other words, the end-effector 24 of robot 18 may be maintained in an orientation that is 45 ° relative to Z-axis 38 of Fig. 1. VAL is the program control language for the PUMA-560 controller 53. It is understood that other robots, controllers, and program languages may be used in accordance with the present invention.
If a cursor is over the A/P image area 86 of Fig. 3 a, computer 40 advances to block 476 of Fig. 16. Computer 40 waits for the trackball to be clicked in the A/P
image area 86 as illustrated at block 478. Once the trackball has been clicked at block 478, computer 40 determines whether both the A/P image and the sagittal image have been registered as illustrated at block 480. If not, computer 40 does nothing and returns to block 482 to wait for the next command.
If the A/P and the sagittal images have been registered at block 480, computer 40 determines whether the projected guidewire is drawn as illustrated at block 2 0 484. If not, computer 40 assumes that the operator intends to draw the projected guidewire. Therefore, the computer 40 draws a cross hair at trackball coordinate (U,V) as illustrated at block 486. Next, computer 40 draws a curve representing the line of site on the sagittal image using the equations of section [5] of the attached Appendix as illustrated at block 488. A curve is drawn representing the line of site due to the 2 5 distortion in the images. If you take an x-ray of a straight line, its image will be a curve due to the distortions inherent in the fluoroscope's image intensifier. This is why a curve must be drawn to represent the line of sight. Once the line of sight indicator 70 is drawn on the sagittal image area 62 of Fig. 3b, computer 40 returns to wait for the next command as illustrated at block 490.
3 o If the projected guidewire is already drawn at block 484, computer 40 determines whether the trackball coordinates are within five pixels from the top point 88 in the A/P image area 86. This step is illustrated at block 492. If the cursor coordinates WO 97/42898 PCTlUS97/08128 are within five pixels from the top point 88, computer 40 erases the projected guidewire as illustrated at block 494 and returns to wait for the next command as illustrated at block 496.
If the trackball cursor coordinates are not within five pixels from the top point 88 at block 492, computer 40 determines whether the trackball coordinates are within five pixels of the bottom point 90 as illustrated at block 498. If not, computer 40 returns to wait for the next command as illustrated at block 490. If the trackball cursor coordinates are within five pixels from the bottom point 90 at block 498, computer 40 determines whether the trackball has been clicked again as illustrated at block 500. If so, computer 40 returns to block 490 to wait for the next command. If not, computer 40 updates the transverse or sagittal angle as illustrated at block 502 based on movement of the trackball. The transverse angle value is incremented if the trackball is being moved up. The transverse angle value is decreased if the trackball is moving down.
The sagittal angle value is incremented if the trackball is being moved right. The sagittai angle value is decreased if the trackball is moving left. The incrementing factor is 0.1 ° per pixel. The equations for this step are set forth in section [6) of the Appendix.
After the transverse and/or sagittal angle have been updated at block 502, computer 40 redraws the projected guidewire 92 in the A/P image area 86 and the projected guidewire 68 in the sagittal image area 62 using the equations in section [7) of 2 o the attached Appendix. These steps are illustrated at block 504. Computer 40 then returns to block 500.
If the cursor is over the sagittal image area 62 of Fig. 3b, computer 40 advances to block 506 of Fig. 17. Computer 40 determines whether the line of sight has been drawn at block 508. If not, computer 40 returns to wait for the next command at block 510. If the line of sight has been drawn at block 508, computer 40 draws the projected guidewire 92 in the A/P image area 86 and the projected guidewire 68 in the sagittal image area 62 using the equations in section [8) of the Appendix.
These steps are illustrated at block 512. Computer 40 also checks if the robot has been initialized at block 513, if it has then computer 40 enables and brightens buttons "Move Robot"
104, and "Move Along Drill Axis" 82, and keys F11, and F12 at block 513.5. Next, computer 40 waits for the track ball in the sagittal image area 62 to be clicked as illustrated at block 514. If robot has not been initialized then computer 40 waits for the track ball in the sagittal image area 62 to be clicked as illustrated at block 514. Next, computer 40 determines whether the trackball cursor coordinates are within five pixels from the top point 64 as illustrated at block 516. If not, computer 40 determines whether the trackball coordinates are within five pixels of the bottom point 66 as illustrated at block 518. If not, computer 40 returns at block 520 to wait for the next command.
If the trackball coordinates are within five pixels of the top point 64 at block 516, computer 40 determines whether the trackball has been clicked again at block 522. If so, computer 40 returns at block 524 to wait for the next command. If not, computer 40 updates the position of the virtual guidewire 68 by moving it along the line of sight in the same direction as the trackball movements. The incrementing ratio is 0.1 °
mm/pixel. This step is illustrated at block 526. The computer uses the equations set forth in section [9] of the Appendix to update the virtual guidewire position.
Computer 40 then redraws the projected guidewire 68 in the sagittal image area 62 and also redraws the projected guidewire 92 in the A/P image area 86 as illustrated at block 528 by using the equations set forth in Section [7] of the Appendix. Computer 40 then returns back to block 522.
If the trackball coordinates are within five pixels from the bottom point 66 at block 518, computer 40 determines whether the trackball has been clicked again at block 530. If so, computer 40 returns at block 524 to wait for the next command. If not, 2 0 computer 40 assumes that the operator wants to adjust the position of bottom point 66.
Therefore, computer 40 updates the sagittal and/or transverse angle as illustrated at block 532 based on movement of the trackball. The transverse angle value is incremented if the trackball is being moved up. The transverse angle value is decreased if the trackball is moving down. The sagittal angle value is incremented if the trackball is being moved to the right. The sagittal angle value is decreased if the trackball is moving to the left. The incrementing ratio is 0.1 ° /pixel. Computer 40 uses the equations of section [ 10] of the Appendix for these steps as illustrated at block 532. Next computer 40 redraws the projected guidewire 68 in the sagittal image area 62 and the projected guidewire 92 in the A/P image area 86 as illustrated at block 534 using the equations set forth in Section [7]
3 0 of the Appendix. Computer 40 then returns to block 530.
If the Robot Control areas 84 of Fig. 3a-b is selected, computer 40 advances to block 536 of Fig. 18. Computer 40 then displays a menu giving the user options at block 538. The first option is a "Initialize Robot" option.
Computer 40 determines whether the Initialize Robot menu item was selected at block 540.
If so, computer 40 opens the serial communication port 52 for communication with the robot controller 53 as illustrated at block 542. Computer 40 sends the VAL program language commands required to initialize the robot controller 53 as illustrated at block 544.
Computer 40 determines whether VAL was initialized properly at block 546. If VAL was not initialized properly then the computer 40 sends message VAL not initialized 53 as illustrated at block 548. Computer 40 then returns at block 550.
If VAL was properly initialized at block 546, computer 40 transnuts to preestablished HOME and START positions to the robot controller 53 as illustrated at block 552. The HOME and START position are two positions in the work space of the robot. In addition, computer 40 initializes the preestablished NULL end-effector and SURGICAL end-effector as illustrated at block 554. In other words, computer 40 sends specifications to the precise configurations of the specific surgical instrument that is going to be used. Therefore, the controller 53 is programmed to move the robot to these positions. During operation, computer 40 can instruct the controller 53 to move to the particular HOME or START positions. In addition, controller 53 will recognize instructions for the particular surgical end-effector which was initialized during step 554.
Next, the robot speed is set to a very slow speed as illustrated at block 556.
For example, 2 0 the robot speed is set to a speed of 5 out of 256. Next, the computer 40 checks if the virtual guidewire has been planned, if it has then it enables and brightens buttons "Move Robot" 104 and "Move Robot Along Tool Axis" 82 and keys F 11, F 12, as illustrated in block 557.5. Computer 40 then returns to wait for the next instruction as illustrated at block 559.
2 5 If the virtual guidewire has not been planned, computer 40 then returns to wait for the next instruction as illustrated at block 558.
If an option entitled "Move to a Predefined Location" was selected from the pop-up menu 538 and if the robot was already initialized as illustrated at block 560, then computer 40 displays a dialog box with options to move the robot to the predefined 3 o locations as illustrated at block 562. In other words, a dialog box with the options to move the robot to the HOME position or the START position are displayed. The operator can select one of these options at block 562. Computer 40 then sends a VAL

command to controller 53 to move the robot I8 to the specified location as illustrated at block 564. Computer 40 then returns at block 568 to wait for the next command.
If computer 40 determines that the option "Assigned Predefined Tool" was selected from the menu 538 and if the robot has already been initialized as illustrated at block 570, then computer 40 displays a dialog box with options to assign the predefined tools established during the initialization step at block 554. This step is illustrated at block 574. In other words, computer 40 displays a dialog box for assigning either the NULL end-effector or the SURGICAL end-effector at block 574. Once the desired tool is selected, computer 40 transmits to VAL the command to assign the specified end-effector to controller 53 as illustrated at block 576. Computer 40 then returns to wait for the next command at block 578. If the assigned predefined end-effector item was not selected or the robot was not initialized at block 570, computer 40 returns at block 572 to wait for the next command.
Although the invention has been described in detail with reference to a certain preferred embodiment, variations and modifications exist within the scope and spirit of the present invention as described and defined in the following claims.

APPENDIX
{Page 1 of 6) Notation used t_h_roughout the flowchart WCS Worid Coordinate System CCS C-arm Coordinate System (r, y, z) Used for 3D coordinates in WCS and the CCS.

(x, Y) Used for calibrated image coordinates.

{u, v) Used for real image coordinates.

Sagittal Angle.

Transverse Angle.

Y Approach Angle.
Subs ri t c p Specifies the coordinate system. Only s: used with 3D
w = WCS

c = CCS coodinate systems.

t = top Specifies a point on the virtual guidewire.

b = bottom a = ~' Specifies to what image the information pertains to.

s = Sagittal [1] J. Canny; "A Computational Approach to Edge Detection"; IEEE Transactions on Pattern Analysis Machine Intelligence; Vol 8, Nov. 1986, pp. 679-698.
[2] Mathematics involved in performing the Levenberg-Marquardt optimiz~ion method:
cos~cosB cos~sin6sinyr- sin~cosyi cos~sin6bosyr+ sin~sinyr R= sin~cosB sin~sinBsinyr+cosøcos~ sin~sin6bosqr-cos~sin~
-sin B cos Bin yr cos 6bos yi =~Rt~x+Rmy+Ri3z+tX
uc(xii a) ~ lx ~. ~ .~.R Z -h tZ~ f ~3 and - ~RZ;X + RZ,,y + Rz3z + 1y vc(xi~ a) ~ ~x + ~~, +R z + tZ) f ~3 ~,((ui uc(xie a))2 + (vi VC(xl' a))2) i~
where x; _ [x, y, z]; are the 3D coordinates of the fiducials, (u, v) are the 2D coordinates of the center of the fiducials, and a = [~, B, ~, tX, ty, tZJ are the six parameters that define a six degree-of freedom pose.
[3] Once the fit has been performed I construct the homogeneous transformation matrix that corresponds to the optimized parameters (a = [~, B, ~, tX, ty, tZ]) as follows:

APPENDIX
(Page 2 of 6) cos~cosA cos~sin6sin~ - sin~cosw cos~sin6cosy~ + sin~sinyr px sin~cos9 sin~sin6siny + cos~cosy~ sin~sin9cos~ - cos~sinw py -sing cos6siny~ cos6cosW pz [4] Once the fit has been performed I contruct the homogeneous transformation matrix that corresponds to the optimized parameters (a = [~, 8, yr, tx, tY, tZ]) as follows:
cos~cosA cos~sin6sinyr - sin~cos~r cos~sin6cosy~ + sin~siny~ px _ sin~cos0 sin~sinAsiny + cos~cosy~ sin~sin6cos~r - cos~siny~ py -sin8 cosAsiny~ cos9cosyr pz [S] The line of sight is calculated in the following way:
The line of sight is bound by (0, 0, 0) and (u~, v~, f) in the CCS.
Note: (u~, v~) is the calibrated equivalent of (u, v). See [13]
LSxW, LSxWz u~ 0 LSyWI LSyWz _ _1 v~ 0 LSzW, LSzWz - ~~G~j f 0 xcsl xa2 1_.SXN.~ LSXW,2 Ycs1 Y~z = ~~G ] LSyW, LSyWz z~, z~z ~ LSzW, LSzWz ui =s'~!.f y =Ys~l.f z~s~ z~,i __ s~ __ Ys~
uz z~z.f vz z~zf Due to the inherent distortion in the fluoroscopic images the line of sight is drawn as a curve image. This is done by un-calibrating SO points on the line bound by (u,, v1) and (uz, vz) as in [15] and drawing a polyline through them.
[6] Recall that the virtual guidewire is a 3D object bound by (~, 0"", 0~"t) and (OWb, OWb, -screwlengthWb).
/3= R+ 0.1 * (# pixels moved by the trackball) VxWt VxWb 0"n OWb Owc OWb VY~ VYWb = ~TI(a~ R~ ~~ tY~ ~»
VzWt YzWb 0,~ -screwlengthWb [7] With (Vx",~, Vy""~, Vz",~) and (YxWb, VyWb, YzWb) the virtual guidewire's projection is drawn on both the A/P and sagittal images using the following equations:

APPENDIX
(Page 3 of 6) xcat xcab YX",t YxWb Ycat Ycab = ~~G J Yywt YYWb Zcat Zcab A YZWt YZH,b xcst x~5b YxWt VxWb Ycst Ycsb = [~G J Yywt YYwb Zcst Zub S YZWt YZWb a uat . ~S~ Vat . ~t~
./ cat ust = xs~t. f vsc -_ Y~s.f Zut Zcst uab ~f vab - Z~f usb ~f vsb = Z~f Due to the distortion in fluoroscopic images the projected guidewire is drawn as a curve. This is done byun-calibrating 20 points on the line bound by (uat, vat) and (uab, vab) as in [15] and drawing a polyline through them on the A/P image and similarly for the Sagittal image using (ust, vst) and (usb, vsb).
[8] To draw the virtual guidewire's projection, two points (0, 0, 0) and (0, 0, screwlength), in the WCS are transformed so that the top point (0, 0, 0) lies on the line of sight. The virtual guidewire is initially set to 30mm.
The projected guidewire is drawn using the following math:
initially:
depth = 0.2 screwlength = 30mm a=0,/3=0 (tx, ty, tz) is constrained to lie on the line of sight bound by ~,SxW,, LSyw,, LSzw,) and (LSxWz, LSyW2, LSzW2), thus tx = LSxW, - depth*(LSxW2 - LSxW,) ty = LSyW, - depth*(LSyW, - LSyW,) tz = LSzW, - depth*(LSz",2 - LSzW,) VxWt VxWb O,~t OWb YYwn YYWb = ~T~ (a ~j ~.> $'~ tz)J '"' O
YzW, YzWb ' ' 0"" -screwlengthWb ' T is composed of the following transformations:
T = Trans(tx, ty, tz) Rot(y, a) Rot(x, ,(~
or cosa sinasin/j sinacos~3 tx ~T (a, ~3, tx, ty, tz)] = 0 cos~3 - sink ty -sina cosasin/j cosacos/3 tz WO 97/42898 _ 32 _ PCT/US97108128 APPENDIX
(Page 4 of 6) in order to draw the projected guidewire on the images, the points (YxWt, VyW,,VzW~) and (VxWb, VyWb, VzWb) are used in conjunction [7].
[9] Recall that the virtual guidewire is a 3D object bound by (0",t, OW,, OWE) and (OWb, OWb, -screwlengthWb).
depth = depth + 0.1 * (# pixels moved by the trackball) tx = LSxW, - depth*(LSxW,_ - LSxW,) ty = LSyW, - depth*(LSyW2 - LSyW,) tz = LSzW, - depth*(LSzW2 - LSzw,) YxW, VxWb OWE OWb VYWC VY..~b t Owc OWb =CT (a,~3,rx,t1'.tz)~
YzW~ VzWb 0",~ -screwlengthWb
[10] Recall that the virtual guidewire is a 3D object bound by (q~,~, 0"", OWt) and (OWb, 0~,6, -screwlengthWb).
a = a + 0.1 * (# pixels moved by the trackball) Vx"~ VxWb 0,~ OWb VYWt VYWb = ~.~.t(a ~~ ~~ ty~ tz 0"" O
Vi"" YzWb )~ 0",~ -screwlengthWb
[11] Recall that the virtual guidewire is a 3D object bound by (Q"~, 0",r, OWt) and (0W6, OWb, -screwlengthWb).
YxW~ YxWb 0"K OWb VYWc YYWn - ~.y(a~ ~~ tx, ty, tz)] OW' OWn VzW~ YzWb 0,~ -screwlengthWb
[12] Given [Toot]z =[Rot(Z, -90)] [Rot(y, -90)]
[Plan]2 = [Rot(y,«)] [Rot(x,a)] [Toot]
[Approach] 2 = Rot(z,r)]
PNx [FinaIPlan]2 Prry PNZ
and using the following contraints I determine the remaining two vectors that would be complete[FP].
Note: The first vector (N) is maintained from the [Plan] since it is the dirll guide axis:
Contraints:
2 These matrices re of the following form:
Nx Ox Ax Ny Oy Ay Nz Oz Az WO 97!42898 PCT/LT897/08128 APPENDIX
(Page 5 of 6) 1) FPArz + FpAyz + FP,~2 = 1 2) A,~ ~ FP,~ = 0 3) FPN ~ FPa = 0 D = - Arv~ (ANr ' FPNZ - FPN% ~AN ) ANz ANx (FPNx ' ANy - ANx' FPNy) ANx E = (ANr ~ FPNZ - FPNX ~ANZ) (FPNX ' ANy - ANx' FPNy) FP,e,z = f D'- + E + 1 FP,4,~ = D ~ FP,~
FPAr = E ~ FPS
FPo is determined using FPo = FPN x FPA
Hence, PNx FPoX FPnx ~Finalplan~ = PNy FPoy FPAy PNZ FPoZ FPnZ
Since the PUMA 560 robot uses an Euler representation for specifying an orientation, the inverse solution of(FP]
is determined in the following manner:
Euler representation = Rot(z,~) Rot(y, B) Rot(z, yr) thus from [i ].
~ = arctan(FP,,r, FP,~
B = arctan(FP~ ~ cos(~ + FPAr ~ sin(, FP,a,Z) yr = arctan(-FPNX ~ sin( + FPNr ~ cos(~, -FPoX ~ sin( + Fpor ~ cos(~) Adding a PUMA specific offset to ~, and B the final position and orientation is established Final pose = (~ + 90, B- 90, y~, tx, ty, tz)
[13] The calibrated coordinates (x, y) of the edge-pixels (u, v) are determined using a quartic polynomial equation as follows:
x = ao u4v4 + a ~ u4v3 + az u4vz + ... + az3 uv + az4 y = a0 u4v4 + a 1 u4v3 ~' a2 u4vz + ... + a23 1!V ~' a24 the set of parameters a and b, are previously determined using the image calibration program.
[14] The center of the fiducial shadow is found by fitting the equation of a circle to the edge-pixels using a pseudo-inverse approach:
xzo + yzo xo yo 1 2h 2k ~n + yZn xn Yn 1 ~ - hz -l~

APPENDIX
(Page 6 of 6) or A=BP
using pseudo inverse P = (BrB)_~ BrA
once P is established the center of the fiducials (h, k) is determined as follows:
h =~°

k =~-' [15J The un-calibrated (distorted) coordinates (u, v) corresponds to the calibrated coordinate (x, y) and is determined using a quartic polynomial equation as follows:
u=aox'y4+aixy3+azx4yz+...+azsxy+aza v=aoxy4+aix4y3+azxyz+...+a~xy+aza the set of parameters a and b, are previously determined using a separate calibration program.
i Robot Manipulators: Mathematics. Pro~amm_,'ng and Control; Richard P. Paul;
The MIT Press, Cambridge, Massachusetts and London, England, 1983.

Claims (54)

Claims:
1. A computer-aided method for planning a surgical procedure comprising:
registering to a known coordinate frame a first two-dimensional, fluoroscopic image of a body's anatomy taken at a first observation angle;
displaying the first image; and drawing in the displayed first image a representation of at least one of a trajectory, position, and orientation of a surgical device to be placed in the body based on the registration of the first image with the known coordinate frame.
2. The method of claim 1 wherein drawing in the displayed first image the representation of the at least one of trajectory, position, and orientation of the surgical device is in response to a user indicating at least one positioning parameter for the surgical device, the at least one positioning parameter defined in reference to the known coordinate frame.
3. The method of claim 2 wherein the at least one positioning parameter includes an approach angle of the surgical device.
4. The method of claim 2 wherein the at least one positioning parameter is defined in reference to the first image.
5. The method of claim 4 wherein the at least one positioning parameter includes a point in the body.
6. The method of claim 4 wherein the user indicates the at least one positioning parameter by positioning a cursor displayed within the first image.
7. The method of claim 1 wherein a user indicates at least one parameter defining the at least one of trajectory, position, and orientation of the surgical device.
8. The method of claim 7 wherein the at least one parameter includes a length of the surgical device.
9. The method of claim 1 further comprising:
registering to the known coordinate frame a second two-dimensional, fluoroscopic image of the body's anatomy taken at a second observation angle;
displaying the second image; and drawing in the displayed second image the representation of the at least one of trajectory, position, and orientation of the surgical device based on the registration of the second image with the known coordinate frame.
10. The method of claim 9 wherein drawing the representation of the at least one of trajectory, position, and orientation of the at least one of surgical device in the second image is in response to a user indicating on the displayed first image a change in position of the representation of the at least one of trajectory, position, and orientation of the surgical device in the first image.
11. The method of claim 1 wherein the representation of the at least one of trajectory, position, and orientation of the surgical device is a projection of a virtual guidewire defining, at least in part, a trajectory of insertion of the surgical device into the body.
12. The method of claim 1 wherein the representation of the at least one of trajectory, position, and orientation of the surgical device is a projection of a virtual guidewire having a length corresponding to a length of the surgical device to be inserted into the body.
13. The method of claim 1 further comprising transmitting to a positioning mechanism coordinates for indicating the position of the surgical device represented in the first image.
14. The method of claim 13 further comprising manipulating the positioning mechanism such that a guide coupled to the positioning mechanism is substantially aligned with the representation of the at least one of trajectory, position, and orientation of the surgical device in the image.
15. The method of claim 1 further comprising displaying information for indicating the position of the surgical device represented in the first image.
16. A computer readable storage medium encoded with instructions, which, when read by a computer, enable a computer to undertake a process comprising:
registering to a known coordinate frame a first two-dimensional, fluoroscopic image of a body's anatomy taken at a first observation angle;
displaying the first image; and drawing in the displayed first image a representation of at least one of a trajectory, position, and orientation of a surgical device to be placed in the body based on the registration of the first image with the known coordinate frame.
17. The computer readable storage medium of claim 16 wherein drawing in the displayed first image the representation of the at least one of trajectory, position, and orientation of the surgical device is in response to a user indicating at least one positioning parameter for the surgical device.
18. The computer readable storage medium of claim 17 wherein the at least one positioning parameter for the surgical device is defined in reference to the known coordinate frame.
19. The computer readable storage medium of claim 18 wherein the indication of the at least one positioning parameter is a reference on the displayed first image controlled by a user.
20. The computer readable storage medium of claim 16 wherein the process further comprises:
registering to the known coordinate frame a second two-dimensional, fluoroscopic image of the body's anatomy taken at a second observation angle;
displaying the second image; and drawing in the displayed second image the representation of the at least one of trajectory, position, and orientation of the surgical device based on the registration of the second image with the known coordinate frame.
21. The computer readable storage medium of claim 20 wherein drawing the representation of the at least one of trajectory, position, and orientation of the surgical device in the second image is in response to an input received from a user indicating a position of the surgical device.
22. The computer readable storage medium of claim 20 wherein drawing the representation of the at least one of trajectory, position, and orientation of the surgical device in the second image is in response to an input received from a user indicating a position of the representation of the at least one of trajectory, position, and orientation of the surgical device in the displayed first image.
23. The computer readable storage medium of claim 20 wherein drawing the representation of the at least one of trajectory, position, and orientation of the surgical device in the second image is in response to an input indicating a change in position of the representation of the at least one of trajectory, position, and orientation of the surgical device in the first image.
24. The computer readable storage medium of claim 20 wherein registering to the known coordinate frame the first image and the second image includes registering known coordinates of a plurality of fiducials within the reference frame with positions of the plurality of fiducials in the first and second images.
25. A computer-aided method for planning a surgical procedure comprising:
registering a first two-dimensional, fluoroscopic image of a body's anatomy taken at a first observation angle with a second two-dimensional fluoroscopic image of the body's anatomy taken at a second observation angle;
displaying the first image;
drawing within the displayed first image a representation of at least one of a trajectory, position, and orientation of a surgical device to be placed in the body based on an input indicating a position of the surgical device;
displaying the second image; and drawing in the displayed second image the representation of the at least one of trajectory, position, and orientation of the surgical device.
26. The method of claim 25 wherein drawing the representation of the at least one of trajectory, position, and orientation of the surgical device in the second image is based, at least in part, on positioning in the displayed first image of the representation of the at least one of trajectory, position, and orientation of the surgical device in the first image.
27. The method of claim 25 wherein drawing in the first image and drawing in the second image the representation of the at least one of trajectory, position, and orientation of the surgical device is at least in part in response to a user indicating at least one positioning parameter for the surgical device.
28. The method of claim 27 wherein the at least one positioning parameter for the surgical device is defined in reference to a known coordinate frame to which the first and the second images are registered.
29. The method of claim 27 wherein the at least one positioning parameter includes an approach angle of the surgical device.
30. The method of claim 27 wherein the at least one positioning parameter includes a point in the body.
31. The method of claim 27 wherein the user indicates the at least one positioning parameter by positioning a reference displayed within the first or second images.
32. The method of claim 25 wherein a user indicates at least one parameter defining the at least one of trajectory, position, and orientation of the surgical device.
33. The method of claim 25 further comprising transmitting to a positioning mechanism coordinates for indicating the position of the representation of the at least one of trajectory, position, and orientation of the surgical device in the first image.
34. The method of claim 33 further comprising manipulating the positioning mechanism such that a guide coupled to the positioning mechanism is substantially aligned with the representation of the at least one of trajectory, position, and orientation of the surgical device in the image.
35. The method of claim 25 further comprising displaying information for indicating the position within a known coordinate frame of reference for the surgical device for use in manually positioning a guide.
36. The method of claim 25 wherein registering the first and second images includes registering a plurality of fiducials having known coordinates within a known coordinate frame of reference with images of the plurality of fiducials within the respective first and second images.
37. A computer readable storage medium encoded with instructions, which, when read by a computer, enable a computer to undertake a process comprising:
receiving a first two-dimensional, fluoroscopic image taken of a patient's body and a plurality of radio-opaque fiducials placed adjacent the body at known positions; and registering the fluoroscopic image by optimizing parameters of a known geometric model such that projections of the plurality of fiducials into the first image best fit positions of the plurality of fiducials in the image.
38. The computer readable storage medium of claim 37 wherein the process further comprises:
receiving a second, two-dimensional fluoroscopic image taken of the patient's body and the plurality of fiducials from a position different from the first fluoroscopic image; and registering the second fluoroscopic image by optimizing parameters of the known geometric model such that projections of the plurality of fiducials into the second image best fit positions of the plurality of fiducials in the second image.
39. The computer readable storage medium of claim 38 wherein the process further comprises:
receiving input indicating a point on one of the first and second images, wherein the point corresponds to a point of a virtual object;
receiving input indicating at least one of a position, length, and orientation of the virtual object; and drawing on the first image a first projection of the virtual object and drawing on the second image a second projection of the virtual object.
40. The computer readable storage medium of claim 39 further comprising:

receiving input indicating a change to the at least one of position, length, and orientation of the virtual object; and redrawing the first projection on the first image and the second projection on the second image based on the change to the at least one of position, length, and orientation of the virtual object.
41. The computer readable storage medium of claim 39 wherein the virtual object is a representation of at least one of a trajectory, position, and orientation of a surgical device and the first and second projections are also representations of the at least one of trajectory, position, and orientation of the surgical device.
42. The computer readable storage medium of claim 38 wherein the process further comprises:
receiving an input indicating a position of a virtual object within the body;
and drawing on the first and the second images a projection of the virtual object in the indicated position.
43. The computer readable storage medium of claim 42 wherein the process further comprises:
receiving an input indicating a change in the position of the virtual object to a second position; and drawing on the first and the second images the projection of the virtual object in the second position.
44. The computer readable storage medium of claim 37 wherein registering the fluoroscopic image further comprises:
displaying the fluoroscopic image; and receiving an input from a user indicating on the fluoroscopic image the position of each of the plurality of fiducials within the image.
45. The computer readable storage medium of claim 37 wherein the process further comprises linearizing the fluoroscopic image before registering the image.
46. A method comprising:
receiving a first two-dimensional, fluoroscopic image taken of a patient's body and a plurality of radio-opaque fiducials placed adjacent the body at known positions; and registering the fluoroscopic image by optimizing parameters of a known geometric model such that projections of the plurality of fiducials into the first image best fit positions of the plurality of fiducials in the image.
47. The method of claim 46 further comprising:
receiving a second, two-dimensional fluoroscopic image taken of the patient's body and the plurality of fiducials from a position different from the first fluoroscopic image; and registering the second fluoroscopic image by optimizing parameters of the known geometric model such that projections of the plurality of fiducials into the second image best fit positions of the plurality of fiducials in the second image.
48. The method of claim 47 further comprising:
receiving input indicating on one of the first and second images a trajectory of a surgical instrument with respect to the body; and drawing on the other of the first and second images a corresponding representation of the trajectory projected into said other of the first and second images.
49. The method of claim 48 further comprising:
receiving input indicating a change to a position or orientation of the trajectory within said one of the first and second image; and redrawing within said other of the first and second images the corresponding representation of the trajectory based on the change in the position or orientation.
50. The method of claim 48 wherein the trajectory of the surgical instrument is represented by a virtual object.
51. The method of claim 47 further comprising:
receiving an input indicating a position of a trajectory of a virtual object within the body; and drawing on the first and the second images a representation of the trajectory in the indicated position.
52. The method of claim 51 further comprising:
receiving an input indicating a change in the position of the trajectory to a second position; and drawing in the first and the second images the representation of the trajectory in the second position.
53. The method of claim 46 further comprising:
displaying the fluoroscopic image; and receiving an input from a user indicating on the fluoroscopic image the position of each of the plurality of fiducials within the image.
54. The method of claim 46 further comprising linearizing the fluoroscopic image before registering the image.
CA002255041A 1996-05-15 1997-05-14 Stereotactic surgical procedure apparatus and method Expired - Lifetime CA2255041C (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US64831396A 1996-05-15 1996-05-15
US08/648,313 1996-05-15
US08/649,798 US5799055A (en) 1996-05-15 1996-05-17 Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US08/649,798 1996-05-17
PCT/US1997/008128 WO1997042898A1 (en) 1996-05-15 1997-05-14 Stereotactic surgical procedure apparatus and method

Publications (2)

Publication Number Publication Date
CA2255041A1 CA2255041A1 (en) 1997-11-20
CA2255041C true CA2255041C (en) 2006-11-21

Family

ID=27095352

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002255041A Expired - Lifetime CA2255041C (en) 1996-05-15 1997-05-14 Stereotactic surgical procedure apparatus and method

Country Status (9)

Country Link
US (3) US5799055A (en)
EP (1) EP0955935A4 (en)
JP (2) JP4469423B2 (en)
AU (1) AU3066497A (en)
CA (1) CA2255041C (en)
IL (1) IL127027A (en)
NZ (1) NZ332764A (en)
TW (1) TW384217B (en)
WO (1) WO1997042898A1 (en)

Families Citing this family (408)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2652928B1 (en) 1989-10-05 1994-07-29 Diadix Sa INTERACTIVE LOCAL INTERVENTION SYSTEM WITHIN A AREA OF A NON-HOMOGENEOUS STRUCTURE.
US5603318A (en) 1992-04-21 1997-02-18 University Of Utah Research Foundation Apparatus and method for photogrammetric surgical localization
US5913820A (en) 1992-08-14 1999-06-22 British Telecommunications Public Limited Company Position location system
US6695848B2 (en) 1994-09-02 2004-02-24 Hudson Surgical Design, Inc. Methods for femoral and tibial resection
US8603095B2 (en) 1994-09-02 2013-12-10 Puget Bio Ventures LLC Apparatuses for femoral and tibial resection
EP0951874A3 (en) 1994-09-15 2000-06-14 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications using a reference unit secured to a patients head
JP3492697B2 (en) * 1994-10-07 2004-02-03 セントルイス ユニバーシティー Surgical guidance device with reference and localization frame
US5592939A (en) 1995-06-14 1997-01-14 Martinelli; Michael A. Method and system for navigating a catheter probe
US5799055A (en) * 1996-05-15 1998-08-25 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
USRE40176E1 (en) * 1996-05-15 2008-03-25 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US6226418B1 (en) 1997-11-07 2001-05-01 Washington University Rapid convolution based large deformation image matching via landmark and volume imagery
US6611630B1 (en) 1996-07-10 2003-08-26 Washington University Method and apparatus for automatic shape characterization
US6009212A (en) 1996-07-10 1999-12-28 Washington University Method and apparatus for image registration
US6408107B1 (en) 1996-07-10 2002-06-18 Michael I. Miller Rapid convolution based large deformation image matching via landmark and volume imagery
US6314310B1 (en) * 1997-02-14 2001-11-06 Biosense, Inc. X-ray guided surgical location system with extended mapping volume
US6708184B2 (en) 1997-04-11 2004-03-16 Medtronic/Surgical Navigation Technologies Method and apparatus for producing and accessing composite data using a device having a distributed communication controller interface
US5970499A (en) 1997-04-11 1999-10-19 Smith; Kurt R. Method and apparatus for producing and accessing composite data
US6226548B1 (en) 1997-09-24 2001-05-01 Surgical Navigation Technologies, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
US6021343A (en) 1997-11-20 2000-02-01 Surgical Navigation Technologies Image guided awl/tap/screwdriver
US20030135115A1 (en) * 1997-11-24 2003-07-17 Burdette Everette C. Method and apparatus for spatial registration and mapping of a biopsy needle during a tissue biopsy
US6149592A (en) * 1997-11-26 2000-11-21 Picker International, Inc. Integrated fluoroscopic projection image data, volumetric image data, and surgical device position data
US6064904A (en) * 1997-11-28 2000-05-16 Picker International, Inc. Frameless stereotactic CT scanner with virtual needle display for planning image guided interventional procedures
US6348058B1 (en) 1997-12-12 2002-02-19 Surgical Navigation Technologies, Inc. Image guided spinal surgery guide, system, and method for use thereof
US8303576B2 (en) 1998-02-24 2012-11-06 Hansen Medical, Inc. Interchangeable surgical instrument
US7758569B2 (en) * 1998-02-24 2010-07-20 Hansen Medical, Inc. Interchangeable surgical instrument
US7901399B2 (en) 1998-02-24 2011-03-08 Hansen Medical, Inc. Interchangeable surgical instrument
ES2304794T3 (en) 1998-06-22 2008-10-16 Ao Technology Ag PAREO OF LOCATION THROUGH LOCALIZATION SCREWS.
US6118845A (en) 1998-06-29 2000-09-12 Surgical Navigation Technologies, Inc. System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers
US20040030244A1 (en) * 1999-08-06 2004-02-12 Garibaldi Jeffrey M. Method and apparatus for magnetically controlling catheters in body lumens and cavities
US6477400B1 (en) * 1998-08-20 2002-11-05 Sofamor Danek Holdings, Inc. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US6482182B1 (en) 1998-09-03 2002-11-19 Surgical Navigation Technologies, Inc. Anchoring system for a brain lead
WO2000021442A1 (en) 1998-10-09 2000-04-20 Surgical Navigation Technologies, Inc. Image guided vertebral distractor
DE19848765C2 (en) 1998-10-22 2000-12-21 Brainlab Med Computersyst Gmbh Position verification in camera images
US6214018B1 (en) * 1998-11-04 2001-04-10 Trex Medical Corporation Method and apparatus for removing tissue from a region of interest using stereotactic radiographic guidance
US6430434B1 (en) * 1998-12-14 2002-08-06 Integrated Surgical Systems, Inc. Method for determining the location and orientation of a bone for computer-assisted orthopedic procedures using intraoperatively attached markers
IL144320A0 (en) * 1999-01-15 2002-05-23 Z Kat Inc Apparatus and method for measuring anatomical objects using coordinated fluoroscopy
US6285902B1 (en) 1999-02-10 2001-09-04 Surgical Insights, Inc. Computer assisted targeting device for use in orthopaedic surgery
CA2594492A1 (en) * 1999-03-07 2000-09-14 Active Implants Corporation Method and apparatus for computerized surgery
AU748703B2 (en) 1999-03-17 2002-06-13 Ao Technology Ag Imaging and planning device for ligament graft placement
US6470207B1 (en) 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
JP4636696B2 (en) 1999-04-20 2011-02-23 アーオー テクノロジー アクチエンゲゼルシャフト Device for percutaneous acquisition of 3D coordinates on the surface of a human or animal organ
DE19917867B4 (en) 1999-04-20 2005-04-21 Brainlab Ag Method and device for image support in the treatment of treatment objectives with integration of X-ray detection and navigation system
US6491699B1 (en) 1999-04-20 2002-12-10 Surgical Navigation Technologies, Inc. Instrument guidance method and system for image guided surgery
EP1504726A1 (en) * 1999-04-22 2005-02-09 Medtronic Surgical Navigation Technologies Apparatus for image guided surgery
US6689142B1 (en) 1999-04-26 2004-02-10 Scimed Life Systems, Inc. Apparatus and methods for guiding a needle
ES2201700T3 (en) * 1999-05-03 2004-03-16 Synthes Ag Chur DEVICE DETECTION DEVICE EQUIPPED WITH AUXILIARY MEDIA ALLOWING TO DETERMINE THE DIRECTION OF THE GRAVITY VECTOR.
US6626899B2 (en) 1999-06-25 2003-09-30 Nidus Medical, Llc Apparatus and methods for treating tissue
JP3608448B2 (en) * 1999-08-31 2005-01-12 株式会社日立製作所 Treatment device
US6206891B1 (en) * 1999-09-14 2001-03-27 Medeye Medical Technology Ltd. Device and method for calibration of a stereotactic localization system
US6381485B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies, Inc. Registration of human anatomy integrated for electromagnetic localization
US6499488B1 (en) 1999-10-28 2002-12-31 Winchester Development Associates Surgical sensor
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US6474341B1 (en) 1999-10-28 2002-11-05 Surgical Navigation Technologies, Inc. Surgical communication and power system
US6493573B1 (en) 1999-10-28 2002-12-10 Winchester Development Associates Method and system for navigating a catheter probe in the presence of field-influencing objects
US11331150B2 (en) 1999-10-28 2022-05-17 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US8239001B2 (en) 2003-10-17 2012-08-07 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6235038B1 (en) 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
US7366562B2 (en) 2003-10-17 2008-04-29 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6671538B1 (en) * 1999-11-26 2003-12-30 Koninklijke Philips Electronics, N.V. Interface system for use with imaging devices to facilitate visualization of image-guided interventional procedure planning
US7635390B1 (en) * 2000-01-14 2009-12-22 Marctec, Llc Joint replacement component having a modular articulating surface
US6702821B2 (en) 2000-01-14 2004-03-09 The Bonutti 2003 Trust A Instrumentation for minimally invasive joint replacement and methods for using same
US7104996B2 (en) * 2000-01-14 2006-09-12 Marctec. Llc Method of performing surgery
US7689014B2 (en) * 2000-01-18 2010-03-30 Z-Kat Inc Apparatus and method for measuring anatomical objects using coordinated fluoroscopy
DE10009166A1 (en) * 2000-02-26 2001-08-30 Philips Corp Intellectual Pty Procedure for the localization of objects in interventional radiology
US6725080B2 (en) 2000-03-01 2004-04-20 Surgical Navigation Technologies, Inc. Multiple cannula image guided tool for image guided procedures
US6701174B1 (en) 2000-04-07 2004-03-02 Carnegie Mellon University Computer-aided bone distraction
US20040068187A1 (en) * 2000-04-07 2004-04-08 Krause Norman M. Computer-aided orthopedic surgery
US6711432B1 (en) 2000-10-23 2004-03-23 Carnegie Mellon University Computer-aided orthopedic surgery
US6535756B1 (en) 2000-04-07 2003-03-18 Surgical Navigation Technologies, Inc. Trajectory storage apparatus and method for surgical navigation system
US6856827B2 (en) * 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6490475B1 (en) * 2000-04-28 2002-12-03 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6856826B2 (en) * 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
ATE221751T1 (en) * 2000-05-09 2002-08-15 Brainlab Ag METHOD FOR REGISTERING A PATIENT DATA SET FROM AN IMAGING PROCESS IN NAVIGATION-ASSISTED SURGICAL PROCEDURES USING X-RAY IMAGE ASSIGNMENT
US7085400B1 (en) * 2000-06-14 2006-08-01 Surgical Navigation Technologies, Inc. System and method for image based sensor calibration
US20020029047A1 (en) * 2000-06-16 2002-03-07 Benedicte Bascle Method and apparatus for needle placement and entry point determination in percutaneous procedures
US7228165B1 (en) 2000-06-26 2007-06-05 Boston Scientific Scimed, Inc. Apparatus and method for performing a tissue resection procedure
DE10033723C1 (en) * 2000-07-12 2002-02-21 Siemens Ag Surgical instrument position and orientation visualization device for surgical operation has data representing instrument position and orientation projected onto surface of patient's body
US6837892B2 (en) * 2000-07-24 2005-01-04 Mazor Surgical Technologies Ltd. Miniature bone-mounted surgical robot
KR20000064078A (en) * 2000-08-17 2000-11-06 오창근 The Technique of 3 Dimensional Modelling using Real Multiple Photographs
US6907281B2 (en) * 2000-09-07 2005-06-14 Ge Medical Systems Fast mapping of volumetric density data onto a two-dimensional screen
EP1323120B1 (en) * 2000-09-25 2018-11-14 Z-Kat Inc. Fluoroscopic registration artifact with optical and/or magnetic markers
US6493574B1 (en) 2000-09-28 2002-12-10 Koninklijke Philips Electronics, N.V. Calibration phantom and recognition algorithm for automatic coordinate transformation in diagnostic imaging
AU2002212642A1 (en) * 2000-10-18 2002-04-29 Paieon Inc. Method and system for measuring dimensions of an organ
US6718194B2 (en) * 2000-11-17 2004-04-06 Ge Medical Systems Global Technology Company, Llc Computer assisted intramedullary rod surgery system with enhanced features
US6917827B2 (en) 2000-11-17 2005-07-12 Ge Medical Systems Global Technology Company, Llc Enhanced graphic features for computer assisted surgery system
CA2334495A1 (en) * 2001-02-06 2002-08-06 Surgical Navigation Specialists, Inc. Computer-aided positioning method and system
US20040181149A1 (en) * 2001-02-07 2004-09-16 Ulrich Langlotz Device and method for intraoperative navigation
US7766894B2 (en) 2001-02-15 2010-08-03 Hansen Medical, Inc. Coaxial catheter system
WO2002067784A2 (en) 2001-02-27 2002-09-06 Smith & Nephew, Inc. Surgical navigation systems and processes for unicompartmental knee
US7547307B2 (en) * 2001-02-27 2009-06-16 Smith & Nephew, Inc. Computer assisted knee arthroplasty instrumentation, systems, and processes
US8062377B2 (en) 2001-03-05 2011-11-22 Hudson Surgical Design, Inc. Methods and apparatus for knee arthroplasty
EP1260179B1 (en) * 2001-05-22 2003-03-26 BrainLAB AG X-ray image registration device with a medical navigation system
US6636757B1 (en) 2001-06-04 2003-10-21 Surgical Navigation Technologies, Inc. Method and apparatus for electromagnetic navigation of a surgical probe near a metal object
US7206434B2 (en) * 2001-07-10 2007-04-17 Vistas Unlimited, Inc. Method and system for measurement of the duration an area is included in an image stream
US7708741B1 (en) * 2001-08-28 2010-05-04 Marctec, Llc Method of preparing bones for knee replacement surgery
WO2003032837A1 (en) 2001-10-12 2003-04-24 University Of Florida Computer controlled guidance of a biopsy needle
US7383073B1 (en) 2001-10-16 2008-06-03 Z-Kat Inc. Digital minimally invasive surgery system
US7169155B2 (en) * 2001-12-14 2007-01-30 Scimed Life Systems, Inc. Methods and apparatus for guiding a needle
ATE261274T1 (en) 2002-01-18 2004-03-15 Brainlab Ag METHOD AND DEVICE FOR ASSOCIATING DIGITAL IMAGE INFORMATION WITH THE NAVIGATION DATA OF A MEDICAL NAVIGATION SYSTEM
ES2217210T3 (en) * 2002-02-22 2004-11-01 Brainlab Ag REDUCED HEIGHT CALIBRATION INSTRUMENT.
US6947786B2 (en) 2002-02-28 2005-09-20 Surgical Navigation Technologies, Inc. Method and apparatus for perspective inversion
US8010180B2 (en) 2002-03-06 2011-08-30 Mako Surgical Corp. Haptic guidance system and method
TW200304608A (en) 2002-03-06 2003-10-01 Z Kat Inc System and method for using a haptic device in combination with a computer-assisted surgery system
US11202676B2 (en) 2002-03-06 2021-12-21 Mako Surgical Corp. Neural monitor-based dynamic haptics
US7831292B2 (en) * 2002-03-06 2010-11-09 Mako Surgical Corp. Guidance system and method for surgical procedures with improved feedback
US8996169B2 (en) 2011-12-29 2015-03-31 Mako Surgical Corp. Neural monitor-based dynamic haptics
US9155544B2 (en) * 2002-03-20 2015-10-13 P Tech, Llc Robotic systems and methods
US7611522B2 (en) * 2003-06-02 2009-11-03 Nuvasive, Inc. Gravity dependent pedicle screw tap hole guide and data processing device
EP1348394B1 (en) 2002-03-27 2006-02-22 BrainLAB AG Planning or navigation assistance by generic obtained patient data with two-dimensional adaptation
EP1348393B1 (en) 2002-03-27 2007-03-21 BrainLAB AG Medical navigation or pre-operative treatment planning supported by generic patient data
US6990368B2 (en) 2002-04-04 2006-01-24 Surgical Navigation Technologies, Inc. Method and apparatus for virtual digital subtraction angiography
US6980849B2 (en) * 2002-04-17 2005-12-27 Ricardo Sasso Instrumentation and method for performing image-guided spinal surgery using an anterior surgical approach
US8180429B2 (en) * 2002-04-17 2012-05-15 Warsaw Orthopedic, Inc. Instrumentation and method for mounting a surgical navigation reference device to a patient
US7998062B2 (en) 2004-03-29 2011-08-16 Superdimension, Ltd. Endoscope structures and techniques for navigating to a target in branched structure
US6993374B2 (en) * 2002-04-17 2006-01-31 Ricardo Sasso Instrumentation and method for mounting a surgical navigation reference device to a patient
US7787932B2 (en) * 2002-04-26 2010-08-31 Brainlab Ag Planning and navigation assistance using two-dimensionally adapted generic and detected patient data
US7299805B2 (en) 2002-06-07 2007-11-27 Marctec, Llc Scaffold and method for implanting cells
WO2003105659A2 (en) 2002-06-17 2003-12-24 Mazor Surgical Technologies Ltd. Robot for use with orthopaedic inserts
CA2633137C (en) 2002-08-13 2012-10-23 The Governors Of The University Of Calgary Microsurgical robot system
US20040044295A1 (en) * 2002-08-19 2004-03-04 Orthosoft Inc. Graphical user interface for computer-assisted surgery
DE10393169T5 (en) * 2002-08-26 2006-02-02 Orthosoft, Inc., Montreal A method of placing multiple implants during surgery using a computer-aided surgery system
ES2224007T3 (en) * 2002-09-24 2005-03-01 Brainlab Ag DEVICE AND PROCEDURE FOR THE DETERMINATION OF THE OPENING ANGLE OF AN ARTICULATION.
US7599730B2 (en) 2002-11-19 2009-10-06 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US7697972B2 (en) 2002-11-19 2010-04-13 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US7789885B2 (en) * 2003-01-15 2010-09-07 Biomet Manufacturing Corp. Instrumentation for knee resection
US7887542B2 (en) * 2003-01-15 2011-02-15 Biomet Manufacturing Corp. Method and apparatus for less invasive knee resection
US7837690B2 (en) * 2003-01-15 2010-11-23 Biomet Manufacturing Corp. Method and apparatus for less invasive knee resection
US8551100B2 (en) 2003-01-15 2013-10-08 Biomet Manufacturing, Llc Instrumentation for knee resection
US7542791B2 (en) 2003-01-30 2009-06-02 Medtronic Navigation, Inc. Method and apparatus for preplanning a surgical procedure
US7660623B2 (en) 2003-01-30 2010-02-09 Medtronic Navigation, Inc. Six degree of freedom alignment display for medical procedures
US20050267354A1 (en) * 2003-02-04 2005-12-01 Joel Marquart System and method for providing computer assistance with spinal fixation procedures
WO2004070580A2 (en) 2003-02-04 2004-08-19 Z-Kat, Inc. Computer-assisted knee replacement apparatus and method
WO2004069040A2 (en) * 2003-02-04 2004-08-19 Z-Kat, Inc. Method and apparatus for computer assistance with intramedullary nail procedure
WO2004069036A2 (en) * 2003-02-04 2004-08-19 Z-Kat, Inc. Computer-assisted knee replacement apparatus and method
US7111401B2 (en) * 2003-02-04 2006-09-26 Eveready Battery Company, Inc. Razor head having skin controlling means
US7194120B2 (en) * 2003-05-29 2007-03-20 Board Of Regents, The University Of Texas System Methods and systems for image-guided placement of implants
US6836702B1 (en) * 2003-06-11 2004-12-28 Abb Ab Method for fine tuning of a robot program
US7873403B2 (en) * 2003-07-15 2011-01-18 Brainlab Ag Method and device for determining a three-dimensional form of a body from two-dimensional projection images
US7209538B2 (en) * 2003-08-07 2007-04-24 Xoran Technologies, Inc. Intraoperative stereo imaging system
US7313430B2 (en) 2003-08-28 2007-12-25 Medtronic Navigation, Inc. Method and apparatus for performing stereotactic surgery
ES2432616T3 (en) 2003-09-15 2013-12-04 Covidien Lp Accessory system for use with bronchoscopes
EP2316328B1 (en) 2003-09-15 2012-05-09 Super Dimension Ltd. Wrap-around holding device for use with bronchoscopes
EP1677679A1 (en) * 2003-10-03 2006-07-12 Xoran Technologies, Inc. Ct imaging system for robotic intervention
US7862570B2 (en) 2003-10-03 2011-01-04 Smith & Nephew, Inc. Surgical positioners
US7835778B2 (en) 2003-10-16 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US7840253B2 (en) 2003-10-17 2010-11-23 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US20050085822A1 (en) * 2003-10-20 2005-04-21 Thornberry Robert C. Surgical navigation system component fault interfaces and related processes
US7764985B2 (en) 2003-10-20 2010-07-27 Smith & Nephew, Inc. Surgical navigation system component fault interfaces and related processes
WO2005039417A1 (en) * 2003-10-22 2005-05-06 Schaerer Mayfield Technologies Gmbh Method for fluoroscopy-based neuronavigation
US7794467B2 (en) 2003-11-14 2010-09-14 Smith & Nephew, Inc. Adjustable surgical cutting systems
US20050109855A1 (en) * 2003-11-25 2005-05-26 Mccombs Daniel Methods and apparatuses for providing a navigational array
US7488324B1 (en) 2003-12-08 2009-02-10 Biomet Manufacturing Corporation Femoral guide for implanting a femoral knee prosthesis
US8021368B2 (en) 2004-01-14 2011-09-20 Hudson Surgical Design, Inc. Methods and apparatus for improved cutting tools for resection
US20060030854A1 (en) 2004-02-02 2006-02-09 Haines Timothy G Methods and apparatus for wireplasty bone resection
US8114083B2 (en) 2004-01-14 2012-02-14 Hudson Surgical Design, Inc. Methods and apparatus for improved drilling and milling tools for resection
US7815645B2 (en) 2004-01-14 2010-10-19 Hudson Surgical Design, Inc. Methods and apparatus for pinplasty bone resection
US7857814B2 (en) 2004-01-14 2010-12-28 Hudson Surgical Design, Inc. Methods and apparatus for minimally invasive arthroplasty
US20050182317A1 (en) * 2004-01-29 2005-08-18 Haddad Souheil F. Method and apparatus for locating medical devices in tissue
US20060030855A1 (en) 2004-03-08 2006-02-09 Haines Timothy G Methods and apparatus for improved profile based resection
US20050267353A1 (en) * 2004-02-04 2005-12-01 Joel Marquart Computer-assisted knee replacement apparatus and method
US20050281465A1 (en) * 2004-02-04 2005-12-22 Joel Marquart Method and apparatus for computer assistance with total hip replacement procedure
US8764725B2 (en) 2004-02-09 2014-07-01 Covidien Lp Directional anchoring mechanism, method and applications thereof
AU2005216091B2 (en) * 2004-02-20 2009-09-17 Hector O. Pacheco Method for improving pedicle screw placement in spinal surgery
US7477776B2 (en) * 2004-03-01 2009-01-13 Brainlab Ag Method and apparatus for determining a plane of symmetry of a three-dimensional object
EP1570800B1 (en) * 2004-03-01 2007-04-11 BrainLAB AG Method and device for determining the symmetrical plane of a three dimensional object
CA2460119A1 (en) * 2004-03-04 2005-09-04 Orthosoft Inc. Graphical user interface for computer-assisted surgery
EP1720480A1 (en) 2004-03-05 2006-11-15 Hansen Medical, Inc. Robotic catheter system
US7976539B2 (en) 2004-03-05 2011-07-12 Hansen Medical, Inc. System and method for denaturing and fixing collagenous tissue
US20070073306A1 (en) * 2004-03-08 2007-03-29 Ryan Lakin Cutting block for surgical navigation
US7641660B2 (en) 2004-03-08 2010-01-05 Biomet Manufacturing Corporation Method, apparatus, and system for image guided bone cutting
EP1744670A2 (en) * 2004-03-22 2007-01-24 Vanderbilt University System and methods for surgical instrument disablement via image-guided position feedback
AU2005231404B9 (en) * 2004-03-31 2012-04-26 Smith & Nephew, Inc. Methods and apparatuses for providing a reference array input device
US20050228404A1 (en) * 2004-04-12 2005-10-13 Dirk Vandevelde Surgical navigation system component automated imaging navigation and related processes
WO2005104978A1 (en) 2004-04-21 2005-11-10 Smith & Nephew, Inc. Computer-aided methods, systems, and apparatuses for shoulder arthroplasty
US7567834B2 (en) 2004-05-03 2009-07-28 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
US7182767B2 (en) 2004-05-19 2007-02-27 Howmedica Osteonics Corp. Navigated lateral/medial femoral resection guide
US20050267359A1 (en) * 2004-05-27 2005-12-01 General Electric Company System, method, and article of manufacture for guiding an end effector to a target position within a person
FR2871363B1 (en) * 2004-06-15 2006-09-01 Medtech Sa ROBOTIZED GUIDING DEVICE FOR SURGICAL TOOL
US20050279368A1 (en) * 2004-06-16 2005-12-22 Mccombs Daniel L Computer assisted surgery input/output systems and processes
SE0401928D0 (en) * 2004-07-26 2004-07-26 Stig Lindequist Method and arrangement for positioning a tool
US20060063998A1 (en) * 2004-09-21 2006-03-23 Von Jako Ron Navigation and visualization of an access needle system
US9492241B2 (en) 2005-01-13 2016-11-15 Mazor Robotics Ltd. Image guided robotic system for keyhole neurosurgery
JP2008531091A (en) 2005-02-22 2008-08-14 スミス アンド ネフュー インコーポレーテッド In-line milling system
US7623902B2 (en) 2005-03-07 2009-11-24 Leucadia 6, Llc System and methods for improved access to vertebral bodies for kyphoplasty, vertebroplasty, vertebral body biopsy or screw placement
US7695479B1 (en) 2005-04-12 2010-04-13 Biomet Manufacturing Corp. Femoral sizer
US7725169B2 (en) * 2005-04-15 2010-05-25 The Board Of Trustees Of The University Of Illinois Contrast enhanced spectroscopic optical coherence tomography
US20070016008A1 (en) * 2005-06-23 2007-01-18 Ryan Schoenefeld Selective gesturing input to a surgical navigation system
US7840256B2 (en) 2005-06-27 2010-11-23 Biomet Manufacturing Corporation Image guided tracking array and method
WO2007005976A1 (en) 2005-07-01 2007-01-11 Hansen Medical, Inc. Robotic catheter system
EP1741469A1 (en) * 2005-07-08 2007-01-10 Engineers & Doctors Wallstén Medical A/S Method of guiding an irradiation equipment
DE102005032523B4 (en) * 2005-07-12 2009-11-05 Siemens Ag Method for the pre-interventional planning of a 2D fluoroscopy projection
DE102005044033B4 (en) * 2005-09-14 2010-11-18 Cas Innovations Gmbh & Co. Kg Positioning system for percutaneous interventions
US20070073133A1 (en) * 2005-09-15 2007-03-29 Schoenefeld Ryan J Virtual mouse for use in surgical navigation
US20070073136A1 (en) * 2005-09-15 2007-03-29 Robert Metzger Bone milling with image guided surgery
US7643862B2 (en) 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
US7835784B2 (en) 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US9168102B2 (en) 2006-01-18 2015-10-27 Medtronic Navigation, Inc. Method and apparatus for providing a container to a sterile environment
US7662183B2 (en) * 2006-01-24 2010-02-16 Timothy Haines Dynamic spinal implants incorporating cartilage bearing graft material
US7787129B2 (en) * 2006-01-31 2010-08-31 The Board Of Trustees Of The University Of Illinois Method and apparatus for measurement of optical properties in tissue
JP2007215577A (en) * 2006-02-14 2007-08-30 Fujifilm Corp Endoscopic instrument and diagnostic system
US8858561B2 (en) 2006-06-09 2014-10-14 Blomet Manufacturing, LLC Patient-specific alignment guide
US8298237B2 (en) 2006-06-09 2012-10-30 Biomet Manufacturing Corp. Patient-specific alignment guide for multiple incisions
US7967868B2 (en) 2007-04-17 2011-06-28 Biomet Manufacturing Corp. Patient-modified implant and associated method
US8535387B2 (en) 2006-02-27 2013-09-17 Biomet Manufacturing, Llc Patient-specific tools and implants
US9918740B2 (en) 2006-02-27 2018-03-20 Biomet Manufacturing, Llc Backup surgical instrument system and method
US8070752B2 (en) 2006-02-27 2011-12-06 Biomet Manufacturing Corp. Patient specific alignment guide and inter-operative adjustment
US10278711B2 (en) 2006-02-27 2019-05-07 Biomet Manufacturing, Llc Patient-specific femoral guide
US7780672B2 (en) * 2006-02-27 2010-08-24 Biomet Manufacturing Corp. Femoral adjustment device and associated method
US8092465B2 (en) 2006-06-09 2012-01-10 Biomet Manufacturing Corp. Patient specific knee alignment guide and associated method
US8282646B2 (en) 2006-02-27 2012-10-09 Biomet Manufacturing Corp. Patient specific knee alignment guide and associated method
US8591516B2 (en) 2006-02-27 2013-11-26 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US8473305B2 (en) 2007-04-17 2013-06-25 Biomet Manufacturing Corp. Method and apparatus for manufacturing an implant
US9339278B2 (en) 2006-02-27 2016-05-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US8864769B2 (en) 2006-02-27 2014-10-21 Biomet Manufacturing, Llc Alignment guides with patient-specific anchoring elements
US8603180B2 (en) 2006-02-27 2013-12-10 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US9289253B2 (en) 2006-02-27 2016-03-22 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9345548B2 (en) 2006-02-27 2016-05-24 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US9173661B2 (en) 2006-02-27 2015-11-03 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US8608749B2 (en) 2006-02-27 2013-12-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US8133234B2 (en) 2006-02-27 2012-03-13 Biomet Manufacturing Corp. Patient specific acetabular guide and method
US9113971B2 (en) 2006-02-27 2015-08-25 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US8407067B2 (en) 2007-04-17 2013-03-26 Biomet Manufacturing Corp. Method and apparatus for manufacturing an implant
US8568487B2 (en) 2006-02-27 2013-10-29 Biomet Manufacturing, Llc Patient-specific hip joint devices
US8608748B2 (en) 2006-02-27 2013-12-17 Biomet Manufacturing, Llc Patient specific guides
US9907659B2 (en) 2007-04-17 2018-03-06 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US8377066B2 (en) 2006-02-27 2013-02-19 Biomet Manufacturing Corp. Patient-specific elbow guides and associated methods
US20150335438A1 (en) 2006-02-27 2015-11-26 Biomet Manufacturing, Llc. Patient-specific augments
US8241293B2 (en) 2006-02-27 2012-08-14 Biomet Manufacturing Corp. Patient specific high tibia osteotomy
US8165659B2 (en) 2006-03-22 2012-04-24 Garrett Sheffer Modeling method and apparatus for use in surgical navigation
US8112292B2 (en) 2006-04-21 2012-02-07 Medtronic Navigation, Inc. Method and apparatus for optimizing a therapy
US7920162B2 (en) * 2006-05-16 2011-04-05 Stryker Leibinger Gmbh & Co. Kg Display method and system for surgical procedures
AU2007254173B2 (en) * 2006-05-17 2013-07-25 Nuvasive, Inc. Surgical trajectory monitoring system and related methods
EP2023843B1 (en) 2006-05-19 2016-03-09 Mako Surgical Corp. System for verifying calibration of a surgical device
US7695520B2 (en) * 2006-05-31 2010-04-13 Biomet Manufacturing Corp. Prosthesis and implementation system
US9795399B2 (en) 2006-06-09 2017-10-24 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
EP1868157A1 (en) 2006-06-14 2007-12-19 BrainLAB AG Shape reconstruction using X-ray images
US8560047B2 (en) 2006-06-16 2013-10-15 Board Of Regents Of The University Of Nebraska Method and apparatus for computer aided surgery
EP2032023A4 (en) * 2006-06-28 2011-08-10 Hector O Pacheco Templating and placing artifical discs in spine
US8335553B2 (en) * 2006-09-25 2012-12-18 Mazor Robotics Ltd. CT-free spinal surgical imaging system
US8660635B2 (en) 2006-09-29 2014-02-25 Medtronic, Inc. Method and apparatus for optimizing a computer assisted surgical procedure
ATE462353T1 (en) * 2006-11-10 2010-04-15 Koninkl Philips Electronics Nv PREVENTING METAL ARTIFACTS DURING NEEDLE GUIDANCE UNDER (XPER) CT
EP1925256A1 (en) 2006-11-24 2008-05-28 BrainLAB AG Method and device for registering an anatomical structure with markers
US8116550B2 (en) * 2006-12-20 2012-02-14 Cytyc Corporation Method and system for locating and focusing on fiducial marks on specimen slides
US20080163118A1 (en) * 2006-12-29 2008-07-03 Jason Wolf Representation of file relationships
EP2144568B1 (en) * 2007-05-10 2021-03-17 Koninklijke Philips N.V. Targeting method
US8934961B2 (en) 2007-05-18 2015-01-13 Biomet Manufacturing, Llc Trackable diagnostic scope apparatus and methods of use
US20080312528A1 (en) * 2007-06-15 2008-12-18 Bertolina James A Guidance of medical instrument using flouroscopy scanner with multple x-ray sources
EP2207495A2 (en) * 2007-06-15 2010-07-21 Rainer Burgkart Method for determining a direction of intervention of a tool and performing the tool intervention
US20080319491A1 (en) * 2007-06-19 2008-12-25 Ryan Schoenefeld Patient-matched surgical component and methods of use
GB0714940D0 (en) * 2007-08-01 2007-09-12 Depuy Orthopaedie Gmbh Image processing
US8265949B2 (en) 2007-09-27 2012-09-11 Depuy Products, Inc. Customized patient surgical plan
US8905920B2 (en) 2007-09-27 2014-12-09 Covidien Lp Bronchoscope adapter and method
US8357111B2 (en) 2007-09-30 2013-01-22 Depuy Products, Inc. Method and system for designing patient-specific orthopaedic surgical instruments
CN102670275B (en) 2007-09-30 2016-01-20 德普伊产品公司 The patient-specific orthopaedic surgical instrumentation of customization
WO2009055034A1 (en) 2007-10-24 2009-04-30 Nuvasive, Inc. Surgical trajectory monitoring system and related methods
ES2595366T3 (en) * 2008-01-09 2016-12-29 Stryker European Holdings I, Llc Computer-assisted stereotactic surgery system based on a three-dimensional visualization
US8983580B2 (en) 2008-01-18 2015-03-17 The Board Of Trustees Of The University Of Illinois Low-coherence interferometry and optical coherence tomography for image-guided surgical treatment of solid tumors
US7751057B2 (en) 2008-01-18 2010-07-06 The Board Of Trustees Of The University Of Illinois Magnetomotive optical coherence tomography
US8115934B2 (en) 2008-01-18 2012-02-14 The Board Of Trustees Of The University Of Illinois Device and method for imaging the ear using optical coherence tomography
US8571637B2 (en) 2008-01-21 2013-10-29 Biomet Manufacturing, Llc Patella tracking method and apparatus for use in surgical navigation
US10875182B2 (en) 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
WO2009122273A2 (en) 2008-04-03 2009-10-08 Superdimension, Ltd. Magnetic interference detection system and method
FR2929709B1 (en) * 2008-04-04 2010-04-23 Spectroscan RADIO SYNTHETIC EXAMINATION METHOD OF SPECIMENS
US8549888B2 (en) 2008-04-04 2013-10-08 Nuvasive, Inc. System and device for designing and forming a surgical implant
US8473032B2 (en) 2008-06-03 2013-06-25 Superdimension, Ltd. Feature-based registration method
US8218847B2 (en) 2008-06-06 2012-07-10 Superdimension, Ltd. Hybrid registration method
US8932207B2 (en) 2008-07-10 2015-01-13 Covidien Lp Integrated multi-functional endoscopic tool
US8165658B2 (en) * 2008-09-26 2012-04-24 Medtronic, Inc. Method and apparatus for positioning a guide relative to a base
EP2353147B1 (en) * 2008-11-28 2021-05-19 Fujifilm Medical Systems U.S.A. Inc. System and method for propagation of spine labeling
US8175681B2 (en) 2008-12-16 2012-05-08 Medtronic Navigation Inc. Combination of electromagnetic and electropotential localization
US8170641B2 (en) 2009-02-20 2012-05-01 Biomet Manufacturing Corp. Method of imaging an extremity of a patient
US20120069965A1 (en) * 2009-02-28 2012-03-22 Stellenbosch University Method for positioning an instrument
US8611984B2 (en) 2009-04-08 2013-12-17 Covidien Lp Locatable catheter
US9254123B2 (en) 2009-04-29 2016-02-09 Hansen Medical, Inc. Flexible and steerable elongate instruments with shape control and support elements
DE102009028503B4 (en) 2009-08-13 2013-11-14 Biomet Manufacturing Corp. Resection template for the resection of bones, method for producing such a resection template and operation set for performing knee joint surgery
US8494614B2 (en) 2009-08-31 2013-07-23 Regents Of The University Of Minnesota Combination localization system
US8494613B2 (en) 2009-08-31 2013-07-23 Medtronic, Inc. Combination localization system
AU2010292136B2 (en) * 2009-09-10 2013-07-04 Blue Ortho Alignment guides for use in computer assisted orthopedic surgery to prepare a bone element for an implant
WO2011039427A1 (en) * 2009-09-30 2011-04-07 Spectroscan Sarl Method of radio-synthetic examination of specimens
WO2011041428A2 (en) 2009-10-01 2011-04-07 Mako Surgical Corp. Surgical system for positioning prosthetic component and/or for constraining movement of surgical tool
US11154981B2 (en) * 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US8632547B2 (en) 2010-02-26 2014-01-21 Biomet Sports Medicine, Llc Patient-specific osteotomy devices and methods
AU2011200764B2 (en) * 2010-03-01 2013-06-13 Stryker European Operations Holdings Llc Computer assisted surgery system
US10588647B2 (en) * 2010-03-01 2020-03-17 Stryker European Holdings I, Llc Computer assisted surgery system
US9066727B2 (en) 2010-03-04 2015-06-30 Materialise Nv Patient-specific computed tomography guides
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
DE102010023345B4 (en) * 2010-06-10 2020-10-29 Siemens Healthcare Gmbh Medical biplane x-ray system and method for controlling the biplane x-ray system
WO2011159834A1 (en) 2010-06-15 2011-12-22 Superdimension, Ltd. Locatable expandable working channel and method
US8908937B2 (en) * 2010-07-08 2014-12-09 Biomet Manufacturing, Llc Method and device for digital image templating
EP2593023B1 (en) 2010-07-16 2018-09-19 Stryker European Holdings I, LLC Surgical targeting system and method
FR2963693B1 (en) 2010-08-04 2013-05-03 Medtech PROCESS FOR AUTOMATED ACQUISITION AND ASSISTED ANATOMICAL SURFACES
US9271744B2 (en) 2010-09-29 2016-03-01 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US9968376B2 (en) 2010-11-29 2018-05-15 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
WO2012100211A2 (en) 2011-01-20 2012-07-26 Hansen Medical, Inc. System and method for endoluminal and transluminal therapy
US8718837B2 (en) 2011-01-28 2014-05-06 Intouch Technologies Interfacing with a mobile telepresence robot
US8917290B2 (en) 2011-01-31 2014-12-23 Biomet Manufacturing, Llc Digital image templating
US9241745B2 (en) 2011-03-07 2016-01-26 Biomet Manufacturing, Llc Patient-specific femoral version guide
US8715289B2 (en) 2011-04-15 2014-05-06 Biomet Manufacturing, Llc Patient-specific numerically controlled instrument
US9675400B2 (en) 2011-04-19 2017-06-13 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US8956364B2 (en) 2011-04-29 2015-02-17 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
US8668700B2 (en) 2011-04-29 2014-03-11 Biomet Manufacturing, Llc Patient-specific convertible guides
US8532807B2 (en) 2011-06-06 2013-09-10 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US9084618B2 (en) 2011-06-13 2015-07-21 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US11911117B2 (en) 2011-06-27 2024-02-27 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
CA2840397A1 (en) 2011-06-27 2013-04-11 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US20130001121A1 (en) 2011-07-01 2013-01-03 Biomet Manufacturing Corp. Backup kit for a patient-specific arthroplasty kit assembly
US8764760B2 (en) 2011-07-01 2014-07-01 Biomet Manufacturing, Llc Patient-specific bone-cutting guidance instruments and methods
US20130030363A1 (en) 2011-07-29 2013-01-31 Hansen Medical, Inc. Systems and methods utilizing shape sensing fibers
US8597365B2 (en) 2011-08-04 2013-12-03 Biomet Manufacturing, Llc Patient-specific pelvic implants for acetabular reconstruction
US9295497B2 (en) 2011-08-31 2016-03-29 Biomet Manufacturing, Llc Patient-specific sacroiliac and pedicle guides
US9066734B2 (en) 2011-08-31 2015-06-30 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9386993B2 (en) 2011-09-29 2016-07-12 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US9451973B2 (en) 2011-10-27 2016-09-27 Biomet Manufacturing, Llc Patient specific glenoid guide
US9301812B2 (en) 2011-10-27 2016-04-05 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
KR20130046337A (en) 2011-10-27 2013-05-07 삼성전자주식회사 Multi-view device and contol method thereof, display apparatus and contol method thereof, and display system
US9554910B2 (en) 2011-10-27 2017-01-31 Biomet Manufacturing, Llc Patient-specific glenoid guide and implants
EP2770918B1 (en) 2011-10-27 2017-07-19 Biomet Manufacturing, LLC Patient-specific glenoid guides
FR2983059B1 (en) 2011-11-30 2014-11-28 Medtech ROBOTIC-ASSISTED METHOD OF POSITIONING A SURGICAL INSTRUMENT IN RELATION TO THE BODY OF A PATIENT AND DEVICE FOR CARRYING OUT SAID METHOD
US9237950B2 (en) 2012-02-02 2016-01-19 Biomet Manufacturing, Llc Implant with patient-specific porous structure
WO2013175471A1 (en) 2012-05-22 2013-11-28 Mazor Robotics Ltd. On-site verification of implant positioning
US10499961B2 (en) 2012-05-23 2019-12-10 Stryker European Holdings I, Llc Entry portal navigation
EP2667352B1 (en) 2012-05-23 2020-01-01 Stryker European Holdings I, LLC Virtual 3D overlay as reduction aid for complex fractures
WO2013174402A1 (en) 2012-05-23 2013-11-28 Stryker Trauma Gmbh Locking screw length measurement
JP2015528713A (en) * 2012-06-21 2015-10-01 グローバス メディカル インコーポレイティッド Surgical robot platform
US10799298B2 (en) * 2012-06-21 2020-10-13 Globus Medical Inc. Robotic fluoroscopic navigation
US10758315B2 (en) * 2012-06-21 2020-09-01 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
WO2014048447A1 (en) 2012-09-27 2014-04-03 Stryker Trauma Gmbh Rotational position determination
KR101433242B1 (en) * 2012-11-16 2014-08-25 경북대학교 산학협력단 Reduction surgical robot and method for driving control thereof
US20140148673A1 (en) 2012-11-28 2014-05-29 Hansen Medical, Inc. Method of anchoring pullwire directly articulatable region in catheter
US9060788B2 (en) 2012-12-11 2015-06-23 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9204977B2 (en) 2012-12-11 2015-12-08 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9888967B2 (en) * 2012-12-31 2018-02-13 Mako Surgical Corp. Systems and methods for guiding a user during surgical planning
US9839438B2 (en) 2013-03-11 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9579107B2 (en) 2013-03-12 2017-02-28 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US9498233B2 (en) 2013-03-13 2016-11-22 Biomet Manufacturing, Llc. Universal acetabular guide and associated hardware
US9826981B2 (en) 2013-03-13 2017-11-28 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US20140277334A1 (en) 2013-03-14 2014-09-18 Hansen Medical, Inc. Active drives for robotic catheter manipulators
US9326822B2 (en) 2013-03-14 2016-05-03 Hansen Medical, Inc. Active drives for robotic catheter manipulators
US9408669B2 (en) 2013-03-15 2016-08-09 Hansen Medical, Inc. Active drive mechanism with finite range of motion
US9517145B2 (en) 2013-03-15 2016-12-13 Biomet Manufacturing, Llc Guide alignment system and method
US20140276936A1 (en) 2013-03-15 2014-09-18 Hansen Medical, Inc. Active drive mechanism for simultaneous rotation and translation
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9454643B2 (en) * 2013-05-02 2016-09-27 Smith & Nephew, Inc. Surface and image integration for model evaluation and landmark determination
US20150112349A1 (en) 2013-10-21 2015-04-23 Biomet Manufacturing, Llc Ligament Guide Registration
EP3119336A4 (en) * 2014-03-17 2017-11-15 Intuitive Surgical Operations, Inc. Methods of controlling motion of under-actuated joints in a surgical set-up structure
CN103954313B (en) * 2014-04-09 2017-01-18 中国海洋大学 Three-dimensional coordinate frame for wind-wave flume
US10046140B2 (en) 2014-04-21 2018-08-14 Hansen Medical, Inc. Devices, systems, and methods for controlling active drive systems
US10282488B2 (en) 2014-04-25 2019-05-07 Biomet Manufacturing, Llc HTO guide with optional guided ACL/PCL tunnels
US9408616B2 (en) 2014-05-12 2016-08-09 Biomet Manufacturing, Llc Humeral cut guide
US9561040B2 (en) 2014-06-03 2017-02-07 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9839436B2 (en) 2014-06-03 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US10952593B2 (en) 2014-06-10 2021-03-23 Covidien Lp Bronchoscope adapter
EP3164080A4 (en) * 2014-07-06 2018-06-27 Garcia-Bengochea, Javier Methods and devices for surgical access
US9826994B2 (en) 2014-09-29 2017-11-28 Biomet Manufacturing, Llc Adjustable glenoid pin insertion guide
US9833245B2 (en) 2014-09-29 2017-12-05 Biomet Sports Medicine, Llc Tibial tubercule osteotomy
US10123846B2 (en) * 2014-11-13 2018-11-13 Intuitive Surgical Operations, Inc. User-interface control using master controller
EP3866174A1 (en) 2014-11-13 2021-08-18 Intuitive Surgical Operations, Inc. Interaction between user-interface and master controller
US9820868B2 (en) 2015-03-30 2017-11-21 Biomet Manufacturing, Llc Method and apparatus for a pin apparatus
US10959783B2 (en) 2015-04-15 2021-03-30 Mobius Imaging, Llc Integrated medical imaging and surgical robotic system
US10426555B2 (en) 2015-06-03 2019-10-01 Covidien Lp Medical instrument with sensor for use in a system and method for electromagnetic navigation
US10568647B2 (en) 2015-06-25 2020-02-25 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10226262B2 (en) 2015-06-25 2019-03-12 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10687830B2 (en) 2015-07-06 2020-06-23 Javier Garcia-Bengochea Methods and devices for surgical access
US9962134B2 (en) 2015-10-28 2018-05-08 Medtronic Navigation, Inc. Apparatus and method for maintaining image quality while minimizing X-ray dosage of a patient
US10010372B1 (en) 2016-01-06 2018-07-03 Paul Beck Marker Positioning Apparatus
US10004564B1 (en) 2016-01-06 2018-06-26 Paul Beck Accurate radiographic calibration using multiple images
ES2877761T3 (en) 2016-03-02 2021-11-17 Nuvasive Inc Systems and Procedures for Spinal Correction Surgical Planning
US10478254B2 (en) 2016-05-16 2019-11-19 Covidien Lp System and method to access lung tissue
US10695133B2 (en) 2016-07-12 2020-06-30 Mobius Imaging Llc Multi-stage dilator and cannula system and method
US10463439B2 (en) 2016-08-26 2019-11-05 Auris Health, Inc. Steerable catheter with shaft load distributions
US11241559B2 (en) 2016-08-29 2022-02-08 Auris Health, Inc. Active drive for guidewire manipulation
WO2018053282A1 (en) 2016-09-16 2018-03-22 GYS Tech, LLC d/b/a Cardan Robotics System and method for mounting a robotic arm in a surgical robotic system
US11350995B2 (en) 2016-10-05 2022-06-07 Nuvasive, Inc. Surgical navigation systems and methods
EP3528735A4 (en) 2016-10-21 2020-04-29 Mobius Imaging LLC Methods and systems for setting trajectories and target locations for image guided surgery
FR3057757B1 (en) 2016-10-21 2021-04-16 Medtech AUTOMATIC REGISTRATION DEVICE AND METHOD FOR 3D INTRA-OPERATIVE IMAGES
WO2018081136A2 (en) 2016-10-25 2018-05-03 Eugene Gregerson Methods and systems for robot-assisted surgery
US11653979B2 (en) 2016-10-27 2023-05-23 Leucadia 6, Llc Intraoperative fluoroscopic registration of vertebral bodies
US10615500B2 (en) 2016-10-28 2020-04-07 Covidien Lp System and method for designing electromagnetic navigation antenna assemblies
US10751126B2 (en) 2016-10-28 2020-08-25 Covidien Lp System and method for generating a map for electromagnetic navigation
US10638952B2 (en) 2016-10-28 2020-05-05 Covidien Lp Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system
US10446931B2 (en) 2016-10-28 2019-10-15 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10722311B2 (en) 2016-10-28 2020-07-28 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
US10517505B2 (en) 2016-10-28 2019-12-31 Covidien Lp Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system
US10418705B2 (en) 2016-10-28 2019-09-17 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10792106B2 (en) 2016-10-28 2020-10-06 Covidien Lp System for calibrating an electromagnetic navigation system
US10499997B2 (en) 2017-01-03 2019-12-10 Mako Surgical Corp. Systems and methods for surgical navigation
US10722310B2 (en) 2017-03-13 2020-07-28 Zimmer Biomet CMF and Thoracic, LLC Virtual surgery planning system and method
US10682129B2 (en) 2017-03-23 2020-06-16 Mobius Imaging, Llc Robotic end effector with adjustable inner diameter
US11862302B2 (en) 2017-04-24 2024-01-02 Teladoc Health, Inc. Automated transcription and documentation of tele-health encounters
EP3621545B1 (en) 2017-05-10 2024-02-21 MAKO Surgical Corp. Robotic spine surgery system
US11033341B2 (en) 2017-05-10 2021-06-15 Mako Surgical Corp. Robotic spine surgery system and methods
KR101937236B1 (en) 2017-05-12 2019-01-11 주식회사 코어라인소프트 System and method of computer assistance for the image-guided reduction of a fracture
US10483007B2 (en) 2017-07-25 2019-11-19 Intouch Technologies, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
EP3664741B1 (en) 2017-08-11 2023-02-22 Mobius Imaging LLC Apparatus for attaching a reference marker to a patient
US11636944B2 (en) 2017-08-25 2023-04-25 Teladoc Health, Inc. Connectivity infrastructure for a telehealth platform
WO2019070997A1 (en) 2017-10-04 2019-04-11 GYS Tech, LLC d/b/a Cardan Robotics Systems and methods for performing lateral-access spine surgery
AU2018346790A1 (en) 2017-10-05 2020-04-30 Mobius Imaging, Llc Methods and systems for performing computer assisted surgery
US11219489B2 (en) 2017-10-31 2022-01-11 Covidien Lp Devices and systems for providing sensors in parallel with medical tools
TWI641358B (en) * 2017-12-26 2018-11-21 國立成功大學 Percutaneous spine puncture guiding system and puncture orientation configuring method
US10617299B2 (en) 2018-04-27 2020-04-14 Intouch Technologies, Inc. Telehealth cart that supports a removable tablet with seamless audio/video switching
US11051829B2 (en) 2018-06-26 2021-07-06 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic surgical instrument
CN109238136A (en) * 2018-08-09 2019-01-18 广州毅远塑胶五金模具有限公司 A kind of headstock headlight measurement method
CN109363771B (en) * 2018-12-06 2021-08-06 安徽埃克索医疗机器人有限公司 Femoral neck fracture multi-tunnel nail implantation positioning system integrating intraoperative 2D planning information
US11065065B2 (en) * 2019-02-22 2021-07-20 Warsaw Orthopedic, Inc. Spinal implant system and methods of use
EP3733112A1 (en) 2019-05-03 2020-11-04 Globus Medical, Inc. System for robotic trajectory guidance for navigated biopsy needle
US11871998B2 (en) 2019-12-06 2024-01-16 Stryker European Operations Limited Gravity based patient image orientation detection
EP4014912A1 (en) * 2020-12-21 2022-06-22 Metamorphosis GmbH Artificial-intelligence-based registration of x-ray images

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4722336A (en) * 1985-01-25 1988-02-02 Michael Kim Placement guide
US4805615A (en) * 1985-07-02 1989-02-21 Carol Mark P Method and apparatus for performing stereotactic surgery
US5078140A (en) 1986-05-08 1992-01-07 Kwoh Yik S Imaging device - aided robotic stereotaxis system
US4750487A (en) 1986-11-24 1988-06-14 Zanetti Paul H Stereotactic frame
US5099846A (en) * 1988-12-23 1992-03-31 Hardy Tyrone L Method and apparatus for video presentation from a variety of scanner imaging sources
US5265610A (en) * 1991-09-03 1993-11-30 General Electric Company Multi-planar X-ray fluoroscopy system using radiofrequency fields
US5274551A (en) * 1991-11-29 1993-12-28 General Electric Company Method and apparatus for real-time navigation assist in interventional radiological procedures
US5389101A (en) * 1992-04-21 1995-02-14 University Of Utah Apparatus and method for photogrammetric surgical localization
GB9405299D0 (en) * 1994-03-17 1994-04-27 Roke Manor Research Improvements in or relating to video-based systems for computer assisted surgery and localisation
US5868673A (en) * 1995-03-28 1999-02-09 Sonometrics Corporation System for carrying out surgery, biopsy and ablation of a tumor or other physical anomaly
US5799055A (en) * 1996-05-15 1998-08-25 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy

Also Published As

Publication number Publication date
WO1997042898A1 (en) 1997-11-20
JP2000510730A (en) 2000-08-22
US6198794B1 (en) 2001-03-06
EP0955935A4 (en) 2002-02-27
AU3066497A (en) 1997-12-05
IL127027A (en) 2003-05-29
IL127027A0 (en) 1999-09-22
US6069932A (en) 2000-05-30
JP5065783B2 (en) 2012-11-07
CA2255041A1 (en) 1997-11-20
TW384217B (en) 2000-03-11
JP4469423B2 (en) 2010-05-26
US5799055A (en) 1998-08-25
EP0955935A1 (en) 1999-11-17
NZ332764A (en) 2000-08-25
JP2007307399A (en) 2007-11-29

Similar Documents

Publication Publication Date Title
CA2255041C (en) Stereotactic surgical procedure apparatus and method
US11653905B2 (en) Systems and methods for tracking robotically controlled medical instruments
EP3254621B1 (en) 3d image special calibrator, surgical localizing system and method
US6097994A (en) Apparatus and method for determining the correct insertion depth for a biopsy needle
USRE43952E1 (en) Interactive system for local intervention inside a non-homogeneous structure
EP3032456B1 (en) System and method for optical position measurement and guidance of a rigid or semi-flexible tool to a target
US6285902B1 (en) Computer assisted targeting device for use in orthopaedic surgery
US8781630B2 (en) Imaging platform to provide integrated navigation capabilities for surgical guidance
USRE40176E1 (en) Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US5930329A (en) Apparatus and method for detection and localization of a biopsy needle or similar surgical tool in a radiographic image
KR20180104763A (en) POSITION ESTIMATION AND CORRECTION SYSTEM AND METHOD FOR FUSION IMAGING SYSTEM IN IMAGING
Zamorano et al. Computer-assisted neurosurgery system: Wayne State University hardware and software configuration
WO1996010949A1 (en) Video-based surgical targeting system
CN113017834B (en) Joint replacement operation navigation device and method
CN105555221A (en) Medical needle path display
KR20240021747A (en) Medical robots for ultrasound-guided needle placement
US6028912A (en) Apparatus and method for point reconstruction and metric measurement on radiographic images
Doignon et al. The role of insertion points in the detection and positioning of instruments in laparoscopy for robotic tasks
CN112869856B (en) Two-dimensional image guided intramedullary needle distal locking robot system and locking method thereof
KR20000011134A (en) Stereotactic surgical procedure apparatus and method
AU773931B2 (en) Stereotactic surgical procedure apparatus and method
EP1786349A1 (en) Method and arrangement for positioning a tool
CN117425448A (en) Ultrasound probe equipped robot for guiding percutaneous interventional therapy in real time

Legal Events

Date Code Title Description
EEER Examination request
MKEX Expiry

Effective date: 20170515