US20020077543A1 - Method and apparatus for tracking a medical instrument based on image registration - Google Patents

Method and apparatus for tracking a medical instrument based on image registration Download PDF

Info

Publication number
US20020077543A1
US20020077543A1 US09/892,402 US89240201A US2002077543A1 US 20020077543 A1 US20020077543 A1 US 20020077543A1 US 89240201 A US89240201 A US 89240201A US 2002077543 A1 US2002077543 A1 US 2002077543A1
Authority
US
United States
Prior art keywords
instrument
image
operating space
patient
target site
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/892,402
Other versions
US6782287B2 (en
Inventor
Robert Grzeszczuk
Ramin Shahidi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CALIFORNIA INSTITUTE OF COMPUTER ASSISTED SURGERY Inc
Stryker European Operations Holdings LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US09/892,402 priority Critical patent/US6782287B2/en
Application filed by Individual filed Critical Individual
Assigned to CBYON, INC. reassignment CBYON, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GRZESZCZUK, ROBERT
Assigned to BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY, THE reassignment BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY, THE ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHAHIDI, RAMIN
Publication of US20020077543A1 publication Critical patent/US20020077543A1/en
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CBYON, INC.
Publication of US6782287B2 publication Critical patent/US6782287B2/en
Application granted granted Critical
Assigned to SHAHIDI, RAMIN reassignment SHAHIDI, RAMIN ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY
Assigned to SHAHIDI, RAMIN reassignment SHAHIDI, RAMIN CHANGE OF ASSIGNEE ADDRESS Assignors: SHAHIDI, RAMIN
Assigned to CALIFORNIA INSTITUTE OF COMPUTER ASSISTED SURGERY, INC. reassignment CALIFORNIA INSTITUTE OF COMPUTER ASSISTED SURGERY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHAHIDI, RAMIN
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING PARTY ADDRESS PREVIOUSLY RECORDED AT REEL: 015251 FRAME: 0840. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT . Assignors: CBYON, INC.
Assigned to STRYKER EUROPEAN HOLDINGS I, LLC reassignment STRYKER EUROPEAN HOLDINGS I, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GENERAL ELECTRIC COMPANY
Anticipated expiration legal-status Critical
Assigned to STRYKER EUROPEAN HOLDINGS III, LLC reassignment STRYKER EUROPEAN HOLDINGS III, LLC NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS). Assignors: STRYKER EUROPEAN HOLDINGS I, LLC
Assigned to STRYKER EUROPEAN OPERATIONS HOLDINGS LLC reassignment STRYKER EUROPEAN OPERATIONS HOLDINGS LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: STRYKER EUROPEAN HOLDINGS III, LLC
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/221Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Definitions

  • the present invention relates to an apparatus, method and system for tracking a medical instrument in three-dimensional (3-D) space based on diagnostic scan data and intra-operative stereo images.
  • the invention has particular application in tracking instruments, both flexible and rigid, as they are moved inside a patient's body.
  • the invention also relates to a processor-readable medium embodying a program of instructions (e.g., software) which may be employed with the apparatus or system for implementing aspects of the tracking method.
  • Various scanning techniques are known for imaging and mapping body structures, which provide information regarding the location of a target site in a patient's body for surgical or diagnostic procedures.
  • One such technique employs still photography, videography, radiological x-rays, or angiography to produce a 2-D projection of a 3-D object.
  • Another technique involves (1) acquiring 2-D image scans of the operating space and internal anatomical structures of interest either pre- or intra-operatively; (2) reconstructing 3-D images based on the acquired 2-D scans; and (3) segmenting the 3-D images.
  • the scans are typically computerized tomographic (CT) scans, positron emission tomography (PET) scans, or magnetic resonance image (MRI) scans.
  • CT computerized tomographic
  • PET positron emission tomography
  • MRI magnetic resonance image
  • the image scans are registered with the patient to provide a basis for localizing or tracking a medical instrument with respect to anatomical features or other elements in the images, as the instrument is moved within the operating field during surgery.
  • Registration involves the point-for-point mapping of the image space to the patient space, allowing corresponding points to be mapped together.
  • Corresponding points are those points that represent the same anatomical features in two spaces.
  • appropriate equipment can be used to track the medical instrument relative to internal structures of the patient as it is navigated in and around the patient target site during surgery. Images of the target site are displayed to assist the user (e.g., the surgeon) in navigating to the target site.
  • Conventional tracking equipment includes a structure to define a 3-D reference coordinate system relative to the patient or operating space.
  • One such structure used for instrument localization or tracking in neurosurgery is a large ring-like device which surrounds the patient's head and is fixed relative thereto.
  • the ring establishes a 3-D coordinate system with respect to the patient's head.
  • a separate calibration unit having an array of rod elements is fixed to the ring to surround the head during the generation of scan and/or 2-D images.
  • the rods which have known coordinates in the 3-D coordinate system defined by the ring, produce spots in the scans.
  • Other features in the volume scans can then be assigned coordinates in the 3-D coordinate system by correlation with the known coordinates of the spots produced by the rod elements.
  • the calibration unit is detached from the ring, and a guidance arc calibrated to the 3-D coordinate system of the ring is attached in its place.
  • the arc provides coordinate reference information to guide the instrument which is usually attached to the arc.
  • Cranial implants of radio-opaque or MRI-opaque materials have also been used as a localization structure. Three or more of such implants are made and used to establish a 3-D coordinate system.
  • Another type of localization device is a fiducial structure which is positioned in the operating space for calibrating the operating space in terms of a 3-D coordinate framework.
  • the fiducial structure includes a set of fiducial points connected by a frame constructed to hold the points in fixed spatial relation to each other.
  • the 3-D operating space framework is derived by computation from two 2-D projections of a calibration image pair obtain from video cameras made with the fiducial structure positioned in the operating space. After the calibration image pair is made the fiducial structure is removed, and a standard projection algorithm is used to reconstruct the operating space framework from the calibration image pair. Such framework is then aligned with a 3-D volume scan framework and can be used to locate and track a medical instrument in the operating space, so long as the cameras remain in fixed positions relative to the operating space.
  • a basic disadvantage with these conventional 3-D reference frame structures is that they add an extra degree of complication to the tracking process by establishing a coordinate framework for the operating space.
  • DRFs dynamic reference frames
  • the technique of the present invention does not require a fixed fiducial registration structure, nor does it require a fixed camera position relative to the operating space. Moreover, the technique provides a way of tracking a surgical tool with respect to a 3-D model, instead of using a 2-D projection.
  • the invention provides an apparatus for use in an image-guided surgical or a diagnostic procedure for tracking an instrument in an operating space that includes a target site of a patient.
  • the apparatus comprises a data-storage medium for storing scan data representing scans of the operating space including the patient target site; an image capture device adapted to capture at least two stereo images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure; a display device; and a processor in communication with the data-storage medium, the image capture device, and the display device.
  • the processor is operable to register selected scan data with the stereo images without using a fiducial structure, construct a composite, threedimensional rendition showing features from the selected scan data and the stereo images, and display the composite rendition on the display device, so as to enable a user to track the instrument as it is moved within the operating space to a selected position with respect to the patient target site.
  • the processor preferably constructs the composite rendition using stereo photogrammetry to extract three-dimensional information from projective images
  • the image capture device preferably comprises a plurality of x-ray devices adapted to capture at least two stereo radiographic images.
  • the apparatus is capable of tracking an instrument, which may be flexible or rigid, not visible to a user.
  • the apparatus may be used to track a flexible instrument such as a flexible catheter through a vascular network in a patient.
  • the data-storage medium of the apparatus stores scan data representing scans of the vascular network
  • the image capture device in the form of an x-ray device, is adapted to capture at least two stereo radiographic images of vascular network and such instrument during the image-guide surgical or diagnostic procedure.
  • the composite rendition is displayed on the display device, so as to enable a user to track and navigate the flexible instrument as it is moved through a selected path in the vascular network to a selected position in the patient's body.
  • the invention involves a method for use in an image-guided surgical or a diagnostic procedure for tracking an instrument, which may be flexible or rigid, in an operating space that includes a target site of a patient.
  • the method may be used to track such an instrument whether or not it is visible to a user.
  • the method comprises storing scan data representing scans of the operating space including the patient target site; capturing at least two stereo images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure; registering selected scan data with the stereo images without using a fiducial structure; constructing a composite, three-dimensional rendition showing features from the selected scan data and the stereo images; and displaying the composite rendition, so as to enable a user to track the instrument as it is moved within the operating space to a selected position with respect to the patient target site.
  • the constructing comprises constructing the composite rendition using stereo photogrammetry to extract three-dimensional information from projective images
  • the capturing comprises capturing at least two radiographic images of the operating space including the patient target site and such instrument during the surgical or diagnostic procedure.
  • Another aspect of the invention provides a processor-readable medium embodying a program of instructions for execution by a processor for performing a method, used in an image-guided surgical or a diagnostic procedure, for tracking an instrument in an operating space that includes a target site of a patient.
  • the program of instructions comprises instructions for performing the above-described method.
  • FIG. 1 is a partially perspective, partially schematic view of an image-guided surgery system, according to embodiments of the invention.
  • FIG. 2 is a schematic diagram depicting the architecture of a computer system which may be used in the image-guided surgery system.
  • FIG. 3 is a flow chart illustrating an image-guided surgical method, according to embodiments of the invention.
  • a mobile fluoroscopic device 12 is used for intra-operative, free-hand imaging of selected portions of the anatomy of a patient 10 .
  • Fluoroscopic device 12 is preferably a C-Arm of the type which may be obtained from General Electric, Milwaukee, Wis.
  • the mobile fluoroscopic device includes an X-ray camera 14 and an image intensifier 16 .
  • the system also includes a surgical instrument 18 , which may be any of a variety of devices including a catheter having a flexible or rigid construction.
  • Each of the C-arm/image intensifier and surgical instrument is equipped with emitters 16 a and 18 a respectively that define local coordinate systems for each of those components.
  • the emitters may be infrared light-emitting diode (LED) markers which are in communication with a tracking device or position sensor 20 which may be an optical/electronic device, such as an Optotrack available from Northern Digital, Waterloo, Ontario, Canada.
  • the position sensor tracks these components within an operating space 19 , and supplies data needed to perform coordinate transformations between the various local coordinate systems to a computer system 22 , such as a workstation computer of the type available from Sun Microsystems, Mountain View, Calif. or Silicon Graphics Inc., Mountain View, Calif.
  • the NTSC video output of camera 14 is also processed by the computer system.
  • a video framegrabber board such as an SLIC-Video available from Osprey Systems, Cary, N.C., may also be employed to allow loading of gray-scale images from the video buffer of the C-arm to the computer system.
  • the general architecture of such a computer system is shown in more detail in FIG. 2.
  • the computer system includes a central processing unit (CPU) 30 that provides computing resources and controls the computer.
  • CPU 30 may be implemented with a microprocessor or the like, and may also include a graphics processor and/or a floating point coprocessor for mathematical computations.
  • Computer 22 also includes system memory 32 which may be in the form of random-access memory (RAM) and random-access memory (ROM).
  • Input device(s) 34 such as a keyboard, mouse, foot pedal, stylus, etc., are used to input data into the computer.
  • Storage device(s) 36 include a storage medium such as magnetic tape or disk, or optical disk, e.g., a compact disk, that are used to record programs of instructions for operating systems, utilities and applications.
  • the storage device(s) may be internal, such as a hard disk and may also include a disk drive for reading data and software embodied on external storage mediums such as compact disks, etc.
  • Storage device 36 may be used to store one or more programs and data that implement various aspects of the present invention, including the registration and tracking procedures.
  • One or more display devices 38 are used to display various images to the surgeon during the surgical procedure. Display device(s) 38 are preferably highresolution device(s).
  • the computer may also include communications device(s) 40 , such as a modem or other network device for making connection to a network, such as a local area network (LAN), Internet, etc.
  • communications device(s) 40 such as a modem or other network device for making connection to a network, such as a local area network (LAN), Internet, etc.
  • program(s) and/or data that implement various aspects of the present invention may be transmitted to computer 22 from a remote location (e.g., a server or another workstation) over a network.
  • All major system components of the computer may connect to a bus 42 which may be more than one physical bus.
  • Bus 42 is preferably a high-bandwidth bus to improve speed of image display during the procedure.
  • step 301 intrinsic camera calibration is performed.
  • step 302 image data acquired by camera 14 is used to register a pre-operative CT data set (e.g., 130 slices, 512 ⁇ 512, 1.0 mm thick) to the patient's reference frame.
  • a pre-operative CT data set e.g., 130 slices, 512 ⁇ 512, 1.0 mm thick
  • the system operator performs an initial registration of the patient's anatomy to the pre-operative CT data set by taking at least two protocoled fluoroscopic views (e.g., an AP view and a lateral view) of the operating space 19 , including a patient target site 50 (e.g., a target vertebrae). These images are then used to compute the C-Arm-to-CT data set registration in step 303 using a fully automatic technique described in detail below. As a result, the position and orientation of the C-Arm's camera is obtained in the reference frame of the CT data set.
  • protocoled fluoroscopic views e.g., an AP view and a lateral view
  • This information is then used, together with the position and orientation of the camera in the reference frame of the tracking device to follow the future relative motions of the camera.
  • Tracking data is used to monitor the relative position changes of camera 14 with respect to the patient during free-hand navigation.
  • Misregistration due to either patient movement or system error, can be detected at any time by comparing the predicted Digitally Reconstructed Radiograph (DRR) to the actual fluoroscopic image, at which time the objects can be re-registered within seconds.
  • DRR Digitally Reconstructed Radiograph
  • the tool is then back-projected into the reference frame of the CT data set using standard stereoscopic techniques that are well known in the computer vision community (step 304 ).
  • the position and orientation of the tool can then be visualized with respect to a 3-D model of the anatomical structure of interest.
  • the tool is tracked (step 305 ).
  • the tool can also be tracked externally (e.g., using the tracking device already employed, or a robotic interface) to facilitate a variety of surgical procedure types.
  • the system is calibrated to eliminate imaging distortions and extract parameters required to characterize the imaging system.
  • a set of parameters that characterize the internal geometry of the imaging system is obtained. All of these parameters are well documented in the computer vision literature [2]. They include effective focal length (i.e., the exact distance from the X-ray source to the image plane), focal spot (i.e., the exact place where the optical axis pierces the image plane), magnification factors, and pincushion distortions. It is also well understood that many of the C-Arm's intrinsic parameters vary with orientation. For example, the focal length and focal spot will be different for vertical and horizontal positioning, due to mechanical sag of the arm. Similarly, the pincushion distortion depends on the orientation of the device with respect to the Earth's magnetic field, and external source electromagnetic fields.
  • a calibration jig 24 is placed in the field-of-view of the of the X-ray camera, e.g., fixed to the face of the image intensifier, in order to adaptively calibrate the system every time an image is acquired.
  • the intrinsic parameters thus generated are used during the registration phase. It should be noted that the calibration jig 24 is used exclusively for the purpose of correcting distortion of the imaging system and not for the purpose of registration.
  • the imaging system can be registered to the patient.
  • the task involves finding the position and orientation of camera 14 in the reference frame of the CT study that produced the image.
  • a technique used by a commercial frameless image-guided radiosurgery system (Cyberknife, Accuray Inc., Sunnyvale, Calif., USA) [3] is employed.
  • the system hypothesizes about the camera's actual six degrees of freedom (DOF) position by introducing a virtual camera and computing a simulated radiograph (e.g., DRR).
  • DOF six degrees of freedom
  • the virtual camera position identifies the actual camera position and thus the registration is found; otherwise another hypothesis is formed.
  • the accuracy of such basic radiograph-to-DRR registration method can be substantially improved if two or more views of the same anatomy can be used.
  • the relative positions of the cameras are known (e.g., from tracking device 20 ). Therefore, instead of finding two or more independent camera positions [4], the task can be reformulated as finding the pose of CT study with respect to the rigid configuration of multiple cameras.
  • the registration process can be viewed as an extrinsic calibration of an abstract imaging system consisting of multiple, rigidly mounted cameras.
  • the radiograph-to-DRR registration procedure has three parts: (1) exploring the range of possible patient positions to be represented in the DRRs (i.e., hypothesis formation strategy), (2) identifying those elements in the images (image features) that most effectively capture information about the patient pose, and then (3) using a comparison statistic or cost function for the two sets of image feature data to indicate when the best match has been achieved.
  • matched filter area correlation generates a single reference image from a DRR representing the desired patient position from the camera point of view. This reference image, or part thereof, is shifted and rotated relative to the acquired image until the best fit is achieved, in a manner of a sliding window matched filter.
  • a second approach referred to herein as interpolative area correlation, consists of calculating a set of reference DRRs that samples a full range of possible patient positions and orientations, and then making an interpolative comparison of the acquired fluoroscopic image with each of the DRRs.
  • a third method consists of interactively re-projecting DRRs while perturbing the pose of the patient image in the CT study, until a DRR is made that matches the fluoroscopic image.
  • the iterative re-projection technique can accurately measure all six DOFs and is the approach taken in the present invention.
  • Comparison of the DRRs and acquired radiographs is a problem in pattern recognition.
  • the sets of image data used to compare two images are called feature vectors.
  • the most primitive feature vector is simply the complete set of pixel gray-scale values, where each pixel's brightness is the magnitude of a vector component.
  • More sophisticated feature vectors are usually sought to emphasize the important large-scale structure in the image and minimize the extraneous information and noise.
  • Formal systems of feature extraction involve recasting the original gray-scale feature vector on an orthogonal system of eigenvectors that relate to large-scale patterns that are not necessarily physical structures.
  • Heuristic feature vectors identify the positions and shapes of physical edges, boundaries, and other discernible structures.
  • the fluoroscope image contrast is expanded and then thresholded to highlight the fiducial shadows.
  • the image-plane coordinates of the fiducials are automatically extracted using one of three methods: (1) A Sobel edge filter is convolved across the image to further enhance the fiducials, the image is thresholded again, and then x and y moments are computed in the neighborhood of the bright fiducial edge features; (2) a matched filter representing the shadow of the fiducial is convolved across the image and the convolution maxima are isolated by thresholding; (3) if spherical fiducials have been used, a circular Hough transform is applied to the image, resulting in a bright maximum at the center of each fiducial shadow.
  • Registration of skeletal landmarks is accomplished by edge-filtering the fluoroscope and DRR reference images and then locating the points where the anatomical edges intersect line segments at fixed positions in the images. These points make up the feature vectors for the radiographs and DRRs. With eight to ten feature points in each fluoroscope view, the translational and rotational registration can again achieve ⁇ 0.2 mm and ⁇ 0.5 degrees precision, respectively.
  • the fluoroscope images have a pixel pitch of 1.0 mm, the position of a 2-3 mm diameter spherical fiducial can be found with a precision of ⁇ 0.2 mm. This yields a translational registration precision of 0.2 mm or better.
  • the rotational precision depends on the fiducial spacing and the angle of projection of the fiducial configuration in the fluoroscope image. For fiducials spaced 25 mm apart a typical projection angle will resolve out-of-plane rotations with a precision of ⁇ 0.5 degrees.
  • the technique of the present invention employs fluoroscopic imaging and registration using percutaneously implanted markers or skeletal anatomy as a minimally invasive approach. This helps in the guiding of surgical tools using pre-operative 3-D diagnostic scans. While the technique of the present invention does not use a real-time DRF for the sake of target movement monitoring, periodic re-registration is much more practical than in conventional approaches: misregistration can be detected and eliminated by simply re-acquiring two new fluoroscopic images and running a fairly automatic procedure that requires little or no intervention by the operator.
  • the registration technique has been adopted from Cyberknife's radiation treatment methodology, which has been shown to register either fiducial markers or skeletal anatomy with sub-millimeter and sub-degree precision in all six degrees of freedom and computation efficiency leading to a time scale of approximately one second to complete the entire registration process. Additional digital re-projection techniques using off-the-shelf computer graphics hardware will further enhance the robustness, accuracy, and performance of the registration method.
  • the registration method of the present invention coupled with more sophisticated visualization and biomechanical modeling techniques can potentially be generalized to handle non-rigid deformations resulting from inter-vertebral displacement. This would allow the techniques of the present invention to be applied in clinical scenarios that involve more than a single vertebral body, without cumbersome patient fixation or making assumptions about unchanged pose between the time of the scan and intra-operative positioning.
  • the ability to intra-operatively register articulated, deformable spinal anatomy in near real-time and with high accuracy would be a critical improvement over existing systems.
  • An important differentiating factor of the present invention is that it provides the ability to track surgical tools with respect to the patient's anatomy as defined by a pre-operative diagnostic scan. Unlike traditional approaches where optical tracking is used to follow the surgical tools, the techniques of the present invention employ fluoroscopy for both registration and tool tracking with respect to a 3-D model from the diagnostic scan. This permits the system of the present invention to be applied in the context of minimally invasive percutaneous procedures where the tool may not be exposed and visible to the tracking device. This is particularly advantageous for flexible and/or articulated effectors, which cannot be tracked optically.
  • Another benefit of the present invention's approach is more effective use of the fluoroscope with less exposure to ionizing radiation on the part of the patient as well as the surgeon, because instrumented fluoroscopy can be used in a more controlled manner than during conventional freehand imaging.
  • the present invention does not use a fixed fiducial registration structure, nor require a fixed camera position.
  • various aspects of the present invention such as registering the operating space to the pre-operative data set, registering the C-Arm to the pre-operative data set to place the C-Arm's camera in the reference frame of the pre-operative data set, and constructing and displaying images/3-D composite renditions to track the tool as it is moved in the operating space may be implemented by a program of instructions (i.e., software).
  • Software implementing one or more of these aspects may be written to run with existing software used for computer-assisted/image-guided surgery.
  • the software for any or all of these tasks may be fetched by the processor from RAM in computer system 22 for execution.
  • the software may be stored in a storage medium in storage device 36 and transferred to RAM when in use. Alternatively, the software may be transferred to RAM through communications device 40 . More broadly, the software may be conveyed by any medium that is readable by the CPU. Such media may include, in addition to various magnetic and optical media, various communication paths throughout the electromagnetic spectrum including infrared signals, signals transmitted through a network or the Internet, and carrier waves encoded to transmit the software.
  • the above-described aspects of the invention also may be implemented with functionally equivalent hardware using discrete components, application specific integrated circuits (ASICs), digital signal processing circuits, or the like.
  • ASICs application specific integrated circuits
  • Such hardware may be physically integrated with the computer processor(s) or may be a separate device which may be embodied on a computer card that can be inserted into an available card slot in the computer.
  • the present invention provides a method, apparatus and system for tracking a surgical tool, flexible or rigid, and localizing it with respect to the patient's anatomy and pre-operative 3-D diagnostic scans using intra-operative fluoroscopy for in situ registration, without external calibration devices or fixing camera position.
  • the resulting system leverages equipment already commonly available in operating rooms while providing a new, cost-effective medical instrument tracking technique that is free of many current limitations in the field.
  • the computer-assisted tracking system of the present invention which provides 3-D navigation and guidance for a surgeon, offers many advantages. It improves the accuracy, reduces risk, minimizes the invasiveness, and shortens the time it takes to perform a variety of neurosurgical and orthopedic procedures, particularly of the spine.

Abstract

An apparatus, method and system for tracking a medical instrument, as it is moved in an operating space to a patient target site in the space, by constructing a composite, 3-D rendition of at least a part of the operating space based on an algorithm that registers preoperative 3-D diagnostic scans of the operating space with real-time, stereo x-ray or radiograph images of the operating space. The invention has particular utility in tracking a flexible medical instrument and/or a medical instrument that moves inside the patient's body and is not visible to the surgeon.

Description

  • This application claims priority to U.S. Provisional Patent Application Serial No. 60/214,324 filed Jun. 27, 2000, which is incorporated in its entirety herein by reference.[0001]
  • FIELD OF THE INVENTION
  • The present invention relates to an apparatus, method and system for tracking a medical instrument in three-dimensional (3-D) space based on diagnostic scan data and intra-operative stereo images. The invention has particular application in tracking instruments, both flexible and rigid, as they are moved inside a patient's body. The invention also relates to a processor-readable medium embodying a program of instructions (e.g., software) which may be employed with the apparatus or system for implementing aspects of the tracking method. [0002]
  • References [0003]
  • [1] R. Hofstetter, M. Slomczynski, M. Sati and L.-P. Nolte, “Fluoroscopy as an Imaging Means for Computer-Assisted Surgical Navigation,” [0004] Computer Aided Surgery 4:65-76, (1999).
  • [2] R. Y. Tsai, “An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision”, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Miami Beach, Fla., 1986, pages 364-374. [0005]
  • [[0006] 3] M. J. Murphy, “An automatic six-degree-of-freedom image registration algorithm form image-guided frameless stereotaxic radiosurgery,” in Medical Physics 24(6), (June 1997).
  • [4] J. Weese, G. P. Penny, T. M. Buzug, C. Fassnacht and C. Lorenz “2D/3D registration of pre-operative CT images and intra-operative X-ray projections for image guided surgery,” in CARS97, H. U. Lemke, M. W. Vannier and K Inamura ed., pages 833-838, (1997). [0007]
  • [5] M. Roth, C. Brack, R. Burgkart, A. Zcopf, H. Gotte and A. Schwiekard “Multi-view contourless registration of bone structures using single calibrated X-ray fluoroscope,” CARS99, pages 756-761, (1999). [0008]
  • BACKGROUND OF THE INVENTION
  • Various scanning techniques are known for imaging and mapping body structures, which provide information regarding the location of a target site in a patient's body for surgical or diagnostic procedures. One such technique employs still photography, videography, radiological x-rays, or angiography to produce a 2-D projection of a 3-D object. [0009]
  • Another technique involves (1) acquiring 2-D image scans of the operating space and internal anatomical structures of interest either pre- or intra-operatively; (2) reconstructing 3-D images based on the acquired 2-D scans; and (3) segmenting the 3-D images. The scans are typically computerized tomographic (CT) scans, positron emission tomography (PET) scans, or magnetic resonance image (MRI) scans. [0010]
  • The image scans are registered with the patient to provide a basis for localizing or tracking a medical instrument with respect to anatomical features or other elements in the images, as the instrument is moved within the operating field during surgery. Registration involves the point-for-point mapping of the image space to the patient space, allowing corresponding points to be mapped together. Corresponding points are those points that represent the same anatomical features in two spaces. [0011]
  • With registration established, appropriate equipment can be used to track the medical instrument relative to internal structures of the patient as it is navigated in and around the patient target site during surgery. Images of the target site are displayed to assist the user (e.g., the surgeon) in navigating to the target site. Conventional tracking equipment includes a structure to define a 3-D reference coordinate system relative to the patient or operating space. [0012]
  • One such structure used for instrument localization or tracking in neurosurgery is a large ring-like device which surrounds the patient's head and is fixed relative thereto. The ring establishes a 3-D coordinate system with respect to the patient's head. A separate calibration unit having an array of rod elements is fixed to the ring to surround the head during the generation of scan and/or 2-D images. The rods, which have known coordinates in the 3-D coordinate system defined by the ring, produce spots in the scans. Other features in the volume scans can then be assigned coordinates in the 3-D coordinate system by correlation with the known coordinates of the spots produced by the rod elements. [0013]
  • After the images are made, the calibration unit is detached from the ring, and a guidance arc calibrated to the 3-D coordinate system of the ring is attached in its place. The arc provides coordinate reference information to guide the instrument which is usually attached to the arc. [0014]
  • Cranial implants of radio-opaque or MRI-opaque materials have also been used as a localization structure. Three or more of such implants are made and used to establish a 3-D coordinate system. [0015]
  • Another type of localization device is a fiducial structure which is positioned in the operating space for calibrating the operating space in terms of a 3-D coordinate framework. The fiducial structure includes a set of fiducial points connected by a frame constructed to hold the points in fixed spatial relation to each other. The 3-D operating space framework is derived by computation from two 2-D projections of a calibration image pair obtain from video cameras made with the fiducial structure positioned in the operating space. After the calibration image pair is made the fiducial structure is removed, and a standard projection algorithm is used to reconstruct the operating space framework from the calibration image pair. Such framework is then aligned with a 3-D volume scan framework and can be used to locate and track a medical instrument in the operating space, so long as the cameras remain in fixed positions relative to the operating space. [0016]
  • A basic disadvantage with these conventional 3-D reference frame structures is that they add an extra degree of complication to the tracking process by establishing a coordinate framework for the operating space. [0017]
  • In the area of computer-assisted spine surgery various systems have been proposed for registration and localization. These systems are generally similar in the methodology used and the functionality provided, with the majority of such systems employing optical trackers for the purpose of registration and localization. Typically, the vertebrae of interest is fully exposed intra-operatively, and a small number of distinct anatomical landmarks are digitized for the purpose of coarse registration. Subsequently, a larger number of points are digitized on the surface of the vertebrae to refine the registration with a surface matching technique. The procedure is often cumbersome, time consuming, and of limited accuracy. This is mainly due to difficulties in identifying characteristic anatomical landmarks in a reproducible fashion and inherent inaccuracies of surface matching techniques. While dynamic reference frames (DRFs) are commonly used to monitor target movement, any safeguarding against DRF misregistration requires the entire process, including the laborious manual digitization part to be repeated. The problem is exacerbated in procedures involving multiple vertebrae (e.g., cage placements) requiring context of percutaneous procedures, because they rely on the target structure being directly visible to the optical tracking device. [0018]
  • Recently, there has been some interest in fluoroscopy as an intra-operative imaging modality [1]. The relative low cost and pervasiveness of C-Arm devices in modern operating rooms (ORs) drives this interest. Most of these attempts focus on improving conventional 2D navigation techniques via tracking of the C-Arm and re-projecting preoperative CT data onto multiple planes. Such techniques are helpful in lowering the amount of ionizing radiation delivered to the patient and the OR staff during free-hand navigation and also in providing more information to the surgeon about the relative position of the surgical tools with respect to the patient's anatomy. However, they essentially automate and streamline the current workflow and rely on the surgeon's ability to create a complex spatial model mentally. [0019]
  • Thus, there is a need for a more efficient and effective technique for performing registration and localization to track a medical instrument in an operating space that eliminates the need to establish an operating space framework and the complications associated with DRFs. [0020]
  • SUMMARY OF THE INVENTION
  • It is therefore an object of this invention to provide a technique that employs stereoscopic registration in order to relate a patient's anatomy to pre-operative diagnostic scans in 3-D without the aid of an external calibration device. [0021]
  • It is another object of this invention to provide a technique that is able to track a surgical tool and localize it with respect to the patient's anatomy and pre-operative diagnostic scans using intra-operative fluoroscopy for in situ registration. [0022]
  • Advantageously, the technique of the present invention does not require a fixed fiducial registration structure, nor does it require a fixed camera position relative to the operating space. Moreover, the technique provides a way of tracking a surgical tool with respect to a 3-D model, instead of using a 2-D projection. [0023]
  • In one aspect, the invention provides an apparatus for use in an image-guided surgical or a diagnostic procedure for tracking an instrument in an operating space that includes a target site of a patient. The apparatus comprises a data-storage medium for storing scan data representing scans of the operating space including the patient target site; an image capture device adapted to capture at least two stereo images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure; a display device; and a processor in communication with the data-storage medium, the image capture device, and the display device. The processor is operable to register selected scan data with the stereo images without using a fiducial structure, construct a composite, threedimensional rendition showing features from the selected scan data and the stereo images, and display the composite rendition on the display device, so as to enable a user to track the instrument as it is moved within the operating space to a selected position with respect to the patient target site. [0024]
  • The processor preferably constructs the composite rendition using stereo photogrammetry to extract three-dimensional information from projective images, and the image capture device preferably comprises a plurality of x-ray devices adapted to capture at least two stereo radiographic images. [0025]
  • The apparatus is capable of tracking an instrument, which may be flexible or rigid, not visible to a user. In particular, the apparatus may be used to track a flexible instrument such as a flexible catheter through a vascular network in a patient. In such an arrangement, the data-storage medium of the apparatus stores scan data representing scans of the vascular network, and the image capture device, in the form of an x-ray device, is adapted to capture at least two stereo radiographic images of vascular network and such instrument during the image-guide surgical or diagnostic procedure. The composite rendition is displayed on the display device, so as to enable a user to track and navigate the flexible instrument as it is moved through a selected path in the vascular network to a selected position in the patient's body. [0026]
  • In another aspect, the invention involves a method for use in an image-guided surgical or a diagnostic procedure for tracking an instrument, which may be flexible or rigid, in an operating space that includes a target site of a patient. The method may be used to track such an instrument whether or not it is visible to a user. The method comprises storing scan data representing scans of the operating space including the patient target site; capturing at least two stereo images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure; registering selected scan data with the stereo images without using a fiducial structure; constructing a composite, three-dimensional rendition showing features from the selected scan data and the stereo images; and displaying the composite rendition, so as to enable a user to track the instrument as it is moved within the operating space to a selected position with respect to the patient target site. [0027]
  • Preferably, the constructing comprises constructing the composite rendition using stereo photogrammetry to extract three-dimensional information from projective images, and the capturing comprises capturing at least two radiographic images of the operating space including the patient target site and such instrument during the surgical or diagnostic procedure. [0028]
  • Another aspect of the invention provides a processor-readable medium embodying a program of instructions for execution by a processor for performing a method, used in an image-guided surgical or a diagnostic procedure, for tracking an instrument in an operating space that includes a target site of a patient. The program of instructions comprises instructions for performing the above-described method.[0029]
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 is a partially perspective, partially schematic view of an image-guided surgery system, according to embodiments of the invention. [0030]
  • FIG. 2 is a schematic diagram depicting the architecture of a computer system which may be used in the image-guided surgery system. [0031]
  • FIG. 3 is a flow chart illustrating an image-guided surgical method, according to embodiments of the invention.[0032]
  • DETAILED DISCRIPTION OF THE INVENTION
  • The general set-up of fluoroscopic imaging for computer-assisted surgery is described in document [1] listed in the “References” section of this application. The contents of that document are incorporated by reference herein. [0033]
  • I. System Overview [0034]
  • Referring to FIG. 1 a mobile [0035] fluoroscopic device 12 is used for intra-operative, free-hand imaging of selected portions of the anatomy of a patient 10. Fluoroscopic device 12 is preferably a C-Arm of the type which may be obtained from General Electric, Milwaukee, Wis. The mobile fluoroscopic device includes an X-ray camera 14 and an image intensifier 16. The system also includes a surgical instrument 18, which may be any of a variety of devices including a catheter having a flexible or rigid construction. Each of the C-arm/image intensifier and surgical instrument is equipped with emitters 16 a and 18 a respectively that define local coordinate systems for each of those components. The emitters may be infrared light-emitting diode (LED) markers which are in communication with a tracking device or position sensor 20 which may be an optical/electronic device, such as an Optotrack available from Northern Digital, Waterloo, Ontario, Canada. The position sensor tracks these components within an operating space 19, and supplies data needed to perform coordinate transformations between the various local coordinate systems to a computer system 22, such as a workstation computer of the type available from Sun Microsystems, Mountain View, Calif. or Silicon Graphics Inc., Mountain View, Calif. The NTSC video output of camera 14 is also processed by the computer system. A video framegrabber board, such as an SLIC-Video available from Osprey Systems, Cary, N.C., may also be employed to allow loading of gray-scale images from the video buffer of the C-arm to the computer system.
  • The general architecture of such a computer system is shown in more detail in FIG. 2. The computer system includes a central processing unit (CPU) [0036] 30 that provides computing resources and controls the computer. CPU 30 may be implemented with a microprocessor or the like, and may also include a graphics processor and/or a floating point coprocessor for mathematical computations. Computer 22 also includes system memory 32 which may be in the form of random-access memory (RAM) and random-access memory (ROM). Input device(s) 34, such as a keyboard, mouse, foot pedal, stylus, etc., are used to input data into the computer. Storage device(s) 36 include a storage medium such as magnetic tape or disk, or optical disk, e.g., a compact disk, that are used to record programs of instructions for operating systems, utilities and applications. The storage device(s) may be internal, such as a hard disk and may also include a disk drive for reading data and software embodied on external storage mediums such as compact disks, etc. Storage device 36 may be used to store one or more programs and data that implement various aspects of the present invention, including the registration and tracking procedures. One or more display devices 38 are used to display various images to the surgeon during the surgical procedure. Display device(s) 38 are preferably highresolution device(s). The computer may also include communications device(s) 40, such as a modem or other network device for making connection to a network, such as a local area network (LAN), Internet, etc. With such an arrangement, program(s) and/or data that implement various aspects of the present invention may be transmitted to computer 22 from a remote location (e.g., a server or another workstation) over a network. All major system components of the computer may connect to a bus 42 which may be more than one physical bus. Bus 42 is preferably a high-bandwidth bus to improve speed of image display during the procedure.
  • Referring to the flow chart of FIG. 3, an overview of the general steps of an image-guided surgical method according to embodiments of the present invention is illustrated. Initially, in [0037] step 301, intrinsic camera calibration is performed. In step 302, image data acquired by camera 14 is used to register a pre-operative CT data set (e.g., 130 slices, 512×512, 1.0 mm thick) to the patient's reference frame.
  • During the course of the registration procedure, the system operator performs an initial registration of the patient's anatomy to the pre-operative CT data set by taking at least two protocoled fluoroscopic views (e.g., an AP view and a lateral view) of the operating [0038] space 19, including a patient target site 50 (e.g., a target vertebrae). These images are then used to compute the C-Arm-to-CT data set registration in step 303 using a fully automatic technique described in detail below. As a result, the position and orientation of the C-Arm's camera is obtained in the reference frame of the CT data set.
  • This information is then used, together with the position and orientation of the camera in the reference frame of the tracking device to follow the future relative motions of the camera. Tracking data is used to monitor the relative position changes of [0039] camera 14 with respect to the patient during free-hand navigation.
  • Misregistration, due to either patient movement or system error, can be detected at any time by comparing the predicted Digitally Reconstructed Radiograph (DRR) to the actual fluoroscopic image, at which time the objects can be re-registered within seconds. [0040]
  • With surgical tool being visible in at least two fluoroscopic views, the tool is then back-projected into the reference frame of the CT data set using standard stereoscopic techniques that are well known in the computer vision community (step [0041] 304). The position and orientation of the tool can then be visualized with respect to a 3-D model of the anatomical structure of interest. Using this composite rendition, in which the tool is present in the reference frame of the scan data, the tool is tracked (step 305). The tool can also be tracked externally (e.g., using the tracking device already employed, or a robotic interface) to facilitate a variety of surgical procedure types.
  • II. Intrinsic Camera Calibration [0042]
  • Before use, the system is calibrated to eliminate imaging distortions and extract parameters required to characterize the imaging system. During the process of intrinsic camera calibration, a set of parameters that characterize the internal geometry of the imaging system is obtained. All of these parameters are well documented in the computer vision literature [2]. They include effective focal length (i.e., the exact distance from the X-ray source to the image plane), focal spot (i.e., the exact place where the optical axis pierces the image plane), magnification factors, and pincushion distortions. It is also well understood that many of the C-Arm's intrinsic parameters vary with orientation. For example, the focal length and focal spot will be different for vertical and horizontal positioning, due to mechanical sag of the arm. Similarly, the pincushion distortion depends on the orientation of the device with respect to the Earth's magnetic field, and external source electromagnetic fields. [0043]
  • For these reasons, a [0044] calibration jig 24 is placed in the field-of-view of the of the X-ray camera, e.g., fixed to the face of the image intensifier, in order to adaptively calibrate the system every time an image is acquired. The intrinsic parameters thus generated are used during the registration phase. It should be noted that the calibration jig 24 is used exclusively for the purpose of correcting distortion of the imaging system and not for the purpose of registration.
  • III. Extrinsic Camera Calibration and Registration [0045]
  • Once the intrinsic camera parameters are measured, the imaging system can be registered to the patient. Given a CT study of the relevant anatomy in the operating space and a fluoroscopic image of the same anatomy, the task involves finding the position and orientation of [0046] camera 14 in the reference frame of the CT study that produced the image. For this purpose, a technique used by a commercial frameless image-guided radiosurgery system (Cyberknife, Accuray Inc., Sunnyvale, Calif., USA) [3] is employed. In this approach, the system hypothesizes about the camera's actual six degrees of freedom (DOF) position by introducing a virtual camera and computing a simulated radiograph (e.g., DRR). The radiograph represents the actual patient position. If the DRR matches the radiograph exactly, the virtual camera position identifies the actual camera position and thus the registration is found; otherwise another hypothesis is formed. In practice, the accuracy of such basic radiograph-to-DRR registration method can be substantially improved if two or more views of the same anatomy can be used. In this scenario, it is assumed that the relative positions of the cameras are known (e.g., from tracking device 20). Therefore, instead of finding two or more independent camera positions [4], the task can be reformulated as finding the pose of CT study with respect to the rigid configuration of multiple cameras. In essence, the registration process can be viewed as an extrinsic calibration of an abstract imaging system consisting of multiple, rigidly mounted cameras.
  • The radiograph-to-DRR registration procedure has three parts: (1) exploring the range of possible patient positions to be represented in the DRRs (i.e., hypothesis formation strategy), (2) identifying those elements in the images (image features) that most effectively capture information about the patient pose, and then (3) using a comparison statistic or cost function for the two sets of image feature data to indicate when the best match has been achieved. [0047]
  • There are three fundamental ways of generating hypotheses for possible patient placement. One approach, called matched filter area correlation, generates a single reference image from a DRR representing the desired patient position from the camera point of view. This reference image, or part thereof, is shifted and rotated relative to the acquired image until the best fit is achieved, in a manner of a sliding window matched filter. A second approach, referred to herein as interpolative area correlation, consists of calculating a set of reference DRRs that samples a full range of possible patient positions and orientations, and then making an interpolative comparison of the acquired fluoroscopic image with each of the DRRs. Using two cameras, either of the aforementioned methods can accurately measure all three translational and one rotational degree of freedom, provided there is no out-of-plane degrees of freedom [6]. A third method consists of interactively re-projecting DRRs while perturbing the pose of the patient image in the CT study, until a DRR is made that matches the fluoroscopic image. The iterative re-projection technique can accurately measure all six DOFs and is the approach taken in the present invention. [0048]
  • Comparison of the DRRs and acquired radiographs is a problem in pattern recognition. The sets of image data used to compare two images are called feature vectors. The most primitive feature vector is simply the complete set of pixel gray-scale values, where each pixel's brightness is the magnitude of a vector component. More sophisticated feature vectors are usually sought to emphasize the important large-scale structure in the image and minimize the extraneous information and noise. Formal systems of feature extraction involve recasting the original gray-scale feature vector on an orthogonal system of eigenvectors that relate to large-scale patterns that are not necessarily physical structures. Heuristic feature vectors identify the positions and shapes of physical edges, boundaries, and other discernible structures. [0049]
  • The DRR and radiograph feature vectors A and B are compared using a similarity statistic or cost function. This can be simply the cross-correlation coefficient r=A·B=cos θ, where A and B have been normalized to unit length. (The vectors can be centered on their means before normalization, which gives Pearson's correlation coefficient.) A more general and flexible comparison can be made with the chi-squared statistic: χ[0050] 2=Σ(Ai−Bi)2/wi 2, where each vector component is weighted according to both its reliability and usefulness, and the vectors are not necessarily normalized. When the vectors are normalized, and all vector components carry equal weight, χ2 is proportional to 1-r.
  • IV. Image Processing [0051]
  • For fiducial-based registration, the fluoroscope image contrast is expanded and then thresholded to highlight the fiducial shadows. The image-plane coordinates of the fiducials are automatically extracted using one of three methods: (1) A Sobel edge filter is convolved across the image to further enhance the fiducials, the image is thresholded again, and then x and y moments are computed in the neighborhood of the bright fiducial edge features; (2) a matched filter representing the shadow of the fiducial is convolved across the image and the convolution maxima are isolated by thresholding; (3) if spherical fiducials have been used, a circular Hough transform is applied to the image, resulting in a bright maximum at the center of each fiducial shadow. Registration of skeletal landmarks is accomplished by edge-filtering the fluoroscope and DRR reference images and then locating the points where the anatomical edges intersect line segments at fixed positions in the images. These points make up the feature vectors for the radiographs and DRRs. With eight to ten feature points in each fluoroscope view, the translational and rotational registration can again achieve ±0.2 mm and ±0.5 degrees precision, respectively. [0052]
  • If the fluoroscope images have a pixel pitch of 1.0 mm, the position of a 2-3 mm diameter spherical fiducial can be found with a precision of ±0.2 mm. This yields a translational registration precision of 0.2 mm or better. The rotational precision depends on the fiducial spacing and the angle of projection of the fiducial configuration in the fluoroscope image. For fiducials spaced 25 mm apart a typical projection angle will resolve out-of-plane rotations with a precision of ±0.5 degrees. [0053]
  • V. Features and Advantages [0054]
  • Various factors differentiating the technique and solution of the present invention from that of others include: selecting fluoroscopy for the in situ imaging technique, using stereo photogrammetry to extract 3-D information from projective images, as well as using a robust, precise and practical registration method. [0055]
  • Unlike all currently available 3-D spinal navigation packages, which require full exposure of the vertebral body for the sake of registration and real-time optical tracking, the technique of the present invention employs fluoroscopic imaging and registration using percutaneously implanted markers or skeletal anatomy as a minimally invasive approach. This helps in the guiding of surgical tools using pre-operative 3-D diagnostic scans. While the technique of the present invention does not use a real-time DRF for the sake of target movement monitoring, periodic re-registration is much more practical than in conventional approaches: misregistration can be detected and eliminated by simply re-acquiring two new fluoroscopic images and running a fairly automatic procedure that requires little or no intervention by the operator. The registration technique has been adopted from Cyberknife's radiation treatment methodology, which has been shown to register either fiducial markers or skeletal anatomy with sub-millimeter and sub-degree precision in all six degrees of freedom and computation efficiency leading to a time scale of approximately one second to complete the entire registration process. Additional digital re-projection techniques using off-the-shelf computer graphics hardware will further enhance the robustness, accuracy, and performance of the registration method. [0056]
  • Similarly, the registration method of the present invention coupled with more sophisticated visualization and biomechanical modeling techniques can potentially be generalized to handle non-rigid deformations resulting from inter-vertebral displacement. This would allow the techniques of the present invention to be applied in clinical scenarios that involve more than a single vertebral body, without cumbersome patient fixation or making assumptions about unchanged pose between the time of the scan and intra-operative positioning. The ability to intra-operatively register articulated, deformable spinal anatomy in near real-time and with high accuracy would be a critical improvement over existing systems. [0057]
  • An important differentiating factor of the present invention is that it provides the ability to track surgical tools with respect to the patient's anatomy as defined by a pre-operative diagnostic scan. Unlike traditional approaches where optical tracking is used to follow the surgical tools, the techniques of the present invention employ fluoroscopy for both registration and tool tracking with respect to a 3-D model from the diagnostic scan. This permits the system of the present invention to be applied in the context of minimally invasive percutaneous procedures where the tool may not be exposed and visible to the tracking device. This is particularly advantageous for flexible and/or articulated effectors, which cannot be tracked optically. [0058]
  • Another benefit of the present invention's approach is more effective use of the fluoroscope with less exposure to ionizing radiation on the part of the patient as well as the surgeon, because instrumented fluoroscopy can be used in a more controlled manner than during conventional freehand imaging. [0059]
  • Finally, the present invention does not use a fixed fiducial registration structure, nor require a fixed camera position. [0060]
  • VI. System Implementation [0061]
  • As previously noted, various aspects of the present invention, such as registering the operating space to the pre-operative data set, registering the C-Arm to the pre-operative data set to place the C-Arm's camera in the reference frame of the pre-operative data set, and constructing and displaying images/3-D composite renditions to track the tool as it is moved in the operating space may be implemented by a program of instructions (i.e., software). Software implementing one or more of these aspects may be written to run with existing software used for computer-assisted/image-guided surgery. [0062]
  • The software for any or all of these tasks may be fetched by the processor from RAM in [0063] computer system 22 for execution. The software may be stored in a storage medium in storage device 36 and transferred to RAM when in use. Alternatively, the software may be transferred to RAM through communications device 40. More broadly, the software may be conveyed by any medium that is readable by the CPU. Such media may include, in addition to various magnetic and optical media, various communication paths throughout the electromagnetic spectrum including infrared signals, signals transmitted through a network or the Internet, and carrier waves encoded to transmit the software.
  • As an alternative to software implementation, the above-described aspects of the invention also may be implemented with functionally equivalent hardware using discrete components, application specific integrated circuits (ASICs), digital signal processing circuits, or the like. Such hardware may be physically integrated with the computer processor(s) or may be a separate device which may be embodied on a computer card that can be inserted into an available card slot in the computer. [0064]
  • Thus, the above-described aspects of the invention can be implemented using software, functionally equivalent hardware, or combination thereof. The diagrams and accompanying description provide the functional information one skilled in the art would require to implement a system to perform the functions required using any of these programming tools. [0065]
  • As should be readily apparent from the foregoing description, the present invention provides a method, apparatus and system for tracking a surgical tool, flexible or rigid, and localizing it with respect to the patient's anatomy and pre-operative 3-D diagnostic scans using intra-operative fluoroscopy for in situ registration, without external calibration devices or fixing camera position. The resulting system leverages equipment already commonly available in operating rooms while providing a new, cost-effective medical instrument tracking technique that is free of many current limitations in the field. The computer-assisted tracking system of the present invention, which provides 3-D navigation and guidance for a surgeon, offers many advantages. It improves the accuracy, reduces risk, minimizes the invasiveness, and shortens the time it takes to perform a variety of neurosurgical and orthopedic procedures, particularly of the spine. [0066]
  • While embodiments of the invention have been described, it will be apparent to those skilled in the art in light of the foregoing description that many further alternatives, modifications and variations are possible. The invention described herein is intended to embrace all such alternatives, modifications and variations as may fall within the spirit and scope of the appended claims. [0067]

Claims (14)

What is claimed:
1. An apparatus for use in an image-guided surgical or a diagnostic procedure for tracking an instrument in an operating space that includes a target site of a patient, the apparatus comprising:
(a) a data-storage medium for storing scan data representing scans of the operating space including the patient target site;
(b) an image capture device adapted to capture at least two stereo images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure;
(c) a display device; and
(d) a processor in communication with the data-storage medium, the image capture device, and the display device for:
(i) registering selected scan data with the stereo images without using a fiducial structure,
(ii) constructing a composite, three-dimensional rendition showing features from the selected scan data and the stereo images, and
(iii) displaying the composite rendition on the display device, so as to enable a user to track the instrument as it is moved within the operating space to a selected position with respect to the patient target site.
2. The apparatus of claim 1, wherein the processor constructs the composite rendition using stereo photogrammetry to extract three-dimensional information from projective images.
3. The apparatus of claim 1, wherein the image capture device comprises a plurality of x-ray devices, and the at least two captured stereo images are radiographic images.
4. The apparatus of claim 1, wherein the apparatus is capable of tracking an instrument not visible to a user.
5. The apparatus of claim 1, wherein the apparatus is capable of tracking a flexible instrument not visible to a user.
6. An apparatus for use in an image-guided surgical or a diagnostic procedure for tracking a flexible instrument such as a flexible catheter through a vascular network in a patient, the apparatus comprising:
(a) a data-storage medium for storing scan data representing scans of the vascular network;
(b) an x-ray device adapted to capture at least two stereo radiographic images of vascular network and such instrument during the image-guided surgical or diagnostic procedure;
(c) a display device; and
(d) a processor in communication with the data-storage medium, the x-ray device, and the display device for:
(i) registering selected scan data with the stereo images without using a fiducial structure,
(ii) constructing a three-dimensional, composite rendition showing features from the selected scan data and the stereo images, and
(iii) displaying the composite rendition on the display device, so as to enable a user to track and navigate the flexible instrument as it is moved through a selected path in the vascular network to a selected position in the patient's body.
7. A method for use in an image-guided surgical or a diagnostic procedure for tracking an instrument in an operating space that includes a target site of a patient, the method comprising:
(a) storing scan data representing scans of the operating space including the patient target site;
(b) capturing at least two stereo images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure;
(c) registering selected scan data with the stereo images without using a fiducial structure;
(d) constructing a composite, three-dimensional rendition showing features from the selected scan data and the stereo images; and
(e) displaying the composite rendition, so as to enable a user to track the instrument as it is moved within the operating space to a selected position with respect to the patient target site.
8. The method of claim 7, wherein the constructing (d) comprises constructing the composite rendition using stereo photogrammetry to extract three-dimensional information from projective images.
9. The method of claim 7, wherein the capturing (b) comprises capturing at least two radiographic images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure.
10. The method of claim 7, wherein the tracked instrument is not visible to a user.
11. The method of claim 7, wherein the tracked instrument is a flexible instrument not visible to a user.
12. A processor-readable medium embodying a program of instructions for execution by a processor for performing a method, used in an image-guided surgical or a diagnostic procedure, for tracking an instrument in an operating space that includes a target site of a patient, the program of instructions comprising:
(a) instructions for storing scan data representing scans of the operating space including the patient target site;
(b) instructions for capturing at least two stereo images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure;
(c) instructions for registering selected scan data with the stereo images without using a fiducial structure;
(d) instructions for constructing a composite, three-dimensional rendition showing features from the selected scan data and the stereo images; and
(e) instructions for displaying the composite rendition, so as to enable a user to track the instrument as it is moved within the operating space to a selected position with respect to the patient target site.
13. The processor-readable medium of claim 12, wherein the constructing instructions (d) comprises instructions for constructing the composite rendition using stereo photogrammetry to extract three-dimensional information from projective images.
14. The processor-readable medium of claim 12, wherein the capturing instructions (b) comprises instructions for capturing at least two radiographic images of the operating space including the patient target site and such instrument during the image-guided surgical or diagnostic procedure.
US09/892,402 2000-06-27 2001-06-26 Method and apparatus for tracking a medical instrument based on image registration Expired - Lifetime US6782287B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/892,402 US6782287B2 (en) 2000-06-27 2001-06-26 Method and apparatus for tracking a medical instrument based on image registration

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US21432400P 2000-06-27 2000-06-27
US09/892,402 US6782287B2 (en) 2000-06-27 2001-06-26 Method and apparatus for tracking a medical instrument based on image registration

Publications (2)

Publication Number Publication Date
US20020077543A1 true US20020077543A1 (en) 2002-06-20
US6782287B2 US6782287B2 (en) 2004-08-24

Family

ID=22798640

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/892,402 Expired - Lifetime US6782287B2 (en) 2000-06-27 2001-06-26 Method and apparatus for tracking a medical instrument based on image registration

Country Status (3)

Country Link
US (1) US6782287B2 (en)
AU (1) AU2001278181A1 (en)
WO (1) WO2002000103A2 (en)

Cited By (107)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030095638A1 (en) * 2001-11-16 2003-05-22 Joerg Sabczynski Method and device for calibrating an image pick-up device sensitive to magnetic fields and for imaging by means of such an image pick-up device
US20040034300A1 (en) * 2002-08-19 2004-02-19 Laurent Verard Method and apparatus for virtual endoscopy
US20040087852A1 (en) * 2001-02-06 2004-05-06 Edward Chen Computer-assisted surgical positioning method and system
US20040127788A1 (en) * 2002-09-09 2004-07-01 Arata Louis K. Image guided interventional method and apparatus
US20040147839A1 (en) * 2002-10-25 2004-07-29 Moctezuma De La Barrera Jose Luis Flexible tracking article and method of using the same
US20040153191A1 (en) * 2003-02-04 2004-08-05 Grimm James E. Implant registration device for surgical navigation system
US20040167654A1 (en) * 2003-02-04 2004-08-26 Zimmer Technology, Inc. Implant registration device for surgical navigation system
US20040181149A1 (en) * 2001-02-07 2004-09-16 Ulrich Langlotz Device and method for intraoperative navigation
WO2004081865A2 (en) * 2003-03-10 2004-09-23 University Of Iowa Research Foundation Systems and methods for bioliminescent computed tomographic reconstruction
US20050049478A1 (en) * 2003-08-29 2005-03-03 Gopinath Kuduvalli Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
US20050059887A1 (en) * 2003-09-16 2005-03-17 Hassan Mostafavi Localization of a target using in vivo markers
US20050163279A1 (en) * 2003-12-19 2005-07-28 Matthias Mitschke Method and apparatus for image support of an operative procedure implemented with a medical instrument
US20050195587A1 (en) * 2004-03-08 2005-09-08 Moctezuma De La Barrera Jose L. Enhanced illumination device and method
US20050272991A1 (en) * 2004-04-22 2005-12-08 Chenyang Xu Method and system for registering pre-procedural images with intra-procedural images using a pre-computed knowledge base
US20060002615A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US20060002631A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. ROI selection in image registration
US20060002632A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Motion field generation for non-rigid image registration
US20060002601A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. DRR generation using a non-linear attenuation model
US20060002630A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US20060015030A1 (en) * 2002-08-26 2006-01-19 Orthosoft Inc. Method for placing multiple implants during a surgery using a computer aided surgery system
US20060029917A1 (en) * 2004-08-06 2006-02-09 Sui Leung K Navigation surgical training model, apparatus having the same and method thereof
US20060074292A1 (en) * 2004-09-30 2006-04-06 Accuray, Inc. Dynamic tracking of moving targets
US20060100510A1 (en) * 2004-10-21 2006-05-11 Remy Klausz Method and apparatus for using tomography for placement of an instrument
US20060184014A1 (en) * 2004-12-02 2006-08-17 Manfred Pfeiler Registration aid for medical images
US20060291710A1 (en) * 2005-06-23 2006-12-28 Bai Wang DRR generation and enhancement using a dedicated graphics device
US20070016011A1 (en) * 2005-05-18 2007-01-18 Robert Schmidt Instrument position recording in medical navigation
US20070016009A1 (en) * 2005-06-27 2007-01-18 Lakin Ryan C Image guided tracking array and method
US7166114B2 (en) 2002-09-18 2007-01-23 Stryker Leibinger Gmbh & Co Kg Method and system for calibrating a surgical tool and adapter thereof
US20070073136A1 (en) * 2005-09-15 2007-03-29 Robert Metzger Bone milling with image guided surgery
US20070100223A1 (en) * 2005-10-14 2007-05-03 Rui Liao Method and system for cardiac imaging and catheter guidance for radio frequency (RF) ablation
US20070110289A1 (en) * 2005-11-16 2007-05-17 Dongshan Fu Rigid body tracking for radiosurgery
US20070127845A1 (en) * 2005-11-16 2007-06-07 Dongshan Fu Multi-phase registration of 2-D X-ray images to 3-D volume studies
US20070189457A1 (en) * 2005-08-22 2007-08-16 Frank Deinzer Method for displaying a devise in a 3-D image of a volumetric data set
US20080039716A1 (en) * 2006-08-11 2008-02-14 Gregor Tuma Method and system for determining the location of a medical instrument relative to a body structure
US20080037843A1 (en) * 2006-08-11 2008-02-14 Accuray Incorporated Image segmentation for DRR generation and image registration
US20080118116A1 (en) * 2006-11-20 2008-05-22 General Electric Company Systems and methods for tracking a surgical instrument and for conveying tracking information via a network
US20080119728A1 (en) * 2006-10-05 2008-05-22 Visionsense Ltd. Method and system for superimposing three dimensional medical information on a three dimensional image
US20080119712A1 (en) * 2006-11-20 2008-05-22 General Electric Company Systems and Methods for Automated Image Registration
EP2014233A1 (en) * 2004-03-04 2009-01-14 Philips Intellectual Property & Standards GmbH Apparatus for the processing of perfusion images
US20090062646A1 (en) * 2005-07-07 2009-03-05 Creighton Iv Francis M Operation of a remote medical navigation system using ultrasound image
US20090290771A1 (en) * 2003-04-25 2009-11-26 Surgical Navigation Technologies, Inc. Method and Apparatus for Performing 2D to 3D Registration
US20090312629A1 (en) * 2008-06-13 2009-12-17 Inneroptic Technology Inc. Correction of relative tracking errors based on a fiducial
US20100030232A1 (en) * 2006-09-25 2010-02-04 Eli Zehavi System for positioning of surgical inserts and tools
US20100045783A1 (en) * 2001-10-19 2010-02-25 Andrei State Methods and systems for dynamic virtual convergence and head mountable display using same
US7728868B2 (en) 2006-08-02 2010-06-01 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US7771436B2 (en) 2003-12-10 2010-08-10 Stryker Leibinger Gmbh & Co. Kg. Surgical navigation tracker, system and method
US20100228257A1 (en) * 2000-01-14 2010-09-09 Bonutti Peter M Joint replacement component
US7853307B2 (en) 2003-08-11 2010-12-14 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US7873400B2 (en) 2003-12-10 2011-01-18 Stryker Leibinger Gmbh & Co. Kg. Adapter for surgical navigation trackers
US20110046483A1 (en) * 2008-01-24 2011-02-24 Henry Fuchs Methods, systems, and computer readable media for image guided ablation
US20110043612A1 (en) * 2009-07-31 2011-02-24 Inneroptic Technology Inc. Dual-tube stereoscope
US20110057930A1 (en) * 2006-07-26 2011-03-10 Inneroptic Technology Inc. System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy
US7920909B2 (en) 2005-09-13 2011-04-05 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
US20110082351A1 (en) * 2009-10-07 2011-04-07 Inneroptic Technology, Inc. Representing measurement information during a medical procedure
US8150495B2 (en) 2003-08-11 2012-04-03 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
US20120294497A1 (en) * 2011-05-20 2012-11-22 Varian Medical Systems, Inc. Method and Apparatus Pertaining to Images Used for Radiation-Treatment Planning
US8340379B2 (en) 2008-03-07 2012-12-25 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US20130158565A1 (en) * 2009-11-27 2013-06-20 Mcmaster University Automated in-bore mr guided robotic diagnostic and therapeutic system
US8483351B2 (en) 2009-10-28 2013-07-09 Virginia Tech Intellectual Properties, Inc. Cardiac computed tomography methods and systems using fast exact/quasi-exact filtered back projection algorithms
US20130217952A1 (en) * 2003-07-21 2013-08-22 Vanderbilt University Ophthalmic orbital surgery apparatus and method and image-guided navigation system
US20130218137A1 (en) * 2011-12-30 2013-08-22 Mako Surgical Corp. Integrated surgery system
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US20130293690A1 (en) * 2012-05-07 2013-11-07 Eric S. Olson Medical device navigation system stereoscopic display
US8585598B2 (en) 2009-02-17 2013-11-19 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8623030B2 (en) 2001-08-28 2014-01-07 Bonutti Skeletal Innovations Llc Robotic arthroplasty system including navigation
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8670816B2 (en) 2012-01-30 2014-03-11 Inneroptic Technology, Inc. Multiple medical device guidance
US8696549B2 (en) 2010-08-20 2014-04-15 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US8781186B2 (en) 2010-05-04 2014-07-15 Pathfinder Therapeutics, Inc. System and method for abdominal surface matching using pseudo-features
US20140343572A1 (en) * 2011-12-15 2014-11-20 Ao Technology Ag Method and a device for computer assisted surgery
US9138165B2 (en) 2012-02-22 2015-09-22 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9282947B2 (en) 2009-12-01 2016-03-15 Inneroptic Technology, Inc. Imager focusing based on intraoperative data
US9345552B2 (en) 2011-09-02 2016-05-24 Stryker Corporation Method of performing a minimally invasive procedure on a hip joint of a patient to relieve femoral acetabular impingement
US9510771B1 (en) 2011-10-28 2016-12-06 Nuvasive, Inc. Systems and methods for performing spine surgery
US20160354161A1 (en) * 2015-06-05 2016-12-08 Ortho Kinematics, Inc. Methods for data processing for intra-operative navigation systems
US20170095666A1 (en) * 2004-09-08 2017-04-06 Daniel H. Kim Methods for stimulating a dorsal root ganglion
US20170119339A1 (en) * 2012-06-21 2017-05-04 Globus Medical, Inc. Systems and methods of checking registrations for surgical systems
US20170119474A1 (en) * 2015-10-28 2017-05-04 Endochoice, Inc. Device and Method for Tracking the Position of an Endoscope within a Patient's Body
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
CN109363771A (en) * 2018-12-06 2019-02-22 安徽埃克索医疗机器人有限公司 The fracture of neck of femur Multiple tunnel of 2D planning information plants nail positioning system in a kind of fusion
US10232180B2 (en) 2004-09-08 2019-03-19 The Board Of Trustees Of The Leland Stanford Junior University Selective stimulation to modulate the sympathetic nervous system
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US10325380B2 (en) 2016-01-12 2019-06-18 University Of Iowa Research Foundation Precise, low-cost orthopaedic surgical simulator
US10363102B2 (en) 2011-12-30 2019-07-30 Mako Surgical Corp. Integrated surgery method
US10617324B2 (en) 2014-04-23 2020-04-14 Veran Medical Technologies, Inc Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
US10624701B2 (en) 2014-04-23 2020-04-21 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
CN111096795A (en) * 2018-10-26 2020-05-05 韦伯斯特生物官能(以色列)有限公司 Release mode for a robot
US20210205022A1 (en) * 2018-02-07 2021-07-08 Ao Technology Ag Reference device for real-time tracking of bone and/or surgical objects in computer-assisted surgery
CN113974689A (en) * 2012-03-07 2022-01-28 齐特奥股份有限公司 Space alignment apparatus
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
CN114098971A (en) * 2020-08-27 2022-03-01 杭州三坛医疗科技有限公司 Imaging, navigation and positioning method and device of orthopedic surgery robot and storage medium
US11304630B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US11403966B2 (en) 2018-04-07 2022-08-02 University Of Iowa Research Foundation Fracture reduction simulator
CN114831732A (en) * 2022-07-04 2022-08-02 真健康(北京)医疗科技有限公司 Puncture position verification method and device based on X-ray image
WO2022192690A1 (en) * 2021-03-12 2022-09-15 True Digital Surgery Automated touchless registration for surgical navigation
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
US20230012440A1 (en) * 2021-07-12 2023-01-12 Mazor Robotics Ltd. Systems, devices, and methods for identifying and locating a region of interest
CN115719386A (en) * 2022-11-16 2023-02-28 南京博视医疗科技有限公司 Calibration device and method of laser treatment system based on line scanning
US11707203B2 (en) 2016-10-11 2023-07-25 Wenzel Spine, Inc. Systems for generating image-based measurements during diagnosis
WO2023154548A1 (en) * 2022-02-14 2023-08-17 Nview Medical Inc. Surgical navigation system with distributed patient reference tracking
CN116725662A (en) * 2023-08-11 2023-09-12 北京维卓致远医疗科技发展有限责任公司 Fracture surgery planning method, device and storable medium based on two-dimensional images

Families Citing this family (177)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL134182A (en) 2000-01-23 2006-08-01 Vls Com Ltd Method and apparatus for visual lossless pre-processing
US6753929B1 (en) * 2000-06-28 2004-06-22 Vls Com Ltd. Method and system for real time motion picture segmentation and superposition
DE10210647A1 (en) * 2002-03-11 2003-10-02 Siemens Ag Method for displaying an image of an instrument inserted into an area of a patient under examination uses a C-arch fitted with a source of X-rays and a ray detector.
US20040204645A1 (en) * 2003-04-10 2004-10-14 Vahid Saadat Scope position and orientation feedback device
US20050010105A1 (en) * 2003-07-01 2005-01-13 Sra Jasbir S. Method and system for Coronary arterial intervention
US7454078B2 (en) * 2003-07-22 2008-11-18 Warner Bros. Entertainment Inc. Method and apparatus for flicker removal from an image sequence
CA2551053A1 (en) * 2003-11-03 2005-05-12 Bracco Imaging S.P.A. Stereo display of tube-like structures and improved techniques therefor ("stereo display")
US20050182319A1 (en) 2004-02-17 2005-08-18 Glossop Neil D. Method and apparatus for registration, verification, and referencing of internal organs
US7639892B2 (en) * 2004-07-26 2009-12-29 Sheraizin Semion M Adaptive image improvement
US7903902B2 (en) * 2004-07-26 2011-03-08 Sheraizin Semion M Adaptive image improvement
KR100701306B1 (en) * 2004-08-26 2007-03-29 삼성전자주식회사 The device of processing the shoot artifact in the image signal and the method thereof
EP1645241B1 (en) * 2004-10-05 2011-12-28 BrainLAB AG Position marker system with point light sources
WO2006057786A1 (en) 2004-11-05 2006-06-01 The Government Of The United States Of America As Represented By The Secretary, Department Of Health And Human Services Access system
KR100689707B1 (en) * 2004-11-12 2007-03-08 삼성전자주식회사 Bank selection signal control circuit, semiconductor memory device having the same and method for control bank selection signal
US7805269B2 (en) 2004-11-12 2010-09-28 Philips Electronics Ltd Device and method for ensuring the accuracy of a tracking device in a volume
US7751868B2 (en) 2004-11-12 2010-07-06 Philips Electronics Ltd Integrated skin-mounted multifunction device for use in image-guided surgery
US8611983B2 (en) 2005-01-18 2013-12-17 Philips Electronics Ltd Method and apparatus for guiding an instrument to a target in the lung
US7840254B2 (en) 2005-01-18 2010-11-23 Philips Electronics Ltd Electromagnetically tracked K-wire device
US7526142B2 (en) * 2005-02-22 2009-04-28 Sheraizin Vitaly S Enhancement of decompressed video
US8073528B2 (en) 2007-09-30 2011-12-06 Intuitive Surgical Operations, Inc. Tool tracking systems, methods and computer products for image guided surgery
US8108072B2 (en) * 2007-09-30 2012-01-31 Intuitive Surgical Operations, Inc. Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information
US8147503B2 (en) * 2007-09-30 2012-04-03 Intuitive Surgical Operations Inc. Methods of locating and tracking robotic instruments in robotic surgical systems
US10555775B2 (en) 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
DE602006019117D1 (en) 2005-06-21 2011-02-03 Us Government DEVICE AND METHOD FOR A TRACKABLE ULTRASOUND
WO2007002079A2 (en) 2005-06-21 2007-01-04 Traxtal Inc. System, method and apparatus for navigated therapy and diagnosis
DE102005036322A1 (en) * 2005-07-29 2007-02-15 Siemens Ag Intraoperative registration method for intraoperative image data sets, involves spatial calibration of optical three-dimensional sensor system with intraoperative imaging modality
US9661991B2 (en) 2005-08-24 2017-05-30 Koninklijke Philips N.V. System, method and devices for navigated flexible endoscopy
US7643862B2 (en) 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
DE102005051102B4 (en) * 2005-10-24 2011-02-24 Cas Innovations Gmbh & Co. Kg System for medical navigation
US9402639B2 (en) * 2005-12-14 2016-08-02 General Electric Company Method and apparatus for alignment of a mobile fluoroscopic imaging system
US8219178B2 (en) 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US10653497B2 (en) 2006-02-16 2020-05-19 Globus Medical, Inc. Surgical tool systems and methods
US10357184B2 (en) 2012-06-21 2019-07-23 Globus Medical, Inc. Surgical tool systems and method
US10893912B2 (en) 2006-02-16 2021-01-19 Globus Medical Inc. Surgical tool systems and methods
US8165659B2 (en) 2006-03-22 2012-04-24 Garrett Sheffer Modeling method and apparatus for use in surgical navigation
US8560047B2 (en) 2006-06-16 2013-10-15 Board Of Regents Of The University Of Nebraska Method and apparatus for computer aided surgery
US8401620B2 (en) * 2006-10-16 2013-03-19 Perfint Healthcare Private Limited Needle positioning apparatus and method
US8218908B2 (en) * 2006-11-02 2012-07-10 Canon Kabushiki Kaisha Mixed content image compression with two edge data representations
US7671887B2 (en) * 2006-11-20 2010-03-02 General Electric Company System and method of navigating a medical instrument
US7780349B2 (en) * 2007-01-03 2010-08-24 James G. Schwade Apparatus and method for robotic radiosurgery beam geometry quality assurance
IL188569A (en) 2007-01-17 2014-05-28 Mediguide Ltd Method and system for registering a 3d pre-acquired image coordinate system with a medical positioning system coordinate system and with a 2d image coordinate system
EP2117436A4 (en) * 2007-03-12 2011-03-02 David Tolkowsky Devices and methods for performing medical procedures in tree-like luminal structures
US8934961B2 (en) 2007-05-18 2015-01-13 Biomet Manufacturing, Llc Trackable diagnostic scope apparatus and methods of use
US20080319491A1 (en) * 2007-06-19 2008-12-25 Ryan Schoenefeld Patient-matched surgical component and methods of use
US8571637B2 (en) 2008-01-21 2013-10-29 Biomet Manufacturing, Llc Patella tracking method and apparatus for use in surgical navigation
US8725241B2 (en) 2008-11-07 2014-05-13 Cardioinsight Technologies, Inc. Visualization of physiological data for virtual electrodes
EP2345024B1 (en) 2008-11-10 2017-11-08 Cardioinsight Technologies, Inc. Visualization of electrophysiology data
WO2010065107A1 (en) * 2008-12-04 2010-06-10 Packetvideo Corp. System and method for browsing, selecting and/or controlling rendering of media with a mobile device
WO2011134083A1 (en) 2010-04-28 2011-11-03 Ryerson University System and methods for intraoperative guidance feedback
US20120190970A1 (en) 2010-11-10 2012-07-26 Gnanasekar Velusamy Apparatus and method for stabilizing a needle
US8712177B2 (en) * 2011-01-25 2014-04-29 Siemens Aktiengesellschaft Motion compensated overlay
WO2012131660A1 (en) 2011-04-01 2012-10-04 Ecole Polytechnique Federale De Lausanne (Epfl) Robotic system for spinal and other surgeries
US11911117B2 (en) 2011-06-27 2024-02-27 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
CA2840397A1 (en) 2011-06-27 2013-04-11 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US20130030363A1 (en) 2011-07-29 2013-01-31 Hansen Medical, Inc. Systems and methods utilizing shape sensing fibers
RU2015101519A (en) 2012-06-20 2016-08-10 Конинклейке Филипс Н.В. MULTI-CAMERA DEVICE TRACKING
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US10758315B2 (en) 2012-06-21 2020-09-01 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
US10624710B2 (en) 2012-06-21 2020-04-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US11786324B2 (en) 2012-06-21 2023-10-17 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
JP2015528713A (en) 2012-06-21 2015-10-01 グローバス メディカル インコーポレイティッド Surgical robot platform
US11395706B2 (en) 2012-06-21 2022-07-26 Globus Medical Inc. Surgical robot platform
US11607149B2 (en) 2012-06-21 2023-03-21 Globus Medical Inc. Surgical tool systems and method
US10350013B2 (en) 2012-06-21 2019-07-16 Globus Medical, Inc. Surgical tool systems and methods
US10136954B2 (en) 2012-06-21 2018-11-27 Globus Medical, Inc. Surgical tool systems and method
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
US11116576B2 (en) 2012-06-21 2021-09-14 Globus Medical Inc. Dynamic reference arrays and methods of use
US10231791B2 (en) 2012-06-21 2019-03-19 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US11896446B2 (en) 2012-06-21 2024-02-13 Globus Medical, Inc Surgical robotic automation with tracking markers
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
US11589771B2 (en) 2012-06-21 2023-02-28 Globus Medical Inc. Method for recording probe movement and determining an extent of matter removed
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US10874466B2 (en) 2012-06-21 2020-12-29 Globus Medical, Inc. System and method for surgical tool insertion using multiaxis force and moment feedback
US10646280B2 (en) 2012-06-21 2020-05-12 Globus Medical, Inc. System and method for surgical tool insertion using multiaxis force and moment feedback
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US10799298B2 (en) 2012-06-21 2020-10-13 Globus Medical Inc. Robotic fluoroscopic navigation
US9024462B2 (en) 2012-09-19 2015-05-05 Jeff Thramann Generation of electrical energy in a ski or snowboard
US10070828B2 (en) 2013-03-05 2018-09-11 Nview Medical Inc. Imaging systems and related apparatus and methods
US10846860B2 (en) 2013-03-05 2020-11-24 Nview Medical Inc. Systems and methods for x-ray tomosynthesis image reconstruction
US9057600B2 (en) 2013-03-13 2015-06-16 Hansen Medical, Inc. Reducing incremental measurement sensor error
US9014851B2 (en) 2013-03-15 2015-04-21 Hansen Medical, Inc. Systems and methods for tracking robotically controlled medical instruments
US9271663B2 (en) 2013-03-15 2016-03-01 Hansen Medical, Inc. Flexible instrument localization from both remote and elongation sensors
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9283048B2 (en) 2013-10-04 2016-03-15 KB Medical SA Apparatus and systems for precise guidance of surgical tools
WO2015107099A1 (en) 2014-01-15 2015-07-23 KB Medical SA Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
EP3104803B1 (en) 2014-02-11 2021-09-15 KB Medical SA Sterile handle for controlling a robotic surgical system from a sterile field
US10737118B2 (en) * 2014-03-03 2020-08-11 Varian Medical Systems, Inc. Systems and methods for patient position monitoring
US10004562B2 (en) 2014-04-24 2018-06-26 Globus Medical, Inc. Surgical instrument holder for use with a robotic surgical system
US10828120B2 (en) 2014-06-19 2020-11-10 Kb Medical, Sa Systems and methods for performing minimally invasive surgery
CN107072673A (en) 2014-07-14 2017-08-18 Kb医疗公司 Anti-skidding operating theater instruments for preparing hole in bone tissue
US10765438B2 (en) 2014-07-14 2020-09-08 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
US9974525B2 (en) 2014-10-31 2018-05-22 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
EP3226781B1 (en) 2014-12-02 2018-08-01 KB Medical SA Robot assisted volume removal during surgery
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
EP3258872B1 (en) 2015-02-18 2023-04-26 KB Medical SA Systems for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
US10646298B2 (en) 2015-07-31 2020-05-12 Globus Medical, Inc. Robot arm and methods of use
US10058394B2 (en) 2015-07-31 2018-08-28 Globus Medical, Inc. Robot arm and methods of use
US10716525B2 (en) 2015-08-06 2020-07-21 Covidien Lp System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US10674982B2 (en) 2015-08-06 2020-06-09 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US10702226B2 (en) 2015-08-06 2020-07-07 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US10080615B2 (en) 2015-08-12 2018-09-25 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
WO2017037127A1 (en) 2015-08-31 2017-03-09 KB Medical SA Robotic surgical systems and methods
US10034716B2 (en) 2015-09-14 2018-07-31 Globus Medical, Inc. Surgical robotic systems and methods thereof
US9771092B2 (en) 2015-10-13 2017-09-26 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
US11172895B2 (en) 2015-12-07 2021-11-16 Covidien Lp Visualization, navigation, and planning with electromagnetic navigation bronchoscopy and cone beam computed tomography integrated
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US10117632B2 (en) 2016-02-03 2018-11-06 Globus Medical, Inc. Portable medical imaging system with beam scanning collimator
EP3429475B1 (en) 2016-03-13 2021-12-15 Vuze Medical Ltd. Apparatus for use with skeletal procedures
US10866119B2 (en) 2016-03-14 2020-12-15 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
US11051886B2 (en) 2016-09-27 2021-07-06 Covidien Lp Systems and methods for performing a surgical navigation procedure
US11039893B2 (en) 2016-10-21 2021-06-22 Globus Medical, Inc. Robotic surgical systems
EP3360502A3 (en) 2017-01-18 2018-10-31 KB Medical SA Robotic navigation of robotic surgical systems
JP2018114280A (en) 2017-01-18 2018-07-26 ケービー メディカル エスアー Universal instrument guide for robotic surgical system, surgical instrument system, and method of using them
EP3351202B1 (en) 2017-01-18 2021-09-08 KB Medical SA Universal instrument guide for robotic surgical systems
US11071594B2 (en) 2017-03-16 2021-07-27 KB Medical SA Robotic navigation of robotic surgical systems
EP3621545B1 (en) 2017-05-10 2024-02-21 MAKO Surgical Corp. Robotic spine surgery system
US10699448B2 (en) 2017-06-29 2020-06-30 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
EP3651678A4 (en) 2017-07-08 2021-04-14 Vuze Medical Ltd. Apparatus and methods for use with image-guided skeletal procedures
US10675094B2 (en) 2017-07-21 2020-06-09 Globus Medical Inc. Robot surgical platform
CN111373448B (en) 2017-09-22 2023-12-08 尼维医疗公司 Image reconstruction using machine learning regularization
EP3694412A4 (en) 2017-10-10 2021-08-18 Covidien LP System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
JP6778242B2 (en) 2017-11-09 2020-10-28 グローバス メディカル インコーポレイティッド Surgical robot systems for bending surgical rods, and related methods and equipment
US11382666B2 (en) 2017-11-09 2022-07-12 Globus Medical Inc. Methods providing bend plans for surgical rods and related controllers and computer program products
US11794338B2 (en) 2017-11-09 2023-10-24 Globus Medical Inc. Robotic rod benders and related mechanical and motor housings
US11134862B2 (en) 2017-11-10 2021-10-05 Globus Medical, Inc. Methods of selecting surgical implants and related devices
US10905498B2 (en) 2018-02-08 2021-02-02 Covidien Lp System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US11364004B2 (en) 2018-02-08 2022-06-21 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US20190254753A1 (en) 2018-02-19 2019-08-22 Globus Medical, Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US10573023B2 (en) 2018-04-09 2020-02-25 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
US11337742B2 (en) 2018-11-05 2022-05-24 Globus Medical Inc Compliant orthopedic driver
US11278360B2 (en) 2018-11-16 2022-03-22 Globus Medical, Inc. End-effectors for surgical robotic systems having sealed optical components
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11918313B2 (en) 2019-03-15 2024-03-05 Globus Medical Inc. Active end effectors for surgical robots
US11419616B2 (en) 2019-03-22 2022-08-23 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11317978B2 (en) 2019-03-22 2022-05-03 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11571265B2 (en) 2019-03-22 2023-02-07 Globus Medical Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US20200297357A1 (en) 2019-03-22 2020-09-24 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11806084B2 (en) 2019-03-22 2023-11-07 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
US11628023B2 (en) 2019-07-10 2023-04-18 Globus Medical, Inc. Robotic navigational system for interbody implants
US11571171B2 (en) 2019-09-24 2023-02-07 Globus Medical, Inc. Compound curve cable chain
US11426178B2 (en) 2019-09-27 2022-08-30 Globus Medical Inc. Systems and methods for navigating a pin guide driver
US11864857B2 (en) 2019-09-27 2024-01-09 Globus Medical, Inc. Surgical robot with passive end effector
US11890066B2 (en) 2019-09-30 2024-02-06 Globus Medical, Inc Surgical robot with passive end effector
US11510684B2 (en) 2019-10-14 2022-11-29 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US11164039B2 (en) * 2019-10-23 2021-11-02 International Business Machines Corporation Framework for few-shot temporal action localization
US11464581B2 (en) 2020-01-28 2022-10-11 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11253216B2 (en) 2020-04-28 2022-02-22 Globus Medical Inc. Fixtures for fluoroscopic imaging systems and related navigation systems and methods
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11317973B2 (en) 2020-06-09 2022-05-03 Globus Medical, Inc. Camera tracking bar for computer assisted navigation during surgery
US11382713B2 (en) 2020-06-16 2022-07-12 Globus Medical, Inc. Navigated surgical system with eye to XR headset display calibration
US11877807B2 (en) 2020-07-10 2024-01-23 Globus Medical, Inc Instruments for navigated orthopedic surgeries
US11793588B2 (en) 2020-07-23 2023-10-24 Globus Medical, Inc. Sterile draping of robotic arms
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11523785B2 (en) 2020-09-24 2022-12-13 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
US11911112B2 (en) 2020-10-27 2024-02-27 Globus Medical, Inc. Robotic navigational system
US11941814B2 (en) 2020-11-04 2024-03-26 Globus Medical Inc. Auto segmentation using 2-D images taken during 3-D imaging spin
US11717350B2 (en) 2020-11-24 2023-08-08 Globus Medical Inc. Methods for robotic assistance and navigation in spinal surgery and related systems
CN112614141B (en) * 2020-12-18 2023-09-19 深圳市德力凯医疗设备股份有限公司 Vascular scanning path planning method and device, storage medium and terminal equipment
US11857273B2 (en) 2021-07-06 2024-01-02 Globus Medical, Inc. Ultrasonic robotic surgical navigation
US11439444B1 (en) 2021-07-22 2022-09-13 Globus Medical, Inc. Screw tower and rod reduction tool
US11911115B2 (en) 2021-12-20 2024-02-27 Globus Medical Inc. Flat panel registration fixture and method of using same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5531520A (en) * 1994-09-01 1996-07-02 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets including anatomical body data
US5901199A (en) * 1996-07-11 1999-05-04 The Board Of Trustees Of The Leland Stanford Junior University High-speed inter-modality image registration via iterative feature matching
US6285902B1 (en) * 1999-02-10 2001-09-04 Surgical Insights, Inc. Computer assisted targeting device for use in orthopaedic surgery
US6314310B1 (en) * 1997-02-14 2001-11-06 Biosense, Inc. X-ray guided surgical location system with extended mapping volume
US6347240B1 (en) * 1990-10-19 2002-02-12 St. Louis University System and method for use in displaying images of a body part

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3949229A (en) 1974-06-24 1976-04-06 Albert Richard D X-ray scanning method and apparatus
US5662111A (en) * 1991-01-28 1997-09-02 Cosman; Eric R. Process of stereotactic optical navigation
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
DE4417944A1 (en) * 1994-05-21 1995-11-23 Zeiss Carl Fa Process for correlating different coordinate systems in computer-assisted, stereotactic surgery
US6259943B1 (en) * 1995-02-16 2001-07-10 Sherwood Services Ag Frameless to frame-based registration system
US6061439A (en) * 1997-03-27 2000-05-09 Nortel Networks Corporation Method and apparatus for providing subscriber services to a telephone
AU4305201A (en) * 1999-11-29 2001-06-04 Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for transforming view orientations in image-guided surgery
US6490475B1 (en) * 2000-04-28 2002-12-03 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6347240B1 (en) * 1990-10-19 2002-02-12 St. Louis University System and method for use in displaying images of a body part
US5531520A (en) * 1994-09-01 1996-07-02 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets including anatomical body data
US5901199A (en) * 1996-07-11 1999-05-04 The Board Of Trustees Of The Leland Stanford Junior University High-speed inter-modality image registration via iterative feature matching
US6314310B1 (en) * 1997-02-14 2001-11-06 Biosense, Inc. X-ray guided surgical location system with extended mapping volume
US6285902B1 (en) * 1999-02-10 2001-09-04 Surgical Insights, Inc. Computer assisted targeting device for use in orthopaedic surgery

Cited By (236)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8632552B2 (en) 2000-01-14 2014-01-21 Bonutti Skeletal Innovations Llc Method of preparing a femur and tibia in knee arthroplasty
US20100228257A1 (en) * 2000-01-14 2010-09-09 Bonutti Peter M Joint replacement component
US8425522B2 (en) 2000-01-14 2013-04-23 Bonutti Skeletal Innovations Llc Joint replacement method
US8784495B2 (en) 2000-01-14 2014-07-22 Bonutti Skeletal Innovations Llc Segmental knee arthroplasty
US9795394B2 (en) 2000-01-14 2017-10-24 Bonutti Skeletal Innovations Llc Method for placing implant using robotic system
US9101443B2 (en) 2000-01-14 2015-08-11 Bonutti Skeletal Innovations Llc Methods for robotic arthroplasty
US9192459B2 (en) 2000-01-14 2015-11-24 Bonutti Skeletal Innovations Llc Method of performing total knee arthroplasty
US20040087852A1 (en) * 2001-02-06 2004-05-06 Edward Chen Computer-assisted surgical positioning method and system
US20040181149A1 (en) * 2001-02-07 2004-09-16 Ulrich Langlotz Device and method for intraoperative navigation
US8858557B2 (en) 2001-08-28 2014-10-14 Bonutti Skeletal Innovations Llc Method of preparing a femur and tibia in knee arthroplasty
US8623030B2 (en) 2001-08-28 2014-01-07 Bonutti Skeletal Innovations Llc Robotic arthroplasty system including navigation
US10231739B1 (en) 2001-08-28 2019-03-19 Bonutti Skeletal Innovations Llc System and method for robotic surgery
US8840629B2 (en) 2001-08-28 2014-09-23 Bonutti Skeletal Innovations Llc Robotic arthroplasty system including navigation
US10470780B2 (en) 2001-08-28 2019-11-12 Bonutti Skeletal Innovations Llc Systems and methods for ligament balancing in robotic surgery
US10321918B2 (en) 2001-08-28 2019-06-18 Bonutti Skeletal Innovations Llc Methods for robotic surgery using a cannula
US8641726B2 (en) 2001-08-28 2014-02-04 Bonutti Skeletal Innovations Llc Method for robotic arthroplasty using navigation
US9060797B2 (en) 2001-08-28 2015-06-23 Bonutti Skeletal Innovations Llc Method of preparing a femur and tibia in knee arthroplasty
US9763683B2 (en) 2001-08-28 2017-09-19 Bonutti Skeletal Innovations Llc Method for performing surgical procedures using optical cutting guides
US8834490B2 (en) 2001-08-28 2014-09-16 Bonutti Skeletal Innovations Llc Method for robotic arthroplasty using navigation
US20100045783A1 (en) * 2001-10-19 2010-02-25 Andrei State Methods and systems for dynamic virtual convergence and head mountable display using same
US6956202B2 (en) * 2001-11-16 2005-10-18 Koninklijke Philips Electronics N.V. Method and device for calibrating an image pick-up device sensitive to magnetic fields and for imaging by means of such an image pick-up device
US20030095638A1 (en) * 2001-11-16 2003-05-22 Joerg Sabczynski Method and device for calibrating an image pick-up device sensitive to magnetic fields and for imaging by means of such an image pick-up device
US20040034300A1 (en) * 2002-08-19 2004-02-19 Laurent Verard Method and apparatus for virtual endoscopy
US20060015030A1 (en) * 2002-08-26 2006-01-19 Orthosoft Inc. Method for placing multiple implants during a surgery using a computer aided surgery system
US20040127788A1 (en) * 2002-09-09 2004-07-01 Arata Louis K. Image guided interventional method and apparatus
US7359746B2 (en) * 2002-09-09 2008-04-15 Z-Kat, Inc. Image guided interventional method and apparatus
US7166114B2 (en) 2002-09-18 2007-01-23 Stryker Leibinger Gmbh & Co Kg Method and system for calibrating a surgical tool and adapter thereof
US7869861B2 (en) * 2002-10-25 2011-01-11 Howmedica Leibinger Inc. Flexible tracking article and method of using the same
US20040147839A1 (en) * 2002-10-25 2004-07-29 Moctezuma De La Barrera Jose Luis Flexible tracking article and method of using the same
US8457719B2 (en) 2002-10-25 2013-06-04 Stryker Corporation Flexible tracking article and method of using the same
US20110077510A1 (en) * 2002-10-25 2011-03-31 Jose Luis Moctezuma De La Barrera Flexible Tracking Article And Method Of Using The Same
US20040167654A1 (en) * 2003-02-04 2004-08-26 Zimmer Technology, Inc. Implant registration device for surgical navigation system
US20040153191A1 (en) * 2003-02-04 2004-08-05 Grimm James E. Implant registration device for surgical navigation system
US6988009B2 (en) * 2003-02-04 2006-01-17 Zimmer Technology, Inc. Implant registration device for surgical navigation system
US6925339B2 (en) * 2003-02-04 2005-08-02 Zimmer Technology, Inc. Implant registration device for surgical navigation system
US8090431B2 (en) 2003-03-10 2012-01-03 University Of Iowa Research Foundation Systems and methods for bioluminescent computed tomographic reconstruction
US20040249260A1 (en) * 2003-03-10 2004-12-09 Ge Wang Systems and methods for bioluminescent computed tomographic reconstruction
WO2004081865A3 (en) * 2003-03-10 2005-07-07 Univ Iowa Res Found Systems and methods for bioliminescent computed tomographic reconstruction
WO2004081865A2 (en) * 2003-03-10 2004-09-23 University Of Iowa Research Foundation Systems and methods for bioliminescent computed tomographic reconstruction
US20090290771A1 (en) * 2003-04-25 2009-11-26 Surgical Navigation Technologies, Inc. Method and Apparatus for Performing 2D to 3D Registration
US8036441B2 (en) * 2003-04-25 2011-10-11 Medtronic Navigation, Inc. Method and apparatus for performing 2D to 3D registration
US20130217952A1 (en) * 2003-07-21 2013-08-22 Vanderbilt University Ophthalmic orbital surgery apparatus and method and image-guided navigation system
US10470725B2 (en) 2003-08-11 2019-11-12 Veran Medical Technologies, Inc. Method, apparatuses, and systems useful in conducting image guided interventions
US11426134B2 (en) * 2003-08-11 2022-08-30 Veran Medical Technologies, Inc. Methods, apparatuses and systems useful in conducting image guided interventions
US7853307B2 (en) 2003-08-11 2010-12-14 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US8483801B2 (en) 2003-08-11 2013-07-09 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US8150495B2 (en) 2003-08-11 2012-04-03 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
US11154283B2 (en) 2003-08-11 2021-10-26 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
US20110054309A1 (en) * 2003-08-11 2011-03-03 Edwards Jerome R Methods, apparatuses, and systems useful in conductng image guided interventions
US20100239153A1 (en) * 2003-08-29 2010-09-23 Accuray Incorporated Image guided radiosurgery method and apparatus using registration of 2d radiographic images with digitally reconstructed radiographs of 3d scan data
US8280491B2 (en) 2003-08-29 2012-10-02 Accuray Incorporated Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
US20070116341A1 (en) * 2003-08-29 2007-05-24 Dongshan Fu Apparatus and method for determining measure of similarity between images
US7480399B2 (en) 2003-08-29 2009-01-20 Accuray, Inc. Apparatus and method for determining measure of similarity between images
US20050049478A1 (en) * 2003-08-29 2005-03-03 Gopinath Kuduvalli Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
US7756567B2 (en) * 2003-08-29 2010-07-13 Accuray Incorporated Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
US20050059887A1 (en) * 2003-09-16 2005-03-17 Hassan Mostafavi Localization of a target using in vivo markers
US7873400B2 (en) 2003-12-10 2011-01-18 Stryker Leibinger Gmbh & Co. Kg. Adapter for surgical navigation trackers
US7771436B2 (en) 2003-12-10 2010-08-10 Stryker Leibinger Gmbh & Co. Kg. Surgical navigation tracker, system and method
US7519415B2 (en) * 2003-12-19 2009-04-14 Siemens Aktiengesellschaft Method and apparatus for image support of an operative procedure implemented with a medical instrument
US20050163279A1 (en) * 2003-12-19 2005-07-28 Matthias Mitschke Method and apparatus for image support of an operative procedure implemented with a medical instrument
EP1727468B1 (en) * 2004-03-04 2009-04-29 Philips Intellectual Property & Standards GmbH Apparatus for the processing of perfusion images
EP2014233A1 (en) * 2004-03-04 2009-01-14 Philips Intellectual Property & Standards GmbH Apparatus for the processing of perfusion images
US20050195587A1 (en) * 2004-03-08 2005-09-08 Moctezuma De La Barrera Jose L. Enhanced illumination device and method
US7567833B2 (en) 2004-03-08 2009-07-28 Stryker Leibinger Gmbh & Co. Kg Enhanced illumination device and method
US20050272991A1 (en) * 2004-04-22 2005-12-08 Chenyang Xu Method and system for registering pre-procedural images with intra-procedural images using a pre-computed knowledge base
US7620223B2 (en) * 2004-04-22 2009-11-17 Siemens Medical Solutions Usa, Inc. Method and system for registering pre-procedural images with intra-procedural images using a pre-computed knowledge base
WO2006011935A3 (en) * 2004-06-30 2007-05-18 Accuray Inc Roi selection in image registration
US20080159612A1 (en) * 2004-06-30 2008-07-03 Dongshan Fu DRR generation using a non-linear attenuation model
US20060002615A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US20080101673A1 (en) * 2004-06-30 2008-05-01 Dongshan Fu Fiducial-less tracking with non-rigid image registration
US20060002631A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. ROI selection in image registration
US7522779B2 (en) 2004-06-30 2009-04-21 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US20090091567A1 (en) * 2004-06-30 2009-04-09 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US7505617B2 (en) 2004-06-30 2009-03-17 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US7231076B2 (en) * 2004-06-30 2007-06-12 Accuray, Inc. ROI selection in image registration
WO2006011935A2 (en) * 2004-06-30 2006-02-02 Accuray, Inc. Roi selection in image registration
US7840093B2 (en) 2004-06-30 2010-11-23 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US7327865B2 (en) 2004-06-30 2008-02-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US7426318B2 (en) 2004-06-30 2008-09-16 Accuray, Inc. Motion field generation for non-rigid image registration
US7366278B2 (en) 2004-06-30 2008-04-29 Accuray, Inc. DRR generation using a non-linear attenuation model
US20060002632A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Motion field generation for non-rigid image registration
US20060002630A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US20060002601A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. DRR generation using a non-linear attenuation model
US20060029917A1 (en) * 2004-08-06 2006-02-09 Sui Leung K Navigation surgical training model, apparatus having the same and method thereof
US8021162B2 (en) * 2004-08-06 2011-09-20 The Chinese University Of Hong Kong Navigation surgical training model, apparatus having the same and method thereof
US20170095666A1 (en) * 2004-09-08 2017-04-06 Daniel H. Kim Methods for stimulating a dorsal root ganglion
US10232180B2 (en) 2004-09-08 2019-03-19 The Board Of Trustees Of The Leland Stanford Junior University Selective stimulation to modulate the sympathetic nervous system
US10159838B2 (en) * 2004-09-08 2018-12-25 The Board Of Trustees Of The Leland Stanford Junior University Methods for stimulating a dorsal root ganglion
US9474914B2 (en) 2004-09-30 2016-10-25 Accuray Incorporated Tracking of moving targets
US8989349B2 (en) * 2004-09-30 2015-03-24 Accuray, Inc. Dynamic tracking of moving targets
US20110092793A1 (en) * 2004-09-30 2011-04-21 Accuray, Inc. Dynamic tracking of moving targets
US20060074292A1 (en) * 2004-09-30 2006-04-06 Accuray, Inc. Dynamic tracking of moving targets
US20060100510A1 (en) * 2004-10-21 2006-05-11 Remy Klausz Method and apparatus for using tomography for placement of an instrument
US8280490B2 (en) * 2004-12-02 2012-10-02 Siemens Aktiengesellschaft Registration aid for medical images
US20060184014A1 (en) * 2004-12-02 2006-08-17 Manfred Pfeiler Registration aid for medical images
US20070016011A1 (en) * 2005-05-18 2007-01-18 Robert Schmidt Instrument position recording in medical navigation
US20080069422A1 (en) * 2005-06-23 2008-03-20 Bai Wang DRR generation and enhancement using a dedicated graphics device
US20060291710A1 (en) * 2005-06-23 2006-12-28 Bai Wang DRR generation and enhancement using a dedicated graphics device
US7330578B2 (en) 2005-06-23 2008-02-12 Accuray Inc. DRR generation and enhancement using a dedicated graphics device
US7840256B2 (en) * 2005-06-27 2010-11-23 Biomet Manufacturing Corporation Image guided tracking array and method
US20070016009A1 (en) * 2005-06-27 2007-01-18 Lakin Ryan C Image guided tracking array and method
US9314222B2 (en) * 2005-07-07 2016-04-19 Stereotaxis, Inc. Operation of a remote medical navigation system using ultrasound image
US20090062646A1 (en) * 2005-07-07 2009-03-05 Creighton Iv Francis M Operation of a remote medical navigation system using ultrasound image
US20070189457A1 (en) * 2005-08-22 2007-08-16 Frank Deinzer Method for displaying a devise in a 3-D image of a volumetric data set
US9218664B2 (en) 2005-09-13 2015-12-22 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US11304629B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US7920909B2 (en) 2005-09-13 2011-04-05 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
US10617332B2 (en) 2005-09-13 2020-04-14 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US11304630B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US9218663B2 (en) 2005-09-13 2015-12-22 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
US20070073136A1 (en) * 2005-09-15 2007-03-29 Robert Metzger Bone milling with image guided surgery
US20070100223A1 (en) * 2005-10-14 2007-05-03 Rui Liao Method and system for cardiac imaging and catheter guidance for radio frequency (RF) ablation
US7684647B2 (en) 2005-11-16 2010-03-23 Accuray Incorporated Rigid body tracking for radiosurgery
US20070127845A1 (en) * 2005-11-16 2007-06-07 Dongshan Fu Multi-phase registration of 2-D X-ray images to 3-D volume studies
US7835500B2 (en) 2005-11-16 2010-11-16 Accuray Incorporated Multi-phase registration of 2-D X-ray images to 3-D volume studies
US20070110289A1 (en) * 2005-11-16 2007-05-17 Dongshan Fu Rigid body tracking for radiosurgery
US20110057930A1 (en) * 2006-07-26 2011-03-10 Inneroptic Technology Inc. System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy
US11481868B2 (en) 2006-08-02 2022-10-25 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure she using multiple modalities
US10127629B2 (en) 2006-08-02 2018-11-13 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US8482606B2 (en) 2006-08-02 2013-07-09 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US10733700B2 (en) 2006-08-02 2020-08-04 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US7728868B2 (en) 2006-08-02 2010-06-01 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US8350902B2 (en) 2006-08-02 2013-01-08 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9659345B2 (en) 2006-08-02 2017-05-23 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US20080037843A1 (en) * 2006-08-11 2008-02-14 Accuray Incorporated Image segmentation for DRR generation and image registration
US7962196B2 (en) * 2006-08-11 2011-06-14 Brainlab Ag Method and system for determining the location of a medical instrument relative to a body structure
US20080039716A1 (en) * 2006-08-11 2008-02-14 Gregor Tuma Method and system for determining the location of a medical instrument relative to a body structure
US20100030232A1 (en) * 2006-09-25 2010-02-04 Eli Zehavi System for positioning of surgical inserts and tools
US8394144B2 (en) * 2006-09-25 2013-03-12 Mazor Surgical Technologies Ltd. System for positioning of surgical inserts and tools
US8320992B2 (en) * 2006-10-05 2012-11-27 Visionsense Ltd. Method and system for superimposing three dimensional medical information on a three dimensional image
US20080119728A1 (en) * 2006-10-05 2008-05-22 Visionsense Ltd. Method and system for superimposing three dimensional medical information on a three dimensional image
US20080119712A1 (en) * 2006-11-20 2008-05-22 General Electric Company Systems and Methods for Automated Image Registration
US20080118116A1 (en) * 2006-11-20 2008-05-22 General Electric Company Systems and methods for tracking a surgical instrument and for conveying tracking information via a network
US20110046483A1 (en) * 2008-01-24 2011-02-24 Henry Fuchs Methods, systems, and computer readable media for image guided ablation
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US8340379B2 (en) 2008-03-07 2012-12-25 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US8831310B2 (en) 2008-03-07 2014-09-09 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US20090312629A1 (en) * 2008-06-13 2009-12-17 Inneroptic Technology Inc. Correction of relative tracking errors based on a fiducial
US9364294B2 (en) 2009-02-17 2016-06-14 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US10136951B2 (en) 2009-02-17 2018-11-27 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US9398936B2 (en) 2009-02-17 2016-07-26 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464575B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8585598B2 (en) 2009-02-17 2013-11-19 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US10398513B2 (en) 2009-02-17 2019-09-03 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US20110043612A1 (en) * 2009-07-31 2011-02-24 Inneroptic Technology Inc. Dual-tube stereoscope
US20110082351A1 (en) * 2009-10-07 2011-04-07 Inneroptic Technology, Inc. Representing measurement information during a medical procedure
US8483351B2 (en) 2009-10-28 2013-07-09 Virginia Tech Intellectual Properties, Inc. Cardiac computed tomography methods and systems using fast exact/quasi-exact filtered back projection algorithms
US9259271B2 (en) * 2009-11-27 2016-02-16 Mehran Anvari Automated in-bore MR guided robotic diagnostic and therapeutic system
US20130158565A1 (en) * 2009-11-27 2013-06-20 Mcmaster University Automated in-bore mr guided robotic diagnostic and therapeutic system
US9282947B2 (en) 2009-12-01 2016-03-15 Inneroptic Technology, Inc. Imager focusing based on intraoperative data
US9107698B2 (en) 2010-04-12 2015-08-18 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8781186B2 (en) 2010-05-04 2014-07-15 Pathfinder Therapeutics, Inc. System and method for abdominal surface matching using pseudo-features
US10165928B2 (en) 2010-08-20 2019-01-01 Mark Hunter Systems, instruments, and methods for four dimensional soft tissue navigation
US10898057B2 (en) 2010-08-20 2021-01-26 Veran Medical Technologies, Inc. Apparatus and method for airway registration and navigation
US11690527B2 (en) 2010-08-20 2023-07-04 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US10264947B2 (en) 2010-08-20 2019-04-23 Veran Medical Technologies, Inc. Apparatus and method for airway registration and navigation
US11109740B2 (en) 2010-08-20 2021-09-07 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US8696549B2 (en) 2010-08-20 2014-04-15 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US20120294497A1 (en) * 2011-05-20 2012-11-22 Varian Medical Systems, Inc. Method and Apparatus Pertaining to Images Used for Radiation-Treatment Planning
US9014454B2 (en) * 2011-05-20 2015-04-21 Varian Medical Systems, Inc. Method and apparatus pertaining to images used for radiation-treatment planning
US9707043B2 (en) 2011-09-02 2017-07-18 Stryker Corporation Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing
US11135014B2 (en) 2011-09-02 2021-10-05 Stryker Corporation Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing
US9622823B2 (en) 2011-09-02 2017-04-18 Stryker Corporation Method for repairing focal defects in tissue of a patient
US11896314B2 (en) 2011-09-02 2024-02-13 Stryker Corporation Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing
US10813697B2 (en) 2011-09-02 2020-10-27 Stryker Corporation Methods of preparing tissue of a patient to receive an implant
US9345552B2 (en) 2011-09-02 2016-05-24 Stryker Corporation Method of performing a minimally invasive procedure on a hip joint of a patient to relieve femoral acetabular impingement
USRE49094E1 (en) 2011-10-28 2022-06-07 Nuvasive, Inc. Systems and methods for performing spine surgery
US9510771B1 (en) 2011-10-28 2016-12-06 Nuvasive, Inc. Systems and methods for performing spine surgery
US20140343572A1 (en) * 2011-12-15 2014-11-20 Ao Technology Ag Method and a device for computer assisted surgery
USRE48834E1 (en) * 2011-12-15 2021-11-30 Synthes Gmbh Method and a device for computer assisted surgery
US9687308B2 (en) * 2011-12-15 2017-06-27 AO Technolgoy AG Method and a device for computer assisted surgery
US20130218137A1 (en) * 2011-12-30 2013-08-22 Mako Surgical Corp. Integrated surgery system
US11109917B2 (en) 2011-12-30 2021-09-07 Mako Surgical Corp. Integrated surgery method and system
US10363102B2 (en) 2011-12-30 2019-07-30 Mako Surgical Corp. Integrated surgery method
US11779409B2 (en) 2011-12-30 2023-10-10 Mako Surgical Corp. Surgical system with workflow monitoring
US8670816B2 (en) 2012-01-30 2014-03-11 Inneroptic Technology, Inc. Multiple medical device guidance
US9138165B2 (en) 2012-02-22 2015-09-22 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US11403753B2 (en) 2012-02-22 2022-08-02 Veran Medical Technologies, Inc. Surgical catheter having side exiting medical instrument and related systems and methods for four dimensional soft tissue navigation
US11830198B2 (en) 2012-02-22 2023-11-28 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US10460437B2 (en) 2012-02-22 2019-10-29 Veran Medical Technologies, Inc. Method for placing a localization element in an organ of a patient for four dimensional soft tissue navigation
US10249036B2 (en) 2012-02-22 2019-04-02 Veran Medical Technologies, Inc. Surgical catheter having side exiting medical instrument and related systems and methods for four dimensional soft tissue navigation
US10140704B2 (en) 2012-02-22 2018-11-27 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9972082B2 (en) 2012-02-22 2018-05-15 Veran Medical Technologies, Inc. Steerable surgical catheter having biopsy devices and related systems and methods for four dimensional soft tissue navigation
US11551359B2 (en) 2012-02-22 2023-01-10 Veran Medical Technologies, Inc Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US10977789B2 (en) 2012-02-22 2021-04-13 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
CN113974689A (en) * 2012-03-07 2022-01-28 齐特奥股份有限公司 Space alignment apparatus
US20130293690A1 (en) * 2012-05-07 2013-11-07 Eric S. Olson Medical device navigation system stereoscopic display
US10842461B2 (en) * 2012-06-21 2020-11-24 Globus Medical, Inc. Systems and methods of checking registrations for surgical systems
US20170119339A1 (en) * 2012-06-21 2017-05-04 Globus Medical, Inc. Systems and methods of checking registrations for surgical systems
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
US11553968B2 (en) 2014-04-23 2023-01-17 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US10617324B2 (en) 2014-04-23 2020-04-14 Veran Medical Technologies, Inc Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
US10624701B2 (en) 2014-04-23 2020-04-21 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US11684429B2 (en) 2014-10-02 2023-06-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US10820944B2 (en) 2014-10-02 2020-11-03 Inneroptic Technology, Inc. Affected region display based on a variance parameter associated with a medical device
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US11534245B2 (en) 2014-12-12 2022-12-27 Inneroptic Technology, Inc. Surgical guidance intersection display
US10820946B2 (en) 2014-12-12 2020-11-03 Inneroptic Technology, Inc. Surgical guidance intersection display
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US11931117B2 (en) 2014-12-12 2024-03-19 Inneroptic Technology, Inc. Surgical guidance intersection display
US10959786B2 (en) * 2015-06-05 2021-03-30 Wenzel Spine, Inc. Methods for data processing for intra-operative navigation systems
US20160354161A1 (en) * 2015-06-05 2016-12-08 Ortho Kinematics, Inc. Methods for data processing for intra-operative navigation systems
US20210220057A1 (en) * 2015-06-05 2021-07-22 Wenzel Spine, Inc. Surgical navigation processors and systems
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US11103200B2 (en) 2015-07-22 2021-08-31 Inneroptic Technology, Inc. Medical device approaches
WO2017075085A1 (en) * 2015-10-28 2017-05-04 Endochoice, Inc. Device and method for tracking the position of an endoscope within a patient's body
US11529197B2 (en) 2015-10-28 2022-12-20 Endochoice, Inc. Device and method for tracking the position of an endoscope within a patient's body
US20170119474A1 (en) * 2015-10-28 2017-05-04 Endochoice, Inc. Device and Method for Tracking the Position of an Endoscope within a Patient's Body
US10325380B2 (en) 2016-01-12 2019-06-18 University Of Iowa Research Foundation Precise, low-cost orthopaedic surgical simulator
US11179136B2 (en) 2016-02-17 2021-11-23 Inneroptic Technology, Inc. Loupe display
US10433814B2 (en) 2016-02-17 2019-10-08 Inneroptic Technology, Inc. Loupe display
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US11707203B2 (en) 2016-10-11 2023-07-25 Wenzel Spine, Inc. Systems for generating image-based measurements during diagnosis
US10772686B2 (en) 2016-10-27 2020-09-15 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US11369439B2 (en) 2016-10-27 2022-06-28 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
US20210205022A1 (en) * 2018-02-07 2021-07-08 Ao Technology Ag Reference device for real-time tracking of bone and/or surgical objects in computer-assisted surgery
US11403966B2 (en) 2018-04-07 2022-08-02 University Of Iowa Research Foundation Fracture reduction simulator
US11875702B2 (en) 2018-04-07 2024-01-16 University Of Iowa Research Foundation Fracture reduction simulator
CN111096795A (en) * 2018-10-26 2020-05-05 韦伯斯特生物官能(以色列)有限公司 Release mode for a robot
CN109363771A (en) * 2018-12-06 2019-02-22 安徽埃克索医疗机器人有限公司 The fracture of neck of femur Multiple tunnel of 2D planning information plants nail positioning system in a kind of fusion
CN114098971A (en) * 2020-08-27 2022-03-01 杭州三坛医疗科技有限公司 Imaging, navigation and positioning method and device of orthopedic surgery robot and storage medium
WO2022192690A1 (en) * 2021-03-12 2022-09-15 True Digital Surgery Automated touchless registration for surgical navigation
US20230012440A1 (en) * 2021-07-12 2023-01-12 Mazor Robotics Ltd. Systems, devices, and methods for identifying and locating a region of interest
US11847809B2 (en) * 2021-07-12 2023-12-19 Mazor Robotics Ltd. Systems, devices, and methods for identifying and locating a region of interest
WO2023154548A1 (en) * 2022-02-14 2023-08-17 Nview Medical Inc. Surgical navigation system with distributed patient reference tracking
CN114831732A (en) * 2022-07-04 2022-08-02 真健康(北京)医疗科技有限公司 Puncture position verification method and device based on X-ray image
CN115719386A (en) * 2022-11-16 2023-02-28 南京博视医疗科技有限公司 Calibration device and method of laser treatment system based on line scanning
CN116725662A (en) * 2023-08-11 2023-09-12 北京维卓致远医疗科技发展有限责任公司 Fracture surgery planning method, device and storable medium based on two-dimensional images

Also Published As

Publication number Publication date
WO2002000103A2 (en) 2002-01-03
WO2002000103A3 (en) 2002-06-27
US6782287B2 (en) 2004-08-24
AU2001278181A1 (en) 2002-01-08

Similar Documents

Publication Publication Date Title
US6782287B2 (en) Method and apparatus for tracking a medical instrument based on image registration
US10762627B2 (en) Method and a system for registering a 3D pre acquired image coordinates system with a medical positioning system coordinate system and with a 2D image coordinate system
US8682413B2 (en) Systems and methods for automated tracker-driven image selection
US8131031B2 (en) Systems and methods for inferred patient annotation
US7831096B2 (en) Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use
US9320569B2 (en) Systems and methods for implant distance measurement
US6856827B2 (en) Fluoroscopic tracking and visualization system
US6856826B2 (en) Fluoroscopic tracking and visualization system
US11759272B2 (en) System and method for registration between coordinate systems and navigation
US6484049B1 (en) Fluoroscopic tracking and visualization system
US7885441B2 (en) Systems and methods for implant virtual review
US20080119725A1 (en) Systems and Methods for Visual Verification of CT Registration and Feedback
US20080119712A1 (en) Systems and Methods for Automated Image Registration
US10426414B2 (en) System for tracking an ultrasonic probe in a body part
US11559266B2 (en) System and method for local three dimensional volume reconstruction using a standard fluoroscope
US20080154120A1 (en) Systems and methods for intraoperative measurements on navigated placements of implants
US11918297B2 (en) System and method for registration between coordinate systems and navigation
US20220054199A1 (en) Robotic surgery systems and surgical guidance methods thereof
US20080119724A1 (en) Systems and methods for intraoperative implant placement analysis
US20050288574A1 (en) Wireless (disposable) fiducial based registration and EM distoration based surface registration
US9477686B2 (en) Systems and methods for annotation and sorting of surgical images
Grzeszczuk et al. A fluoroscopic X-ray registration process for three-dimensional surgical navigation
Abbasi et al. Clinical fluoroscopic fiducial-based registration of the vertebral body in spinal neuronavigation
Oentoro et al. High-accuracy registration of intraoperative CT imaging

Legal Events

Date Code Title Description
AS Assignment

Owner name: CBYON, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GRZESZCZUK, ROBERT;REEL/FRAME:012464/0278

Effective date: 20011017

Owner name: BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHAHIDI, RAMIN;REEL/FRAME:012464/0285

Effective date: 20011010

AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, WISCONSIN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CBYON, INC.;REEL/FRAME:015251/0840

Effective date: 20040206

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: SHAHIDI, RAMIN, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY;REEL/FRAME:018249/0019

Effective date: 20060913

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: SHAHIDI, RAMIN, CALIFORNIA

Free format text: CHANGE OF ASSIGNEE ADDRESS;ASSIGNOR:SHAHIDI, RAMIN;REEL/FRAME:020184/0435

Effective date: 20071130

AS Assignment

Owner name: CALIFORNIA INSTITUTE OF COMPUTER ASSISTED SURGERY,

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHAHIDI, RAMIN;REEL/FRAME:024320/0259

Effective date: 20100430

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING PARTY ADDRESS PREVIOUSLY RECORDED AT REEL: 015251 FRAME: 0840. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:CBYON, INC.;REEL/FRAME:044322/0732

Effective date: 20040206

AS Assignment

Owner name: STRYKER EUROPEAN HOLDINGS I, LLC, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL ELECTRIC COMPANY;REEL/FRAME:046020/0621

Effective date: 20171206

AS Assignment

Owner name: STRYKER EUROPEAN HOLDINGS III, LLC, DELAWARE

Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:STRYKER EUROPEAN HOLDINGS I, LLC;REEL/FRAME:056969/0771

Effective date: 20210219

Owner name: STRYKER EUROPEAN OPERATIONS HOLDINGS LLC, MICHIGAN

Free format text: CHANGE OF NAME;ASSIGNOR:STRYKER EUROPEAN HOLDINGS III, LLC;REEL/FRAME:056969/0893

Effective date: 20190226