WO2013055707A1 - Interventional in-situ image-guidance by fusing ultrasound video - Google Patents

Interventional in-situ image-guidance by fusing ultrasound video Download PDF

Info

Publication number
WO2013055707A1
WO2013055707A1 PCT/US2012/059406 US2012059406W WO2013055707A1 WO 2013055707 A1 WO2013055707 A1 WO 2013055707A1 US 2012059406 W US2012059406 W US 2012059406W WO 2013055707 A1 WO2013055707 A1 WO 2013055707A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
instrument
imaging
image
location
Prior art date
Application number
PCT/US2012/059406
Other languages
French (fr)
Inventor
Emad Boctor
Gregory Hager
Dorothee HEISENBERG
Philipp STOLKA
Original Assignee
Clear Guide Medical, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Clear Guide Medical, Llc filed Critical Clear Guide Medical, Llc
Priority to JP2014535792A priority Critical patent/JP2015505679A/en
Priority to EP12840772.3A priority patent/EP2763591A4/en
Priority to CA2851659A priority patent/CA2851659A1/en
Publication of WO2013055707A1 publication Critical patent/WO2013055707A1/en
Priority to IL232026A priority patent/IL232026A0/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B46/00Surgical drapes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4417Constructional features of apparatus for radiation diagnosis related to combined acquisition of different diagnostic modalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • A61B8/587Calibration phantoms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00707Dummies, phantoms; Devices simulating patient or parts of patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B46/00Surgical drapes
    • A61B46/20Surgical drapes specially adapted for patients
    • A61B2046/205Adhesive drapes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/366Correlation of different images or relation of image positions in respect to the body using projection of images directly onto the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B46/00Surgical drapes
    • A61B46/40Drape material, e.g. laminates; Manufacture thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/12Devices for detecting or locating foreign bodies

Definitions

  • the field of the currently claimed embodiments of this invention relate to imaging devices and to augmentation devices for these imaging devices, and more particularly to such devices that have one or more of a camera, one or more of a projector, and/or a set of local sensors for observation and imaging of, projecting onto, and tracking within and around a region of interest.
  • Image-guided surgery can be defined as a surgical or intervention procedure where the doctor uses indirect visualization to operate, i.e. by employing imaging instruments in real time, such as fiber-optic guides, internal video cameras, flexible or rigid endoscopes, ultrasonography etc.
  • imaging instruments in real time
  • Most image-guided surgical procedures are minimally invasive.
  • IGS systems allow the surgeon to have more information available at the surgical site while performing a procedure.
  • these systems display 3D patient information and render the surgical instrument in this display with respect to the anatomy and a preoperative plan.
  • the 3D patient information can be a preoperative scan such as CT or MRI to which the patient is registered during the procedure, or it can be a real-time imaging modality such as ultrasound or fluoroscopy.
  • MIS minimally invasive surgery
  • a procedure or intervention is performed either through small openings in the body or percutaneously (e.g. in ablation or biopsy procedures).
  • MIS techniques provide for reductions in patient discomfort, healing time, risk of complications, and help improve overall patient outcomes.
  • Tracking technologies can be easily categorized into the following groups: 1) mechanical-based tracking including active robots (DaVinci robots [http://www.intuitivesurgical.com, August 2nd, 2010]) and passive-encoded mechanical arms (Faro mechanical arms [http://products.faro.com product-overview, August 2nd, 2010]), 2) optical-based tracking ( DI OptoTrak [http://www.ndigital.com, August 2nd, 2010], MicronTracker [http://www.clarontech.com, August 2nd, 2010]), 3) acoustic-based tracking, and 4) electromagnetic (EM)-based tracking (Ascension Technology [http://www.ascension-tech.com, August 2nd, 2010]).
  • Ultrasound is one useful imaging modality for image-guided interventions including ablative procedures, biopsy, radiation therapy, and surgery.
  • ultrasound-guided intervention research is performed by integrating a tracking system (either optical or EM methods) with an ultrasound (US) imaging system to, for example, track and guide liver ablations, or in external beam radiation therapy
  • a tracking system either optical or EM methods
  • US ultrasound
  • E.M. Boctor M. DeOliviera, M. Choti, R. Ghanem, R.H. Taylor, G. Hager, G. Fichtinger, "Ultrasound Monitoring of Tissue Ablation via Deformation Model and Shape Priors", International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2006; H. Rivaz, I. Fleming, L.
  • An augmentation device for an imaging system has a bracket structured to be attachable to an imaging component, a projector attached to the bracket, and one or more cameras observing the surrounding environment.
  • the projector is arranged and configured to project an image onto a surface in conjunction with imaging by the camera system. This system can be used for registration to the imaged surface, and guidance for placement of the device on the surface, or guidance of needles or other instruments to interact with the surface or below the surface.
  • a system that consists of a single camera and project, whereby one of the camera or projector is aligned with the ultrasound plane, and the other is off-axis, and a combination of tracking and display is used to provide guidance.
  • the camera and projector configuration can be preserved using sterile probe covering that contain special transparent sterile window.
  • the projection image may be time-multiplexed in synchrony with the camera or cameras to alternatively optimize projection for tracking (maximize needle presence), guidance (overlay clues), surfaces (optimize stereo reconstruction).
  • the projection pattern may also be spatially modulated or multiplexed for different purposes, e.g. projecting a pattern in one area and guidance in other areas.
  • An adaptive pattern both in space and time including the following:
  • a method to guide tool by actively tracking the tool and projecting :
  • proximity markers to indicate general "closeness” by e.g. color-changing
  • target markers to point towards e.g. crosshairs, circles, bulls-eyes etc.
  • alignment markers to line up with e.g. lines, fans, polygons
  • This guidance approach and information to be either registered to the underlying image or environment i.e. the overlay symbols correspond to target location, size, or areas to avoid); or it can be location-independent guidance (e.g. location, color, size, shape, but also auditory cues such as audio volume, sound clips, and/or frequency changes indicate to the user where to direct the tools or the probe.)
  • location-independent guidance e.g. location, color, size, shape, but also auditory cues such as audio volume, sound clips, and/or frequency changes indicate to the user where to direct the tools or the probe.
  • the combination of the camera and projector can be used to construct intuitive and sterile user interfaces on the patient surface, or on any other projectable surface.
  • standard icons and buttons can be projected onto the patient, and a finger or needle can be tracked and used to activate these buttons.
  • This tracking can also be used in non- visual user interfaces, e.g. for gesture tracking without projected visual feedback.
  • the projection system may make use of the geometry computed by the stereo system to correct for the curvature of the body when projecting information onto it.
  • the system can include overlay guidance to place the imaging device on a surface (e.g. Ultrasound probe) or move it to a specific pose (e.g. C-arm X-ray).
  • a surface e.g. Ultrasound probe
  • a specific pose e.g. C-arm X-ray
  • the system can include overlay guidance to place the imaging device on a surface (e.g. Ultrasound probe) or move it to a specific pose (e.g. C-arm X-ray).
  • a specific pose e.g. C-arm X-ray
  • the imaging device e.g. Ultrasound probe
  • a specific pose e.g. C-arm X-ray
  • a method to guide interventional tool by matching the tool's shadow to an artificial shadow - this single-shadow alignment can be used for one degree of freedom with additional active tracking for remaining degrees of freedom.
  • the shadow can be a single line; the shadow can be a line of different thickness; the shadow can be of different colors; the shadow can be used as part of structured light pattern.
  • Adaptive projection to overcome interference e.g. overlay guidance can interfere with needle tracking tasks
  • guidance "lines” composed of e.g. "string-of-pearls” series of circles/discs/ellipses etc. can improve alignment performance for the user.
  • the apparent thickness of guidance lines/structures can be modified based on detected tool width, distance to projector, distance to surface, excessive intervention duration etc. to improve alignment performance
  • Two projectors can uniquely provide two independent shadows that can define the intended/optimal guide of the tool
  • Using a combination of mirrors and a beam splitter - one projector can be divided into two projectors and hence provide the same number of independent shadows
  • a guidance system (one example) - Overlaying crosshairs and/or extrapolated needle pose lines onto live ultrasound views on-screen (both in-plane and out-of-plane) or projected onto the patient, see, e.g., Figure 34; Projecting paired symbols (circles, triangles etc.) that change size, color, and relative position depending on the current targeting error vector; Overlaying alignment lines onto single/stereo/multiple camera views that denote desired needle poses, allowing the user to line up the camera image of the needle with the target pose, as well as lines denoting the currently-tracked needle pose for quality control purposes; Projecting needle alignment lines onto the surface, denoting both target pose (for guidance) as well as currently-tracked pose (for quality control), from one or more projectors;
  • the system may use the pose of the needle in air to optimize ultrasound to detect the needle in the body and vice-versa. For example, by expecting the location of the needle tip - the ultrasound system can automatically set the transmit focus location and the needle steering parameters etc.
  • the system may make use of the projected insertion point as "capture range" for possible needle poses, discard candidates outside that range, or detect when computed 3D poses violate the expected targeting behavior.
  • An approach to indicate depth of penetration of the tool can be performed by detecting fiducials on the needle, and tracking those fiducials over time. For example, these may be dark rings on the needle itself, which can be counted using the vision system, or they may be a reflective element attached to the end of the needle, and the depth may be computed by subtracting the location of the fiducial in space from the patient surface, and then subtracting that result from the entire length of the needle. [0034] Depth guidance by directly projecting on the needle shaft a fiducial landmark
  • Additional depth guidance claim can be simply the display of the system may passively indicate the number of fiducial rings that should remain outside the patient at the correct depth for the current system pose, providing the user with a perceptual cue that they can use to determine manually if they are at the correct depth.
  • the camera/projector configuration can rotate 90 degrees to allow the guidance for both in-plane and out-of-plane intervention
  • the mounting bracket can be modular to add cameras, projectors, for example start with one projector and add one camera, or start with one projector and two cameras and add additional projector
  • the camera and projector can be added at different location (camera and projector for in-plane intervention and adding one projector facing the out-of-plane view)
  • a calibration method that simultaneously calibrates US, projector and stereo cameras. The method is based on a calibration object constructed from a known geometry:
  • Double-wedge phantom attached to a planar surface (as in Fig 26A), or Multi-line phantom (as in Fig 26B). Both are alternative designs of possible phantoms that can be used in estimating the rigid-body transformation between ultrasound coordinate frame and camera coordinates frame.
  • a phantom with a well-known geometry comprising an ultrasound-visible component and an optically- visible component (as in Figs. 26 A and 26B) is simultaneously observed in both modalities.
  • Pose recovery of both components in their respective modality allows reconstruction of the pose of the cameras and the ultrasound transducer relative to the phantom, and thus the calibration of their relative pose to the each other. See also, Figure 33.
  • a method to accurately measure the location of the projector relative to the location of the cameras and probe One means of doing so is to observe that visible rays projected from the camera will form straight lines in space that intersect at the optical center of the projector.
  • the system can calculate a series of 3D points which can then be extrapolated to compute the center of projection. This can be performed with nearly any planar or nonplanar series of projection surfaces.
  • a temporal calibration method that simultaneously synchronize ultrasound data stream to both cameras streams and to projector streams:
  • Drapes that are transparent to the structured light system • Drapes that are IR transparent or wavelength-specific to allow patients surface or organ scanning
  • the projector may make use of light-activated dyes that have been "printed on patient” or may contain an auxiliary controlled laser for this purpose.
  • a depth imaging system composed from more than two cameras. For example with three cameras where camera 1 and 2 are optimized for far range, camera 2 and 3 for mid-range, and camera 1 and 3 for close range.
  • the overall configuration may be augmented by and/or controlled from a handheld device such as a tablet computer for 1 ) ultrasound machine operation, 2) for
  • An augmentation hardware to construct a display system that maintains registration with the probe and which can be used for both visualization and guidance.
  • the probe may have an associated display that the can be detached and which shows relevant pre-operative CT information based on its position in space. It may also overlay targeting information.
  • the computational resources used by the device may be augmented with additional computation located elsewhere.
  • This remote computation might be used to process information coming from the device (e.g. to perform a computationally intense registration process); it may be used to recall information useful to the function of the device (e.g. to compare this patient with other similar patients to provide "best practice" treatment options), or it may be used to provide information that directs the device (e.g. transferring the indication of a lesion in a CT image to a remote center for biopsy).
  • the trajectory of a needle can be calculated by visual tracking and thence projected into the ultrasound image. If the needle in the image is inconsistent with this projection, it is a cue that there is a system discrepancy. Conversely, if the needle is detected in the ultrasound image, it can be projected back into the video image to confirm that the external pose of the needle is consistent with that tracked image.
  • Active quality control method by to simultaneously track the needle in both ultrasound and video images, and to use those computed values to detect needle bending and to either update the likely trajectory of the needle, or to alert the user that they are putting pressure on the needle, or both.
  • the projection center may lie on or near the plane of the ultrasound system.
  • the projector can project a single line or shadow that indicates where this plane is.
  • a needle or similar tool placed in the correct plane will become bright.
  • a video camera outside this plane can view the scene, and this image can be displayed on a screen. Indeed, it may be included with the ultrasound view.
  • the clinician can view both the external and internal guidance of the needle simultaneously on the same screen.
  • Guidance to achieve a particular angle can be superimposed on the camera image, so that the intersection of the ultrasound plane and the plane formed by the superimposed guidance forms a line that is the desired trajectory of the needle.
  • a second embodiment of the simultaneous camera/projector guidance would be to place a camera along the ultrasound plane, and to place the projector off-plane.
  • the geometry is similar, but now the camera superimposed image is used to define the plane, and a line is projected by the projector to define the needle trajectory.
  • Further variations include combinations of single or multiple cameras or projectors, where at least one of either is mounted on the mobile device itself as well as mounted statically in the environment, with registration between the mobile and fixed components maintained at all times to make guidance possible. This registration maintenance can be achieved e.g. by detecting and tracking known features present in the environment and/or projected into the common field of interest.
  • An augmentation system that may use multi-band projection with both visible and invisible bands (such as with IR in various ways), simultaneously or time-multiplexed.
  • the invention may use multi -projector setups for shadow reduction, intensity enhancement, or passive stereo guidance.
  • An augmentation device with stereo projection In order to create a stereo projection, the projection system may make use of mirrors and splitters for making one projector two (or more) by using "arms" etc. to split the image or to accomplish
  • the projection system may make use of polarization for 3D guidance or use dual-arm or dual-device projection with polarized light and (passive) glasses for 3D in-situ ultrasound guidance display.
  • the projection may project onto a screen consisting of any of: Fog screen, switchable film, UV-fluorescent glass as almost-in-situ projection surfaces
  • An augmentation device where one of the cameras or a dedicated camera is outward-looking to track the user to help correct visualization from geometric distortion or probe motion. This may also be used to solve the parallax problem when projecting in 3D.
  • the augmentation device can estimate relative motion.
  • the projection system may project a fixed pattern upwards onto the environment to support tracking with stereo cameras (limited degrees of freedom, depending on environment structure and the direction of motion)
  • a projection system that in addition of projecting on the patient surface; the projector might instead project onto other rigid or deformable objects in the workspace or the reading room.
  • the camera might reconstruct a sheet of paper in space, and the projector could project the CT data of a preoperative scan onto the paper. As the paper is deformed the CT data would be altered to reflect the data that it would "slice through” if it were inside the body. This would allow the visualization of curved surfaces or curvilinear structures.
  • the system may have an electronic or printable signature that records the essential targeting information in an easy-to-use way. This information may be loaded or scanned visually by the device itself when the patient is re-imaged.
  • This may include providing training for those learning about diagnostic or interventional ultrasound; or to make it possible for the general population to make use of ultrasound-based treatments for illness (automated carotid scanning in pharmacies).
  • nondestructive inspection of a plane wing may use ultrasound or x-ray, but in either case requires exact guidance to the inspection location (e.g. a wing attachment) in question.
  • the methods described above can provide this guidance.
  • the system could provide guidance for e.g. throwing darts, hitting a pool ball, or a similar game.
  • Figure 1 shows an embodiment of an augmentation device for an imaging system according to an embodiment of the current invention.
  • Figure 2 is a schematic illustration of the augmentation device of Figure 1 in which the bracket is not shown.
  • Figures 3A-3I are schematic illustrations of augmentation devices and imaging systems according to some embodiments of the current invention.
  • Figure 4 is a schematic illustration of a system for (MRI-)image-guided surgery according to an embodiment of the current invention.
  • Figure 5 shows representational illustrations of three camera configurations according to different embodiments of the invention, a stereo camera arrangement (left), a single camera arrangement (center) and an omnidirectional camera arrangement (right).
  • Figures 6A and 6B are schematic illustrations of an augmentation device for a handheld imaging system according to an embodiment including a switchable semi- transparent screen for projection purposes.
  • Figure 7 is a schematic illustration of an augmentation device for a handheld imaging system according to an embodiment including a laser-based system for photoacoustic imaging (utilizing both tissue- and airborne laser and ultrasound waves) for needle tracking and improved imaging quality in some applications.
  • a laser-based system for photoacoustic imaging utilizing both tissue- and airborne laser and ultrasound waves
  • Figures 8A and 8B are schematic illustrations of one possible approach for needle guidance, using projected guidance information overlaid directly onto the imaged surface, with an intuitive dynamic symbol scheme for position/orientation correction support.
  • Figure 9 shows the appearance of a needle touching a surface in a structured light system for an example according to an embodiment of the current application.
  • Figure 10 shows surface registration results using CPD on points acquired from CT and a ToF camera for an example according to an embodiment of the current application.
  • Figure 11 shows a comparison of SNR and CNR values that show a large improvement in quality and reliability of strain calculation when the RF pairs are selected using our automatic frame selection method for an example according to an embodiment of the current application.
  • Figure 12 shows a breast phantom imaged with a three-color sine wave pattern; right the corresponding 3D reconstruction for an example according to an embodiment of the current application.
  • Figure 13 shows laparoscopic partial nephrectomy guided by US elasticity imaging for an example according to an embodiment of the current application. Left: System concept and overview. Right: Augmented visualization.
  • Figure 14 shows laparoscopic partial nephrectomy guided by US probe placed outside the body for an example according to an embodiment of the current application.
  • Figure 15 shows an example of a photoacoustic-based registration method according to an embodiment of the current application.
  • the pulsed laser projector initiates a pattern that can generate PA signals in the US space.
  • fusion of both US and Camera spaces can be easily established using point-to-point real-time registration method.
  • Figure 16 shows ground truth (left image) reconstructed by the complete projection data according to an embodiment of the current application.
  • the middle one is reconstructed using the truncated sonogram with 200 channels trimmed from both sides.
  • the right one is constructed using the truncated data and the extracted trust region (Rectangle support).
  • Figure 17 is a schematic illustration showing projection of live ultrasound
  • Figure 18 is a schematic illustration of different structured-light patterns shown with varying spatial frequencies.
  • Figure 19 is a schematic illustration of different structured-light patterns, with and without edges, to aid the detection of straight needles.
  • Figure 20 is a schematic illustration of randomizing through different patterns over time to increase the data density for stereo surface reconstruction.
  • Figure 21 is a schematic illustration of use of a camera/projection unit combination outside of an imaging device next to the patient; here projecting structured-light patterns onto the skin as well as onto a semi-transparent or switchable-film screen above the patient.
  • Figure 22 is a schematic illustration of using a switchable-film, fluorescent, or similar semi-transparent screen, simultaneous projection onto both the patient and the screen is possible.
  • Figure 23 is a schematic illustration of dual-shadow passive guidance - by projecting one line from each projection center, two light planes are created that intersect at the desired needle pose and allow passive alignment.
  • Figure 24 is a schematic illustration of semi-active, single-shadow guidance
  • the needle can be passively aligned in one plane and actively in the remaining degrees of freedom.
  • Figure 25 is a schematic illustration of using "bulby" (bottom) as opposed to straight lines (top) to improve needle guidance performance and usability because of the additional directional information to the user.
  • Figure 26A is a schematic illustration of a setup for camera-ultrasound calibration with double- wedge phantom.
  • the ultrasound probe becomes aligned with the wedges' central plane during a manual sweep, and simultaneously a stereo view of a grid allows to reconstruct the camera pose relative to the well-known phantom.
  • Figure 26B is an illustration of a multi-line phantom. This figure shows another configuration of a known geometry that can uniquely identify the pose of the ultrasound imaging frame, and relate the ultrasound image to the known optical landmark (the checker board). Hence the calibration can be performed from a single image.
  • Figure 27 is a schematic illustration of estimation of the camera pose in camera coordinates allows to optimize ultrasound imaging parameters (such as focus depth) for best needle or target imaging.
  • Figure 28A is a schematic illustration of target/directional symbols indicate the changes to the needle pose to be made by the user in order to align with the target.
  • Figure 28B is a schematic illustration of dual-shadow approach for passive guidance.
  • Figure 28C is a schematic illustration of direct projection of target/critical regions onto the surface allows freehand navigation by the user.
  • Figure 29 is a schematic illustration of projection of visible rays from the projection center onto arbitrary surfaces allows to reconstruct lines that in turn allow to reconstruct the projection center in camera coordinates, helping to calibrate cameras and projectors.
  • Figure 30 is a schematic illustration of the system uses the projected insertion points as a "capture range” reference, discarding/not tracking needles that point too far away from it.
  • Figure 31 is a schematic illustration of passive needle alignment using one projector, one camera: Alignment of the needle with the projected line constrains the pose to a plane, while alignment with a line overlaid onto the camera image imposes another plane; together defining a needle insertion pose.
  • Figure 32 is a schematic illustration of double-shadow passive guidance with a single projector and dual-mirror attachment: The single projection cone is split into two virtual cones from different virtual centers, thus allowing passive alignment with limited hardware overhead.
  • Figure 33 is a picture illustrating how double-wedges show up in ultrasound and how they are automatically detected/segmented (the green triangle). This is the pose recovery based on ultrasound images.
  • Figure 34 is a screenshot of the system's graphical user interface showing the image overlay for out-of-plane views (the top section, with the green crosshair+line crossing the horizontal gray “ultrasound plane” line).
  • FIGs 5 and 17 through 32 projected images are shown in blue, and camera views are shown in red. Additionally, C denotes cameras 1&2; P is projector, P' is projected image (blue); C is camera views (red); N is needle or instrument; M is mirror; B is base, US is ultrasound, I is imaging system, SLS is structured light surface, O is object or patient surface, and S is for a semi-transparent or switchable-film screen (except for Figures 24 and 32, where S is a real (cast) line shadow, and S' are projected shadow lines for alignment).
  • Some embodiments of this invention describe IGI-(image-guided interventions)-enabling "platform technology" going beyond the current paradigm of relatively narrow image-guidance and tracking. It simultaneously aims to overcome limitations of tracking, registration, visualization, and guidance; specifically using and integrating techniques e.g. related to needle identification and tracking using 3D computer vision, structured light, and photoacoustic effects; multi-modality registration with novel combinations of orthogonal imaging modalities; and imaging device tracking using local sensing approaches; among others.
  • the current invention covers a wide range of different embodiments, sharing a tightly integrated common core of components and methods used for general imaging, projection, vision, and local sensing.
  • Some embodiments of the current invention are directed to combining a group of complementary technologies to provide a local sensing approach that can provide enabling technology for the tracking of medical imaging devices, for example, with the potential to significantly reduce errors and increase positive patient outcomes.
  • This approach can provide a platform technology for the tracking of ultrasound probes and other imaging devices, intervention guidance, and information visualization according to some embodiments of the current invention.
  • Some embodiments of the current invention allow the segmentation, tracking, and guidance of needles and other tools (using visual, ultrasound, and possibly other imaging and localization modalities), allowing for example the integration with the above-mentioned probe tracking capabilities into a complete tracked, image-guided intervention system.
  • the same set of sensors can enable interactive, in-place visualization using additional projection components.
  • This visualization can include current or pre-operative imaging data or fused displays thereof, but also navigation information such as guidance overlays.
  • the same projection components can help in surface acquisition and multi- modality registration, capable of reliable and rapid fusion with pre-operative plans, in diverse systems such as handheld ultrasound probes, MRI/CT/C-arm imaging systems, wireless capsule endoscopy, and conventional endoscopic procedures, for example.
  • Such devices can allow imaging procedures with improved sensitivity and specificity as compared to the current state of the art. This can open up several possible application scenarios that previously required harmful X-ray/CT or expensive MRI imaging, and/or external tracking, and/or expensive, imprecise, time-consuming, or impractical hardware setups, or that were simply afflicted with an inherent lack of precision and guarantee of success, such as:
  • biopsies, RF/HIFU ablations etc. can allow 2D- or 3D-ultrasound-based needle guidance without external tracking,
  • brachytherapy can allow 3D-ultrasound acquisition and needle guidance for precise brachytherapy seed placement
  • cone -beam CT reconstruction can enable high-quality C-arm CT reconstructions with reduced radiation dose and focused field of view
  • gastroenterology can perform localization and trajectory reconstruction for
  • Some embodiments of the current invention can provide several advantages over existing technologies, such as combinations of:
  • some embodiments of the current invention are directed to devices and methods for the tracking of ultrasound probes and other imaging devices.
  • ultrasound imaging By combining ultrasound imaging with image analysis algorithms, probe-mounted cameras, and very low-cost, independent optical-inertial sensors, it is possible to reconstruct the position and trajectory of the device and possible tools or other objects by incrementally tracking their current motion according to an embodiment of the current invention.
  • This can provide several possible application scenarios that previously required expensive, imprecise, or impractical hardware setups. Examples can include the generation of freehand three- dimensional ultrasound volumes without the need for external tracking, 3D ultrasound-based needle guidance without external tracking, improved multi-modal registration, simplified image overlay, or localization and trajectory reconstruction for wireless capsule endoscopes over extended periods of time, for example.
  • the same set of sensors can enable interactive, in-place visualization using additional projection components according to some embodiments of the current invention.
  • FIG. 1 is an illustration of an embodiment of an augmentation device 100 for an imaging system according to an embodiment of the current invention.
  • the augmentation device 100 includes a bracket 102 that is structured to be attachable to an imaging component 104 of the imaging system.
  • the imaging component 104 is an ultrasound probe and the bracket 102 is structured to be attached to a probe handle of the ultrasound probe.
  • the bracket 102 can be structured to be attachable to other handheld instruments for image-guided surgery, such as surgical orthopedic power tools or stand-alone handheld brackets, for example.
  • the bracket 102 can be structured to be attachable to the C-arm of an X-ray system or an MRI system, for example.
  • the augmentation device 100 also includes a projector 106 attached to the bracket 102.
  • the projector 106 is arranged and configured to project an image onto a surface in conjunction with imaging by the imaging component 104.
  • the projector 106 can be at least one of a visible light imaging projector, a laser imaging projector, a pulsed laser, or a projector of a fixed or selectable pattern (using visible, laser, or infrared/ultraviolet light).
  • a visible light imaging projector a laser imaging projector, a pulsed laser, or a projector of a fixed or selectable pattern (using visible, laser, or infrared/ultraviolet light).
  • the use of different spectral ranges and power intensities enables different capabilities, such as infrared for structured light illumination simultaneous with e.g.
  • a fixed pattern projector can include, for example, a light source arranged to project through a slide, a mask, a reticle, or some other light-patterning structure such that a predetermined pattern is projected onto the region of interest. This can be used, for example, for projecting structured light patterns (such as grids or locally unique patterns) onto the region of interest.
  • structured light patterns such as grids or locally unique patterns
  • Another use for such projectors can be the overlay of user guidance information onto the region of interest, such as dynamic needle-insertion-supporting symbols (circles and crosses, cf. Figure 8). Such a projector can be made to be very compact in some applications.
  • a projector of a selectable pattern can be similar to the fixed pattern device, but with a mechanism to select and/or exchange the light-patterning component.
  • a rotating component could be used in which one of a plurality of predetermined light- patterning sections is moved into the path of light from the light source to be projected onto the region of interest.
  • said projector(s) can be a stand-alone element of the system, or combined with a subset of other components described in the current invention, i.e. not necessarily integrated in one bracket or holder with another imaging device.
  • the projector(s) may be synchronized with the camera(s), imaging unit, and/or switchable film screens.
  • the augmentation device 100 can also include at least one of a camera 108 attached to the bracket 102.
  • a second camera 1 10 can also be attached to the bracket 102, either with or without the projector, to provide stereo vision, for example.
  • the camera can be at least one of a visible-light camera, an infra-red camera, or a time-of- flight camera in some embodiments of the current invention.
  • the camera(s) can be standalone or integrated with one or more projection units in one device as well, depending on the application. They may have to be synchronized with the projector(s) and/or switchable film glass screens as well.
  • Additional cameras and/or projectors could be provided - either physically attached to the main device, some other component, or free-standing - without departing from the general concepts of the current invention.
  • the cameras need not be traditional perspective cameras, but maybe of other types such as catadioptric or other omni-direction designs, line scan, and so forth. See, e.g., Figure 5.
  • the camera 108 and/or 1 10 can be arranged to observe a surface region close to the and during operation of the imaging component 104.
  • the two cameras 108 and 1 10 can be arranged and configured for stereo observation of the region of interest.
  • one of the cameras 108 and 1 10, or an additional camera, or two, or more can be arranged to track the user face location during visualization to provide information regarding a viewing position of the user. This can permit, for example, the projection of information onto the region of interest in such a way that it takes into account the position of the viewer, e.g. to address the parallax problem.
  • Figure 2 is a schematic illustration of the augmentation device 100 of Figure 1 in which the bracket 102 is not shown for clarity.
  • Figure 2 illustrates further optional local sensing components that can be included in the augmentation device 100 according to some embodiments of the current invention.
  • the augmentation device 100 can include a local sensor system 1 12 attached to the bracket 102.
  • the local sensor system 1 12 can be part of a conventional tracking system, such as an EM tracking system, for example.
  • the local sensor system 1 12 can provide position and/or orientation information of the imaging component 104 to permit tracking of the imaging component 104 while in use without the need for external reference frames such as with conventional optical or EM tracking systems.
  • Such local sensor systems can also help in the tracking (e.g.
  • the local sensor system 1 12 can include at least one of an optical, inertial, or capacitive sensor, for example.
  • the local sensor system 1 12 includes an inertial sensor component 1 14 which can include one or more gyroscopes and/or linear accelerometers, for example.
  • the local sensor system 1 12 has a three-axis gyro system that provides rotation information about three orthogonal axes of rotation.
  • the three-axis gyro system can be a micro-electromechanical system (MEMS) three-axis gyro system, for example.
  • the local sensor system 1 12 can alternatively, or in addition, include one or more linear accelerometers that provide acceleration information along one or more orthogonal axes in an embodiment of the current invention.
  • the linear accelerometers can be, for example, MEMS accelerometers.
  • the local sensor system 1 12 can include an optical sensor system 1 16 arranged to detect motion of the imaging component 104 with respect to a surface.
  • the optical sensor system 1 16 can be similar to the sensor system of a conventional optical mouse (using visible, IR, or laser light), for example.
  • the optical sensor system 1 16 can be optimized or otherwise customized for the particular application. This may include the use of (potentially stereo) cameras with specialized feature and device tracking algorithms (such as scale-invariant feature transform/SIFT and simultaneous localization and mapping/SLAM, respectively) to track the device, various surface features, or surface region patches over time, supporting a variety of capabilities such as trajectory reconstruction or stereo surface reconstruction.
  • feature and device tracking algorithms such as scale-invariant feature transform/SIFT and simultaneous localization and mapping/SLAM, respectively
  • the local sensor system 1 12 can include a local ultrasound sensor system to make use of the airborne photoacoustic effect.
  • one or more pulsed laser projectors direct laser energy towards the patient tissue surface, the surrounding area, or both, and airborne ultrasound receivers placed around the probe itself help to detect and localize potential objects such as tools or needles in the immediate vicinity of the device.
  • the projector 106 can be arranged to project an image onto a local environment adjacent to the imaging component 104.
  • the projector 106 can be adapted to project a pattern onto a surface in view of the cameras 108 and 1 10 to facilitate stereo object recognition and tracking of objects in view of the cameras.
  • structured light can be projected onto the skin or an organ of a patient according to some embodiments of the current invention.
  • the projector 106 can be configured to project an image that is based on ultrasound imaging data obtained from the ultrasound imaging device.
  • the projector 106 can be configured to project an image based on imaging data obtained from an x-ray computed tomography imaging device or a magnetic resonance imaging device, for example. Additionally, preoperative data or real-time guidance information could also be projected by the projector 106.
  • the invention may include the projection of the ultrasound data, and simultaneously that projection may be used to improve stereo reconstruction performance. See, e.g., Figure 17.
  • parameters of the projected pattern may include (a) spatial frequencies (both the presence of edges vs. smoother transitions as well as color patch sizes) - to adapt to surface distance, apparent structure sizes, or camera resolutions, see, e.g., Figures 18 and 19, - or (b) colors - to adapt to surface properties such as skin type or environment conditions such as ambient lighting, or (c) to randomize/iterate through different patterns over time, see, e.g., Figure 20.
  • Both structured-light patterns as well as projected guidance symbols contribute to surface reconstruction performance, but can also be detrimental to overall system performance, e.g. when straight edges interfere with needle tracking. In such cases, projection patterns and guidance symbols can be adapted to optimize system metrics (such as tracking success/robustness, surface outlier ratio etc.), e.g. by introducing more curvy features.
  • the augmentation device 100 can also include a communication system that is in communication with at least one of the local sensor system 1 12, camera 108, camera 1 10 or projector 106 according to some embodiments of the current invention.
  • the communication system can be a wireless communication system according to some embodiments, such as, but not limited to, a Bluetooth wireless communication system.
  • Figures 1 and 2 illustrate the imaging system as an ultrasound imaging system and that the bracket 102 is structured to be attached to an ultrasound probe handle 104, the broad concepts of the current invention are not limited to this example.
  • the bracket can be structured to be attachable to other imaging systems, such as, but not limited to, x-ray and magnetic resonance imaging systems, for example.
  • FIG. 3A is a schematic illustration of an augmentation device 200 attached to the C-arm 202 of an x-ray imaging system.
  • the augmentation device 200 is illustrated as having a projector 204, a first camera 206 and a second camera 208.
  • Conventional and/or local sensor systems can also be optionally included in the augmentation device 200, improving the localization of single C-arm X-ray images by enhancing C-arm angular encoder resolution and estimation robustness against structural deformation.
  • the x-ray source 210 typically projects an x-ray beam that is not wide enough to encompass the patient's body completely, resulting in severe truncation artifacts in the reconstruction of so-called cone beam CT (CBCT) image data.
  • CBCT cone beam CT
  • the camera 206 and/or camera 208 can provide information on the amount of extension of the patient beyond the beam width.
  • This information can be gathered for each angle as the C-arm 202 is rotated around the patient 212 and be incorporated into the processing of the CBCT image to at least partially compensate for the limited beam width and reduce truncation artifacts
  • conventional and/or local sensors can provide accurate data of the precise angle of illumination by the x-ray source, for example (more precise than potential C-arm encoders themselves, and potentially less susceptible to arm deformation under varying orientations).
  • Other uses of the camera-projection combination units are surface-supported multi-modality registration, or visual needle or tool tracking, or guidance information overlay.
  • Figure 3A is very similar to the arrangement of an augmentation device for an MRI system.
  • FIG. 3B is a schematic illustration of a system for image-guided surgery 400 according to some embodiments of the current invention.
  • the system for image-guided surgery 400 includes an imaging system 402, and a projector 404 configured to project an image onto a region of interest during imaging by the imaging system 402.
  • the projector 404 can be arranged proximate the imaging system 402, as illustrated, or it could be attached to or integrated with the imaging system.
  • the imaging system 402 is illustrated schematically as an x-ray imaging system.
  • the invention is not limited to this particular example.
  • the imaging system could also be an ultrasound imaging system or a magnetic resonance imaging system, for example.
  • the projector 404 can be at least one of a white light imaging projector, a laser light imaging projector, a pulsed laser, or a projector of a fixed or selectable pattern, for example.
  • the system for image-guided surgery 400 can also include a camera 406 arranged to capture an image of a region of interest during imaging by the imaging system.
  • a second camera 408 could also be included in some embodiments of the current invention.
  • a third, fourth or even more cameras could also be included in some embodiments.
  • the region of interest being observed by the imaging system 402 can be substantially the same as the region of interest being observed with the camera 406 and/or camera 408.
  • the cameras 406 and 408 can be at least one of a visible-light camera, an infra-red camera or a time-of-flight camera, for example.
  • Each of the cameras 406, 408, etc. can be arranged proximate the imaging system 402 or attached to or integrated with the imaging system 402.
  • the system for image-guided surgery 400 can also include one or more sensor systems, such as sensor systems 410 and 412, for example.
  • the sensor systems 410 and 412 are part of a conventional EM sensor system.
  • other conventional sensor systems such as optical tracking systems could be used instead of or in addition to the EM sensor systems illustrated.
  • one or more local sensor systems such as local sensor system 1 12 could also be included instead of sensor systems 410 and/or 412.
  • the sensor systems 410 and/or 412 could be attached to any one of the imaging system 402, the projector 404, camera 406 or camera 408, for example.
  • Each of the projector 404 and cameras 406 and 408 could be grouped together or separate and could be attached to or made integral with the imaging system 402, or arranged proximate the imaging system 402, for example.
  • Figure 4 illustrates one possible use of a camera/projection combination unit in conjunction with a medical imaging device such as MRI or CT.
  • Image-guided interventions based on these modalities suffer from registration difficulties arising from the fact that in-place interventions are awkward or impossible due to space constraints within the imaging device bores, among other reasons. Therefore, a multi-modality image registration system supporting the interactive overlay of potentially fused pre- and intra-operative image data could support or enable e.g. needle-based percutaneous interventions with massively reduced imaging requirements in terms of duration, radiation exposure, cost etc.
  • a camera/projection unit outside the main imaging system could track the patient, reconstruct the body surface using e.g. structured light and stereo reconstruction, and register and track needles and other tools relative to it.
  • handheld units comprising switchable film glass screens could be tracked optically and used as interactive overlay projection surfaces, see, e.g., Figure 21.
  • the tracking accuracy for such screens could be improved by attaching (at least inertial) local sensor systems to said screens, allowing better orientation estimation that using visual clues alone.
  • the screens need not impede the (potentially structured-light-supported) reconstruction of the underlying patient surface, nor block the user's view of that surface, as they can be rapidly switched (up to hundreds of times per second) alternating between a transparent mode to allow pattern and guidance information projection onto the surface, and an opaque mode to block and display other user-targeted data, e.g. in a tracked 3D data visualization fashion.
  • Such switchable film glass screens can also be attached to handheld imaging devices such as ultrasound probes and the afore-mentioned brackets as in Figure 6.
  • imaging and/or guidance data can be displayed on a handheld screen - in opaque mode - directly adjacent to imaging devices in the region of interest, instead of on a remote monitor screen.
  • - in transparent mode - structured light projection and/or surface reconstruction are not impeded by the screen, see, e.g., Figure 22.
  • the data is projected onto or through the switchable screen using the afore-mentioned projection units, allowing a more compact handheld design or even remote projection.
  • these screens (handheld or bracket-mounted) can also be realized using e.g.
  • UV- sensitive/fluorescent glass requiring a (potentially multi-spectral for color reproduction) UV projector to create bright images on the screen, but making active control of screen mode switching unnecessary.
  • overlay data projection onto the screen and structured light projection onto the patient surface can be run in parallel, provided the structured light uses a frequency unimpeded by the glass.
  • Figure 7 describes a possible extension to the augmentation device (“bracket") described for handheld imaging devices, comprising one or more pulsed lasers as projection units that are directed through fibers towards the patient surface, exciting tissue -borne photoacoustic effects, and towards the sides of the imaging device, emitting the laser pulse into the environment, allowing airborne photoacoustic imaging.
  • the handheld imaging device and/or the augmentation device comprise ultrasound receivers around the device itself, pointing into the environment. Both photoacoustic channels can be used e.g. to enable in-body and out-of-body tool tracking or out-of-plane needle detection and tracking, improving both detectability and visibility of tools/needles under various circumstances.
  • the photoacoustic effect can be used together with its structured-light aspect for registration between endoscopic video and ultrasound.
  • a unique pattern of light incidence locations is generated on the endoscope-facing surface side of observed organs.
  • One or more camera units next to the projection unit in the endoscopic device observe the pattern, potentially reconstructing its three-dimensional shape on the organ surface.
  • a distant ultrasound imaging device on the opposite side of the organ under observation receives the resulting photoacoustic wave patterns and is able to reconstruct and localize their origins, corresponding to the pulsed-laser incidence locations.
  • Figure 8 outlines one possible approach to display needle guidance information to the user by means of direct projection onto the surface in the region of interest in a parallax-independent fashion, so the user position is not relevant to the method's success (the same method can be applied to projection e.g. onto a device-affixed screen as described above, or onto handheld screens). Using e.g. a combination of moving, potentially color/size/thickness/etc.
  • the five degrees of freedom governing a needle insertion (two each for insertion point location and needle orientation, and one for insertion depth and/or target distance) can be intuitively displayed to the user.
  • the position and color of a projected circle on the surface indicate the intersection of the line between the current needle position and the target location with the patient surface, and said intersection point's distance from a planned insertion point.
  • the position, color, and size of a projected cross can encode the current orientation of the needle with respect to the correct orientation towards the target location, as well as the needle's distance from the target.
  • the orientation deviation is also indicated by an arrow pointing towards the proper position/orientation configuration.
  • guidance information necessary to adjust the needle orientation can be projected as a virtual shadow onto the surface next to the needle insertion point, prompting the user to minimize the shadow length to properly orient the needle for insertion.
  • Needle guidance may be active, by projecting crosshairs or other targeting information for all degrees of freedom as described above. Needle guidance may also make use of shadows as a means of alignment.
  • a "single-shadow alignment" can be used for 1 degree of freedom with additional active tracking/gui dance for remaining degree of freedom, e.g. circles or crosshairs, see, e.g., Figure 24.
  • stereo guidance may make use of shadows, active light planes, or other similar methods, see, e.g., Figures 23 and 32.
  • needle guidance may be passive (without needle tracking) by using simple alignment either in stereo views/cameras or in dual projector shadows or patterns.
  • Specific projection patterns may be used to enhance the speed or reliability of tracking. Examples include specific shadow "brush types” or profiles to help quickly and precisely aligning needle shadow with projected shadow ("bulby lines” etc.). See, e.g., Figure 25. Other patterns may be better for rough vs. precise alignments.
  • the system may also make use of "shadows” or projections of critical areas or forbidden regions onto patient surface, using pre-op CT/MRI or non-patient-specific atlas to define a "roadmap" for an intervention, see, e.g., Figure 25.
  • While the above-mentioned user guidance display is independent of the user viewing direction, several other information displays (such as some variations on the image- guided intervention system shown in Figure 4) may benefit from knowledge about the location of the user's eyes relative to the imaging device, the augmentation device, another handheld camera/projection unit, and/or projection screens or the patient surface.
  • Such information can be gathered using one or more optical (e.g. visible- or infrared-light) cameras pointing away from the imaging region of interest towards regions of space where the user face may be expected (such as upwards from a handheld ultrasound imaging device) combined with face-detection capabilities to determine the user's eye location, for example.
  • optical e.g. visible- or infrared-light
  • the local sensor system can include inertial sensors 506, such as a three-axis gyro system, for example.
  • the local sensor system 504 can include a three-axis MEMS gyro system.
  • the local sensor system 504 can include optical position sensors 508, 510 to detect motion of the capsule imaging device 500.
  • the local sensor system 504 can permit the capsule imaging device 500 to record position information along with imaging data to facilitate registering image data with specific portions of a patient's anatomy after recovery of the capsule imaging device 500, for example.
  • Some embodiments of the current invention can provide an augmentation of existing devices which comprises a combination of different sensors: an inertial measurement unit based on a 3 -axis accelerometer; one or two optical displacement tracking units (OTUs) for lateral surface displacement measurement; one, two or more optical video cameras; and a (possibly handheld and/or linear) ultrasound (US) probe, for example.
  • the latter may be replaced or accompanied by a photoacoustic (PA) arrangement, i.e. one or more active lasers, a photoacoustically active extension, and possibly one or more separate US receiver arrays.
  • PA photoacoustic
  • an embodiment of the current invention may include a miniature projection device capable of projecting at least two distinct features.
  • These sensors may be mounted, e.g. on a common bracket or holder, onto the handheld US probe, with the OTUs pointing towards and close to the scanning surface (if more than one, then preferably at opposite sides of the US array), the cameras mounted (e.g., in a stereo arrangement) so they can capture the environment of the scanning area, possible needles or tools, and/or the operating room environment, and the accelerometer in a basically arbitrary but fixed location on the common holder.
  • the projection device may be pointing mainly onto the scanning surface.
  • one PA laser may point towards the PA extension, while the same or another laser may point outwards, with US receiver arrays suitably arranged to capture possible reflected US echos. Different combinations of the mentioned sensors are possible.
  • the mounting bracket need not be limited to a fixed position or orientation.
  • the augmentation device may be mounted on a re-configurable/rotatable setup to re-orient device from in-plane to out-of-plane projection and guidance depending on the needs of the operator.
  • the mounting mechanism may also be configurable to allow elevation of augmentation device to accommodate different user habits (low/high needle grips etc.).
  • the mounting system may also be modular and allow users to add cameras, add projectors, add mechanical guides e.g. for elevation angle control as needed for the application.
  • an interstitial needle or other tool may be used.
  • the needle or tool may have markers attached for better optical visibility outside the patient body.
  • the needle or tool may be optimized for good ultrasound visibility if they are supposed to be inserted into the body.
  • the needle or tool may be combined with inertial tracking components (i.e. accelerometers).
  • additional markers may optionally be used for the definition of registration or reference positions on the patient body surface. These may be optically distinct spots or arrangements of geometrical features designed for visibility and optimized optical feature extraction.
  • the device to be augmented by the proposed invention may be a handheld US probe; for others it may be a wireless capsule endoscope (WCE); and other devices are possible for suitably defined applications, where said applications may benefit from the added tracking and navigational capabilities of the proposed invention.
  • WCE wireless capsule endoscope
  • an embodiment of the invention includes a software system for opto-inertial probe tracking (OIT).
  • OIT opto-inertial probe tracking
  • P(t) P(0) + ⁇ R(i)Ap(i) where the R(i) are the orientations directly sampled from the accelerometers and/or incrementally tracked from relative displacements between the OTUs (if more than one) at time i, and ⁇ ( ⁇ ' ) are the lateral displacements at time i as measured by the OTUs.
  • P(0) is an arbitrarily chosen initial reference position.
  • a software system for speckle-based probe tracking is included.
  • An (ultrasound-image-based) speckle decorrelation analysis (SDA) algorithm provides very high-precision 1-DoF translation (distance) information for single ultrasound image patch pairs by decorrelation, and 6-DoF information for the complete ultrasound image when combined with planar 2D-2D registration techniques.
  • Suitable image patch pairs are preselected by means of FDS (fully developed speckle) detection. Precision of distance estimation is improved by basing the statistics on a larger set of input pairs.
  • Another approach can be the integration of opto-inertial tracking information into a maximum-a-posteriori (MAP) displacement estimation.
  • sensor data fusion between OIT and SDA can be performed using a Kalman filter.
  • a software system for camera-based probe tracking and needle and/or tool tracking and calibration can be included.
  • the holder-mounted camera(s) can detect and segment e.g. a needle in the vicinity of the system.
  • a needle in the vicinity of the system.
  • Pi being the needle insertion point into the patient tissue (or alternatively, the surface intersection point in a water container) and P 2 being the end or another suitably distant point on the needle
  • a third point Pi being the needle intersection point in the US image frame
  • Another method for calibrating an ultrasound device, a pair of cameras, and a projection device proceeds as follows.
  • the projector projects a pattern onto a planar target.
  • the planar target is observed by the cameras, and is simultaneously measured by the ultrasound probe. Several such images are acquired.
  • Features on the planar target are used to produce a calibration for the camera system.
  • the position of the plane in space can be calculated by the camera system.
  • the projector can be calibrated using the same information.
  • the corresponding position of the intersection of the ultrasound beam with the plane produces a line in the ultrasound image. Processing of several such lines allows the computation of the relative position of the cameras and the ultrasound probe.
  • Synchronizing one or more cameras with an ultrasound system can be accomplished whereby a trigger signal is derived from or generated by the ultrasound system, and this trigger signal is use to trigger camera acquisition.
  • the trigger signal may come from the ultrasound data acquisition hardware, or from the video display associated with the ultrasound system. The same trigger signal may be used to trigger a projection device to show a particular image or pattern.
  • An alternative is a method of software temporal synchronization whereby the camera pair and ultrasound system are moved periodically above a target. The motion of the target in both camera and ultrasound is measured, and the temporal difference is computed by matching or fitting the two trajectories.
  • a method for doing so is disclosed in N. Padoy, G.D. Hager, Spatio-Temporal Registration of Multiple Trajectories, Proceedings of Medical Image Computing and Computer-Assisted Intervention (MICCAI), Toronto, Canada, September 201 1.
  • This also provides a means for interleaving patterns for guidance and for other purposes such as stereo reconstruction, whereby a trigger signal causes the projector to switch between patterns.
  • a trigger signal causes the projector to switch between patterns.
  • the pattern used by the camera system is invisible to the naked eye so that the user is not distracted by the transition.
  • Calibration can also be accomplished by using a specially constructed volume, as shown in Figures 26A and 26B.
  • the ultrasound system is swept over the volume while the volume is simultaneously observed by the camera system.
  • the surface models from both ultrasound and the camera system are registered to a computational model of the shape, and from this the relative position of the camera and ultrasound system is computed.
  • An alternative implementation is to use nanocapsules that rupture under ultrasound irradiation, creating an opaque layer in a disposable calibration phantom
  • needle bending can be inferred from a single 2D US image frame and the operator properly notified.
  • 3D image data registration is also aided by the camera(s) overlooking the patient skin surface.
  • three degrees of freedom tilt, roll, and height
  • three degrees of freedom can be constrained using the cameras, facilitating registration of 3D US and e.g. CT or similar modalities by restricting the registration search space (making it faster) or providing initial transformation estimates (making it easier and/or more reliable).
  • This may be facilitated by the application of optical markers onto the patient skin surface, which will also help in the creation of an explicit fixed reference coordinate system for integration of multiple 3D volumes.
  • drapes may be used that are designed to specifically enhance the performance of the system, whereby such drapes contain an easily detected pattern, fiducials, or other reference points, and the drapes adhere to the patient.
  • drapes that are transparent, and allow the cameras to see the patient directly through the drapes. Drapes may be specially colored to differentiate them from needles to be tracked. The drapes are preferably configured to enhance the ability of the cameras to compute probe motion.
  • Sterility can be preserved by using sterile probe coverings that contain special transparent areas for the cameras and projector to preserve sterility while also preserving or enhancing the function of the cameras and projectors.
  • pressure-sensitive drapes may be used to indicate tissue deformation under the US probe.
  • such drapes could be used to enhance ultrasound elasticity measurement.
  • the pressure-sensitive drapes may be used to monitor the use of the device by noting the level of pressure applied and correcting the registration and display based on that information.
  • the camera(s) provide additional data for pose tracking.
  • this will consist of redundant rotational motion information in addition to opto- inertial tracking.
  • this information could not be recovered from OIT (e.g. yaw motions on a horizontal plane in case of surface tracking loss of one or both optical translation detectors, or tilt motion without translational components around a vertical axis).
  • This information may originate from a general optical-flow-based rotation estimation, or specifically from tracking of specially applied optical markers onto the patient skin surface, which will also help in the creation of an explicit fixed reference coordinate system for integration of multiple 3D volumes.
  • the camera(s) can provide needle translation information. This can serve as input for ultrasound elasticity imaging algorithms to constrain the search space (in direction and magnitude) for the displacement estimation step by tracking the needle and transforming estimated needle motion into expected motion components in the US frame, using the aforementioned calibration matrix X.
  • the camera(s) can provide dense textured 3D image data of the needle insertion area. This can be used to provide enhanced visualization to the operator, e.g. as a view of the insertion trajectory as projected down along the needle shaft towards the skin surface, using actual needle/patient images.
  • the system may use the pose (location and orientation) of the needle in air to optimize ultrasound to detect the needle in the body and vice-versa, see, e.g., Figure 27.
  • the cameras may be of interest to have differing fields of view and depth ranges in the depth imaging system.
  • the cameras maybe a few 10s of centimeters from the surface; but at other times nearly a meter.
  • integration of a micro- projector unit can provide an additional, real-time, interactive visual user interface e.g. for guidance purposes.
  • Projecting navigation data onto the patient skin in the vicinity of the probe the operator need not take his eyes away from the intervention site to properly target subsurface regions.
  • Tracking the needle using the aforementioned camera(s) the projected needle entry point (intersection of patient skin surface and extension of the needle shaft) given the current needle position and orientation can be projected using a suitable representation (e.g. a red dot).
  • a suitable representation e.g. a green dot
  • guidance can be visually provided to the user in a variety of ways, either (a) on-screen or (b) projected through one or more projectors, e.g. directly onto the patient surface near the probe.
  • this guidance can be provided either (a) separately or (b) as an overlay to a secondary image stream, such as ultrasound images or mono- or multi-ocular camera views.
  • this guidance can be either (a) registered to the underlying image or environment geometry such that overlaid symbols correspond to environment features (such as target areas) in location and possibly size and/or shape, or (b) location-independent such that symbol properties, e.g. location, color, size, shape, but also auditory cues such as audio volume, sound clips, and/or frequency changes indicate to the user where to direct the tools or the probe.
  • Guidance symbols can include - in order of increasing specificity - (a) proximity markers (to indicate general "closeness” by e.g. color-changing backgrounds, frames, or image tints, or auditory cues), (b) target markers (to point towards e.g. crosshairs, circles, bulls-eyes etc.), see, e.g., Figure 28A, (c) alignment markers (to line up with e.g. lines, fans, polygons), see, e.g., Figure 28B, or (d) area demarcations (to avoid e.g. shapes denoting critical regions, geometrically or anatomically inaccessible regions etc.), see, e.g., Figure 28C.
  • proximity markers to indicate general "closeness" by e.g. color-changing backgrounds, frames, or image tints, or auditory cues
  • target markers to point towards e.g. crosshairs, circles, bulls-eyes etc.
  • Figure 28A e.g., Figure 28A
  • Overlaid guidance symbols can interfere with overall system performance, e.g. when tracking needles; so adaptation of projected graphic primitives (such as replacing lines with elliptic or curvy structures) can reduce artifacts.
  • guidance "lines” composed of e.g. "string-of-pearls” series of circles/discs/ellipses etc. can improve alignment performance for the user.
  • the apparent thickness of guidance lines/structures can be modified based on detected tool width, distance to projector, distance to surface, excessive intervention duration, etc., to improve alignment performance.
  • Specific - non-exhaustive - examples of the above concepts include: a) overlaying crosshairs and/or extrapolated needle pose lines onto live ultrasound views onscreen or projected onto the patient; b) projecting paired symbols (circles, triangles etc.) that change size, color, and relative position depending on the current targeting error vector; c) overlaying alignment lines onto single/stereo/multiple camera views that denote desired needle poses, allowing the user to line up the camera image of the needle with the target pose, as well as lines denoting the currently-tracked needle pose for quality control purposes; and d) projecting needle alignment lines onto the surface, denoting both target pose (for guidance) as well as currently-tracked pose (for quality control), from one or more projectors.
  • An important aspect of this system is a high accuracy estimate of the location of the projector relative to the probe and to the video camera.
  • One means of doing so is to observe that visible rays projected from the camera will form straight lines in space that intersect at the optical center of the projector.
  • the system can calculate a series of 3D points which can then be extrapolated to compute the center of projection. See, e.g., Figure 29. This can be performed with nearly any planar or nonplanar series of projection surfaces.
  • the overall configuration may be augmented by and/or controlled from a hand-held device such as a tablet computer for 1) ultrasound machine operation, 2) for visualization; 3) in addition, by using an one or more cameras on the tablet computer, for registration to patient for transparent information overlay.
  • a hand-held device such as a tablet computer for 1) ultrasound machine operation, 2) for visualization; 3) in addition, by using an one or more cameras on the tablet computer, for registration to patient for transparent information overlay.
  • the computational resources used by the device may be augmented with additional computation located elsewhere.
  • This remote computation might be used to process information coming from the device (e.g. to perform a computationally intense registration process), it may be used to recall information useful to the function of the device (e.g. to compare this patient with other similar patients to provide "best practice" treatment options), or it may be used to provide information that directs the device (e.g. transferring the indication of a lesion in a CT image to a remote center for biopsy).
  • the use of external computation may be measured and associated with the costs of using the device.
  • guidance can be provided to indicate the correct depth of penetration. This can be performed by detecting fiducials on the needle, and tracking those fiducials over time. For example, these may be dark rings on the needle itself, which can be counted using the vision system, or they may be a reflective element attached to the end of the needle, and the depth may be computed by subtracting the location of the fiducial in space from the patient surface, and then subtracting that result from the entire length of the needle.
  • a fiducial e.g. a bright point of light
  • a fiducial e.g. a bright point of light
  • the display of the system may passively indicate the number of fiducial rings that should remain outside the patient at the correct depth for the current system pose, providing the user with a perceptual cue that they can use to determine manually if they are at the correct depth.
  • the system may make use of the projected insertion point as "capture range" for possible needle poses, discard candidates outside that range, or detect when computed 3D poses violate the expected targeting behavior, see, e.g., Figure 30.
  • the PA laser can fire directly and diffusely at the tissue wall, exciting a PA sound wave emanating from there that is received with the mentioned passive US array and can be used for diagnostic purposes.
  • the diagnostic outcome can be linked to a particular location along the GI tract.
  • Some embodiments of the current invention can allow reconstructing a 2D ultrasound probe's 6-DoF ("degrees of freedom") trajectory robustly, without the need for an external tracking device.
  • the same mechanism can be e.g. applied to (wireless) capsule endoscopes as well. This can be achieved by cooperative sets of local sensors that incrementally track a probe's location through its sequence of motions.
  • an (ultrasound-image-based) speckle decorrelation analysis (SDA) algorithm provides very high-precision 1-DoF translation (distance) information for image patch pairs by decorrelation, and 6-DoF information for the complete ultrasound image when combined with planar 2D-2D registration techniques. Precision of distance estimation is improved by basing the statistics on a larger set of input pairs. (The parallelized approach with a larger input image set can significantly increase speed and reliability.)
  • a full transmit/receive ultrasound transceiver e.g. because of space or energy constraints, as in a wireless capsule endoscope
  • only an ultrasound receiver can be used according to some embodiments of the current invention.
  • the activation energy in this case comes from an embedded laser. Regular laser discharges excite irregularities in the surrounding tissue and generate photoacoustic impulses that can be picked up with the receiver. This can help to track surfaces and subsurface features using ultrasound and thus provide additional information for probe localization.
  • a component, bracket, or holder housing a set of optical, inertial, and/or capacitive (OIC) sensors represents an independent source of (ultrasound-image-free) motion information.
  • Optical displacement trackers (e.g. from optical mice or cameras) generate local translation data across the scan surface (e.g. skin or intestinal wall), while accelerometers and/or gyroscopes provide absolute orientation and/or rotation motion data.
  • two or more optical video cameras are attached to the ultrasound probe, possibly in stereo fashion, at vantage points that let them view the surrounding environment, including any or all of the patient skin surface, possible tools and/or needles, possible additional markers, and parts of the operation room environment. This way, they serve to provide calibration, image data registration support, additional tracking input data, additional input data supporting ultrasound elasticity imaging, needle bending detection input, and/or textured 3D environment model data for enhanced visualization.
  • the camera-projector device When used medically, it may be necessary for the camera-projector device to be maintained in a sterile environment. This may be accomplished in a number of ways.
  • the housing may be resistant to sterilizing agents, and perhaps be cleaned by wiping. It may also be placed in a sterile bag cover. In this case, it may be advantageous to create a "window" of solid plastic in the cover that attaches to the cameras and projector. This window may attached mechanically, or magnetically, or by static electric attraction (“static cling").
  • Another way of maintaining sterility is to produce a sterile (possibly disposable) housing that the projector-camera device mounts into.
  • One embodiment includes a display system that maintains registration with the probe and which can be used for both visualization and guidance.
  • the probe may have an associated display that the can be detached and which shows relevant preoperative CT information based on its position in space. It may also overlay targeting information.
  • One example would include a pair of glasses that were registered to the probe and were able to provide "see through” or "heads up” display to the user.
  • Cameras associated with the augmentation system can be used to perform
  • the trajectory of a needle can be calculated by visual tracking and thence projected into the ultrasound image. If the needle in the image is inconsistent with this projection, it is a cue that there is a system discrepancy. Conversely, if the needle is detected in the ultrasound image, it can be projected back into the video image to confirm that the external pose of the needle is consistent with that tracked image.
  • the system may simultaneously track the needle in both ultrasound and video images, and to use those computed values to detect needle bending and to either update the likely trajectory of the needle, or to alert the user that they are putting pressure on the needle, or both.
  • Quality control can also be performed by processing the ultrasound image to determine that it has the expected structure. For example, if the depth setting of the ultrasound machine differs from that expected by the probe, the structure of the image will differ in detectable ways from that expected in this case - for example the wrong amount of "black space" on the image, or wrong annotations on the screen.
  • the projection center may lie on or near the plane of the ultrasound system.
  • the projector can project a single line or shadow that indicates where this plane is.
  • a needle or similar tool placed in the correct plane will become bright or dark, respectively.
  • a video camera outside this plane can view the scene, and this image can be displayed on a screen. Indeed, it may be included with the ultrasound view. In this case, the clinician can view both the external and internal guidance of the needle simultaneously on the same screen.
  • Guidance to achieve a particular angle can be superimposed on the camera image, so that the intersection of the ultrasound plane and the plane formed by the superimposed guidance forms a line that is the desired trajectory of the needle, see, e.g., Figure 31.
  • a camera may be located along the ultrasound plane, and the projector is located off-plane. The geometry is similar, but according to this embodiment, the camera superimposed image is used to define the plane, and a line is projected by the projector to define the needle trajectory.
  • Further variations include combinations of single or multiple cameras or projectors, where at least one of either is mounted on the mobile device itself as well as mounted statically in the environment, with registration between the mobile and fixed components maintained at all times to make guidance possible. This registration maintenance can be achieved e.g. by detecting and tracking known features present in the environment and/or projected into the common field of interest.
  • the registration component of the system may take advantage of its ability to
  • a micro-projection device integrated into the ultrasound probe bracket can provide the operator with an interactive, realtime visualization modality, displaying relevant data like needle intersection points, optimal entry points, and other supporting data directly in the intervention location by projecting these onto the patient skin surface near the probe.
  • the combination of the camera and projector can be used to construct intuitive and sterile user interfaces on the patient surface, or on any other projectable surface.
  • standard icons and buttons can be projected onto the patient, and a finger or needle can be tracked and used to activate these buttons.
  • This tracking can also be used in non- visual user interfaces, e.g. for gesture tracking without projected visual feedback.
  • the probe may be registered in body coordinates.
  • the system may then project guidance as to how to move the probe to visualize a given target. For example, suppose that a tumor is identified in a diagnostic image, or in a previous scan. After registration, the projection system can project an arrow on the patient showing in which direction the probe should move.
  • this method can be used to guide a user to visualize a particular organ based on a prior model of the patient or a patient-specific scan, or could be used to aid in tracking or orienting relative to a given target. For example, it may be desirable to place a gating window (e.g. for Doppler ultrasound) on a particular target or to maintain it therein.
  • a gating window e.g. for Doppler ultrasound
  • the augmentation system may use multi-band projection with both visible and invisible bands (such as with IR in various ways), simultaneously or time-multiplexed.
  • the invention may use multi-projector setups for shadow reduction, intensity enhancement, or passive stereo guidance.
  • the projection image may be time-multiplexed in synchrony with the camera or cameras to alternatively optimize projection for tracking (maximize needle presence), guidance (overlay clues), surfaces (optimize stereo reconstruction).
  • the projection pattern may also be spatially modulated or multiplexed for different purposes, e.g. projecting a pattern in one area and guidance in other areas.
  • the projection system may make use of mirrors for making one projector two (or more) by using "arms” etc. to split the image or to accomplish omnidirectional projection, see, e.g., Figure 32.
  • the projection system may make use of polarization for 3D guidance or use dual-arm or dual-device projection with polarized light and (passive) glasses for 3D in-situ ultrasound guidance display.
  • the projection may project onto a screen, including a fog screen, switchable film, and UV-fluorescent glass, as almost-in-situ projection surfaces
  • the projection system may make use of the geometry computed by the stereo system to correct for the curvature of the body when projecting information onto it.
  • the projection system may include outward-looking cameras to track the user to help correct visualization from geometric distortion or probe motion. This may also be used to solve the parallax problem when projecting in 3D.
  • the projection system may project a fixed pattern upwards onto the environment to support tracking with stereo cameras (limited degrees of freedom, depending on environment structure).
  • the projection system may project a fixed pattern upwards onto the environment to support tracking with stereo cameras.
  • the system may make use of 3D information that is computed from the projected pattern, it may make use of image appearance information that comes from objects in the world, or it may use both appearance and depth information. It may be useful to synchronize the projection in such a way that images with the pattern and without are obtained. Methods for performing 3D reference positioning using depth and intensity information are well known in the art.
  • the projector may make use of light-activated dyes that have been "printed on patient” or may contain an auxiliary controlled laser for this purpose.
  • the projector might instead project onto other rigid or deformable objects in the workspace.
  • the camera may reconstruct a sheet of paper in space, and the projector could project the CT data of a preoperative scan onto the paper. As the paper is deformed the CT data would be altered to reflect the data that it would "slice through” if it were inside the body. This would allow the visualization of curved surfaces or curvilinear structures.
  • the system may have an electronic or printable signature that records the essential targeting information in an easy-to- use way. This information may be loaded or scanned visually by the device itself when the patient is re-imaged.
  • An interesting use of the above method of probe and needle guidance is to make ultrasound treatment accessible for non-experts. This may include providing training for those learning about diagnostic or interventional ultrasound, or to make it possible for the general population to make use of ultrasound-based treatments for illness. These methods could also monitor the use of an imaging probe and/or needles etc. and indicate when the user is poorly trained.
  • An example of the application of the above would be to have an ultrasound system installed at a pharmacy, and to perform automated carotid artery examination by an unskilled user.
  • nondestructive inspection of a plane wing may use ultrasound or x-ray, but in either case requires exact guidance to the inspection location (e.g. a wing attachment) in question.
  • the methods described above can provide this guidance.
  • the system could provide guidance for e.g. throwing darts, hitting a pool ball, or a similar game.
  • PJD comment OK to leave in.
  • Example 1 Ultrasound-guided Liver Ablation Therapy.
  • Targeting Limitations One common feature of current ablative methodology is the necessity for precise placement of the end-effector tip in specific locations, typically within the volumetric center of the tumor, in order to achieve adequate destruction. The tumor and zone of surrounding normal parenchyma can then be ablated. Tumors are identified by preoperative imaging, primarily CT and MR, and then operatively (or laparoscopically) localized by intra-operative ultrasonography (IOUS). When performed percutaneously, trans-abdominal ultrasonography is most commonly used. Current methodology requires visual comparison of preoperative diagnostic imaging with real-time procedural imaging, often requiring subjective comparison of cross-sectional imaging to IOUS.
  • liver directed therapy The impact of radiological complete response on tumor targeting is an important emerging problem in liver directed therapy. Specifically, this problem relates to the inability to identify the target tumor at the time of therapy.
  • Effective combination systemic chemotherapeutic regimens are being used with increasing frequency prior to liver-directed therapy to treat potential micro-metastatic disease as a neo-adjuvant approach, particularly for colorectal metastases [Gruenberger-2008]. This allows the opportunity to use the liver tumor as a gauge to determine chemo-responsiveness as an aid to planning subsequent post- procedural chemotherapy.
  • the target lesion often cannot be identified during the subsequent resection or ablation.
  • Time-of- flight camera can replace the SLS configuration to provide the surface data [Billings-201 1] ( Figure 10).
  • the ToF camera is not attached to the ultrasound probe, and an external tracker is used to track both components. Projector can still be attached to the ultrasound probe.
  • Another embodiment consists of SLS or ToF camera to provide surface information and a projector attached to the ultrasound probe.
  • the camera configuration i.e. SLS should be able to extract surface data, track intervention tool, and probe surface, hence can locate the needle to the US image coordinate.
  • This embodiment requires offline calibration to estimate the transformation between the probe surface shape and the actual location of the ultrasound image.
  • a projector still can be used to overlay needle location and visualize guidance information.
  • embodiment can only consist of projectors and local sensors.
  • Figure 7 describes a system composed of pulsed laser projector to track an interventional tool in air and in tissue using photoacoustic (PA) phenomenon [Boctor-2010].
  • Interventional tools can convert pulsed light energy into an acoustic wave that can be picked up by multiple acoustic sensors placed on the probe surface, which we then can apply known triangulation algorithms to locate the needle. It is important to note that one can apply laser light directly to the needle, i.e. attach fiber optic configuration to a needle end; the needle can also conduct the generated acoustic wave (i.e.
  • One possible embodiment is to integrate both an ultrasound probe with an endoscopic camera held on one endoscopic channel and having the projector component connected in a separate channel.
  • This projector can enable structured light, and the endoscopic camera performs surface estimation to help performing hybrid surface/ultrasound registration with a pre-operative modality.
  • the projector can be a pulsed laser projector that can enable PA effects and the ultrasound probe attached to the camera can generate PA images for region of interest.
  • Siperstein AE Resection versus laparoscopic radiofrequency thermal ablation of solitary colorectal liver metastasis. J Gastrointest Surg. 2008 Nov; 12(1 1): 1967-72.
  • Example 2 Monitoring Neo-adjuvant chemotherapy using Advanced Ultrasound Imaging
  • NAC Neo-adjuvant chemotherapy
  • NAC is quickly replacing adjuvant (postoperative) chemotherapy as the standard in the management of these patients.
  • NAC is often administered to women with operable stage II or III breast cancer [Kaufmann- 2006].
  • the benefit of NAC is two fold. First, NAC has the ability to increase the rate of breast conserving therapy. Studies have shown that more than fifty percent of women, who would otherwise be candidates for mastectomy only, become eligible for breast conserving therapy because of NAC induced tumor shrinkage [Hortabagyi-1988, Bonadonna-1998].
  • NAC allows in vivo chemo-sensitivity assessment.
  • the ability to detect early drug resistance will prompt change from the ineffective to an effective regimen. Consequently, physicians may decrease toxicity and perhaps improve outcome.
  • the metric most commonly used to determine in-vivo efficacy is the change in the tumor sized during NAC.
  • Ultrasound is a safe modality which easily lends itself to serial use.
  • B-Mode ultrasound does not appear to be sensitive enough to determine subtle changes in tumor size.
  • USEI has emerged as a potentially useful augmentation to conventional ultrasound imaging. USEI has been made possible by two discoveries: (1) different tissues may have significant differences in their mechanical properties and (2) the information encoded in the coherent scattering (a.k.a. speckle) may be sufficient to calculate these differences following a mechanical stimulus [Ophir-1991] .
  • An embodiment for this application is to use an ultrasound probe and an SLS configuration attached to the external passive arm.
  • On day one we place the probe one the region of interest and the SLS configuration captures the breast surface information, the ultrasound probe surface and provides a substantial input for the following task: 1) The US probe can be tracked and hence 3D US volume can be reconstructed from 2D images (the US probe is a 2D probe); or the resulting small volumes from a 3D probe can be stitched together and form a panoramic volume, 2).
  • the US probe can be tracked during elastography scan.
  • This tracking information can be integrated in the EI algorithm to enhance the quality [Foroughi-2010] ( Figure 1 1), and 3) Registration between ultrasound probe's location on the first treatment session and subsequent sessions can be easily recovered using the SLS surface information (as shown in Figure 12) for both the US probe and the breast.
  • Boctor-2005 Boctor EM, DeOliviera M , Awad M., Taylor RH, Fichtinger
  • Elastography a quantitative method for imaging the elasticity of biological tissues.
  • Partridge-2002 Partridge SC, Gibbs JE, Lu Y, Esserman LJ, Sudilovsky D,
  • Hylton NM " Accuracy of MR imaging for revealing residual breast cancer in patients who have undergone neoadjuvant chemotherapy," AJR Am J Roentgenol. 2002 Nov; 179(5): 1 193- 9.
  • Valero- 1996 Valero V, Buzdar AU, Hortobagyi GN, "Locally Advanced
  • Varghese-2004 Varghese T, Shi H. Elastographic imaging of thermal lesions in liver in-vivo using diaphragmatic stimuli. Ultrason Imaging. 2004 Jan;26(l): 18-28.
  • Kidney cancer is the most lethal of all genitourinary tumors, resulting in greater than 13,000 deaths in 2008 out of 55,000 new cases diagnosed [61]. Further, the rate at which kidney cancer is diagnosed is increasing [1,2,62]. "Small" localized tumors currently represent approximately 66% of new diagnoses of renal cell carcinoma [63].
  • Figure 13 shows the first system where an SLS component is held on a laparoscopic arm, a laparoscopic ultrasound probe and an external tracking device to track both the US probe and the SLS [Stolka-2010] .
  • SLS can scan kidney surface and probe surface and track both kidney and the US probe.
  • our invention is concerned with Hybrid surface/ultrasound registration. In this embodiment the SLS will scan the kidney surface and together with few ultrasound images a reliable registration with preoperative data can be performed and augmented visualization, similar to the one shown in Figure 13, can be visualized using the attached projector.
  • the second embodiment is shown in Figure 14 where an ultrasound probe is located outside the patient and facing directly towards the superficial side of the kidney.
  • a laparoscopic tool holds an SLS configuration.
  • the SLS system provides kidney surface information in real-time and the 3DUS also images the same surface (tissue-air interface).
  • registration can be also performed using photoacoustic effect ( Figure 15).
  • the project in the SLS configuration can be a pulsed laser projector with a fixed pattern. Photoacoustic signals will be generated at specified points, which forms a known calibrated pattern. The ultrasound imager can detect these points PA signals. Then a straightforward point-to-point registration can be performed to establish real-time registration between the camera/projector-space and the ultrasound space.
  • ultrasound probe can be easily introduced to the C-arm scene without adding or changing the current setup.
  • the SLS configuration is capable of tracking the US probe. It is important to note that in many pediatric interventional applications, there is need to integrate ultrasound imager to the C-arm suite. In these scenarios, the SLS configuration can be either attached to the C-arm, to the ultrasound probe, or separately attached to an arm.
  • This ultrasound/C-arm system can consist of more than one SLS configuration, or combination of these sensors.
  • the camera or multiple cameras can be fixed to the C-arm where the projector can be attached to the US probe.
  • C-arm is a moving equipment and can't be considered a rigid-body, i.e. there is a small rocking/vibrating motion that need to be measured/calibrated at the manufacture site and these numbers are used to compensate during reconstruction. If a faulty condition happened that alter this calibration, the company needs to be informed to re-calibrate the system. These faulty conditions are hard to detect and repeated QC calibration is also unfeasible and expensive.
  • Our accurate surface tracker should be able to determine the motion of the C-arm and continuously, in the background, compare to the manufacture calibration. Once a faulty condition happens, our system should be able to discover and possible correct it.
  • Hafez- 1999 Hafez KS, Fergany AF, Novick AC. Nephron sparing surgery for localized renal cell carcinoma: impact of tumor size on patient survival, tumor recurrence and TNM staging. J Urol 1999 Dec; 162(6): 1930-3.

Abstract

An augmentation device for an imaging system has a bracket structured to be attachable to an imaging component, and a projector attached to the bracket. The projector is arranged and configured to project an image onto a surface in conjunction with imaging by the imaging system. A system for image-guided surgery has an imaging system, and a projector configured to project an image or pattern onto a region of interest during imaging by the imaging system. A capsule imaging device has an imaging system, and a local sensor system. The local sensor system provides information to reconstruct positions of the capsule endoscope free from external monitoring equipment.

Description

INTERVENTIONAL IN-SITU IMAGE GUIDANCE BY FUSING ULTRASOUND
AND VIDEO
CROSS-REFERENCE OF RELATED APPLICATION
[0001] This application claims priority to U.S. Provisional Application No.
US61/545,168 filed October 9, 2011, U.S. Provisional Application No. US61/603,625, filed February 27, 2012, and U.S. Provisional Application No. US61/657,441, filed June 8, 2012, the entire contents of which are hereby incorporated by reference.
BACKGROUND
1. Field of Invention
[0002] The field of the currently claimed embodiments of this invention relate to imaging devices and to augmentation devices for these imaging devices, and more particularly to such devices that have one or more of a camera, one or more of a projector, and/or a set of local sensors for observation and imaging of, projecting onto, and tracking within and around a region of interest.
2. Discussion of Related Art
[0003] Image-guided surgery (IGS) can be defined as a surgical or intervention procedure where the doctor uses indirect visualization to operate, i.e. by employing imaging instruments in real time, such as fiber-optic guides, internal video cameras, flexible or rigid endoscopes, ultrasonography etc. Most image-guided surgical procedures are minimally invasive. IGS systems allow the surgeon to have more information available at the surgical site while performing a procedure. In general, these systems display 3D patient information and render the surgical instrument in this display with respect to the anatomy and a preoperative plan. The 3D patient information can be a preoperative scan such as CT or MRI to which the patient is registered during the procedure, or it can be a real-time imaging modality such as ultrasound or fluoroscopy. Such guidance assistance is particularly crucial for minimally invasive surgery (MIS), where a procedure or intervention is performed either through small openings in the body or percutaneously (e.g. in ablation or biopsy procedures). MIS techniques provide for reductions in patient discomfort, healing time, risk of complications, and help improve overall patient outcomes.
[0004] In image-guided interventions, the tracking and localization of imaging devices and medical tools during procedures are exceptionally important and are considered the main enabling technology in IGS systems. Tracking technologies can be easily categorized into the following groups: 1) mechanical-based tracking including active robots (DaVinci robots [http://www.intuitivesurgical.com, August 2nd, 2010]) and passive-encoded mechanical arms (Faro mechanical arms [http://products.faro.com product-overview, August 2nd, 2010]), 2) optical-based tracking ( DI OptoTrak [http://www.ndigital.com, August 2nd, 2010], MicronTracker [http://www.clarontech.com, August 2nd, 2010]), 3) acoustic-based tracking, and 4) electromagnetic (EM)-based tracking (Ascension Technology [http://www.ascension-tech.com, August 2nd, 2010]).
[0005] Ultrasound is one useful imaging modality for image-guided interventions including ablative procedures, biopsy, radiation therapy, and surgery. In the literature and in research labs, ultrasound-guided intervention research is performed by integrating a tracking system (either optical or EM methods) with an ultrasound (US) imaging system to, for example, track and guide liver ablations, or in external beam radiation therapy [E.M. Boctor, M. DeOliviera, M. Choti, R. Ghanem, R.H. Taylor, G. Hager, G. Fichtinger, "Ultrasound Monitoring of Tissue Ablation via Deformation Model and Shape Priors", International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2006; H. Rivaz, I. Fleming, L. Assumpcao, G. Fichtinger, U. Hamper, M. Choti, G. Hager, and E. Boctor, "Ablation monitoring with elastography: 2D in-vivo and 3D ex-vivo studies", International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2008; H. Rivaz, P. Foroughi, I. Fleming, R. Zellars, E. Boctor, and G. Hager, "Tracked Regularized Ultrasound Elastography for Targeting Breast Radiotherapy", Medical Image Computing and Computer Assisted Intervention (MICCAI) 2009] . On the commercial side, Siemens and GE Ultrasound Medical Systems recently launched a new interventional system, where an EM tracking device is integrated into high-end cart-based systems. Small EM sensors are integrated into the ultrasound probe, and similar sensors are attached and fixed to the intervention tool of interest.
[0006] Limitations of the current approach on both the research and commercial sides can be attributed to the available tracking technologies and to the feasibility of integrating these systems and using them in clinical environments. For example, mechanical-based trackers are considered expensive and intrusive solutions, i.e. they require large space and limit user motion. Acoustic tracking does not provide sufficient navigation accuracy, leaving optical and EM tracking as the most successful and commercially available tracking technologies. However, both technologies require intrusive setups with a base camera (in case of optical tracking methods) or a reference EM transmitter (in case of EM methods). Additionally, optical rigid-body or EM sensors have to be attached to the imager and all needed tools, hence require offline calibration and sterilization steps. Furthermore, none of these systems natively assist multi-modality fusion (registration e.g. between pre-operative CT/MRI plans and intra-operative ultrasound), and do not contribute to direct or augmented visualization either. Thus there remains a need for improved imaging devices for use in image-guided surgery.
SUMMARY OF THE INVENTION
[0007] An augmentation device for an imaging system according to an embodiment of the current invention has a bracket structured to be attachable to an imaging component, a projector attached to the bracket, and one or more cameras observing the surrounding environment. The projector is arranged and configured to project an image onto a surface in conjunction with imaging by the camera system. This system can be used for registration to the imaged surface, and guidance for placement of the device on the surface, or guidance of needles or other instruments to interact with the surface or below the surface.
[0008] A system that consists of a single camera and project, whereby one of the camera or projector is aligned with the ultrasound plane, and the other is off-axis, and a combination of tracking and display is used to provide guidance. [0009] The camera and projector configuration can be preserved using sterile probe covering that contain special transparent sterile window.
[0010] A structured pattern that simultaneously display the ultrasound image and also used to reconstruct the surface in 3D
[0011] The projection image may be time-multiplexed in synchrony with the camera or cameras to alternatively optimize projection for tracking (maximize needle presence), guidance (overlay clues), surfaces (optimize stereo reconstruction). The projection pattern may also be spatially modulated or multiplexed for different purposes, e.g. projecting a pattern in one area and guidance in other areas.
[0012] An adaptive pattern both in space and time including the following:
[0013] Spatial frequencies of the pattern to adopt surface distance, apparent structure sizes or camera resolution, or
• Colors to adapt surface properties and environment, or
• Randomize/iterate through different pattern overtime
• Both pattern and projected guidance can be integrated and optimized to
reconstruct surfaces
[0014] Real-time feedback and quality control system to choose actively the right pattern design.
[0015] Calculating system metrics - tracking success, robustness, surface outlier ratio to choose the right pattern.
[0016] A method to guide tool by actively tracking the tool and projecting:
• proximity markers (to indicate general "closeness" by e.g. color-changing
backgrounds, frames, or image tints, or auditory cues),
• target markers (to point towards e.g. crosshairs, circles, bulls-eyes etc.), • alignment markers (to line up with e.g. lines, fans, polygons,
• area demarcations (to avoid e.g. shapes denoting critical regions, geometrically or anatomically inaccessible regions etc.).
• Patterns from circles on the edges of the field of view to allow other information to be projected inside the center of the field of view
• Combination of the above
[0017] The guidance to be on screen or projected to the patient or combination of both; we claim the guidance method to be either separate or as an overlay to a secondary imaging system, such as ultrasound images or mono- or multi-ocular views.
[0018] This guidance approach and information to be either registered to the underlying image or environment (i.e. the overlay symbols correspond to target location, size, or areas to avoid); or it can be location-independent guidance (e.g. location, color, size, shape, but also auditory cues such as audio volume, sound clips, and/or frequency changes indicate to the user where to direct the tools or the probe.)
[0019] The combination of the camera and projector can be used to construct intuitive and sterile user interfaces on the patient surface, or on any other projectable surface. For example, standard icons and buttons can be projected onto the patient, and a finger or needle can be tracked and used to activate these buttons. This tracking can also be used in non- visual user interfaces, e.g. for gesture tracking without projected visual feedback.
[0020] The projection system may make use of the geometry computed by the stereo system to correct for the curvature of the body when projecting information onto it.
[0021] The system can include overlay guidance to place the imaging device on a surface (e.g. Ultrasound probe) or move it to a specific pose (e.g. C-arm X-ray). For example, by making use of the ability of an ultrasound probe or similar imaging device to acquire images from within the body while the video imaging system captures images from outside the body, it is possible to register the probe in body coordinates, and to project guidance as to how to move the probe to visualize a given target. For example, suppose that a tumor is identified in a diagnostic image, or in a previous scan. After registration the projection system can project an arrow on the patient showing in which direction the probe should move. One of ordinary skill will realize that these same ideas can be used to guide a user to visualize a particular organ based on a prior model of the patient or a patient-specific scan, or could be used to aid in tracking or orienting relative to a given target. For example, it may be desirable to place a gating window (e.g. for Doppler ultrasound) on a particular target or to maintain it therein.
[0022] It is often the case that a patient is imaged multiple times, for example to provide guidance for radiative cancer therapy. In this case, the images around the target could be recorded, and, upon subsequent imaging, these images would be used to provide guidance on how to move the probe toward a desired target, and an indication when the previous imaging position is reached.
[0023] A method to guide interventional tool by matching the tool's shadow to an artificial shadow - this single-shadow alignment can be used for one degree of freedom with additional active tracking for remaining degrees of freedom. The shadow can be a single line; the shadow can be a line of different thickness; the shadow can be of different colors; the shadow can be used as part of structured light pattern.
[0024] Adaptive projection to overcome interference (e.g. overlay guidance can interfere with needle tracking tasks): guidance "lines" composed of e.g. "string-of-pearls" series of circles/discs/ellipses etc. can improve alignment performance for the user.
[0025] Additionally, the apparent thickness of guidance lines/structures can be modified based on detected tool width, distance to projector, distance to surface, excessive intervention duration etc. to improve alignment performance
[0026] A method based on double shadow or more depending on the number of projectors or virtual projectors available
[0027] Two projectors can uniquely provide two independent shadows that can define the intended/optimal guide of the tool [0028] Using a combination of mirrors and a beam splitter - one projector can be divided into two projectors and hence provide the same number of independent shadows
[0029] A method of guidance to avoid critical structure - by projecting onto patient surface information registered from pre-operative modality
[0030] A guidance system (one example) - Overlaying crosshairs and/or extrapolated needle pose lines onto live ultrasound views on-screen (both in-plane and out-of-plane) or projected onto the patient, see, e.g., Figure 34; Projecting paired symbols (circles, triangles etc.) that change size, color, and relative position depending on the current targeting error vector; Overlaying alignment lines onto single/stereo/multiple camera views that denote desired needle poses, allowing the user to line up the camera image of the needle with the target pose, as well as lines denoting the currently-tracked needle pose for quality control purposes; Projecting needle alignment lines onto the surface, denoting both target pose (for guidance) as well as currently-tracked pose (for quality control), from one or more projectors;
[0031] The system may use the pose of the needle in air to optimize ultrasound to detect the needle in the body and vice-versa. For example, by expecting the location of the needle tip - the ultrasound system can automatically set the transmit focus location and the needle steering parameters etc.
[0032] When using the projector for needle guidance, the system may make use of the projected insertion point as "capture range" for possible needle poses, discard candidates outside that range, or detect when computed 3D poses violate the expected targeting behavior.
[0033] An approach to indicate depth of penetration of the tool. This can be performed by detecting fiducials on the needle, and tracking those fiducials over time. For example, these may be dark rings on the needle itself, which can be counted using the vision system, or they may be a reflective element attached to the end of the needle, and the depth may be computed by subtracting the location of the fiducial in space from the patient surface, and then subtracting that result from the entire length of the needle. [0034] Depth guidance by directly projecting on the needle shaft a fiducial landmark
(e.g. black line or spot of light), indicating to what point the needle should be inserted.
[0035] Additional depth guidance claim can be simply the display of the system may passively indicate the number of fiducial rings that should remain outside the patient at the correct depth for the current system pose, providing the user with a perceptual cue that they can use to determine manually if they are at the correct depth.
[0036] An apparatus and method to provide adaptable mounting bracket:
• The camera/projector configuration can rotate 90 degrees to allow the guidance for both in-plane and out-of-plane intervention
• The bracket height can be adjusted
• The mounting bracket can be modular to add cameras, projectors, for example start with one projector and add one camera, or start with one projector and two cameras and add additional projector
[0037] The camera and projector can be added at different location (camera and projector for in-plane intervention and adding one projector facing the out-of-plane view)
[0038] A calibration method that simultaneously calibrates US, projector and stereo cameras. The method is based on a calibration object constructed from a known geometry:
• Double-wedge phantom attached to a planar surface (as in Fig 26A), or Multi-line phantom (as in Fig 26B). Both are alternative designs of possible phantoms that can be used in estimating the rigid-body transformation between ultrasound coordinate frame and camera coordinates frame. In principle, a phantom with a well-known geometry comprising an ultrasound-visible component and an optically- visible component (as in Figs. 26 A and 26B) is simultaneously observed in both modalities. Pose recovery of both components in their respective modality allows reconstruction of the pose of the cameras and the ultrasound transducer relative to the phantom, and thus the calibration of their relative pose to the each other. See also, Figure 33.
• Multi-line with a known geometry connected to the surface and observed by
cameras
• Complex shape phantom with known geometry from a previous volumetric scan.
By registering both live surface/US data with corresponding preoperative data, we can calibrate the system
• Features in phantoms can be introduced by using nanocapsules that can rupture with ultrasound waves and create a visible mark (observed by cameras)
[0039] A method to accurately measure the location of the projector relative to the location of the cameras and probe. One means of doing so is to observe that visible rays projected from the camera will form straight lines in space that intersect at the optical center of the projector. Thus, with stereo cameras or a similar imaging system observing several surfaces upon which these rays fall, the system can calculate a series of 3D points which can then be extrapolated to compute the center of projection. This can be performed with nearly any planar or nonplanar series of projection surfaces.
[0040] A temporal calibration method that simultaneously synchronize ultrasound data stream to both cameras streams and to projector streams:
[0041] Calibration can be performed using hardware trigger approach
[0042] Software approach can be utilized by moving the US probe periodically above a target - correlating both streams should estimate the amount of internal lag
[0043] A method to synchronize projection output to allow time and space multiplexing (interleaving) patterns for both guidance and stereo structures.
[0044] A system that utilizes custom-made drapes with the following features:
• Drapes that are transparent to the structured light system • Drapes that are IR transparent or wavelength-specific to allow patients surface or organ scanning
Drapes that made of texture or detectable reference frame to allow direct surface tracking and registration/reconstruction
• Drapes that made of light sensitive materials utilizing Fluorescence and/or
Phosphorescence effects. To help create an interface for the user to click on.
• Drapes that are pressure sensitive - color changes with probe pressure or with changes in pressure due to breathing
[0045] The projector may make use of light-activated dyes that have been "printed on patient" or may contain an auxiliary controlled laser for this purpose.
[0046] A depth imaging system composed from more than two cameras. For example with three cameras where camera 1 and 2 are optimized for far range, camera 2 and 3 for mid-range, and camera 1 and 3 for close range.
[0047] An augmentation hardware to the original apparatus depending on the application. The overall configuration may be augmented by and/or controlled from a handheld device such as a tablet computer for 1 ) ultrasound machine operation, 2) for
visualization; 3) in addition, by using an one or more cameras on the tablet computer, for registration to patient for transparent information overlay.
[0048] An augmentation hardware to construct a display system that maintains registration with the probe and which can be used for both visualization and guidance. For example, the probe may have an associated display that the can be detached and which shows relevant pre-operative CT information based on its position in space. It may also overlay targeting information.
[0049] The computational resources used by the device may be augmented with additional computation located elsewhere. [0050] This remote computation might be used to process information coming from the device (e.g. to perform a computationally intense registration process); it may be used to recall information useful to the function of the device (e.g. to compare this patient with other similar patients to provide "best practice" treatment options), or it may be used to provide information that directs the device (e.g. transferring the indication of a lesion in a CT image to a remote center for biopsy).
[0051] Quality control method for the overall system performance. The trajectory of a needle can be calculated by visual tracking and thence projected into the ultrasound image. If the needle in the image is inconsistent with this projection, it is a cue that there is a system discrepancy. Conversely, if the needle is detected in the ultrasound image, it can be projected back into the video image to confirm that the external pose of the needle is consistent with that tracked image.
[0052] Active quality control method by to simultaneously track the needle in both ultrasound and video images, and to use those computed values to detect needle bending and to either update the likely trajectory of the needle, or to alert the user that they are putting pressure on the needle, or both.
[0053] A guidance system based on camera/projector simultaneous interaction. In one embodiment, the projection center may lie on or near the plane of the ultrasound system. In this case, the projector can project a single line or shadow that indicates where this plane is. A needle or similar tool placed in the correct plane will become bright. A video camera outside this plane can view the scene, and this image can be displayed on a screen. Indeed, it may be included with the ultrasound view. In this case, the clinician can view both the external and internal guidance of the needle simultaneously on the same screen. Guidance to achieve a particular angle can be superimposed on the camera image, so that the intersection of the ultrasound plane and the plane formed by the superimposed guidance forms a line that is the desired trajectory of the needle.
[0054] A second embodiment of the simultaneous camera/projector guidance. A variation on this would be to place a camera along the ultrasound plane, and to place the projector off-plane. The geometry is similar, but now the camera superimposed image is used to define the plane, and a line is projected by the projector to define the needle trajectory.
[0055] Further variations include combinations of single or multiple cameras or projectors, where at least one of either is mounted on the mobile device itself as well as mounted statically in the environment, with registration between the mobile and fixed components maintained at all times to make guidance possible. This registration maintenance can be achieved e.g. by detecting and tracking known features present in the environment and/or projected into the common field of interest.
[0056] An augmentation system that may use multi-band projection with both visible and invisible bands (such as with IR in various ways), simultaneously or time-multiplexed. As noted above, the invention may use multi -projector setups for shadow reduction, intensity enhancement, or passive stereo guidance.
[0057] An augmentation device with stereo projection. In order to create a stereo projection, the projection system may make use of mirrors and splitters for making one projector two (or more) by using "arms" etc. to split the image or to accomplish
omnidirectional projection.
[0058] The projection system may make use of polarization for 3D guidance or use dual-arm or dual-device projection with polarized light and (passive) glasses for 3D in-situ ultrasound guidance display. The projection may project onto a screen consisting of any of: Fog screen, switchable film, UV-fluorescent glass as almost-in-situ projection surfaces
[0059] An augmentation device where one of the cameras or a dedicated camera is outward-looking to track the user to help correct visualization from geometric distortion or probe motion. This may also be used to solve the parallax problem when projecting in 3D.
[0060] The augmentation device can estimate relative motion. The projection system may project a fixed pattern upwards onto the environment to support tracking with stereo cameras (limited degrees of freedom, depending on environment structure and the direction of motion) [0061] A projection system that in addition of projecting on the patient surface; the projector might instead project onto other rigid or deformable objects in the workspace or the reading room. For example, the camera might reconstruct a sheet of paper in space, and the projector could project the CT data of a preoperative scan onto the paper. As the paper is deformed the CT data would be altered to reflect the data that it would "slice through" if it were inside the body. This would allow the visualization of curved surfaces or curvilinear structures.
[0062] A data entry approach that can improve the usability of guidance methods, the system may have an electronic or printable signature that records the essential targeting information in an easy-to-use way. This information may be loaded or scanned visually by the device itself when the patient is re-imaged.
[0063] An approach that benefit from conventional database and new visual database
(enabled by the described technology) and provide unique training targeted to needed population.
[0064] This may include providing training for those learning about diagnostic or interventional ultrasound; or to make it possible for the general population to make use of ultrasound-based treatments for illness (automated carotid scanning in pharmacies).
[0065] These methods could also monitor the use of an imaging probe and/or needles etc. and indicate when the user is poorly trained.
[0066] There are many other applications for these ideas that extend beyond ultrasound and medicine. For example, nondestructive inspection of a plane wing may use ultrasound or x-ray, but in either case requires exact guidance to the inspection location (e.g. a wing attachment) in question. The methods described above can provide this guidance. In a more common setting, the system could provide guidance for e.g. throwing darts, hitting a pool ball, or a similar game. BRIEF DESCRIPTION OF THE DRAWINGS
[0067] Further objectives and advantages will become apparent from a consideration of the description, drawings, and examples.
[0068] Figure 1 shows an embodiment of an augmentation device for an imaging system according to an embodiment of the current invention.
[0069] Figure 2 is a schematic illustration of the augmentation device of Figure 1 in which the bracket is not shown.
[0070] Figures 3A-3I are schematic illustrations of augmentation devices and imaging systems according to some embodiments of the current invention.
[0071] Figure 4 is a schematic illustration of a system for (MRI-)image-guided surgery according to an embodiment of the current invention.
[0072] Figure 5 shows representational illustrations of three camera configurations according to different embodiments of the invention, a stereo camera arrangement (left), a single camera arrangement (center) and an omnidirectional camera arrangement (right).
[0073] Figures 6A and 6B are schematic illustrations of an augmentation device for a handheld imaging system according to an embodiment including a switchable semi- transparent screen for projection purposes.
[0074] Figure 7 is a schematic illustration of an augmentation device for a handheld imaging system according to an embodiment including a laser-based system for photoacoustic imaging (utilizing both tissue- and airborne laser and ultrasound waves) for needle tracking and improved imaging quality in some applications.
[0075] Figures 8A and 8B are schematic illustrations of one possible approach for needle guidance, using projected guidance information overlaid directly onto the imaged surface, with an intuitive dynamic symbol scheme for position/orientation correction support. [0076] Figure 9 shows the appearance of a needle touching a surface in a structured light system for an example according to an embodiment of the current application.
[0077] Figure 10 shows surface registration results using CPD on points acquired from CT and a ToF camera for an example according to an embodiment of the current application.
[0078] Figure 11 shows a comparison of SNR and CNR values that show a large improvement in quality and reliability of strain calculation when the RF pairs are selected using our automatic frame selection method for an example according to an embodiment of the current application.
[0079] Figure 12 shows a breast phantom imaged with a three-color sine wave pattern; right the corresponding 3D reconstruction for an example according to an embodiment of the current application.
[0080] Figure 13 shows laparoscopic partial nephrectomy guided by US elasticity imaging for an example according to an embodiment of the current application. Left: System concept and overview. Right: Augmented visualization.
[0081] Figure 14 shows laparoscopic partial nephrectomy guided by US probe placed outside the body for an example according to an embodiment of the current application.
[0082] Figure 15 shows an example of a photoacoustic-based registration method according to an embodiment of the current application. The pulsed laser projector initiates a pattern that can generate PA signals in the US space. Hence, fusion of both US and Camera spaces can be easily established using point-to-point real-time registration method.
[0083] Figure 16 shows ground truth (left image) reconstructed by the complete projection data according to an embodiment of the current application. The middle one is reconstructed using the truncated sonogram with 200 channels trimmed from both sides. The right one is constructed using the truncated data and the extracted trust region (Rectangle support).
[0084] Figure 17 is a schematic illustration showing projection of live ultrasound
(useful as structured-light pattern and for guidance) onto the skin surface.
[0085] Figure 18 is a schematic illustration of different structured-light patterns shown with varying spatial frequencies.
[0086] Figure 19 is a schematic illustration of different structured-light patterns, with and without edges, to aid the detection of straight needles.
[0087] Figure 20 is a schematic illustration of randomizing through different patterns over time to increase the data density for stereo surface reconstruction.
[0088] Figure 21 is a schematic illustration of use of a camera/projection unit combination outside of an imaging device next to the patient; here projecting structured-light patterns onto the skin as well as onto a semi-transparent or switchable-film screen above the patient.
[0089] Figure 22 is a schematic illustration of using a switchable-film, fluorescent, or similar semi-transparent screen, simultaneous projection onto both the patient and the screen is possible.
[0090] Figure 23 is a schematic illustration of dual-shadow passive guidance - by projecting one line from each projection center, two light planes are created that intersect at the desired needle pose and allow passive alignment.
[0091] Figure 24 is a schematic illustration of semi-active, single-shadow guidance
- by projecting one line and additional guidance symbols (based on needle tracking results), the needle can be passively aligned in one plane and actively in the remaining degrees of freedom.
[0092] Figure 25 is a schematic illustration of using "bulby" (bottom) as opposed to straight lines (top) to improve needle guidance performance and usability because of the additional directional information to the user. Critical regions projected onto the surface, from the point of view of the projector, needle, or other viewpoints.
[0093] Figure 26A is a schematic illustration of a setup for camera-ultrasound calibration with double- wedge phantom. The ultrasound probe becomes aligned with the wedges' central plane during a manual sweep, and simultaneously a stereo view of a grid allows to reconstruct the camera pose relative to the well-known phantom.
[0094] Figure 26B is an illustration of a multi-line phantom. This figure shows another configuration of a known geometry that can uniquely identify the pose of the ultrasound imaging frame, and relate the ultrasound image to the known optical landmark (the checker board). Hence the calibration can be performed from a single image.
[0095] Figure 27 is a schematic illustration of estimation of the camera pose in camera coordinates allows to optimize ultrasound imaging parameters (such as focus depth) for best needle or target imaging.
[0096] Figure 28A is a schematic illustration of target/directional symbols indicate the changes to the needle pose to be made by the user in order to align with the target.
[0097] Figure 28B is a schematic illustration of dual-shadow approach for passive guidance.
[0098] Figure 28C is a schematic illustration of direct projection of target/critical regions onto the surface allows freehand navigation by the user.
[0099] Figure 29 is a schematic illustration of projection of visible rays from the projection center onto arbitrary surfaces allows to reconstruct lines that in turn allow to reconstruct the projection center in camera coordinates, helping to calibrate cameras and projectors.
[00100] Figure 30 is a schematic illustration of the system uses the projected insertion points as a "capture range" reference, discarding/not tracking needles that point too far away from it. [00101] Figure 31 is a schematic illustration of passive needle alignment using one projector, one camera: Alignment of the needle with the projected line constrains the pose to a plane, while alignment with a line overlaid onto the camera image imposes another plane; together defining a needle insertion pose.
[00102] Figure 32 is a schematic illustration of double-shadow passive guidance with a single projector and dual-mirror attachment: The single projection cone is split into two virtual cones from different virtual centers, thus allowing passive alignment with limited hardware overhead.
[00103] Figure 33 is a picture illustrating how double-wedges show up in ultrasound and how they are automatically detected/segmented (the green triangle). This is the pose recovery based on ultrasound images.
[00104] Figure 34 is a screenshot of the system's graphical user interface showing the image overlay for out-of-plane views (the top section, with the green crosshair+line crossing the horizontal gray "ultrasound plane" line).
[00105] In Figures 5 and 17 through 32, projected images are shown in blue, and camera views are shown in red. Additionally, C denotes cameras 1&2; P is projector, P' is projected image (blue); C is camera views (red); N is needle or instrument; M is mirror; B is base, US is ultrasound, I is imaging system, SLS is structured light surface, O is object or patient surface, and S is for a semi-transparent or switchable-film screen (except for Figures 24 and 32, where S is a real (cast) line shadow, and S' are projected shadow lines for alignment).
DETAILED DESCRIPTION
[00106] Some embodiments of the current invention are discussed in detail below. In describing embodiments, specific terminology is employed for the sake of clarity. However, the invention is not intended to be limited to the specific terminology so selected. A person skilled in the relevant art will recognize that other equivalent components can be employed and other methods developed without departing from the broad concepts of the current invention. All references cited anywhere in this specification are incorporated by reference as if each had been individually incorporated.
[00107] Some embodiments of this invention describe IGI-(image-guided interventions)-enabling "platform technology" going beyond the current paradigm of relatively narrow image-guidance and tracking. It simultaneously aims to overcome limitations of tracking, registration, visualization, and guidance; specifically using and integrating techniques e.g. related to needle identification and tracking using 3D computer vision, structured light, and photoacoustic effects; multi-modality registration with novel combinations of orthogonal imaging modalities; and imaging device tracking using local sensing approaches; among others.
[00108] The current invention covers a wide range of different embodiments, sharing a tightly integrated common core of components and methods used for general imaging, projection, vision, and local sensing.
[00109] Some embodiments of the current invention are directed to combining a group of complementary technologies to provide a local sensing approach that can provide enabling technology for the tracking of medical imaging devices, for example, with the potential to significantly reduce errors and increase positive patient outcomes. This approach can provide a platform technology for the tracking of ultrasound probes and other imaging devices, intervention guidance, and information visualization according to some embodiments of the current invention. By combining ultrasound imaging with image analysis algorithms, probe-mounted camera and projection units, and very low-cost, independent optical-inertial sensors, according to some embodiments of the current invention, it is possible to reconstruct the position and trajectory of the device and possible tools or other objects by incrementally tracking their current motion.
[00110] Some embodiments of the current invention allow the segmentation, tracking, and guidance of needles and other tools (using visual, ultrasound, and possibly other imaging and localization modalities), allowing for example the integration with the above-mentioned probe tracking capabilities into a complete tracked, image-guided intervention system. [00111] The same set of sensors can enable interactive, in-place visualization using additional projection components. This visualization can include current or pre-operative imaging data or fused displays thereof, but also navigation information such as guidance overlays.
[00112] The same projection components can help in surface acquisition and multi- modality registration, capable of reliable and rapid fusion with pre-operative plans, in diverse systems such as handheld ultrasound probes, MRI/CT/C-arm imaging systems, wireless capsule endoscopy, and conventional endoscopic procedures, for example.
[00113] Such devices can allow imaging procedures with improved sensitivity and specificity as compared to the current state of the art. This can open up several possible application scenarios that previously required harmful X-ray/CT or expensive MRI imaging, and/or external tracking, and/or expensive, imprecise, time-consuming, or impractical hardware setups, or that were simply afflicted with an inherent lack of precision and guarantee of success, such as:
• diagnostic imaging in cancer therapy, prenatal imaging etc.: can allow the
generation of freehand three-dimensional ultrasound volumes without the need for external tracking,
• biopsies, RF/HIFU ablations etc.: can allow 2D- or 3D-ultrasound-based needle guidance without external tracking,
• brachytherapy: can allow 3D-ultrasound acquisition and needle guidance for precise brachytherapy seed placement,
• cone -beam CT reconstruction: can enable high-quality C-arm CT reconstructions with reduced radiation dose and focused field of view,
• gastroenterology: can perform localization and trajectory reconstruction for
wireless capsule endoscopes over extended periods of time, and
• other applications relying on tracked imaging and tracked tools. [00114] Some embodiments of the current invention can provide several advantages over existing technologies, such as combinations of:
• single-plane US-to-CT/MRI registration - no need for tedious acquisition of US volumes,
• low-cost tracking - no optical or electro-magnetic (EM) tracking sensors on
handheld imaging probes, tools, or needles, and no calibrations necessary,
• in-place visualization - guidance information and imaging data is not displayed on a remote screen, but shown projected on the region of interest or over it onto a screen,
• local, compact, and non intrusive solution- ideal tracking system for hand-held and compact ultrasound systems that are primarily used in intervention and point- of-care clinical suites, but also for general needle/tool tracking under visual tracking in other interventional settings,
• improved quality of cone-beam CT - truncation artifacts are minimized.
• improved tracking and multi-modality imaging for capsule endoscopes - enables localization and diagnosis of suspicious findings,
• improved registration of percutaneous ultrasound and endoscopic video, using pulsed-laser photoacoustic imaging.
[00115] For example, some embodiments of the current invention are directed to devices and methods for the tracking of ultrasound probes and other imaging devices. By combining ultrasound imaging with image analysis algorithms, probe-mounted cameras, and very low-cost, independent optical-inertial sensors, it is possible to reconstruct the position and trajectory of the device and possible tools or other objects by incrementally tracking their current motion according to an embodiment of the current invention. This can provide several possible application scenarios that previously required expensive, imprecise, or impractical hardware setups. Examples can include the generation of freehand three- dimensional ultrasound volumes without the need for external tracking, 3D ultrasound-based needle guidance without external tracking, improved multi-modal registration, simplified image overlay, or localization and trajectory reconstruction for wireless capsule endoscopes over extended periods of time, for example.
[00116] The same set of sensors can enable interactive, in-place visualization using additional projection components according to some embodiments of the current invention.
[00117] Current sonographic procedures mostly use handheld 2D ultrasound (US) probes that return planar image slices through the scanned 3D volume (the "region of interest'VROI). In this case, in order to gain sufficient understanding of the clinical situation, the sonographer needs to scan the ROI from many different positions and angles and mentally assemble a representation of the underlying 3D geometry. Providing a computer system with the sequence of 2D images together with the transformations between successive images ("path") can serve to algorithmically perform this reconstruction of a complete 3D US volume. While this path can be provided by conventional optical, EM etc. tracking devices, a solution of substantially lower cost would hugely increase the use of 3D ultrasound.
[00118] For percutaneous interventions requiring needle guidance, prediction of the needle trajectory is currently based on tracking with sensors attached to the distal (external) needle end and on mental extrapolation of the trajectory, relying on the operator's experience. An integrated system with 3D ultrasound, needle tracking, needle trajectory prediction and interactive user guidance would be highly beneficial.
[00119] Figure 1 is an illustration of an embodiment of an augmentation device 100 for an imaging system according to an embodiment of the current invention. The augmentation device 100 includes a bracket 102 that is structured to be attachable to an imaging component 104 of the imaging system. In the example of Figure 1 , the imaging component 104 is an ultrasound probe and the bracket 102 is structured to be attached to a probe handle of the ultrasound probe. However, the broad concepts of the current invention are not limited to only this example. The bracket 102 can be structured to be attachable to other handheld instruments for image-guided surgery, such as surgical orthopedic power tools or stand-alone handheld brackets, for example. In other embodiments, the bracket 102 can be structured to be attachable to the C-arm of an X-ray system or an MRI system, for example.
[00120] The augmentation device 100 also includes a projector 106 attached to the bracket 102. The projector 106 is arranged and configured to project an image onto a surface in conjunction with imaging by the imaging component 104. The projector 106 can be at least one of a visible light imaging projector, a laser imaging projector, a pulsed laser, or a projector of a fixed or selectable pattern (using visible, laser, or infrared/ultraviolet light). Depending on the application, the use of different spectral ranges and power intensities enables different capabilities, such as infrared for structured light illumination simultaneous with e.g. visible overlays; ultraviolet for UV-sensitive transparent glass screens (such as MediaGlass, Superlmaging Inc.); or pulsed laser for photoacoustic imaging, for example. A fixed pattern projector can include, for example, a light source arranged to project through a slide, a mask, a reticle, or some other light-patterning structure such that a predetermined pattern is projected onto the region of interest. This can be used, for example, for projecting structured light patterns (such as grids or locally unique patterns) onto the region of interest. Another use for such projectors can be the overlay of user guidance information onto the region of interest, such as dynamic needle-insertion-supporting symbols (circles and crosses, cf. Figure 8). Such a projector can be made to be very compact in some applications. A projector of a selectable pattern can be similar to the fixed pattern device, but with a mechanism to select and/or exchange the light-patterning component. For example, a rotating component could be used in which one of a plurality of predetermined light- patterning sections is moved into the path of light from the light source to be projected onto the region of interest. In other embodiments, said projector(s) can be a stand-alone element of the system, or combined with a subset of other components described in the current invention, i.e. not necessarily integrated in one bracket or holder with another imaging device. In some embodiments, the projector(s) may be synchronized with the camera(s), imaging unit, and/or switchable film screens.
[00121] The augmentation device 100 can also include at least one of a camera 108 attached to the bracket 102. In some embodiments, a second camera 1 10 can also be attached to the bracket 102, either with or without the projector, to provide stereo vision, for example. The camera can be at least one of a visible-light camera, an infra-red camera, or a time-of- flight camera in some embodiments of the current invention. The camera(s) can be standalone or integrated with one or more projection units in one device as well, depending on the application. They may have to be synchronized with the projector(s) and/or switchable film glass screens as well.
[00122] Additional cameras and/or projectors could be provided - either physically attached to the main device, some other component, or free-standing - without departing from the general concepts of the current invention. The cameras need not be traditional perspective cameras, but maybe of other types such as catadioptric or other omni-direction designs, line scan, and so forth. See, e.g., Figure 5.
[00123] The camera 108 and/or 1 10 can be arranged to observe a surface region close to the and during operation of the imaging component 104. In the embodiment of Figure 1 , the two cameras 108 and 1 10 can be arranged and configured for stereo observation of the region of interest. Alternatively, one of the cameras 108 and 1 10, or an additional camera, or two, or more, can be arranged to track the user face location during visualization to provide information regarding a viewing position of the user. This can permit, for example, the projection of information onto the region of interest in such a way that it takes into account the position of the viewer, e.g. to address the parallax problem.
[00124] Figure 2 is a schematic illustration of the augmentation device 100 of Figure 1 in which the bracket 102 is not shown for clarity. Figure 2 illustrates further optional local sensing components that can be included in the augmentation device 100 according to some embodiments of the current invention. For example, the augmentation device 100 can include a local sensor system 1 12 attached to the bracket 102. The local sensor system 1 12 can be part of a conventional tracking system, such as an EM tracking system, for example. Alternatively, the local sensor system 1 12 can provide position and/or orientation information of the imaging component 104 to permit tracking of the imaging component 104 while in use without the need for external reference frames such as with conventional optical or EM tracking systems. Such local sensor systems can also help in the tracking (e.g. determining the orientation) of handheld screens (Figure 4) or capsule endoscopes, not just of imaging components. In some embodiments, the local sensor system 1 12 can include at least one of an optical, inertial, or capacitive sensor, for example. In some embodiments, the local sensor system 1 12 includes an inertial sensor component 1 14 which can include one or more gyroscopes and/or linear accelerometers, for example. In one embodiment, the local sensor system 1 12 has a three-axis gyro system that provides rotation information about three orthogonal axes of rotation. The three-axis gyro system can be a micro-electromechanical system (MEMS) three-axis gyro system, for example. The local sensor system 1 12 can alternatively, or in addition, include one or more linear accelerometers that provide acceleration information along one or more orthogonal axes in an embodiment of the current invention. The linear accelerometers can be, for example, MEMS accelerometers.
[00125] In addition to, or instead of the inertial sensor component 1 14, the local sensor system 1 12 can include an optical sensor system 1 16 arranged to detect motion of the imaging component 104 with respect to a surface. The optical sensor system 1 16 can be similar to the sensor system of a conventional optical mouse (using visible, IR, or laser light), for example. However, in other embodiments, the optical sensor system 1 16 can be optimized or otherwise customized for the particular application. This may include the use of (potentially stereo) cameras with specialized feature and device tracking algorithms (such as scale-invariant feature transform/SIFT and simultaneous localization and mapping/SLAM, respectively) to track the device, various surface features, or surface region patches over time, supporting a variety of capabilities such as trajectory reconstruction or stereo surface reconstruction.
[00126] In addition to, or instead of the inertial sensor component 1 14, the local sensor system 1 12 can include a local ultrasound sensor system to make use of the airborne photoacoustic effect. In this embodiment, one or more pulsed laser projectors direct laser energy towards the patient tissue surface, the surrounding area, or both, and airborne ultrasound receivers placed around the probe itself help to detect and localize potential objects such as tools or needles in the immediate vicinity of the device. [00127] In some embodiments, the projector 106 can be arranged to project an image onto a local environment adjacent to the imaging component 104. For example, the projector 106 can be adapted to project a pattern onto a surface in view of the cameras 108 and 1 10 to facilitate stereo object recognition and tracking of objects in view of the cameras. For example, structured light can be projected onto the skin or an organ of a patient according to some embodiments of the current invention. According to some embodiments, the projector 106 can be configured to project an image that is based on ultrasound imaging data obtained from the ultrasound imaging device. In some embodiments, the projector 106 can be configured to project an image based on imaging data obtained from an x-ray computed tomography imaging device or a magnetic resonance imaging device, for example. Additionally, preoperative data or real-time guidance information could also be projected by the projector 106.
[00128] Although reconstruction using stereo vision is improved by projecting a pattern that aids in stereo matching performance, projecting a traditional structured light pattern may be distracting to the surgeon. However, the speckle pattern of an ultrasound image provides a natural form of texture that can also be informative to the surgeon. Thus, the invention may include the projection of the ultrasound data, and simultaneously that projection may be used to improve stereo reconstruction performance. See, e.g., Figure 17.
[00129] Alternatively, to improve stereo matching performance for surface
reconstruction, it may prove useful to modify parameters of the projected pattern,- both within an image as well as over time. Such parameters may include (a) spatial frequencies (both the presence of edges vs. smoother transitions as well as color patch sizes) - to adapt to surface distance, apparent structure sizes, or camera resolutions, see, e.g., Figures 18 and 19, - or (b) colors - to adapt to surface properties such as skin type or environment conditions such as ambient lighting, or (c) to randomize/iterate through different patterns over time, see, e.g., Figure 20. Both structured-light patterns as well as projected guidance symbols contribute to surface reconstruction performance, but can also be detrimental to overall system performance, e.g. when straight edges interfere with needle tracking. In such cases, projection patterns and guidance symbols can be adapted to optimize system metrics (such as tracking success/robustness, surface outlier ratio etc.), e.g. by introducing more curvy features.
[00130] The augmentation device 100 can also include a communication system that is in communication with at least one of the local sensor system 1 12, camera 108, camera 1 10 or projector 106 according to some embodiments of the current invention. The communication system can be a wireless communication system according to some embodiments, such as, but not limited to, a Bluetooth wireless communication system.
[00131] Although Figures 1 and 2 illustrate the imaging system as an ultrasound imaging system and that the bracket 102 is structured to be attached to an ultrasound probe handle 104, the broad concepts of the current invention are not limited to this example. The bracket can be structured to be attachable to other imaging systems, such as, but not limited to, x-ray and magnetic resonance imaging systems, for example.
[00132] Figure 3A is a schematic illustration of an augmentation device 200 attached to the C-arm 202 of an x-ray imaging system. In this example, the augmentation device 200 is illustrated as having a projector 204, a first camera 206 and a second camera 208. Conventional and/or local sensor systems can also be optionally included in the augmentation device 200, improving the localization of single C-arm X-ray images by enhancing C-arm angular encoder resolution and estimation robustness against structural deformation.
[00133] In operation, the x-ray source 210 typically projects an x-ray beam that is not wide enough to encompass the patient's body completely, resulting in severe truncation artifacts in the reconstruction of so-called cone beam CT (CBCT) image data. The camera 206 and/or camera 208 can provide information on the amount of extension of the patient beyond the beam width. This information can be gathered for each angle as the C-arm 202 is rotated around the patient 212 and be incorporated into the processing of the CBCT image to at least partially compensate for the limited beam width and reduce truncation artifacts In addition, conventional and/or local sensors can provide accurate data of the precise angle of illumination by the x-ray source, for example (more precise than potential C-arm encoders themselves, and potentially less susceptible to arm deformation under varying orientations). Other uses of the camera-projection combination units are surface-supported multi-modality registration, or visual needle or tool tracking, or guidance information overlay. One can see that the embodiment of Figure 3A is very similar to the arrangement of an augmentation device for an MRI system.
[00134] Figure 3B is a schematic illustration of a system for image-guided surgery 400 according to some embodiments of the current invention. The system for image-guided surgery 400 includes an imaging system 402, and a projector 404 configured to project an image onto a region of interest during imaging by the imaging system 402. The projector 404 can be arranged proximate the imaging system 402, as illustrated, or it could be attached to or integrated with the imaging system. In this case, the imaging system 402 is illustrated schematically as an x-ray imaging system. However, the invention is not limited to this particular example. As in the previous embodiments, the imaging system could also be an ultrasound imaging system or a magnetic resonance imaging system, for example. The projector 404 can be at least one of a white light imaging projector, a laser light imaging projector, a pulsed laser, or a projector of a fixed or selectable pattern, for example.
[00135] The system for image-guided surgery 400 can also include a camera 406 arranged to capture an image of a region of interest during imaging by the imaging system. A second camera 408 could also be included in some embodiments of the current invention. A third, fourth or even more cameras could also be included in some embodiments. The region of interest being observed by the imaging system 402 can be substantially the same as the region of interest being observed with the camera 406 and/or camera 408. The cameras 406 and 408 can be at least one of a visible-light camera, an infra-red camera or a time-of-flight camera, for example. Each of the cameras 406, 408, etc. can be arranged proximate the imaging system 402 or attached to or integrated with the imaging system 402.
[00136] The system for image-guided surgery 400 can also include one or more sensor systems, such as sensor systems 410 and 412, for example. In this example, the sensor systems 410 and 412 are part of a conventional EM sensor system. However, other conventional sensor systems such as optical tracking systems could be used instead of or in addition to the EM sensor systems illustrated. Alternatively, or in addition, one or more local sensor systems such as local sensor system 1 12 could also be included instead of sensor systems 410 and/or 412. The sensor systems 410 and/or 412 could be attached to any one of the imaging system 402, the projector 404, camera 406 or camera 408, for example. Each of the projector 404 and cameras 406 and 408 could be grouped together or separate and could be attached to or made integral with the imaging system 402, or arranged proximate the imaging system 402, for example.
[00137] Figure 4 illustrates one possible use of a camera/projection combination unit in conjunction with a medical imaging device such as MRI or CT. Image-guided interventions based on these modalities suffer from registration difficulties arising from the fact that in-place interventions are awkward or impossible due to space constraints within the imaging device bores, among other reasons. Therefore, a multi-modality image registration system supporting the interactive overlay of potentially fused pre- and intra-operative image data could support or enable e.g. needle-based percutaneous interventions with massively reduced imaging requirements in terms of duration, radiation exposure, cost etc. A camera/projection unit outside the main imaging system could track the patient, reconstruct the body surface using e.g. structured light and stereo reconstruction, and register and track needles and other tools relative to it. Furthermore, handheld units comprising switchable film glass screens could be tracked optically and used as interactive overlay projection surfaces, see, e.g., Figure 21. The tracking accuracy for such screens could be improved by attaching (at least inertial) local sensor systems to said screens, allowing better orientation estimation that using visual clues alone. The screens need not impede the (potentially structured-light-supported) reconstruction of the underlying patient surface, nor block the user's view of that surface, as they can be rapidly switched (up to hundreds of times per second) alternating between a transparent mode to allow pattern and guidance information projection onto the surface, and an opaque mode to block and display other user-targeted data, e.g. in a tracked 3D data visualization fashion.
[00138] Such switchable film glass screens can also be attached to handheld imaging devices such as ultrasound probes and the afore-mentioned brackets as in Figure 6. This way, imaging and/or guidance data can be displayed on a handheld screen - in opaque mode - directly adjacent to imaging devices in the region of interest, instead of on a remote monitor screen. Furthermore - in transparent mode - structured light projection and/or surface reconstruction are not impeded by the screen, see, e.g., Figure 22. In both cases the data is projected onto or through the switchable screen using the afore-mentioned projection units, allowing a more compact handheld design or even remote projection. Furthermore, these screens (handheld or bracket-mounted) can also be realized using e.g. UV- sensitive/fluorescent glass, requiring a (potentially multi-spectral for color reproduction) UV projector to create bright images on the screen, but making active control of screen mode switching unnecessary. In the latter case, overlay data projection onto the screen and structured light projection onto the patient surface can be run in parallel, provided the structured light uses a frequency unimpeded by the glass.
[00139] Figure 7 describes a possible extension to the augmentation device ("bracket") described for handheld imaging devices, comprising one or more pulsed lasers as projection units that are directed through fibers towards the patient surface, exciting tissue -borne photoacoustic effects, and towards the sides of the imaging device, emitting the laser pulse into the environment, allowing airborne photoacoustic imaging. For the latter, the handheld imaging device and/or the augmentation device comprise ultrasound receivers around the device itself, pointing into the environment. Both photoacoustic channels can be used e.g. to enable in-body and out-of-body tool tracking or out-of-plane needle detection and tracking, improving both detectability and visibility of tools/needles under various circumstances.
[00140] In endoscopic systems the photoacoustic effect can be used together with its structured-light aspect for registration between endoscopic video and ultrasound. By emitting pulsed laser patterns from a projection unit in an endoscopic setup, a unique pattern of light incidence locations is generated on the endoscope-facing surface side of observed organs. One or more camera units next to the projection unit in the endoscopic device observe the pattern, potentially reconstructing its three-dimensional shape on the organ surface. At the same time, a distant ultrasound imaging device on the opposite side of the organ under observation receives the resulting photoacoustic wave patterns and is able to reconstruct and localize their origins, corresponding to the pulsed-laser incidence locations. This "rear- projection" scheme allows simple registration between both sides - endoscope and ultrasound - of the system. [00141] Figure 8 outlines one possible approach to display needle guidance information to the user by means of direct projection onto the surface in the region of interest in a parallax-independent fashion, so the user position is not relevant to the method's success (the same method can be applied to projection e.g. onto a device-affixed screen as described above, or onto handheld screens). Using e.g. a combination of moving, potentially color/size/thickness/etc. -coded circles and crosses, the five degrees of freedom governing a needle insertion (two each for insertion point location and needle orientation, and one for insertion depth and/or target distance) can be intuitively displayed to the user. In one possible implementation, the position and color of a projected circle on the surface indicate the intersection of the line between the current needle position and the target location with the patient surface, and said intersection point's distance from a planned insertion point. The position, color, and size of a projected cross can encode the current orientation of the needle with respect to the correct orientation towards the target location, as well as the needle's distance from the target. The orientation deviation is also indicated by an arrow pointing towards the proper position/orientation configuration. In another implementation, guidance information necessary to adjust the needle orientation can be projected as a virtual shadow onto the surface next to the needle insertion point, prompting the user to minimize the shadow length to properly orient the needle for insertion.
[00142] Needle guidance may be active, by projecting crosshairs or other targeting information for all degrees of freedom as described above. Needle guidance may also make use of shadows as a means of alignment. A "single-shadow alignment" can be used for 1 degree of freedom with additional active tracking/gui dance for remaining degree of freedom, e.g. circles or crosshairs, see, e.g., Figure 24. Alternatively, if multiple projectors are available, then stereo guidance may make use of shadows, active light planes, or other similar methods, see, e.g., Figures 23 and 32. In this case, needle guidance may be passive (without needle tracking) by using simple alignment either in stereo views/cameras or in dual projector shadows or patterns.
[00143] Specific projection patterns may be used to enhance the speed or reliability of tracking. Examples include specific shadow "brush types" or profiles to help quickly and precisely aligning needle shadow with projected shadow ("bulby lines" etc.). See, e.g., Figure 25. Other patterns may be better for rough vs. precise alignments.
[00144] The system may also make use of "shadows" or projections of critical areas or forbidden regions onto patient surface, using pre-op CT/MRI or non-patient-specific atlas to define a "roadmap" for an intervention, see, e.g., Figure 25.
[00145] While the above-mentioned user guidance display is independent of the user viewing direction, several other information displays (such as some variations on the image- guided intervention system shown in Figure 4) may benefit from knowledge about the location of the user's eyes relative to the imaging device, the augmentation device, another handheld camera/projection unit, and/or projection screens or the patient surface. Such information can be gathered using one or more optical (e.g. visible- or infrared-light) cameras pointing away from the imaging region of interest towards regions of space where the user face may be expected (such as upwards from a handheld ultrasound imaging device) combined with face-detection capabilities to determine the user's eye location, for example.
EXAMPLES
[00146] The following provides some examples according to some embodiments of the current invention. These examples are provided to facilitate a description of some of the concepts of the invention and are not intended to limit the broad concepts of the invention.
[00147] The local sensor system can include inertial sensors 506, such as a three-axis gyro system, for example. For example, the local sensor system 504 can include a three-axis MEMS gyro system. In some embodiments, the local sensor system 504 can include optical position sensors 508, 510 to detect motion of the capsule imaging device 500. The local sensor system 504 can permit the capsule imaging device 500 to record position information along with imaging data to facilitate registering image data with specific portions of a patient's anatomy after recovery of the capsule imaging device 500, for example.
[00148] Some embodiments of the current invention can provide an augmentation of existing devices which comprises a combination of different sensors: an inertial measurement unit based on a 3 -axis accelerometer; one or two optical displacement tracking units (OTUs) for lateral surface displacement measurement; one, two or more optical video cameras; and a (possibly handheld and/or linear) ultrasound (US) probe, for example. The latter may be replaced or accompanied by a photoacoustic (PA) arrangement, i.e. one or more active lasers, a photoacoustically active extension, and possibly one or more separate US receiver arrays. Furthermore, an embodiment of the current invention may include a miniature projection device capable of projecting at least two distinct features.
[00149] These sensors (or a combination thereof) may be mounted, e.g. on a common bracket or holder, onto the handheld US probe, with the OTUs pointing towards and close to the scanning surface (if more than one, then preferably at opposite sides of the US array), the cameras mounted (e.g., in a stereo arrangement) so they can capture the environment of the scanning area, possible needles or tools, and/or the operating room environment, and the accelerometer in a basically arbitrary but fixed location on the common holder. In a particular embodiment, the projection device may be pointing mainly onto the scanning surface. In another particular embodiment, one PA laser may point towards the PA extension, while the same or another laser may point outwards, with US receiver arrays suitably arranged to capture possible reflected US echos. Different combinations of the mentioned sensors are possible.
[00150] The mounting bracket need not be limited to a fixed position or orientation.
The augmentation device may be mounted on a re-configurable/rotatable setup to re-orient device from in-plane to out-of-plane projection and guidance depending on the needs of the operator. The mounting mechanism may also be configurable to allow elevation of augmentation device to accommodate different user habits (low/high needle grips etc.). The mounting system may also be modular and allow users to add cameras, add projectors, add mechanical guides e.g. for elevation angle control as needed for the application.
[00151] For particular applications and/or embodiments, an interstitial needle or other tool may be used. The needle or tool may have markers attached for better optical visibility outside the patient body. Furthermore, the needle or tool may be optimized for good ultrasound visibility if they are supposed to be inserted into the body. In particular embodiments the needle or tool may be combined with inertial tracking components (i.e. accelerometers).
[00152] For particular applications and/or embodiments, additional markers may optionally be used for the definition of registration or reference positions on the patient body surface. These may be optically distinct spots or arrangements of geometrical features designed for visibility and optimized optical feature extraction.
[00153] For particular applications and/or embodiments, the device to be augmented by the proposed invention may be a handheld US probe; for others it may be a wireless capsule endoscope (WCE); and other devices are possible for suitably defined applications, where said applications may benefit from the added tracking and navigational capabilities of the proposed invention.
Software Components:
[00154] In one embodiment (handheld US probe tracking), an embodiment of the invention includes a software system for opto-inertial probe tracking (OIT). The OTUs generate local translation data across the scan surface (e.g. skin or intestinal wall), while accelerometers and/or gyroscopes provide absolute orientation and/or rotation motion data. Their streams of local data are combined over time to reconstruct an n-DoF probe trajectory with n=2. . .6, depending on the actual OIC sensor combination and the current pose/motion of the probe.
[00155] In general, the current pose Q(t)=(P(t), R(t)) can be computed incrementally with t-l
P(t) = P(0) +∑R(i)Ap(i) where the R(i) are the orientations directly sampled from the accelerometers and/or incrementally tracked from relative displacements between the OTUs (if more than one) at time i, and Δρ(ζ') are the lateral displacements at time i as measured by the OTUs. P(0) is an arbitrarily chosen initial reference position. [00156] In one embodiment (handheld US probe tracking), a software system for speckle-based probe tracking is included. An (ultrasound-image-based) speckle decorrelation analysis (SDA) algorithm provides very high-precision 1-DoF translation (distance) information for single ultrasound image patch pairs by decorrelation, and 6-DoF information for the complete ultrasound image when combined with planar 2D-2D registration techniques. Suitable image patch pairs are preselected by means of FDS (fully developed speckle) detection. Precision of distance estimation is improved by basing the statistics on a larger set of input pairs.
[00157] Both approaches (opto-inertial tracking and SDA) may be combined to achieve greater efficiency and/or robustness. This can be achieved by dropping the FDS detection step in the SDA and instead relying on opto-inertial tracking to constrain the set of patch pairs to be considered, thus implicitly increasing the ratio of suitable FDS patches without explicit FDS classification.
[00158] Another approach can be the integration of opto-inertial tracking information into a maximum-a-posteriori (MAP) displacement estimation. In yet another approach, sensor data fusion between OIT and SDA can be performed using a Kalman filter.
[00159] In one embodiment (handheld US probe tracking), a software system for camera-based probe tracking and needle and/or tool tracking and calibration can be included.
[00160] The holder-mounted camera(s) can detect and segment e.g. a needle in the vicinity of the system. By detecting two points Pi and P2, with Pi being the needle insertion point into the patient tissue (or alternatively, the surface intersection point in a water container) and P2 being the end or another suitably distant point on the needle, and a third point Pi being the needle intersection point in the US image frame, it is possible to calibrate the camera-US probe system in one step in closed form by following
( 2 - 1) x ( 1 - Pi) = 0 with Xbeing the sought calibration matrix linking US frame and the camera(s). [00161] Another method for calibrating an ultrasound device, a pair of cameras, and a projection device proceeds as follows. The projector projects a pattern onto a planar target. The planar target is observed by the cameras, and is simultaneously measured by the ultrasound probe. Several such images are acquired. Features on the planar target are used to produce a calibration for the camera system. Using this calibration, the position of the plane in space can be calculated by the camera system. The projector can be calibrated using the same information. The corresponding position of the intersection of the ultrasound beam with the plane produces a line in the ultrasound image. Processing of several such lines allows the computation of the relative position of the cameras and the ultrasound probe.
[00162] In order to insure high accuracy, synchronization of the imaging components is necessary. Synchronizing one or more cameras with an ultrasound system can be accomplished whereby a trigger signal is derived from or generated by the ultrasound system, and this trigger signal is use to trigger camera acquisition. The trigger signal may come from the ultrasound data acquisition hardware, or from the video display associated with the ultrasound system. The same trigger signal may be used to trigger a projection device to show a particular image or pattern.
[00163] An alternative is a method of software temporal synchronization whereby the camera pair and ultrasound system are moved periodically above a target. The motion of the target in both camera and ultrasound is measured, and the temporal difference is computed by matching or fitting the two trajectories. A method for doing so is disclosed in N. Padoy, G.D. Hager, Spatio-Temporal Registration of Multiple Trajectories, Proceedings of Medical Image Computing and Computer-Assisted Intervention (MICCAI), Toronto, Canada, September 201 1.
[00164] This also provides a means for interleaving patterns for guidance and for other purposes such as stereo reconstruction, whereby a trigger signal causes the projector to switch between patterns. Preferentially, the pattern used by the camera system is invisible to the naked eye so that the user is not distracted by the transition.
[00165] Calibration can also be accomplished by using a specially constructed volume, as shown in Figures 26A and 26B. The ultrasound system is swept over the volume while the volume is simultaneously observed by the camera system. The surface models from both ultrasound and the camera system are registered to a computational model of the shape, and from this the relative position of the camera and ultrasound system is computed.
[00166] An alternative implementation is to use nanocapsules that rupture under ultrasound irradiation, creating an opaque layer in a disposable calibration phantom
[00167] Furthermore, if the above-mentioned calibration condition does not hold at some point in time (detectable by the camera(s)), needle bending can be inferred from a single 2D US image frame and the operator properly notified.
[00168] Furthermore, 3D image data registration is also aided by the camera(s) overlooking the patient skin surface. Even under adverse geometrical conditions, three degrees of freedom (tilt, roll, and height) can be constrained using the cameras, facilitating registration of 3D US and e.g. CT or similar modalities by restricting the registration search space (making it faster) or providing initial transformation estimates (making it easier and/or more reliable). This may be facilitated by the application of optical markers onto the patient skin surface, which will also help in the creation of an explicit fixed reference coordinate system for integration of multiple 3D volumes.
[00169] Alternatively, drapes may be used that are designed to specifically enhance the performance of the system, whereby such drapes contain an easily detected pattern, fiducials, or other reference points, and the drapes adhere to the patient. Also, drapes that are transparent, and allow the cameras to see the patient directly through the drapes. Drapes may be specially colored to differentiate them from needles to be tracked. The drapes are preferably configured to enhance the ability of the cameras to compute probe motion.
[00170] Sterility can be preserved by using sterile probe coverings that contain special transparent areas for the cameras and projector to preserve sterility while also preserving or enhancing the function of the cameras and projectors.
[00171] In some embodiments, it may be useful to make use of pressure-sensitive drapes to indicate tissue deformation under the US probe. For example, such drapes could be used to enhance ultrasound elasticity measurement. The pressure-sensitive drapes may be used to monitor the use of the device by noting the level of pressure applied and correcting the registration and display based on that information.
[00172] Furthermore, the camera(s) provide additional data for pose tracking. In general, this will consist of redundant rotational motion information in addition to opto- inertial tracking. In special cases however, this information could not be recovered from OIT (e.g. yaw motions on a horizontal plane in case of surface tracking loss of one or both optical translation detectors, or tilt motion without translational components around a vertical axis). This information may originate from a general optical-flow-based rotation estimation, or specifically from tracking of specially applied optical markers onto the patient skin surface, which will also help in the creation of an explicit fixed reference coordinate system for integration of multiple 3D volumes.
[00173] Furthermore, by detecting and segmenting the extracorporeal parts of a needle, the camera(s) can provide needle translation information. This can serve as input for ultrasound elasticity imaging algorithms to constrain the search space (in direction and magnitude) for the displacement estimation step by tracking the needle and transforming estimated needle motion into expected motion components in the US frame, using the aforementioned calibration matrix X.
[00174] Furthermore, the camera(s) can provide dense textured 3D image data of the needle insertion area. This can be used to provide enhanced visualization to the operator, e.g. as a view of the insertion trajectory as projected down along the needle shaft towards the skin surface, using actual needle/patient images.
[00175] The system may use the pose (location and orientation) of the needle in air to optimize ultrasound to detect the needle in the body and vice-versa, see, e.g., Figure 27.
[00176] It may be of interest to have differing fields of view and depth ranges in the depth imaging system. For example, on the surface, the cameras maybe a few 10s of centimeters from the surface; but at other times nearly a meter. In this case, it may be useful to have multiple depth ranging configurations built into the same head, mount, or bracket assembly), e.g. using three or four video cameras or multiple depth sensors, additionally at different relative orientations and/or set to different focal lengths.
[00177] For particular applications and/or embodiments, integration of a micro- projector unit can provide an additional, real-time, interactive visual user interface e.g. for guidance purposes. Projecting navigation data onto the patient skin in the vicinity of the probe, the operator need not take his eyes away from the intervention site to properly target subsurface regions. Tracking the needle using the aforementioned camera(s), the projected needle entry point (intersection of patient skin surface and extension of the needle shaft) given the current needle position and orientation can be projected using a suitable representation (e.g. a red dot). Furthermore, an optimal needle entry point given the current needle position and orientation can be projected onto the patient skin surface using a suitable representation (e.g. a green dot). These can be positioned in real-time, allowing interactive repositioning of the needle before skin puncture without the need for external tracking.
[00178] As noted previously, guidance can be visually provided to the user in a variety of ways, either (a) on-screen or (b) projected through one or more projectors, e.g. directly onto the patient surface near the probe.
[00179] Also, this guidance can be provided either (a) separately or (b) as an overlay to a secondary image stream, such as ultrasound images or mono- or multi-ocular camera views. Also, this guidance can be either (a) registered to the underlying image or environment geometry such that overlaid symbols correspond to environment features (such as target areas) in location and possibly size and/or shape, or (b) location-independent such that symbol properties, e.g. location, color, size, shape, but also auditory cues such as audio volume, sound clips, and/or frequency changes indicate to the user where to direct the tools or the probe.
[00180] Guidance symbols can include - in order of increasing specificity - (a) proximity markers (to indicate general "closeness" by e.g. color-changing backgrounds, frames, or image tints, or auditory cues), (b) target markers (to point towards e.g. crosshairs, circles, bulls-eyes etc.), see, e.g., Figure 28A, (c) alignment markers (to line up with e.g. lines, fans, polygons), see, e.g., Figure 28B, or (d) area demarcations (to avoid e.g. shapes denoting critical regions, geometrically or anatomically inaccessible regions etc.), see, e.g., Figure 28C.
[00181] Overlaid guidance symbols can interfere with overall system performance, e.g. when tracking needles; so adaptation of projected graphic primitives (such as replacing lines with elliptic or curvy structures) can reduce artifacts. Additionally, guidance "lines" composed of e.g. "string-of-pearls" series of circles/discs/ellipses etc. can improve alignment performance for the user. Additionally, the apparent thickness of guidance lines/structures can be modified based on detected tool width, distance to projector, distance to surface, excessive intervention duration, etc., to improve alignment performance.
[00182] Specific - non-exhaustive - examples of the above concepts include: a) overlaying crosshairs and/or extrapolated needle pose lines onto live ultrasound views onscreen or projected onto the patient; b) projecting paired symbols (circles, triangles etc.) that change size, color, and relative position depending on the current targeting error vector; c) overlaying alignment lines onto single/stereo/multiple camera views that denote desired needle poses, allowing the user to line up the camera image of the needle with the target pose, as well as lines denoting the currently-tracked needle pose for quality control purposes; and d) projecting needle alignment lines onto the surface, denoting both target pose (for guidance) as well as currently-tracked pose (for quality control), from one or more projectors.
[00183] An important aspect of this system is a high accuracy estimate of the location of the projector relative to the probe and to the video camera. One means of doing so is to observe that visible rays projected from the camera will form straight lines in space that intersect at the optical center of the projector. Thus, with stereo cameras or a similar imaging system observing several surfaces upon which these rays fall, the system can calculate a series of 3D points which can then be extrapolated to compute the center of projection. See, e.g., Figure 29. This can be performed with nearly any planar or nonplanar series of projection surfaces.
[00184] Different combinations of software components are possible for different applications and/or different hardware embodiments. Also, the overall configuration may be augmented by and/or controlled from a hand-held device such as a tablet computer for 1) ultrasound machine operation, 2) for visualization; 3) in addition, by using an one or more cameras on the tablet computer, for registration to patient for transparent information overlay.
[00185] The computational resources used by the device may be augmented with additional computation located elsewhere. This remote computation might be used to process information coming from the device (e.g. to perform a computationally intense registration process), it may be used to recall information useful to the function of the device (e.g. to compare this patient with other similar patients to provide "best practice" treatment options), or it may be used to provide information that directs the device (e.g. transferring the indication of a lesion in a CT image to a remote center for biopsy). The use of external computation may be measured and associated with the costs of using the device.
[00186] In addition to providing guidance on the needle trajectory, guidance can be provided to indicate the correct depth of penetration. This can be performed by detecting fiducials on the needle, and tracking those fiducials over time. For example, these may be dark rings on the needle itself, which can be counted using the vision system, or they may be a reflective element attached to the end of the needle, and the depth may be computed by subtracting the location of the fiducial in space from the patient surface, and then subtracting that result from the entire length of the needle.
[00187] It may also be possible to indicate depth of penetration to the user by projecting a fiducial (e.g. a bright point of light) onto the needle, indicating to what point the needle should be inserted to be at the correct depth.
[00188] Additionally, the display of the system may passively indicate the number of fiducial rings that should remain outside the patient at the correct depth for the current system pose, providing the user with a perceptual cue that they can use to determine manually if they are at the correct depth.
[00189] When using the projector for needle guidance, the system may make use of the projected insertion point as "capture range" for possible needle poses, discard candidates outside that range, or detect when computed 3D poses violate the expected targeting behavior, see, e.g., Figure 30.
[00190] For imaging, the PA laser can fire directly and diffusely at the tissue wall, exciting a PA sound wave emanating from there that is received with the mentioned passive US array and can be used for diagnostic purposes. Ideally, using a combination of the mentioned tracking methods, the diagnostic outcome can be linked to a particular location along the GI tract.
[00191] Some embodiments of the current invention can allow reconstructing a 2D ultrasound probe's 6-DoF ("degrees of freedom") trajectory robustly, without the need for an external tracking device. The same mechanism can be e.g. applied to (wireless) capsule endoscopes as well. This can be achieved by cooperative sets of local sensors that incrementally track a probe's location through its sequence of motions. Some aspects of the current invention can be summarized, as follows.
[00192] First, an (ultrasound-image-based) speckle decorrelation analysis (SDA) algorithm provides very high-precision 1-DoF translation (distance) information for image patch pairs by decorrelation, and 6-DoF information for the complete ultrasound image when combined with planar 2D-2D registration techniques. Precision of distance estimation is improved by basing the statistics on a larger set of input pairs. (The parallelized approach with a larger input image set can significantly increase speed and reliability.)
[00193] Additionally, or alternatively, instead of using a full transmit/receive ultrasound transceiver (e.g. because of space or energy constraints, as in a wireless capsule endoscope), only an ultrasound receiver can be used according to some embodiments of the current invention. The activation energy in this case comes from an embedded laser. Regular laser discharges excite irregularities in the surrounding tissue and generate photoacoustic impulses that can be picked up with the receiver. This can help to track surfaces and subsurface features using ultrasound and thus provide additional information for probe localization. [00194] Second, a component, bracket, or holder housing a set of optical, inertial, and/or capacitive (OIC) sensors represents an independent source of (ultrasound-image-free) motion information. Optical displacement trackers (e.g. from optical mice or cameras) generate local translation data across the scan surface (e.g. skin or intestinal wall), while accelerometers and/or gyroscopes provide absolute orientation and/or rotation motion data. Capacitive sensors can estimate the distance to tissue when the optical sensors loses surface contact or otherwise suffers tracking loss. Their streams of local data are combined over time to reconstruct an n-DoF probe trajectory with n=2...6, depending on the actual OIC sensor combination and the current pose/motion of the probe.
[00195] Third, two or more optical video cameras are attached to the ultrasound probe, possibly in stereo fashion, at vantage points that let them view the surrounding environment, including any or all of the patient skin surface, possible tools and/or needles, possible additional markers, and parts of the operation room environment. This way, they serve to provide calibration, image data registration support, additional tracking input data, additional input data supporting ultrasound elasticity imaging, needle bending detection input, and/or textured 3D environment model data for enhanced visualization.
[00196] When used medically, it may be necessary for the camera-projector device to be maintained in a sterile environment. This may be accomplished in a number of ways. The housing may be resistant to sterilizing agents, and perhaps be cleaned by wiping. It may also be placed in a sterile bag cover. In this case, it may be advantageous to create a "window" of solid plastic in the cover that attaches to the cameras and projector. This window may attached mechanically, or magnetically, or by static electric attraction ("static cling"). Another way of maintaining sterility is to produce a sterile (possibly disposable) housing that the projector-camera device mounts into.
[00197] One embodiment includes a display system that maintains registration with the probe and which can be used for both visualization and guidance. For example, the probe may have an associated display that the can be detached and which shows relevant preoperative CT information based on its position in space. It may also overlay targeting information. One example would include a pair of glasses that were registered to the probe and were able to provide "see through" or "heads up" display to the user.
[00198] Cameras associated with the augmentation system can be used to perform
"quality control" on the overall performance of the system. For example, the trajectory of a needle can be calculated by visual tracking and thence projected into the ultrasound image. If the needle in the image is inconsistent with this projection, it is a cue that there is a system discrepancy. Conversely, if the needle is detected in the ultrasound image, it can be projected back into the video image to confirm that the external pose of the needle is consistent with that tracked image.
[00199] According to a further embodiment, the system may simultaneously track the needle in both ultrasound and video images, and to use those computed values to detect needle bending and to either update the likely trajectory of the needle, or to alert the user that they are putting pressure on the needle, or both.
[00200] Quality control can also be performed by processing the ultrasound image to determine that it has the expected structure. For example, if the depth setting of the ultrasound machine differs from that expected by the probe, the structure of the image will differ in detectable ways from that expected in this case - for example the wrong amount of "black space" on the image, or wrong annotations on the screen.
[00201] There are a variety of geometries than can be used to provide guidance. In one embodiment, the projection center may lie on or near the plane of the ultrasound system. In this case, the projector can project a single line or shadow that indicates where this plane is. A needle or similar tool placed in the correct plane will become bright or dark, respectively. A video camera outside this plane can view the scene, and this image can be displayed on a screen. Indeed, it may be included with the ultrasound view. In this case, the clinician can view both the external and internal guidance of the needle simultaneously on the same screen. Guidance to achieve a particular angle can be superimposed on the camera image, so that the intersection of the ultrasound plane and the plane formed by the superimposed guidance forms a line that is the desired trajectory of the needle, see, e.g., Figure 31. [00202] According to another embodiment a camera may be located along the ultrasound plane, and the projector is located off-plane. The geometry is similar, but according to this embodiment, the camera superimposed image is used to define the plane, and a line is projected by the projector to define the needle trajectory.
[00203] Further variations include combinations of single or multiple cameras or projectors, where at least one of either is mounted on the mobile device itself as well as mounted statically in the environment, with registration between the mobile and fixed components maintained at all times to make guidance possible. This registration maintenance can be achieved e.g. by detecting and tracking known features present in the environment and/or projected into the common field of interest.
[00204] The registration component of the system may take advantage of its ability to
"gate" in real time based on patient breathing or heart motion. Indeed, the ability of the probe to monitor surface and subsurface change in real time also means that it could register to "cine" (time-series) MR or CT image, and show that in synchrony with patient motion.
[00205] Furthermore, by incorporating additional local sensors (like the OIC sensor bracket) beyond using the ultrasound RF data for the speckle decorrelation analysis (SDA), it is possible to simplify algorithmic complexity and improve robustness by dropping the detection of fully developed speckle (FDS) patches before displacement estimation. While this FDS patch detection is traditionally necessary for SDA, using OIC will provide constraints for the selection of valid patches by limiting the space of possible patches, thus increasing robustness e.g. in combination with RANSAC subset selection algorithms.
[00206] Finally, a micro-projection device (laser- or image-projection-based) integrated into the ultrasound probe bracket can provide the operator with an interactive, realtime visualization modality, displaying relevant data like needle intersection points, optimal entry points, and other supporting data directly in the intervention location by projecting these onto the patient skin surface near the probe.
[00207] The combination of the camera and projector can be used to construct intuitive and sterile user interfaces on the patient surface, or on any other projectable surface. For example, standard icons and buttons can be projected onto the patient, and a finger or needle can be tracked and used to activate these buttons. This tracking can also be used in non- visual user interfaces, e.g. for gesture tracking without projected visual feedback.
[00208] It is another object of the invention to guide the placement of the imaging device on a surface. For example, by making use of the ability of an ultrasound probe or similar imaging device to acquire images from within the body while the video imaging system captures images from outside the body, the probe may be registered in body coordinates. The system may then project guidance as to how to move the probe to visualize a given target. For example, suppose that a tumor is identified in a diagnostic image, or in a previous scan. After registration, the projection system can project an arrow on the patient showing in which direction the probe should move. One of ordinary skill will realize that this method can be used to guide a user to visualize a particular organ based on a prior model of the patient or a patient-specific scan, or could be used to aid in tracking or orienting relative to a given target. For example, it may be desirable to place a gating window (e.g. for Doppler ultrasound) on a particular target or to maintain it therein.
[00209] The augmentation system may use multi-band projection with both visible and invisible bands (such as with IR in various ways), simultaneously or time-multiplexed. As noted above, the invention may use multi-projector setups for shadow reduction, intensity enhancement, or passive stereo guidance.
[00210] The projection image may be time-multiplexed in synchrony with the camera or cameras to alternatively optimize projection for tracking (maximize needle presence), guidance (overlay clues), surfaces (optimize stereo reconstruction). The projection pattern may also be spatially modulated or multiplexed for different purposes, e.g. projecting a pattern in one area and guidance in other areas.
[00211] In order to create a stereo projection, the projection system may make use of mirrors for making one projector two (or more) by using "arms" etc. to split the image or to accomplish omnidirectional projection, see, e.g., Figure 32. [00212] The projection system may make use of polarization for 3D guidance or use dual-arm or dual-device projection with polarized light and (passive) glasses for 3D in-situ ultrasound guidance display. The projection may project onto a screen, including a fog screen, switchable film, and UV-fluorescent glass, as almost-in-situ projection surfaces
[00213] The projection system may make use of the geometry computed by the stereo system to correct for the curvature of the body when projecting information onto it.
[00214] The projection system may include outward-looking cameras to track the user to help correct visualization from geometric distortion or probe motion. This may also be used to solve the parallax problem when projecting in 3D.
[00215] The projection system may project a fixed pattern upwards onto the environment to support tracking with stereo cameras (limited degrees of freedom, depending on environment structure). The projection system may project a fixed pattern upwards onto the environment to support tracking with stereo cameras. The system may make use of 3D information that is computed from the projected pattern, it may make use of image appearance information that comes from objects in the world, or it may use both appearance and depth information. It may be useful to synchronize the projection in such a way that images with the pattern and without are obtained. Methods for performing 3D reference positioning using depth and intensity information are well known in the art.
[00216] The projector may make use of light-activated dyes that have been "printed on patient" or may contain an auxiliary controlled laser for this purpose.
[00217] Rather than relying on the patient surface as a projection surface, the projector might instead project onto other rigid or deformable objects in the workspace. For example, the camera may reconstruct a sheet of paper in space, and the projector could project the CT data of a preoperative scan onto the paper. As the paper is deformed the CT data would be altered to reflect the data that it would "slice through" if it were inside the body. This would allow the visualization of curved surfaces or curvilinear structures.
[00218] It is often the case that a patient is imaged multiple times, for example to provide guidance for radiative cancer therapy. In this case, the images around the target could be recorded, and, upon subsequent imaging, these images would be used to provide guidance on how to move the probe toward a desired target, and an indication when the previous imaging position is reached.
[00219] In order to improve the usability of these methods, the system may have an electronic or printable signature that records the essential targeting information in an easy-to- use way. This information may be loaded or scanned visually by the device itself when the patient is re-imaged.
[00220] An interesting use of the above method of probe and needle guidance is to make ultrasound treatment accessible for non-experts. This may include providing training for those learning about diagnostic or interventional ultrasound, or to make it possible for the general population to make use of ultrasound-based treatments for illness. These methods could also monitor the use of an imaging probe and/or needles etc. and indicate when the user is poorly trained.
[00221] An example of the application of the above would be to have an ultrasound system installed at a pharmacy, and to perform automated carotid artery examination by an unskilled user.
[00222] There are many other applications for these ideas that extend beyond ultrasound and medicine. For example, nondestructive inspection of a plane wing may use ultrasound or x-ray, but in either case requires exact guidance to the inspection location (e.g. a wing attachment) in question. The methods described above can provide this guidance. In a more common setting, the system could provide guidance for e.g. throwing darts, hitting a pool ball, or a similar game. [Question to Peter? Can we claim this non-medical application? Or it is better to take it off ?] PJD comment: OK to leave in.
[00223] The embodiments illustrated and discussed in this specification are intended only to teach those skilled in the art the best way known to the inventors to make and use the invention. In describing embodiments of the invention, specific terminology is employed for the sake of clarity. However, the invention is not intended to be limited to the specific terminology so selected. The above-described embodiments of the invention may be modified or varied, without departing from the invention, as appreciated by those skilled in the art in light of the above teachings. It is therefore to be understood that, within the scope of the claims and their equivalents, the invention may be practiced otherwise than as specifically described.
[00224] Example 1: Ultrasound-guided Liver Ablation Therapy.
[00225] Recent evidence suggests thermal ablation in some cases can achieve results comparable to that of resection. Specifically, a recent randomized clinical trial comparing resection to RFA for small HCC found equivalent long-term outcomes with lower morbidity in the ablation arm [Chen-2006]. Importantly, most studies suggest that efficacy of RFA is highly dependent on the experience and diligence of the treating physician, often associated with a steep learning curve [Poon-2004]. Moreover, the apparent efficacy of open operative RFA over a percutaneous approach reported by some studies suggest that difficultly with targeting and imaging may be contributing factors [Mulier-2005]. Studies of the failure patterns following RFA similarly suggest that limitations in real-time imaging, targeting, monitoring of ablative therapy are likely contributing to increased risk of local recurrence [Mulier-2005].
[00226] One of the most useful features of ablative approaches such as RFA is that it can be applied using minimally invasive techniques. Length of hospital stay, costs, and morbidity may be reduced using this technique [Berber-2008]. These benefits add to the appeal of widening the application of local therapy for liver tumors to other tumor types, perhaps in combination with more effective systemic therapies for minimal residual disease. Improvements in the control, size, and speed of tumor destruction with RFA will begin to allow us to reconsider treatment options for such patients with liver tumors as well. However, clinical outcomes data are clear - complete tumor destruction with adequate margins is imperative in order to achieve durable local control and survival benefit, and this should be the goal of any local therapy. Partial, incomplete, or palliative local therapy is rarely indicated. One study even suggested that incomplete destruction with residual disease may in fact be detrimental, stimulating tumor growth of locally residual tumor cells [Koichi-2008]. This concept is often underappreciated when considering tumor ablation, leading to lack of recognition by some of the importance of precise and complete tumor destruction. Improved targeting, monitoring, and documentation of adequate ablation are critical to achieve this goal. Goldberg et al, in the most cited work on this subject [Goldberg-2000], describes an ablative therapy framework in which the key areas in advancing this technology include improving (1) image guidance, (2) intra-operative monitoring, as well as (3) ablation technology itself.
[00227] In spite of promising results of ablative therapies, significant technical barriers exist with regard to its efficacy, safety, and applicability to many patients. Specifically, these limitations include: (1) localization/targeting of the tumor and (2) monitoring of the ablation zone.
[00228] Targeting Limitations: One common feature of current ablative methodology is the necessity for precise placement of the end-effector tip in specific locations, typically within the volumetric center of the tumor, in order to achieve adequate destruction. The tumor and zone of surrounding normal parenchyma can then be ablated. Tumors are identified by preoperative imaging, primarily CT and MR, and then operatively (or laparoscopically) localized by intra-operative ultrasonography (IOUS). When performed percutaneously, trans-abdominal ultrasonography is most commonly used. Current methodology requires visual comparison of preoperative diagnostic imaging with real-time procedural imaging, often requiring subjective comparison of cross-sectional imaging to IOUS. Then, manual free-hand IOUS is employed in conjunction with free-hand positioning of the tissue ablator under ultrasound guidance. Target motion upon insertion of the ablation probe makes it difficult to localize appropriate placement of the therapy device with simultaneous target imaging. The major limitation of ablative approaches is the lack of accuracy in probe localization within the center of the tumor. This is particularly important, as histological margins cannot be assessed after ablations as opposed to hepatic resection approaches [Koniaris-2000] [Scott-2001] . In addition, manual guidance often requires multiple passes and repositioning of the ablator tip, further increasing the risk of bleeding and tumor dissemination. In situations when the desired target zone is larger than the single ablation size (e.g. 5-cm tumor and 4-cm ablation device), multiple overlapping spheres are required in order to achieve complete tumor destruction. In such cases, the capacity to accurately plan multiple manual ablations is significantly impaired by the complex 3D geometrically complex planning required as well as image distortion artifacts from the first ablation, further reducing the targeting confidence and potential efficacy of the therapy. IOUS often provides excellent visualization of tumors and guidance for probe placement, but its 2D-nature and dependence on the sonographer's skills limit its effectiveness [Wood- 2000].
[00229] Improved real-time guidance for planning, delivery and monitoring of the ablative therapy would provide the missing tool needed to enable accurate and effective application of this promising therapy. Recent studies are beginning to identify reasons for diminished efficacy of ablative approaches, including size, location, operator experience, and technical approach [Mulier-2005] [van Duijnhoven-2006]. These studies suggest that device targeting and ablation monitoring are likely the key reasons for local failure. Also, due to gas bubbles, bleeding, or edema, IOUS images provide limited visualization of tumor margins or even the applicator electrode position during RFA [Hinshaw-2007].
[00230] The impact of radiological complete response on tumor targeting is an important emerging problem in liver directed therapy. Specifically, this problem relates to the inability to identify the target tumor at the time of therapy. Effective combination systemic chemotherapeutic regimens are being used with increasing frequency prior to liver-directed therapy to treat potential micro-metastatic disease as a neo-adjuvant approach, particularly for colorectal metastases [Gruenberger-2008]. This allows the opportunity to use the liver tumor as a gauge to determine chemo-responsiveness as an aid to planning subsequent post- procedural chemotherapy. However, in such an approach, the target lesion often cannot be identified during the subsequent resection or ablation. We know that even when the index liver lesion is no longer visible, microscopic tumors are still present in more than 80% of cases [Benoist-2006]. Any potentially curative approach, therefore, still requires complete resection or local destruction of all original sites of disease. In such cases, the interventionalist can face the situation of contemplating a "blind" ablation in region of the liver in which no imagable tumor can be detected. Therefore, without an ability to identify original sites of disease, preoperative systemic therapies may actually hinder the ability to achieve curative local targeting, paradoxically potentially worsening long-term survival. As proposed in this project, integrating a strategy for registration of the pre-chemotherapy cross- sectional imaging (CT) with the procedure-based imaging (IOUS) would provide invaluable information for ablation guidance.
[00231] Our system embodiments described both in Figure 1 and Figure 2 can be utilized in the above mentioned application. With structured light attached to the ultrasound probe, patient surface can be captured and digitized in real-time. Then, the doctor will select an area of interest to scan where he/she can observe a lesion either directly from the ultrasound images or indirectly from the fused pre-operative data. The fusion is performed by integrating both surface data from structured light and few ultrasound images and can be updated in real-time without manual input from the user. Once the lesion is identified in the US probe space, the doctor can introduce the ablation probe, where the SLS system can easily segment/track and localize the tool before inserting to the patient (Figure 9). The projector can be used to overlay real-time guidance information to help orient the tool and provide a feedback about the needed insertion depth.
[00232] Abovementioned is the embodiment described in Figure 1. However, our invention includes many alternates for example: 1) Time-of- flight camera can replace the SLS configuration to provide the surface data [Billings-201 1] (Figure 10). In this embodiment, the ToF camera is not attached to the ultrasound probe, and an external tracker is used to track both components. Projector can still be attached to the ultrasound probe. 2) Another embodiment consists of SLS or ToF camera to provide surface information and a projector attached to the ultrasound probe. The camera configuration, i.e. SLS should be able to extract surface data, track intervention tool, and probe surface, hence can locate the needle to the US image coordinate. This embodiment requires offline calibration to estimate the transformation between the probe surface shape and the actual location of the ultrasound image. A projector still can be used to overlay needle location and visualize guidance information. 3) Furthermore, embodiment can only consist of projectors and local sensors. Figure 7 describes a system composed of pulsed laser projector to track an interventional tool in air and in tissue using photoacoustic (PA) phenomenon [Boctor-2010]. Interventional tools can convert pulsed light energy into an acoustic wave that can be picked up by multiple acoustic sensors placed on the probe surface, which we then can apply known triangulation algorithms to locate the needle. It is important to note that one can apply laser light directly to the needle, i.e. attach fiber optic configuration to a needle end; the needle can also conduct the generated acoustic wave (i.e. acting like a wave-guide) and fraction of this acoustic wave can propagate from the needle shaft and tip and the PA signals, i.e. acoustic signals generated, can be picked up by both sensors attached to the surface as well as the ultrasound array elements. In addition to the laser light projecting directly to the needle, we can extend few fibers to deposit light energy underneath the probe, hence can track the needle inside the tissue (Figure 7).
[00233] One possible embodiment is to integrate both an ultrasound probe with an endoscopic camera held on one endoscopic channel and having the projector component connected in a separate channel. This projector can enable structured light, and the endoscopic camera performs surface estimation to help performing hybrid surface/ultrasound registration with a pre-operative modality. Possibly, the projector can be a pulsed laser projector that can enable PA effects and the ultrasound probe attached to the camera can generate PA images for region of interest.
References
[00234] [Benoist-2006] Benoist S, Brouquet A, Penna C, Julie C, El Hajjam M,
Chagnon S, Mitry E, Rougier P, Nordlinger B, "Complete response of colorectal liver metastases after chemotherapy: does it mean cure?" J Clin Oncol. 2006 Aug 20;24(24):3939- 45.
[00235] [Berber-2008] Berber E, Tsinberg M, Tellioglu G, Simpfendorfer CH,
Siperstein AE. Resection versus laparoscopic radiofrequency thermal ablation of solitary colorectal liver metastasis. J Gastrointest Surg. 2008 Nov; 12(1 1): 1967-72.
[00236] [Billings-2011] Billings S, Kapoor A, Wood BJ, Boctor EM, "A hybrid surface/image based approach to facilitate ultrasound/CT registration," accepted SPIE Medical Imaging 2011. [00237] [Boctor-2010] E. Boctor, S. Verma et al. "Prostate brachytherapy seed localizationusing combined photoacoustic and ultrasound imaging," SPIE Medical Imaging 2010.
[00238] [Chen-2006] Chen MS, Li JQ, Zheng Y, Guo RP, Liang HH, Zhang YQ, Lin
XJ, Lau WY. A prospective randomized trial comparing percutaneous local ablative therapy and partial hepatectomy for small hepatocellular carcinoma. Ann Surg. 2006
Mar;243(3):321-8.
[00239] [Goldberg-2000] Goldberg SN, Gazelle GS, Mueller PR. Thermal ablation therapy for focal malignancy: a unified approach to underlying principles, techniques, and diagnostic imaging guidance. AJR Am J Roentgenol. 2000 Feb; 174(2):323-31.
[00240] [Gruenberger-2008] Gruenberger B, Scheithauer W, Punzengruber R,
Zielinski C, Tamandl D, Gruenberger T. Importance of response to neoadjuvant
chemotherapy in potentially curable colorectal cancer liver metastases. BMC Cancer. 2008 Apr 25;8: 120.
[00241] [Hinshaw-2007] Hinshaw JL, et. al., Multiple-Electrode Radiofrequency
Ablation of Symptomatic Hepatic Cavernous Hemangioma, Am. J. Roentgenol., Vol. 189, Issue 3, W -149, September 1 , 2007.
[00242] [Koichi-2008] Koichi O, Nobuyuki M, Masaru O et al., "Insufficient radiofrequency ablation therapy may induce further malignant transformation of
hepatocellular carcinoma," Journal of Hepatology International, Volume 2, Number 1 , March 2008, pp 1 16-123.
[00243] [Koniaris-2000] Koniaris LG, Chan DY, Magee C, Solomon SB, Anderson
JH, Smith DO, DeWeese T, Kavoussi LR, Choti MA, "Focal hepatic ablation using interstitial photon radiation energy," J Am Coll Surg. 2000 Aug; 191 (2): 164-74.
[00244] [Mulier-2005] Mulier S, Ni Y, Jamart J, Ruers T, Marchal G, Michel L. Local recurrence after hepatic radiofrequency coagulation: multivariate meta-analysis and review of contributing factors. Ann Surg. 2005 Aug;242(2): 158-71. [00245] [Poon-2004] Poon RT, Ng KK, Lam CM, Ai V, Yuen J, Fan ST, Wong J.
Learning curve for radiofrequency ablation of liver tumors: prospective analysis of initial 100 patients in a tertiary institution. Ann Surg. 2004 Apr;239(4):441-9.
[00246] [Scott-2001] Scott DJ, Young WN, Watumull LM, Lindberg G, Fleming JB,
Huth JF, Rege RV, Jeyarajah DR, Jones DB, "Accuracy and effectiveness of laparoscopic vs open hepatic radiofrequency ablation," Surg Endosc. 2001 Feb; 15(2): 135-40.
[00247] [van Duijnhoven-2006] van Duijnhoven FH, Jansen MC, Junggeburt JM, van
Hillegersberg R, Rijken AM, van Coevorden F, van der Sijp JR, van Gulik TM, Slooter GD, Klaase JM, Putter H, Tollenaar RA, "Factors influencing the local failure rate of
radiofrequency ablation of colorectal liver metastases," Ann Surg Oncol. 2006
May;13(5):651-8. Epub 2006 Mar 17.
[00248] [Wood-2000] Wood TF, Rose DM, Chung M, Allegra DP, Foshag LJ, Bilchik
AJ, "Radiofrequency ablation of 231 unresectable hepatic tumors: indications, limitations, and complications," Ann Surg Oncol. 2000 Sep;7(8):593-600.
[00249] Example 2: Monitoring Neo-adjuvant chemotherapy using Advanced Ultrasound Imaging
[00250] Out of more than two hundred thousand women diagnosed with breast cancer every year, about 10% will present with locally advanced disease [Valero- 1996]. Primary chemotherapy (a.k.a. Neo-adjuvant chemotherapy, NAC) is quickly replacing adjuvant (postoperative) chemotherapy as the standard in the management of these patients. In addition, NAC is often administered to women with operable stage II or III breast cancer [Kaufmann- 2006]. The benefit of NAC is two fold. First, NAC has the ability to increase the rate of breast conserving therapy. Studies have shown that more than fifty percent of women, who would otherwise be candidates for mastectomy only, become eligible for breast conserving therapy because of NAC induced tumor shrinkage [Hortabagyi-1988, Bonadonna-1998]. Second, NAC allows in vivo chemo-sensitivity assessment. The ability to detect early drug resistance will prompt change from the ineffective to an effective regimen. Consequently, physicians may decrease toxicity and perhaps improve outcome. The metric most commonly used to determine in-vivo efficacy is the change in the tumor sized during NAC.
[00251] Unfortunately, the clinical tools used to measure tumor size during NAC, such as physical exam, mammography, and B-mode ultrasound, have been shown to be less than ideal. Researchers have shown that post-NAC tumor size estimates by physical exam, ultrasound and mammography, when compared to pathologic measurements, have correlation coefficients of 0.42, 0.42, and 0.41 respectively [Chagpar-2006] . MRI and PET appear to be more predictive of response to NAC however these modalities are expensive, inconvenient and, with respect to PET, impractical for serial use due to excessive radiation exposure [Smith-2000, Rosen-2003, Partridge-2002]. What is needed is an inexpensive, convenient and safe technique capable of accurately measuring tumor response repeatedly during NAC.
[00252] Ultrasound is a safe modality which easily lends itself to serial use. However, the most common system currently in medical use, B-Mode ultrasound, does not appear to be sensitive enough to determine subtle changes in tumor size. Accordingly, USEI has emerged as a potentially useful augmentation to conventional ultrasound imaging. USEI has been made possible by two discoveries: (1) different tissues may have significant differences in their mechanical properties and (2) the information encoded in the coherent scattering (a.k.a. speckle) may be sufficient to calculate these differences following a mechanical stimulus [Ophir-1991] . An array of parameters, such as velocity of vibration, displacement, strain, velocity of wave propagation and elastic modulus, have been successfully estimated [Konofagou-2004, Greenleaf-2003], which then made it possible to delineate stiffer tissue masses, such as tumors [Hall-2002, Lyshchik-2005, Purohit-2003], ablated lesions [Varghese-2004, Boctor-2005] . Breast cancer detection is the first [Garra-1997] and most promising [Hall-2003] application of USEI.
[00253] An embodiment for this application is to use an ultrasound probe and an SLS configuration attached to the external passive arm. We can track both the SLS and the ultrasound probe using external tracking device, or simply use the SLS configuration to track the probe with respect to SLS's own reference frame. On day one, we place the probe one the region of interest and the SLS configuration captures the breast surface information, the ultrasound probe surface and provides a substantial input for the following task: 1) The US probe can be tracked and hence 3D US volume can be reconstructed from 2D images (the US probe is a 2D probe); or the resulting small volumes from a 3D probe can be stitched together and form a panoramic volume, 2). The US probe can be tracked during elastography scan. This tracking information can be integrated in the EI algorithm to enhance the quality [Foroughi-2010] (Figure 1 1), and 3) Registration between ultrasound probe's location on the first treatment session and subsequent sessions can be easily recovered using the SLS surface information (as shown in Figure 12) for both the US probe and the breast.
References
[00254] [Boctor-2005] Boctor EM, DeOliviera M , Awad M., Taylor RH, Fichtinger
G, Choti MA, Robot-assisted 3D strain imaging for monitoring thermal ablation of liver, Annual congress of the Society of American Gastrointestinal Endoscopic Surgeons, pp 240- 241, 2005.
[00255] [Bonadonna-1998] Bonadonna G, Valagussa P, Brambilla C, Ferrari L,
Moliterni A, Terenziani M, Zambetti M, "Primary chemotherapy in operable breast cancer: eight-year experience at the Milan Cancer Institute," SOJ Clin Oncol 1998 Jan;16(l):93-100.
[00256] [Chagpar-2006] Chagpar A, et al., "Accuracy of Physical Examination,
Ultrasonography and Mammogrpahy in Predicting Residual Pathologic Tumor size in patients treated with neoadjuvant chemotherapy" Annals of surgery Vol.243, Number 2, February 2006.
[00257] [Greenleaf-2003] Greenleaf JF, Fatemi M, Insana M. Selected methods for imaging elastic properties of biological tissues. Annu Rev Biomed Eng. 2003;5:57-78.
[00258] [Hall-2002] Hall TJ, Yanning Zhu, Spalding CS "In vivo real-time freehand palpation imaging Ultrasound Med Biol. 2003 Mar; 29(3):427-35.
[00259] [Konofagou-2004] Konofagou EE. Quovadis elasticity imaging? Ultrasonics.
2004 Apr;42(l-9):331-6. [00260] [Lyshchik-2005] Lyshchik A, Higashi T, Asato R, Tanaka S, Ito J, Mai JJ,
Pellot-Barakat C, Insana MF, Brill AB, Saga T, Hiraoka M, Togashi K. Thyroid gland tumor diagnosis at US elastography. Radiology. 2005 Oct;237(l):202-1 1.
[00261] [Ophir- 1991] Ophir J, Cespedes EI, Ponnekanti H, Yazdi Y, Li X:
Elastography: a quantitative method for imaging the elasticity of biological tissues.
Ultrasonic Imag. ,13: 1 1 1-134, 1991.
[00262] [Partridge-2002] Partridge SC, Gibbs JE, Lu Y, Esserman LJ, Sudilovsky D,
Hylton NM, " Accuracy of MR imaging for revealing residual breast cancer in patients who have undergone neoadjuvant chemotherapy," AJR Am J Roentgenol. 2002 Nov; 179(5): 1 193- 9.
[00263] [Purohit-2003] Purohit RS, Shinohara K, Meng MV, Carroll PR. Imaging clinically localized prostate cancer. Urol Clin North Am. 2003 May;30(2):279-93.
[00264] [Rosen-2003] Rosen EL, Blackwell KL, Baker JA, Soo MS, Bentley RC, Yu
D, Samulski TV, Dewhirst MW, "Accuracy of MRI in the detection of residual breast cancer after neoadjuvant chemotherapy," AJR Am J Roentgenol. 2003 Nov; 181 (5): 1275-82.
[00265] [Smith-2000] Smith IC, Welch AE, Hutcheon AW, Miller ID, Payne S,
Chilcott F, Waikar S, Whitaker T, Ah-See AK, Eremin O, Heys SD, Gilbert FJ, Sharp PF, "Positron emission tomography using [(18)F]-fluorodeoxy-D-glucose to predict the pathologic response of breast cancer to primary chemotherapy," J Clin Oncol. 2000
Apr; 18(8): 1676-88.
[00266] [Valero- 1996] Valero V, Buzdar AU, Hortobagyi GN, "Locally Advanced
Breast Cancer," Oncologist. 1996; 1(1 & 2):8-17.
[00267] [Varghese-2004] Varghese T, Shi H. Elastographic imaging of thermal lesions in liver in-vivo using diaphragmatic stimuli. Ultrason Imaging. 2004 Jan;26(l): 18-28.
[00268] [Foroughi-2010] P. Foroughi, H. Rivaz, I. N. Fleming, G. D. Hager, and E.
Boctor, "Tracked Ultrasound Elastography (TrUE)," in Medical Image Computing and Computer Integrated surgery, 2010. [00269] Example 3: Ultrasound Imaging Guidance for Laparoscopic Partial Nephrectomy
[00270] Kidney cancer is the most lethal of all genitourinary tumors, resulting in greater than 13,000 deaths in 2008 out of 55,000 new cases diagnosed [61]. Further, the rate at which kidney cancer is diagnosed is increasing [1,2,62]. "Small" localized tumors currently represent approximately 66% of new diagnoses of renal cell carcinoma [63].
[00271] Surgery remains the current gold standard for treatment of localized kidney tumors, although alternative therapeutic approaches including active surveillance and emerging ablative technologies [5] exist. Five year cancer-specific survival for small renal tumors treated surgically is greater than 95% [3,4]. Surgical treatments include simple nephrectomy (removal of the kidney), radical nephrectomy (removal of the kidney, adrenal gland, and some surrounding tissue) and partial nephrectomy (removal of the tumor and a small margin of surrounding tissue, but leaving the rest of the kidney intact). More recently, a laparoscopic option for partial nephrectomy (LPN) has been developed with apparently equivalent cancer control results compared to the open approach [9,10]. The benefits of the laparoscopic approach are improved cosmesis, decreased pain, and improved convalescence relative to the open approach.
[00272] Although a total nephrectomy will remove the tumor, it can have serious consequences for patients whose other kidney is damaged or missing or who are otherwise at risk of developing severely compromised kidney function. This is significant given the prevalence of risk factors for chronic renal failure such as diabetes and hypertension in the general population [7,8]. Partial nephrectomy has been shown to be oncologically equivalent to total nephrectomy removal for treatment of renal tumors less than 4 cm in size (e.g., [3,6]). Further, data suggest that patients undergoing partial nephrectomy for treatment of their small renal tumor enjoy a survival benefit compared to those undergoing radical nephrectomy [12-14]. A recent study utilizing the Surveillance, Epidemiology and End Results cancer registry identified 2,991 patients older than 66 years who were treated with either radical or partial nephrectomy for renal tumors <4cm [12]. Radical nephrectomy was associated with an increased risk of overall mortality (HR 1.38, p <0.01) and a 1.4 times greater number of cardiovascular events after surgery compared to partial nephrectomy.
[00273] Despite the advantages in outcomes, partial nephrectomies are performed in only 7.5% of cases [1 1]. One key reason for this disparity is the technical difficulty of the procedure. The surgeon must work very quickly to complete the resection, perform the necessary anastamoses, and restore circulation before the kidney is damaged. Further, the surgeon must know where to cut to ensure cancer-free resection margins while still preserving as much good kidney tissue as possible. In performing the resection, the surgeon must rely on memory and visual judgment to relate preoperative CT and other information to the physical reality of the patient's kidney. These difficulties are greatly magnified when the procedure is performed laparoscopically, due to the reduced dexterity associated with the instruments and reduced visualization from the laparoscope.
[00274] We devised two embodiments to overcome this technically challenging intervention. Figure 13 shows the first system where an SLS component is held on a laparoscopic arm, a laparoscopic ultrasound probe and an external tracking device to track both the US probe and the SLS [Stolka-2010] . However, we don't need to rely on an external tracking device since we have access to an SLS configuration. SLS can scan kidney surface and probe surface and track both kidney and the US probe. Furthermore, our invention is concerned with Hybrid surface/ultrasound registration. In this embodiment the SLS will scan the kidney surface and together with few ultrasound images a reliable registration with preoperative data can be performed and augmented visualization, similar to the one shown in Figure 13, can be visualized using the attached projector.
[00275] The second embodiment is shown in Figure 14 where an ultrasound probe is located outside the patient and facing directly towards the superficial side of the kidney. Internally a laparoscopic tool holds an SLS configuration. The SLS system provides kidney surface information in real-time and the 3DUS also images the same surface (tissue-air interface). By applying surface-to-surface registration ultrasound volume can be easily registered to the SLS reference frame. In a different embodiment, registration can be also performed using photoacoustic effect (Figure 15). Typically, the project in the SLS configuration can be a pulsed laser projector with a fixed pattern. Photoacoustic signals will be generated at specified points, which forms a known calibrated pattern. The ultrasound imager can detect these points PA signals. Then a straightforward point-to-point registration can be performed to establish real-time registration between the camera/projector-space and the ultrasound space.
[00276] C-arm-guided Interventional Application
[00277] Projection data truncation problem is a common issue with reconstructed CT and C-arm images. This problem appears clearly near the image boundaries. Truncation is a result of the incomplete data set obtained from the CT/C-arm modality. An algorithm to overcome this truncation error has been developed [Xu-2010]. In addition to the projection data, this algorithm requires the patient contour in 3D space with respect to the X-Ray detector. This contour is used to generate the trust region required to guide the reconstruction method. A simulation study on a digital phantom was done [Xu-2010] to reveal the enhancement achieved by the new method. However, a practical way to get the trust region has to be developed. Figures 3 and Figure 4 present novel practical embodiments to track and to obtain the patient contour information and consequentially the trust region at each view angle of the scan. The trust region is used to guide the reconstruction method [Ismail- 2011].
[00278] It is known that X-ray is not ideal modality for soft-tissue imaging. Recent C- arm interventional systems are equipped with flat-panel detectors and can perform cone- beam reconstruction. The reconstruction volume can be used to register intraoperative X-ray data to pre-operative MRI. Typically, couple of hundreds X-ray shots need to be taken in order to perform the reconstruction task. Our novel embodiments are capable of performing surface-to-surface registration by utilizing real-time and intraoperative surfaces from SLS or ToF or similar surface scanner sensors. Hence, reducing X-ray dosage is achieved. Nevertheless, if there is need to fine tune the registration task, in this case few X-rays images can be integrated in the overall framework. [00279] It is obvious that similar to US navigation examples and methods described before, the SLS component configured and calibrated to a C-arm can also track interventional tools and the projector attached can provide real-time visualization.
[00280] Furthermore, ultrasound probe can be easily introduced to the C-arm scene without adding or changing the current setup. The SLS configuration is capable of tracking the US probe. It is important to note that in many pediatric interventional applications, there is need to integrate ultrasound imager to the C-arm suite. In these scenarios, the SLS configuration can be either attached to the C-arm, to the ultrasound probe, or separately attached to an arm. This ultrasound/C-arm system can consist of more than one SLS configuration, or combination of these sensors. For example, the camera or multiple cameras can be fixed to the C-arm where the projector can be attached to the US probe.
[00281] Finally, our novel embodiment can provide quality control to the C-arm calibration. C-arm is a moving equipment and can't be considered a rigid-body, i.e. there is a small rocking/vibrating motion that need to be measured/calibrated at the manufacture site and these numbers are used to compensate during reconstruction. If a faulty condition happened that alter this calibration, the company needs to be informed to re-calibrate the system. These faulty conditions are hard to detect and repeated QC calibration is also unfeasible and expensive. Our accurate surface tracker should be able to determine the motion of the C-arm and continuously, in the background, compare to the manufacture calibration. Once a faulty condition happens, our system should be able to discover and possible correct it.
References
[00282] [Jemal-2007] Jemal A, Siegel R, Ward E, Murray T, Xu J, Thun MJ. Cancer statistics, 2007. CA Cancer J Clin2007 Jan-Feb;57(l):43-66.
[00283] 2. [Volpe-2004] Volpe A, Panzarella T, Rendon RA, Haider MA,
Kondylis FI, Jewett MA. The natural history of incidentally detected small renal masses. Cancer2004 Feb 15;100(4):738-45 [00284] 3. [Fergany-2000] Fergany AF, Hafez KS, Novick AC. Long-term results of nephron sparing surgery for localized renal cell carcinoma: 10-year followup. J Urol2000 Feb; 163(2):442-5.
[00285] 4. [Hafez- 1999] Hafez KS, Fergany AF, Novick AC. Nephron sparing surgery for localized renal cell carcinoma: impact of tumor size on patient survival, tumor recurrence and TNM staging. J Urol 1999 Dec; 162(6): 1930-3.
[00286] 5. [Kunkle-2008] Kunkle DA, Egleston BL, Uzzo RG. Excise, ablate or observe: the small renal mass dilemma—a meta-analysis and review. J Urol2008
Apr; 179(4): 1227-33; discussion 33-4.
[00287] 6. [Leibovich-2004] Leibovich BC, Blute ML, Cheville JC, Lohse CM,
Weaver AL, Zincke H. Nephron sparing surgery for appropriately selected renal cell carcinoma between 4 and 7 cm results in outcome similar to radical nephrectomy. J Urol2004 Mar; 171 (3): 1066-70.
[00288] 7. [Coresh-2007] Coresh J, Selvin E, Stevens LA, Manzi J, Kusek JW,
Eggers P, et al. Prevalence of chronic kidney disease in the United States. JAMA2007 Nov 7;298(17):2038-47.
[00289] 8. [Bijol-2006] Bijol V, Mendez GP, Hurwitz S, Rennke HG, Nose V.
Evaluation of the nonneoplastic pathology in tumor nephrectomy specimens: predicting the risk of progressive renal failure. Am J Surg Pathol2006 May;30(5):575-84.
[00290] 9. [Allaf-2004] Allaf ME, Bhayani SB, Rogers C, Varkarakis I, Link
RE, Inagaki T, et al. Laparoscopic partial nephrectomy: evaluation of long-term oncological outcome. J Urol2004 Sep; 172(3):871-3.
[00291] 10. [Moinzadeh-2006] Moinzadeh A, Gill IS, Finelli A, Kaouk J, Desai
M. Laparoscopic partial nephrectomy: 3-year followup. J Urol2006 Feb; 175(2):459-62.
[00292] 1 1. [Hollenbeck-2006] Hollenbeck BK, Taub DA, Miller DC, Dunn RL,
Wei JT. National utilization trends of partial nephrectomy for renal cell carcinoma: a case of underutilization Urology2006 Feb;67(2):254-9. [00293] 12. [Huang-2009] Huang WC, Elkin EB, Levey AS, Jang TL,
Russo P. Partial nephrectomy versus radical nephrectomy in patients with small renal tumors—is there a difference in mortality and cardiovascular outcomes? J Urol2009
Jan; 181(l):55-61 ; discussion -2.
[00294] 13. [Thompson-2008] Thompson RH, Boorjian SA, Lohse CM,
Leibovich BC, Kwon ED, Cheville JC, et al. Radical nephrectomy for pTl a renal masses may be associated with decreased overall survival compared with partial nephrectomy. J Urol2008 Feb; 179(2):468-71 ; discussion 72-3.
[00295] 14. [Zini-2009] Zini L, Perrotte P, Capitanio U, Jeldres C, Shariat
SF, Antebi E, et al. Radical versus partial nephrectomy: effect on overall and noncancer mortality. Cancer2009 Apr 1 ; 1 15(7): 1465-71.
[00296] 15. Stolka PJ, Keil M, Sakas G, McVeigh ER, Taylor RH, Boctor EM, "A
3D-elastography-guided system for laparoscopic partial nephrectomies". SPIE Medical Imaging 2010 (San Diego, C A/US A)
[00297] 61. [Jemal-2008] Jemal A, Siegel R, Ward E, et al. Cancer statistics, 2008. CA Cancer J Clin 2008; 58:71-96. SFX
[00298] 62. [Hock-2002] Hock L, Lynch J, Balaji K. Increasing incidence of all stages of kidney cancer in the last 2 decades in the United States: an analysis of surveillance, epidemiology and end results program data. J Urol 2002; 167:57-60. Ovid Full Text Bibliographic Links
[00299] 63. [Volpe-2005] Volpe A, Jewett M. The natural history of small renal masses. Nat Clin Pract Urol 2005; 2:384-390. SFX
[00300] [Ismail-201 1] Ismail MM, Taguchi K, Xu J, Tsui BM, Boctor E, "3D-guided
CT reconstruction using time-of- flight camera," Accepted in SPIE Medical Imaging 201 1
[00301] [Xu-2010] Xu, J.; Taguchi, K.; Tsui, B. M. W.; , "Statistical Projection
Completion in X-ray CT Using Consistency Conditions," Medical Imaging, IEEE
Transactions on , vol.29, no.8, pp.1528-1540, Aug. 2010

Claims

We Claim:
1. A system for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
a body imaging system configured to image a body along an image plane, a camera configured to observe a region of imaging during operation of said imaging system, and
a projector aligned with said image plane,
wherein said camera is aligned off-axis relative to said image plane;
wherein said projector is configured to project an image that indicates a location of the image plane;
wherein an image recorded by said camera is displayed on a screen, and
wherein the system is configured to superimpose guidance information on the camera display such that intersection of the image plane and the plane formed by the superimposed guidance forms a line that corresponds to a desired trajectory of the instrument.
2. A system according to 1 wherein the display screen is configured to display the image produced by the imaging system together with the image recorded by said camera.
3. A method for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
imaging a body along an image plane;
monitoring a region of said imaging using a camera;
wherein said projector is aligned with said image plane;
wherein said camera is aligned off axis-relative to said image plane;
wherein the projector projects an image that indicates a location of the image plane; wherein an image recorded by said camera is displayed on a screen, and
wherein guidance information is superimposed on the camera display such that intersection of the image plane and the plane formed by the superimposed guidance information forms a line that corresponds to a desired trajectory of the instrument.
4. A system for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
a body imaging system configured to image a body along an image plane, a camera configured to observe a region of imaging during operation of said imaging system, and aligned with said image plane, and
a projector aligned off-axis relative to said image plane;
wherein an image recorded by said camera is displayed on a screen, and
wherein information indicating a location of the image plane is superimposed on the displayed camera image;
wherein the projection system projects a line corresponding to a desired instrument trajectory.
5. A method for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
imaging a body along an image plane;
monitoring a region of said imaging using a camera;
wherein said camera is aligned with said image plane;
displaying on a screen an image recorded by said camera, and
superimposing on said displayed camera image information indicating a location of the image plane;
projecting onto said projected image a line corresponding to a desired instrument trajectory;
wherein said projected image is projected from a projector that is aligned off axis- relative to said image plane.
6. A system for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
a body imaging system configured to image a body along an image plane, a camera configured to observe a region of imaging during operation of said imaging system, and
a projector configured to project an image, wherein one of said camera and said projector is aligned with said image plane, wherein another of said camera and said projector is aligned off-axis relative to said image plane;
wherein said projector is configured to project an image corresponding to a calculated location of a shadow cast by said instrument when said instrument is located in a proper location and orientation for said procedure.
7. A method for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
imaging a body along an image plane;
monitoring a region of said imaging using a camera;
wherein one of said camera and said projector is aligned with said image plane;
wherein another of said camera and said projector is aligned off axis-relative to said image plane;
calculating a location of a shadow that would be cast by an instrument located in a proper location and orientation for said procedure; and
projecting an image corresponding to said calculated location.
8. A method according to claim 7, wherein said projected image corresponding to said calculated location is a line.
9. A method according to claim 7 wherein said projected image corresponding to said calculated location is a line of varying thickness.
10. A method according to claim 7 wherein said projected image corresponding to said calculated location can comprise colors, with different colors indicating degrees of more and less precision relative to a most preferred location for the instrument.
11. A method according to claim 7 wherein said projected image corresponding to said calculated location is part of a structured light pattern.
12. A system for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
a body imaging system configured to image a body along an image plane, a first camera configured to observe a region of imaging during operation of said imaging system, and
a second camera configured to observe a region of said imaging during operation of said imaging system, said first and second cameras configured and located for stereoscopic observation of said region of imaging,
said first and second cameras configured to actively track the location and movement of said instrument, and
said system configured to display guidance information concerning the location and orientation of the instrument relative to a desired location and orientation of the instrument for said procedure.
13. A system according to claim 12, wherein said guidance information is displayed on a display screen.
14. A system according to claim 12, wherein said guidance information is projected onto the patient.
15. A system according to claim 15, wherein said guidance information is displayed on a display screen and projected onto the patient.
16. A system according to claim 12, wherein said guidance information is projected as an overlay on a projection of the image captured by the imaging system.
17. A system according to claim 12, wherein the guidance information is selected from the group consisting of proximity markers, target markers, alignment markers, and area demarcations.
18. A system according to claim 12, wherein the guidance information is registered to the projected image or to the subject.
19. A system according to claim 12, wherein the guidance information is location independent.
20. A system according to claim 12, further configured to provide auditory cues to the user relating to instrument location and orientation.
21. A system according to claim 12, further configured to project guidance concerning placement of the imaging system.
22. A system according to claim 12, wherein images acquired by the imaging system are registered with images recorded with the camera, and the projector uses said registered image to project guidance information for improved visualization of a selected target.
23. A system according to any one of claims 1 , 4, 6 and 12, configured to reference stored prior images of said body to provide guidance concerning for placement of the imaging system for optimal imaging of a desired target.
24. A system according to any one of claims 1 , 4, 6 and 12, configured to display said guidance information is superimposed over live images from said imaging system.
25. A system according to any one of claims 1 , 4, 6 and 12, wherein said guidance information is selected from the group consisting of cross hairs, extrapolated needle location and orientation lines; paired symbols that change size, color and/or relative position based on targeting error vector.
26. A system according to any one of claims 1 , 4, 6 and 12, configured to superimpose alignment lines corresponding to desired instrument location and orientation over single, stereo or multiple camera views, and configured to project instrument alignment lines denoting current instrument location and orientation.
27. A system according to any one of claims 1 , 4, 6 and 12, wherein said projected guidance information is configured to improve the ease of instrument alignment by the user.
28. A system according to any one of claims 1 , 4, 6 and 12, wherein guidance information takes the form of a line punctuated by a series of circles, discs or elipses centered over said line.
29. A system according to any one of claims 1 , 4, 6 and 12, wherein a thickness of said guidelines may be varied based on detected instrument dimensions, distance to projector, distance to the body.
30. A system for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
a body imaging system configured to image a body along an image plane, a first camera configured to observe a region of imaging during operation of said imaging system,
a second camera configured to observe a region of said imaging during operation of said imaging system, said first and second cameras configured and located for stereoscopic observation of said region of imaging, and
a projector assembly configured to project, from two different angles, an image from said imaging system,
said first and second cameras configured to actively track the location and movement of said instrument, and
said system configured to display guidance information concerning the location and orientation of the instrument relative to a desired location and orientation of the instrument for said procedure using intersecting shadows.
31. A system according to claim 30, wherein said projector assembly comprises two projectors.
32. A system according to claim 30, wherein said projector assembly comprises a single projector, a beam splitter, and a plurality of mirrors to generate multiple projections of the same image.
33. A method for calibrating an imaging system, a projector, and a pair of cameras, comprising:
projecting a pattern onto a planar target,
observing the planar target by the cameras,
simultaneously generating a plurality of images of the planar target with the imaging system;
using features on the planar target to produce a calibration for the cameras, using the calibration to calculate the position of the target plane in space, and computing the relative position of the cameras and the imaging system by processing the multiple lines representing the intersection of the plurality of image planes with the target plane.
34. A method for calibrating an imaging system, and a camera, comprising:
observing the planar target by the camera,
imaging a known geometry with said imaging system;
using features of the planar target and the known geometry to compute the relative positions of the camera and the imaging system.
35. A method according to claim 34, wherein the relative positions of said planar target and said known geometry are known.
36. A method according to claim 34, wherein the known geometry is a double wedge, and the planar target is connected, directly or indirectly, to the double wedge.
37. A method for calibrating an imaging system with a camera system comprising:
imaging a complex volume while simultaneously recording the volume with the camera system,
generating surface models of the volume using from each of the imaging system and the camera system,
registering said surface models to a computational model of the volume, and computing the relative position of the cameras and the imaging system.
38. A method for calibrating an imaging system with a camera system comprising:
simultaneously recording the rupture of microcapsules using said imaging system and said camera system,
registering the data from said rupture recorded by said systems, and
computing the relative position of the cameras and the imaging system based on said registration.
39. A method for calibrating an imaging system, a stereo camera system, and a projector, comprising,
observing a plurality of surfaces on which fall visible rays projected from said camera system and which intersect the optical center of the projector;
using said observations to calculate a series of points in space which can be extrapolated to compute the center of projection.
40. A method for temporal calibration of a camera system with an imaging system comprising
generating a trigger signal by the imaging system, and
using said trigger signal to trigger camera acquisition.
41. A method for temporal calibration of a camera system and an imaging system comprising:
moving said camera system and said imaging system periodically above a target, and computing the temporal difference by matching the respective trajectories of the camera system and the imaging system.
42. A method for checking the integrity of a projection guidance system for use with a body imaging system, comprising:
recording the location and orientation of an instrument using a camera system, projecting an image of the instrument over an image produced by an imaging system; and
providing a discrepancy cue to the user if the observed location of the instrument differs from the projected image of the instrument.
43. A method for checking the integrity of a projection guidance system for use with a body imaging system, comprising:
recording the location and/or orientation of an instrument using an imaging system; displaying said recorded location and/or orientation over a camera recorded image of said instrument, and
providing a discrepancy cue to the user of the observed location of the instrument differs from the displayed location of the instrument.
44. A method for checking the integrity of a projection guidance system for use with a body imaging system, comprising:
actively tracking the location and orientation of an instrument using both a body imaging system and a camera assembly, and
using information from said active tracking to provide information to the user concerning instrument location, orientation and trajectory.
45. A method according to any one of claims 42-44, comprising using information from said active tracking to detect instrument bending.
46. A method for optimizing use of an imaging system to track the location of an instrument in a body comprising using said imaging system to detect the location and orientation of said instrument outside of the body and using the detected location and orientation together with the target location and orientation to determine and project trajectory guidance to the user.
47. A method of using a projector for instrument guidance in conjunction with a body imaging system, comprising calculating and projecting body insertion point for the instrument and using a region proximate to said projected body insertion point as a target area for initial instrument image capture.
48. A method of using a projector for instrument guidance in conjunction with a body imaging system, comprising calculating and projecting a body insertion point for the instrument and discarding instrument image capture data falling outside of an area proximate to said projected body insertion point.
49. A method of using a projector for instrument guidance in conjunction with a body imaging system, comprising calculating and projecting a body insertion point for the instrument, and detecting when computed location and orientation of said instrument violate anticipated instrument behavior.
50. A method for indicating the depth of penetration of an instrument in a body, comprising detecting fiducials on the instrument using a device selected from an imaging system, a camera system, and combinations thereof, and tracking the location of the fiducials over time.
51. A method according to claim 50, wherein the fiducials comprise reflective element attached to the instrument.
52. A method according to claim 50, wherein the fiducials are dark rinks on the instrument.
53. A method according to claim 50, wherein the depth of penetration is computed by subtracting the location of the fiducial in space from the patient surface, then subtracting that result from the length of the needle.
54. A method according to claim 50, wherein a fiducial landmark is projected onto said instrument indicating the location on the instrument corresponding to a body surface when the instrument is inserted in the body.
55. A method according to claim 50, further comprising displaying a number of fiducial marks on the instrument that should remain outside of a body to provide a user with a reminder concerning a desired depth of the instrument in the body.
56. A system according to any one of claims 1 , 4, 6 and 12, wherein the projector is configured to project a visual interface on a surface, and wherein the camera system is configured to track user interaction with the visual interface according to which the user may provide instructions to the system by interacting with the visual interface.
57. A system according to any one of claims 1 , 4, 6 and 12, wherein the projector accounts for the three-dimensional structure of a projection surface when projecting a visual interface on said surface.
58. A system according to any one of claims 1 , 4, 6 and 12, wherein the camera system is configured to track user gestures for system instruction.
59. A system according to any one of claims 1 , 4, 6 and 12, further comprising a hand held device configured for remote control of the imaging system.
60. A system according to any one of claims 1 , 4, 6 and 12, further comprising a hand held device configured for display of images recorded by said camera and or by said imaging system.
61. A system according to any one of claims 1 , 4, 6 and 12, further comprising a hand held device configured for registration to a patient for transparent information overlay.
62. A system according to any one of claims 1 , 4, 6 and 12, further comprising a display system that maintains registration with the imaging system and which is configured for both visualization and guidance.
63. A system according to any one of claims 1 , 4, 6 and 12, wherein said imaging system has a detachable display configured to show pre-operative information based on its position in space.
64. A system according to any one of claims 1 , 4, 6 and 12, further comprising a network connection, wherein system computation and/or data storage takes place remotely over said network connection.
65. A system according to any one of claims 1 , 4, 6 and 12, further comprising a mounting bracket configured to receive additional cameras or projects for integration with said system.
66. A system according to any one of claims 1 , 4, 6 and 12, further comprising a housing, containing at least portions of said camera and projector, that is resistant to sterilizing agents.
67. A system according to any one of claims 1 , 4, 6 and 12, further comprising a sterile sheath for at least one of said camera and projector, said sheath containing a transparent solid plastic window to permit projection and camera recording.
68. A method according to any one of claims 3, 5, 7 and 79, comprising projecting said image onto a drape covering at least a portion of said subject.
69. A method according to claim 68, wherein said drape is selected from the group consisting of: transparent to structured light, IR transparent, and wavelength-specific.
70. A method according to claim 68, wherein said drape comprises a detectible reference frame sufficient to allow direct surface tracking and registration.
71. A method according to claim 68, wherein said drape comprises light sensitive materials having fluorescent or phosphorescent effect.
72. A method according to any one of claims 3, 5, 7 and 79, comprising printing information on said subject using a light-activated dye.
73. A method according to any one of claims 3, 5, 7 and 79, further comprising time multiplexing the projection image with the camera to optimize projection for tracking, guidance, and surface.
74. A method according to any one of claims 3, 5, 7 and 79, further comprising time multiplexing or spatially modulating the projection pattern.
75. A method according to any one of claims 3, 5, 7 and 79, further comprising projecting an adaptive pattern according to space and/or time, wherein said adaptive pattern is selected from the group consisting of: changing spatial frequencies of the pattern according to surface distance, structure size and/or camera resolution, changing pattern color to adapt to surface properties, and randomizing patterns over time.
76. A method according to any one of claims 3, 5, 7 and 79, wherein both projected pattern and guidance information are integrated and optimized to reconstruct body surfaces.
77. A method according to any one of claims 3, 5, 7 and 79, further comprising synchronizing projection output to allow time for space multiplexing patterns for both guidance and stereo structures.
78. A method according to any one of claims 3, 5, 7 and 79, further comprising using multi-band projection with both visible light and invisible light bands, simultaneously or time-multiplexed.
79. A method for providing visual information for use in guiding the use of an instrument relative to a body during a procedure, comprising:
imaging a body along an image plane,
observing, with a first camera, a region of imaging during operation of said imaging system, observing, with a second camera, a region of said imaging during operation of said imaging system,
said first and second cameras configured and located for stereoscopic observation of said region of imaging,
actively tracking, with said first and second cameras, the location and movement of said instrument, and
displaying guidance information concerning the location and orientation of the instrument relative to a desired location and orientation of the instrument for said procedure.
PCT/US2012/059406 2011-10-09 2012-10-09 Interventional in-situ image-guidance by fusing ultrasound video WO2013055707A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2014535792A JP2015505679A (en) 2011-10-09 2012-10-09 Intervention image guidance by fusing ultrasound images
EP12840772.3A EP2763591A4 (en) 2011-10-09 2012-10-09 Interventional in-situ image-guidance by fusing ultrasound video
CA2851659A CA2851659A1 (en) 2011-10-09 2012-10-09 Interventional in-situ image guidance by fusing ultrasound and video
IL232026A IL232026A0 (en) 2011-10-09 2014-04-09 Interventional in-situ image-guidance by fusing ultrasound video

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201161545186P 2011-10-09 2011-10-09
US61/545,186 2011-10-09
US201261603625P 2012-02-27 2012-02-27
US61/603,625 2012-02-27
US201261657441P 2012-06-08 2012-06-08
US61/657,441 2012-06-08

Publications (1)

Publication Number Publication Date
WO2013055707A1 true WO2013055707A1 (en) 2013-04-18

Family

ID=48082353

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/059406 WO2013055707A1 (en) 2011-10-09 2012-10-09 Interventional in-situ image-guidance by fusing ultrasound video

Country Status (6)

Country Link
US (1) US20130218024A1 (en)
EP (1) EP2763591A4 (en)
JP (1) JP2015505679A (en)
CA (1) CA2851659A1 (en)
IL (1) IL232026A0 (en)
WO (1) WO2013055707A1 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2878325A1 (en) * 2013-11-27 2015-06-03 Clear Guide Medical, LLC Surgical needle for a surgical system with optical recognition
EP3009095A1 (en) * 2014-10-17 2016-04-20 Imactis Method for planning the introduction of a needle in a patient's body
WO2016139149A1 (en) * 2015-03-02 2016-09-09 Navigate Surgical Technologies, Inc. Surgical location monitoring system and method with surgical guidance graphic user interface
WO2016146173A1 (en) * 2015-03-17 2016-09-22 Brainlab Ag Surgical drape for patient registration and a registration method utilizing such surgical drape
US9585721B2 (en) 2011-10-28 2017-03-07 Navigate Surgical Technologies, Inc. System and method for real time tracking and modeling of surgical site
WO2017109130A1 (en) * 2015-12-22 2017-06-29 Koninklijke Philips N.V. Providing a projection data set
JP2017521112A (en) * 2014-05-23 2017-08-03 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Imaging device for imaging a first object in a second object
WO2017157715A1 (en) * 2016-03-16 2017-09-21 Koninklijke Philips N.V. Optical camera selection in multi-modal x-ray imaging
WO2017172393A1 (en) * 2016-03-26 2017-10-05 Mederi Therapeutics, Inc. Systems and methods for treating tissue with radiofrequency energy
EP3076875A4 (en) * 2013-11-27 2017-11-01 Clear Guide Medical, Inc. An ultrasound system with stereo image guidance or tracking
WO2017189719A1 (en) * 2016-04-27 2017-11-02 Biomet Manufacturing, Llc Surgical system having assisted navigation
CN107749056A (en) * 2017-11-30 2018-03-02 苏州大学 To radioactive substance three-dimensional positioning tracking method and device
US10386990B2 (en) 2009-09-22 2019-08-20 Mederi Rf, Llc Systems and methods for treating tissue with radiofrequency energy
US10624690B2 (en) 2009-09-22 2020-04-21 Mederi Rf, Llc Systems and methods for controlling use and operation of a family of different treatment devices
US10667790B2 (en) 2012-03-26 2020-06-02 Teratech Corporation Tablet ultrasound system
DE102019211870A1 (en) * 2019-08-07 2020-09-03 Siemens Healthcare Gmbh Projection device for generating a light distribution on a surface of an examination object for aligning a medical object and method for projecting a light distribution onto a surface of an examination object
WO2021137108A1 (en) * 2019-12-31 2021-07-08 Auris Health, Inc. Alignment interfaces for percutaneous access
US11179138B2 (en) 2012-03-26 2021-11-23 Teratech Corporation Tablet ultrasound system
CN114298934A (en) * 2021-12-24 2022-04-08 北京朗视仪器股份有限公司 Cheek clamp developing weakening method and device based on pixel adjustment
US11304777B2 (en) 2011-10-28 2022-04-19 Navigate Surgical Technologies, Inc System and method for determining the three-dimensional location and orientation of identification markers
WO2022264125A1 (en) * 2021-06-14 2022-12-22 Mazor Robotics Ltd. Systems and methods for detecting and monitoring a drape configuration
US11660147B2 (en) 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access

Families Citing this family (131)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7728868B2 (en) 2006-08-02 2010-06-01 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US10343283B2 (en) * 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
US9295449B2 (en) * 2012-01-23 2016-03-29 Ultrasonix Medical Corporation Landmarks for ultrasound imaging
CN103544688B (en) * 2012-07-11 2018-06-29 东芝医疗系统株式会社 Medical imaging fusing device and method
US9375196B2 (en) 2012-07-12 2016-06-28 Covidien Lp System and method for detecting critical structures using ultrasound
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
GB201222361D0 (en) * 2012-12-12 2013-01-23 Univ Birmingham Surface geometry imaging
US9947112B2 (en) * 2012-12-18 2018-04-17 Koninklijke Philips N.V. Scanning device and method for positioning a scanning device
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
ITGE20130032A1 (en) * 2013-03-19 2014-09-20 Esaote Spa METHOD AND IMAGING DEVICE OF THE CARDIOVASCULAR SYSTEM
JP6238550B2 (en) * 2013-04-17 2017-11-29 キヤノン株式会社 SUBJECT INFORMATION ACQUISITION DEVICE AND METHOD FOR CONTROLLING SUBJECT INFORMATION ACQUISITION DEVICE
KR102149322B1 (en) * 2013-05-20 2020-08-28 삼성메디슨 주식회사 Photoacoustic bracket, photoacoustic probe assembly and photoacoustic image apparatus having the same
KR20150005052A (en) * 2013-07-04 2015-01-14 삼성메디슨 주식회사 Ultrasound system and method for providing target object information
US9390312B2 (en) 2013-08-23 2016-07-12 Elwha Llc Systems, methods, and devices for assessing microbiota of skin
US9456777B2 (en) 2013-08-23 2016-10-04 Elwha Llc Systems, methods, and devices for assessing microbiota of skin
US9549703B2 (en) * 2013-11-27 2017-01-24 Elwha Llc Devices and methods for sampling and profiling microbiota of skin
US10152529B2 (en) 2013-08-23 2018-12-11 Elwha Llc Systems and methods for generating a treatment map
US10010704B2 (en) 2013-08-23 2018-07-03 Elwha Llc Systems, methods, and devices for delivering treatment to a skin surface
US9811641B2 (en) 2013-08-23 2017-11-07 Elwha Llc Modifying a cosmetic product based on a microbe profile
US9557331B2 (en) 2013-08-23 2017-01-31 Elwha Llc Systems, methods, and devices for assessing microbiota of skin
US9526480B2 (en) 2013-11-27 2016-12-27 Elwha Llc Devices and methods for profiling microbiota of skin
US9805171B2 (en) 2013-08-23 2017-10-31 Elwha Llc Modifying a cosmetic product based on a microbe profile
DE102013217476A1 (en) * 2013-09-03 2015-03-05 Siemens Aktiengesellschaft Method for repositioning a mobile imaging device
US9295372B2 (en) * 2013-09-18 2016-03-29 Cerner Innovation, Inc. Marking and tracking an area of interest during endoscopy
EP3049152B1 (en) 2013-09-19 2017-05-17 Koninklijke Philips N.V. High-dose rate brachytherapy system
US9610037B2 (en) 2013-11-27 2017-04-04 Elwha Llc Systems and devices for profiling microbiota of skin
US9526450B2 (en) * 2013-11-27 2016-12-27 Elwha Llc Devices and methods for profiling microbiota of skin
US9186278B2 (en) 2013-11-27 2015-11-17 Elwha Llc Systems and devices for sampling and profiling microbiota of skin
RU2687883C2 (en) * 2013-12-19 2019-05-16 Конинклейке Филипс Н.В. Object tracking device
CN106061424B (en) 2013-12-20 2019-04-30 皇家飞利浦有限公司 System and method for tracking puncture instrument
CN105916461B (en) 2014-01-31 2020-02-18 柯惠Lp公司 Interface for surgical system
KR101654675B1 (en) * 2014-02-03 2016-09-06 삼성메디슨 주식회사 Method, apparatus and system for generating diagnostic image using photoacoustic material
JP6740131B2 (en) 2014-02-21 2020-08-12 スリーディインテグレイテッド アーペーエス3Dintegrated Aps Set with surgical instrument, surgical system, and training method
JP6615110B2 (en) * 2014-03-04 2019-12-04 ザクト ロボティクス リミテッド Method and system for pre-planning an image guided needle insertion procedure in a region of interest of interest
JP6385079B2 (en) * 2014-03-05 2018-09-05 株式会社根本杏林堂 Medical system and computer program
NL2012416B1 (en) * 2014-03-12 2015-11-26 Stichting Katholieke Univ Anatomical Image Projection System.
JP6327900B2 (en) * 2014-03-24 2018-05-23 キヤノン株式会社 Subject information acquisition apparatus, breast examination apparatus and apparatus
DE102014007909A1 (en) 2014-05-27 2015-12-03 Carl Zeiss Meditec Ag Surgical microscope
US20150366628A1 (en) * 2014-06-18 2015-12-24 Covidien Lp Augmented surgical reality environment system
DE102014110570A1 (en) * 2014-07-25 2016-01-28 Surgiceye Gmbh An imaging apparatus and method combining functional imaging and ultrasound imaging
TWI605795B (en) * 2014-08-19 2017-11-21 鈦隼生物科技股份有限公司 Method and system of determining probe position in surgical site
GB2545603B (en) * 2014-09-10 2020-04-15 Faro Tech Inc A portable device for optically measuring three-dimensional coordinates
US9602811B2 (en) 2014-09-10 2017-03-21 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US9671221B2 (en) 2014-09-10 2017-06-06 Faro Technologies, Inc. Portable device for optically measuring three-dimensional coordinates
DE102014013678B3 (en) 2014-09-10 2015-12-03 Faro Technologies, Inc. Method for optically sensing and measuring an environment with a handheld scanner and gesture control
DE102014013677B4 (en) 2014-09-10 2017-06-22 Faro Technologies, Inc. Method for optically scanning and measuring an environment with a handheld scanner and subdivided display
US9693040B2 (en) 2014-09-10 2017-06-27 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
CN106687049B (en) * 2014-09-19 2019-09-17 富士胶片株式会社 Photoacoustic image generation method and device
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US10284762B2 (en) * 2014-10-27 2019-05-07 Clear Guide Medical, Inc. System and method for targeting feedback
US10639104B1 (en) 2014-11-07 2020-05-05 Verily Life Sciences Llc Surgery guidance system
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
EP3047809B1 (en) * 2015-01-23 2022-04-13 Storz Medical Ag Extracorporeal shock wave lithotripsy system having off-line ultrasound localization
US10285760B2 (en) * 2015-02-04 2019-05-14 Queen's University At Kingston Methods and apparatus for improved electromagnetic tracking and localization
US11576645B2 (en) * 2015-03-02 2023-02-14 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for scanning a patient in an imaging system
US11576578B2 (en) * 2015-03-02 2023-02-14 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for scanning a patient in an imaging system
CN104644205A (en) 2015-03-02 2015-05-27 上海联影医疗科技有限公司 Method and system for positioning patient during diagnostic imaging
CN106033418B (en) 2015-03-10 2020-01-31 阿里巴巴集团控股有限公司 Voice adding and playing method and device, and picture classifying and retrieving method and device
DE102015207119A1 (en) * 2015-04-20 2016-10-20 Kuka Roboter Gmbh Interventional positioning kinematics
US10682156B2 (en) * 2015-05-28 2020-06-16 Akm A. Rahman Angle-guidance device and method for CT guided drainage and biopsy procedures
US10824315B2 (en) * 2015-05-29 2020-11-03 Canon Medical Systems Corporation Medical image processing apparatus, magnetic resonance imaging apparatus and medical image processing method
WO2016192671A1 (en) * 2015-06-05 2016-12-08 Chen Chieh Hsiao Intra operative tracking method
US10512508B2 (en) 2015-06-15 2019-12-24 The University Of British Columbia Imagery system
CN108024806B (en) 2015-07-21 2022-07-01 3D集成公司 Cannula assembly kit, trocar assembly kit, sleeve assembly, minimally invasive surgical system and method thereof
US11020144B2 (en) 2015-07-21 2021-06-01 3Dintegrated Aps Minimally invasive surgery system
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
DE102015213935B4 (en) 2015-07-23 2019-02-14 Siemens Healthcare Gmbh A medical imaging device having a positioning unit and a method for determining a position on a positioning surface
EP3791929A1 (en) * 2015-08-10 2021-03-17 Fusmobile Inc. Image guided focused ultrasound treatment device and aiming apparatus
EP3344148B1 (en) * 2015-09-03 2021-02-17 Siemens Healthcare GmbH Multi-view, multi-source registration of moving anatomies and devices
RU2607948C2 (en) * 2015-09-21 2017-01-11 Общество с ограниченной ответственностью "Лаборатория медицинской электроники "Биоток" Method and device of visualization in cardiac surgery
DK178899B1 (en) 2015-10-09 2017-05-08 3Dintegrated Aps A depiction system
WO2017075085A1 (en) * 2015-10-28 2017-05-04 Endochoice, Inc. Device and method for tracking the position of an endoscope within a patient's body
US11452495B2 (en) 2015-12-07 2022-09-27 Koninklijke Philips N.V. Apparatus and method for detecting a tool
US10178358B2 (en) * 2016-01-14 2019-01-08 Wipro Limited Method for surveillance of an area of interest and a surveillance device thereof
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US10413272B2 (en) 2016-03-08 2019-09-17 Covidien Lp Surgical tool with flex circuit ultrasound sensor
RU2018138979A (en) * 2016-04-06 2020-05-12 Конинклейке Филипс Н.В. METHOD, DEVICE AND SYSTEM FOR ENSURING THE POSSIBILITY OF ANALYSIS OF THE PROPERTIES OF THE DETECTOR OF THE INDICATOR OF Vital IMPORTANT FUNCTION
US10631838B2 (en) 2016-05-03 2020-04-28 Covidien Lp Devices, systems, and methods for locating pressure sensitive critical structures
US10736219B2 (en) 2016-05-26 2020-08-04 Covidien Lp Instrument drive units
EP3463147A4 (en) 2016-05-26 2020-01-22 Covidien LP Robotic surgical assemblies and instrument drive units thereof
US11272992B2 (en) 2016-06-03 2022-03-15 Covidien Lp Robotic surgical assemblies and instrument drive units thereof
US11576746B2 (en) * 2016-09-20 2023-02-14 Kornerstone Devices Pvt. Ltd. Light and shadow guided needle positioning system and method
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
TWI616190B (en) * 2016-11-18 2018-03-01 長庚大學 System and operation method of acoustic-actuated optical coherence enhanced imaging lens
US10524865B2 (en) * 2016-12-16 2020-01-07 General Electric Company Combination of 3D ultrasound and computed tomography for guidance in interventional medical procedures
US11571180B2 (en) 2016-12-16 2023-02-07 Koninklijke Philips N.V. Systems providing images guiding surgery
US10376235B2 (en) 2016-12-21 2019-08-13 Industrial Technology Research Institute Needle guide system and medical intervention system
WO2018119766A1 (en) * 2016-12-28 2018-07-05 上海联影医疗科技有限公司 Multi-modal image processing system and method
JP2018126389A (en) * 2017-02-09 2018-08-16 キヤノン株式会社 Information processing apparatus, information processing method, and program
WO2018187626A1 (en) * 2017-04-05 2018-10-11 Sensus Healthcare, Inc. Augmented reality glasses to help doctors visualize radiation patterns and overall tumor shape/size
US10621720B2 (en) * 2017-04-27 2020-04-14 Siemens Healthcare Gmbh Deformable registration of magnetic resonance and ultrasound images using biomechanical models
EP3621548A1 (en) * 2017-06-08 2020-03-18 Medos International Sàrl User interface systems for sterile fields and other working environments
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
CN107736897A (en) * 2017-09-04 2018-02-27 北京航空航天大学 A kind of ultrasound registration and resetting long bone device and method based on Six Degree-of-Freedom Parallel Platform
US10667789B2 (en) * 2017-10-11 2020-06-02 Geoffrey Steven Hastings Laser assisted ultrasound guidance
US10835344B2 (en) 2017-10-17 2020-11-17 Verily Life Sciences Llc Display of preoperative and intraoperative images
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
US11771399B2 (en) 2018-02-07 2023-10-03 Atherosys, Inc. Apparatus and method to guide ultrasound acquisition of the peripheral arteries in the transverse plane
EP3528210A1 (en) * 2018-02-14 2019-08-21 Koninklijke Philips N.V. An imaging system and method with stitching of multiple images
WO2019168935A1 (en) * 2018-02-27 2019-09-06 Steven Aaron Ross Video patient tracking for medical imaging guidance
US20190282300A1 (en) * 2018-03-13 2019-09-19 The Regents Of The University Of California Projected flap design
EP3787480A4 (en) * 2018-04-30 2022-01-26 Atherosys, Inc. Method and apparatus for the automatic detection of atheromas in peripheral arteries
CN108760893B (en) * 2018-06-15 2020-07-24 广西电网有限责任公司电力科学研究院 Guided wave track visualization auxiliary system in ultrasonic damage detection
US20200014909A1 (en) 2018-07-03 2020-01-09 Faro Technologies, Inc. Handheld three dimensional scanner with autofocus or autoaperture
EP3598948B1 (en) * 2018-07-27 2022-03-16 Siemens Healthcare GmbH Imaging system and method for generating a stereoscopic representation, computer program and data memory
US20210267710A1 (en) * 2018-08-16 2021-09-02 Cartosense Private Limited Visual guidance for aligning a physical object with a reference location
CN112584756A (en) * 2018-08-22 2021-03-30 巴德阿克塞斯系统股份有限公司 System and method for infrared enhanced ultrasound visualization
EP3866697B1 (en) * 2018-10-16 2024-03-20 Koninklijke Philips N.V. Deep learning-based ultrasound imaging guidance and associated devices, systems, and methods
JP2022529110A (en) * 2019-04-15 2022-06-17 コヴィディエン リミテッド パートナーシップ Systems and methods for aligning surgical robot arms
US20220313363A1 (en) * 2019-06-24 2022-10-06 Dm1 Llc Optical System And Apparatus For Instrument Projection And Tracking
JP2022546575A (en) * 2019-09-04 2022-11-04 バード・アクセス・システムズ,インコーポレーテッド System and method for ultrasound probe needle tracking status indicator
CN112535499A (en) 2019-09-20 2021-03-23 巴德阿克塞斯系统股份有限公司 Automated vessel detection tool and method
EP3847990B1 (en) * 2020-01-13 2022-04-06 Stryker European Operations Limited Technique of controlling display of a navigation view indicating an instantaneously changing recommended entry point
US11711596B2 (en) 2020-01-23 2023-07-25 Covidien Lp System and methods for determining proximity relative to an anatomical structure
WO2022020351A1 (en) 2020-07-21 2022-01-27 Bard Access Systems, Inc. System, method and apparatus for magnetic tracking of ultrasound probe and generation of 3d visualization thereof
WO2022051657A1 (en) 2020-09-03 2022-03-10 Bard Access Systems, Inc. Portable ultrasound systems and methods
CN114246614A (en) 2020-09-25 2022-03-29 巴德阿克塞斯系统股份有限公司 Ultrasound imaging system and minimum catheter length tool
EP3973885A1 (en) * 2020-09-29 2022-03-30 Koninklijke Philips N.V. Methods and systems for tool tracking
CN114376613A (en) * 2020-10-02 2022-04-22 巴德阿克塞斯系统股份有限公司 Ultrasound probe, ultrasound system and method thereof
US20240008895A1 (en) * 2020-12-08 2024-01-11 The Regents Of The Univesity Of Colorado, A Body Corporate Needle guidance system
DE102021202997A1 (en) 2021-03-26 2022-05-12 Siemens Healthcare Gmbh Method to support the implementation of a minimally invasive procedure, magnetic resonance device, computer program and electronically readable data carrier
WO2023031688A1 (en) * 2021-09-01 2023-03-09 Rsip Neph Ltd. Combined multi-imaging modalities in surgical procedures
WO2023121755A2 (en) * 2021-10-21 2023-06-29 Massachusetts Institute Of Technology Systems and methods for guided intervention
CN114271856B (en) * 2021-12-27 2022-10-11 开普云信息科技股份有限公司 Three-dimensional ultrasonic image generation method and device, storage medium and equipment
CN114339183A (en) * 2021-12-30 2022-04-12 深圳迈瑞动物医疗科技有限公司 Endoscope system and screen projection method thereof
WO2023192395A1 (en) * 2022-03-29 2023-10-05 Project Moray, Inc. Registration of medical robot and/or image data for robotic catheters and other uses

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5868732A (en) * 1996-05-12 1999-02-09 Esc Medical Systems, Ltd. Cooling apparatus for cutaneous treatment employing a laser and method for operating same
US6317616B1 (en) * 1999-09-15 2001-11-13 Neil David Glossop Method and system to facilitate image guided surgery
US20050216032A1 (en) * 2004-03-26 2005-09-29 Hayden Adam I Navigated pin placement for orthopaedic procedures
US20110098553A1 (en) * 2009-10-28 2011-04-28 Steven Robbins Automatic registration of images for image guided surgery
WO2011063266A2 (en) * 2009-11-19 2011-05-26 The Johns Hopkins University Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE34002E (en) * 1989-02-03 1992-07-21 Sterilizable video camera cover
US6556858B1 (en) * 2000-01-19 2003-04-29 Herbert D. Zeman Diffuse infrared light imaging system
DE10033723C1 (en) * 2000-07-12 2002-02-21 Siemens Ag Surgical instrument position and orientation visualization device for surgical operation has data representing instrument position and orientation projected onto surface of patient's body
US6612991B2 (en) * 2001-08-16 2003-09-02 Siemens Corporate Research, Inc. Video-assistance for ultrasound guided needle biopsy
US20030187458A1 (en) * 2002-03-28 2003-10-02 Kimberly-Clark Worldwide, Inc. Correct surgical site marking system with draping key
WO2011100753A2 (en) * 2010-02-15 2011-08-18 The Johns Hopkins University Interventional photoacoustic imaging system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5868732A (en) * 1996-05-12 1999-02-09 Esc Medical Systems, Ltd. Cooling apparatus for cutaneous treatment employing a laser and method for operating same
US6317616B1 (en) * 1999-09-15 2001-11-13 Neil David Glossop Method and system to facilitate image guided surgery
US20050216032A1 (en) * 2004-03-26 2005-09-29 Hayden Adam I Navigated pin placement for orthopaedic procedures
US20110098553A1 (en) * 2009-10-28 2011-04-28 Steven Robbins Automatic registration of images for image guided surgery
WO2011063266A2 (en) * 2009-11-19 2011-05-26 The Johns Hopkins University Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2763591A4 *

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10639090B2 (en) 2009-09-22 2020-05-05 Mederi Rf, Llc Systems and methods for controlling use and operation of a treatment device
US10386990B2 (en) 2009-09-22 2019-08-20 Mederi Rf, Llc Systems and methods for treating tissue with radiofrequency energy
US10624690B2 (en) 2009-09-22 2020-04-21 Mederi Rf, Llc Systems and methods for controlling use and operation of a family of different treatment devices
US11507247B2 (en) 2009-09-22 2022-11-22 Mederi Rf, Llc Systems and methods for treating tissue with radiofrequency energy
US9585721B2 (en) 2011-10-28 2017-03-07 Navigate Surgical Technologies, Inc. System and method for real time tracking and modeling of surgical site
US11304777B2 (en) 2011-10-28 2022-04-19 Navigate Surgical Technologies, Inc System and method for determining the three-dimensional location and orientation of identification markers
US11179138B2 (en) 2012-03-26 2021-11-23 Teratech Corporation Tablet ultrasound system
US10667790B2 (en) 2012-03-26 2020-06-02 Teratech Corporation Tablet ultrasound system
US11857363B2 (en) 2012-03-26 2024-01-02 Teratech Corporation Tablet ultrasound system
EP3076875A4 (en) * 2013-11-27 2017-11-01 Clear Guide Medical, Inc. An ultrasound system with stereo image guidance or tracking
US9668819B2 (en) 2013-11-27 2017-06-06 Clear Guide Medical, Inc. Surgical needle for a surgical system with optical recognition
EP2878325A1 (en) * 2013-11-27 2015-06-03 Clear Guide Medical, LLC Surgical needle for a surgical system with optical recognition
CN104665933A (en) * 2013-11-27 2015-06-03 克林盖德医疗有限公司 Surgical needle for a surgical system with optical recognition
JP2017521112A (en) * 2014-05-23 2017-08-03 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Imaging device for imaging a first object in a second object
EP3009095A1 (en) * 2014-10-17 2016-04-20 Imactis Method for planning the introduction of a needle in a patient's body
WO2016059255A1 (en) * 2014-10-17 2016-04-21 Imactis System for planning the introduction of a needle in a patient's body
CN107106238B (en) * 2014-10-17 2020-04-17 伊马科提斯公司 System for planning the introduction of a needle into a patient
CN107106238A (en) * 2014-10-17 2017-08-29 伊马科提斯公司 A kind of system for the importing of pin in plan patient body
WO2016139149A1 (en) * 2015-03-02 2016-09-09 Navigate Surgical Technologies, Inc. Surgical location monitoring system and method with surgical guidance graphic user interface
WO2016146173A1 (en) * 2015-03-17 2016-09-22 Brainlab Ag Surgical drape for patient registration and a registration method utilizing such surgical drape
WO2017109130A1 (en) * 2015-12-22 2017-06-29 Koninklijke Philips N.V. Providing a projection data set
CN108430376A (en) * 2015-12-22 2018-08-21 皇家飞利浦有限公司 Data for projection collection is provided
US10769787B2 (en) 2015-12-22 2020-09-08 Koninklijke Philips N.V. Device for projecting a guidance image on a subject
US10806468B2 (en) 2016-03-16 2020-10-20 Koninklijke Philips N.V. Optical camera selection in multi-modal X-ray imaging
CN108778135A (en) * 2016-03-16 2018-11-09 皇家飞利浦有限公司 Optical camera selection in multi-modal X-ray imaging
WO2017157715A1 (en) * 2016-03-16 2017-09-21 Koninklijke Philips N.V. Optical camera selection in multi-modal x-ray imaging
WO2017172393A1 (en) * 2016-03-26 2017-10-05 Mederi Therapeutics, Inc. Systems and methods for treating tissue with radiofrequency energy
WO2017189719A1 (en) * 2016-04-27 2017-11-02 Biomet Manufacturing, Llc Surgical system having assisted navigation
US11058495B2 (en) 2016-04-27 2021-07-13 Biomet Manufacturing, Llc Surgical system having assisted optical navigation with dual projection system
AU2017257887B2 (en) * 2016-04-27 2019-12-19 Biomet Manufacturing, Llc. Surgical system having assisted navigation
CN107749056A (en) * 2017-11-30 2018-03-02 苏州大学 To radioactive substance three-dimensional positioning tracking method and device
DE102019211870A1 (en) * 2019-08-07 2020-09-03 Siemens Healthcare Gmbh Projection device for generating a light distribution on a surface of an examination object for aligning a medical object and method for projecting a light distribution onto a surface of an examination object
WO2021137108A1 (en) * 2019-12-31 2021-07-08 Auris Health, Inc. Alignment interfaces for percutaneous access
CN114929148A (en) * 2019-12-31 2022-08-19 奥瑞斯健康公司 Alignment interface for percutaneous access
US11602372B2 (en) 2019-12-31 2023-03-14 Auris Health, Inc. Alignment interfaces for percutaneous access
US11660147B2 (en) 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access
WO2022264125A1 (en) * 2021-06-14 2022-12-22 Mazor Robotics Ltd. Systems and methods for detecting and monitoring a drape configuration
CN114298934B (en) * 2021-12-24 2022-12-09 北京朗视仪器股份有限公司 Cheek clamp developing weakening method and device based on pixel adjustment
CN114298934A (en) * 2021-12-24 2022-04-08 北京朗视仪器股份有限公司 Cheek clamp developing weakening method and device based on pixel adjustment

Also Published As

Publication number Publication date
EP2763591A1 (en) 2014-08-13
JP2015505679A (en) 2015-02-26
IL232026A0 (en) 2014-05-28
EP2763591A4 (en) 2015-05-06
CA2851659A1 (en) 2013-04-18
US20130218024A1 (en) 2013-08-22

Similar Documents

Publication Publication Date Title
US20130218024A1 (en) Interventional In-Situ Image-Guidance by Fusing Ultrasound and Video
US20120253200A1 (en) Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors
US11754971B2 (en) Method and system for displaying holographic images within a real object
Hughes-Hallett et al. Augmented reality partial nephrectomy: examining the current status and future perspectives
JP7443353B2 (en) Correction of computed tomography (CT) images using position and orientation (P&amp;D) tracking-assisted optical visualization
US11730562B2 (en) Systems and methods for imaging a patient
US10758209B2 (en) Photoacoustic tracking and registration in interventional ultrasound
JP6404713B2 (en) System and method for guided injection in endoscopic surgery
JP6905535B2 (en) Guidance, tracking and guidance system for positioning surgical instruments within the patient&#39;s body
US20110105895A1 (en) Guided surgery
US10543045B2 (en) System and method for providing a contour video with a 3D surface in a medical navigation system
US20150049174A1 (en) System and method for non-invasive patient-image registration
JP2017534389A (en) Computerized tomography extended fluoroscopy system, apparatus, and method of use
JP2020522827A (en) Use of augmented reality in surgical navigation
Stolka et al. Needle guidance using handheld stereo vision and projection for ultrasound-based interventions
WO2007115825A1 (en) Registration-free augmentation device and method
Kanithi et al. Immersive augmented reality system for assisting needle positioning during ultrasound guided intervention
Jackson et al. Surgical tracking, registration, and navigation characterization for image-guided renal interventions
Yaniv et al. Applications of augmented reality in the operating room
De Paolis et al. Augmented reality in minimally invasive surgery
Sauer et al. Augmented reality system for ct-guided interventions: System description and initial phantom trials
Dewi et al. Position tracking systems for ultrasound imaging: A survey
Cheung et al. Fusion of stereoscopic video and laparoscopic ultrasound for minimally invasive partial nephrectomy
Lu et al. Multimodality image-guided lung intervention systems
Octorina Dewi et al. Position tracking systems for ultrasound imaging: a survey

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12840772

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014535792

Country of ref document: JP

Kind code of ref document: A

Ref document number: 2851659

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 232026

Country of ref document: IL

WWE Wipo information: entry into national phase

Ref document number: 2012840772

Country of ref document: EP