WO1999016352A1 - Interventional radiology guidance system - Google Patents

Interventional radiology guidance system Download PDF

Info

Publication number
WO1999016352A1
WO1999016352A1 PCT/US1998/019124 US9819124W WO9916352A1 WO 1999016352 A1 WO1999016352 A1 WO 1999016352A1 US 9819124 W US9819124 W US 9819124W WO 9916352 A1 WO9916352 A1 WO 9916352A1
Authority
WO
WIPO (PCT)
Prior art keywords
instrument
structures
target
self
referential
Prior art date
Application number
PCT/US1998/019124
Other languages
French (fr)
Inventor
Sima Nadler
Dror Aiger
Daniel Cohen-Or
Original Assignee
Medsim Advanced Radiology Medical Simulation Ltd.
Friedman, Mark, M.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medsim Advanced Radiology Medical Simulation Ltd., Friedman, Mark, M. filed Critical Medsim Advanced Radiology Medical Simulation Ltd.
Priority to AU94826/98A priority Critical patent/AU9482698A/en
Publication of WO1999016352A1 publication Critical patent/WO1999016352A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information

Definitions

  • the present invention relates to medical imaging technology, and, in particular, it concerns an interventional radiology guidance system and method.
  • needles, biopsy instruments and other surgical instruments are used in many different kinds of interventional, radiologically guided, medical procedures.
  • tissue biopsy and fine needle aspiration are two methods often used to determine the nature of a tissue within the human body, one or both of these procedures usually being performed as a first step in determining the nature of a potentially malignant growth.
  • Fluid drainage and tissue ablation are other procedures in which a needle is used; for example, when draining fluid from around the lungs, heart, or joints, or when eradicating a growth in the liver by injecting alcohol into the growth.
  • Radiological guidance is required so as to successfully navigate the instrument to the desired organ or tissue (hereinafter referred to as the target), while avoiding other organs, tissues or the like (hereinafter referred to as "avoidance targets”). Radiological guidance is also used during intraoperative procedures.
  • 2D ultrasound is the most widely used guidance modality, due to its widespread availability, low cost, lack of radiation exposure, and ability to generate images in real time. 2D ultrasound, however, suffers from several limitations:
  • an interventional radiological guidance system including a mechanism for obtaining a plurality of chronologically sequential images of a body; an image identification software module operable to designate at least one self-referential spatial location, defined by a first set of spatial coordinates, within at least one of the chronologically sequential images, the desired self-referential spatial location corresponding to at least one structure in the body; an image tracking software module operable to self-referentially locate a second set of spatial coordinates in a chronologically subsequent image of the body, the second set of spatial coordinates corresponding to the first set of spatial coordinates in a chronologically precedent image of the body, such that the second set of spatial coordinates correspond to the at least one structure; and a graphic aids generator software module operable to generate at least one signal marking a location of the at least one structure in the chronologically sequential images, and operable to describe at least one relationship between a plurality of the at least one structures in the chronologically sequential images.
  • the interventional radiological guidance system may further include a display unit operable to display the signal and the relationship, wherein the signal is displayed as a visual graphic.
  • the interventional radiological guidance system may also further include a sound production unit operable to produce an audible signal corresponding to the relationship.
  • the mechanism for obtaining chronologically sequential images of a body may include a three- dimensional ultrasound machine, a two-dimensional ultrasound machine, a computerized tomography scanner, a magnetic resonance imaging scanner, and/or a magnetic resonance therapeutics scanner.
  • the at least one self- referential spatial location may be manually designated, and/or may be designated by an image processing software algorithm, which may be a motion tracking software algorithm.
  • an interventional radiological guidance method including the steps of obtaining a plurality of chronologically sequential images of a body; designating at least one self-referential spatial location, defined by a first set of self-referential spatial coordinates within at least one of the chronologically sequential images, the desired self-referential spatial location corresponding to at least one structure in the body; self-referentially locating a second set of spatial coordinates in a chronologically subsequent image of the body, the second set of spatial coordinates corresponding to the first set of spatial coordinates in a chronologically precedent image of the body, such that the second set of spatial coordinates correspond to the at least one structure; and generating at least one signal marking a location of the at least one structure in the chronologically sequential images.
  • the method may further include the step of displaying the signal as a visual graphic on a display.
  • the images may be obtained using a three-dimensional ultrasound machine, a two- dimensional ultrasound machine, a computerized tomography scanner, a magnetic resonance imaging scanner, and/or a magnetic resonance therapeutics scanner.
  • the body being imaged may be a human body.
  • the self-referential spatial location may be designated by an image processing software algorithm, which may be a motion tracking software algorithm, and/or may be manually designated.
  • the structure to which the self-referential spatial location corresponds may be a medical instrument, such as a surgical instrument being used in an interventional radiological procedure, and/or an organic structure, such as a target of a radiologically guided interventional medical procedure, or a structure which it is desired to avoid in a radiologically guided interventional medical procedure.
  • the method may also further include the steps of describing at least one relationship between a plurality of the structures in at least one of the chronologically sequential images, and displaying the relationship as a visual graphic on a display.
  • the relationship may include a desired orientation of one of the structures relative to a second of the structures, a desired minimum distance between one of the structures and a second of the structures, and/or a desired maximum distance between one of the structures and a second of the structures.
  • the method may further include the step of producing an audible signal corresponding to the relationship.
  • a standard 3D ultrasound machine is used to continuously acquire 3D ultrasound volumes of the area being imaged. Based on this data, real time 2D ultrasound images are depicted on the display of the 3D ultrasound machine, for viewing by the operator.
  • the system is able to provide the operator with an unlimited number of image orientations.
  • the ultrasound data can be viewed from the point of view of the instrument, in relation to a particular plane of the body (coronal, sagittal, transverse or oblique), or any other chosen angle.
  • 2 orthogonally oriented images can be displayed at the same time, thus facilitating 3D conceptualization, by the operator, of the spatial orientations of the structures being depicted in the 2D images.
  • structure is meant both organic structures (such as the target of an interventional radiological procedure or a structured which it is desired to avoid during an interventional radiological procedure) and non-organic structures such as medical instruments.
  • Medical instruments refers to aspiration needles, injection needles, biopsy instruments, scalpels, ablation instruments, biomedical devices such as stents, biomedical prostheses and the like.
  • an image processing software-based tracking system is used to identify the instrument and track its spatial location.
  • a graphical representation of the instrument is superimposed on the ultrasound image of the instrument displayed on the screen.
  • the operator then manually designates the target on the image, and marks its center and diameter by using a standard pointing device, such as the electronic calipers incorporated in the imaging system.
  • a standard pointing device such as the electronic calipers incorporated in the imaging system.
  • the target is designated by the operator by hand, rather than automatically by means of a software algorithm.
  • a circle, or other graphical symbol appears on or around the target in all subsequent ultrasound images, making the target easier to see.
  • the system then tracks the position of the target as the soft tissues, transducer, and instrument move. If the target moves to the extent that it is no longer within the 3D volume, upon relocation the operator is required to re-mark it.
  • the operator is informed that the instrument is located at or within the target by means of graphic and/or audible signals; for example, the structure's graphical symbol may change color and flash, a beep may be heard, and/or a textual message may appear on the screen.
  • a hard copy video or print picture documenting that the instrument has reached the target can be generated.
  • a graphical symbol appears on or around the structures (making them easier to see), and the system tracks their positions. If the instrument approaches to within a defined distance from the structure (either a default tolerance, or a distance defined by the operator) a warning is provided, for example, the structure's graphical symbol may change color and flash, a beep may be heard, and/or a textual message may appear on the screen.
  • 6 DOF physical sensors are mounted on an instrument holder, and can be used to determine the spatial location of the instrument when the instrument is outside of the 3D volume.
  • a graphical representation of the instrument, indicating its location and orientation, can be depicted alongside the ultrasound image, so as to enable the operator to optimize the orientation and trajectory of the instrument before it enters the body and the 3D ultrasound volume.
  • this software based tracking system is more reliable than a mechanical sensor based system because:
  • FIG. 2 is a block diagram of the general flow of control of the IRGS
  • FIG. 3 is a flow diagram describing the functioning of the algorithm of an instrument tracker
  • FIG. 4 is a flow diagram describing the functioning of a target, or avoidance target, tracker
  • FIG. 5 is a diagram illustrating the functioning of a slice generator
  • FIG. 6 shows the flow of control of a graphic aids generator
  • FIG. 7a and FIG. 7b show examples of graphic aids generated by an
  • FIG. 8a and FIG. 8b show examples of graphic aids generated by an
  • IRGS when oriented in the short axis of a surgical instrument.
  • the present invention is an interventional radiology guidance system (IRGS).
  • IRGS interventional radiology guidance system
  • the IRGS enables ultrasound guided procedures to be performed with increased speed and accuracy by tracking the location of the medical instrument (by which is meant surgical instruments as well as other instruments used in interventional medical procedures, as defined above), the target, and the avoidance targets, and providing the operator with a graphical representation of the instrument's trajectory with relation to the target.
  • the software components of the IRGS are located in the processing unit of the 3D ultrasound machine.
  • Figure 1 is a block diagram of the software components of an IRGS 12, and their interrelationship with some hardware components of the system.
  • the software components are: a 3D volume generator 1, an instrument tracker 5, a target marker 7, a target tracker 8, an avoid marker 4, an avoid tracker 3, a slice generator 6, and a graphic aids generator 9.
  • 3D Volume Generator 1 provides 3D ultrasound volumes to IRGS 12.
  • 3D Volume Generator 1 is an external component located in the 3D ultrasound machine. Output from this component is a 3D volume.
  • Target Marker 7 provides a user interface which enables the user to designate targets.
  • the calipers used to make measurements on an ultrasound screen can be used to indicate the location of a target in a frozen ultrasound image, by marking a circle on the image in one of the views displayed. Target Marker 7 then stores the center point and radius of the designated target. In this way a sphere representing the target in three dimensions is defined.
  • Avoid marker 4 provides a user interface which enables the user to indicate structures to be avoided (avoidance targets).
  • calipers can be used to indicate the location of structures in a frozen ultrasound image, by marking a circle on the image in one of the views displayed. Avoid marker 4 stores the center point and radius of the structures to be avoided. In this way, a sphere representing the structure to be avoided in three dimensions is defined.
  • the operator may mark multiple structures for avoidance, with avoid marker 4 being called each time a structure is marked for avoidance.
  • Instrument tracker 5, Avoid marker 4, and Target Marker 7 are hereinafter collectively referred to as image identification software modules.
  • the avoid tracker 3 component tracks structures that the operator wishes to avoid touching with the instrument (avoidance targets). It receives as input the position of the instrument, a 3D volume, and the details of the structures to be avoided. It then calculates the distance of the instrument from each of the structures than have been marked for avoidance. If the instrument is "too close"
  • a message is sent to Graphics Aids Generator 9 to change the structure's graphical symbol color and cause it to flash.
  • a textual warning message is also displayed. Once the instrument is moved away from the avoidance target, another message is sent to Graphics
  • Aids Generator 9 to return the graphical symbol to its original color, stop the flashing and the remove the textual warning message.
  • the algorithm used to track the structures to be avoided is identical to that used by instrument tracker
  • Target Marker The details, center point and radius, of the target are provided by Target Marker
  • 3D Volume Generator 1 The output provided by this component is a set of coordinates representing the location and shape of the target.
  • Slice generator 6 generates specific 2D slices from within the 3D volume, according to the angles and orientation designated by the operator during setup of the system.
  • Graphic Aids Generator 9 is responsible for graphically displaying the instrument, the target, the avoidance targets, and their relationships on each sequential ultrasound image. It receives as input the position and dimensions of the target, the avoidance targets and the instrument, as well as the 2D ultrasound images generated by slice generator 6. If the distance of the instrument from an avoidance target is "too close" (less than a pre-defined tolerance) then the avoidance targets graphical symbol color is changed (to red for example) and the symbol begins to flash. A textual warning is also displayed. Once the instrument is moved away from the structure, its graphical symbol returns to its original color, stops flashing, and the textual message disappears. When the instrument is inside the target, the target symbol's color changes (to green for example) and the target symbol begins to flash - showing that the user has succeeded in navigating the instrument to the desired location.
  • a shared memory unit 2 stores 3D volume data generated by 3D Volume Generator 1 of the 3D ultrasound system.
  • FIG. 2 A block diagram of the general flow of control of IRGS 12 is depicted in Figure 2.
  • the first step in the iterative process is to access the 3D volume placed in shared memory 2 by 3D volume generator 1 of the 3D ultrasound system.
  • a check 13 is performed to see if there are any requests to mark either a target or an avoidance target. If there is a request to mark a target, target marker 7 is called. If there is a request to mark an avoidance target, avoid marker 4 is called. Thereafter, instrument tracker 5 is called, in order to identify and find the location of the instrument. If there is a marked target 14, Target Tracker 8 is then called in order to find the target's location in the current 3D volume.
  • avoid tracker 3 is called for each of the marked avoidance targets (if any) 15.
  • slice generator 6 is called to produce the 2D slices corresponding to each of the views.
  • graphic aids generator 9 which draws the appropriate graphic aids onto the 2D slices, is called.
  • the slices are sent to video card 10 for display on display unit 11, and the flow is repeated.
  • Figure 3 is a flow diagram describing the functioning of the algorithm of instrument tracker 5.
  • the system works in an iterative fashion, identifying and locating the instrument in each consecutive 3D volume it receives.
  • An N depth FIFO queue is used to store sub-volumes.
  • the first step is to read the 3D volume T n from shared memory 2 .
  • the second step is to define a sub-volume of T n , T s , surrounding the instrument. If this is one of the first seven volumes received 16, the sub- volume is determined by a predefined location 17 (usually the surface of the body) or, in an alternative embodiment, by information received from a 6DOF sensor system. Otherwise the sub-volume is determined 18 by the location of the instrument in volume T ⁇ .
  • edge detection is performed 23 on the sub- volume T s .
  • a standard edge detection algorithm such as that described by Canny, and which is incorporated herein by reference (J.F. Canny. A computational approach to edge detection. IEEE Trans. On Pattern Analysis and Machine Intelligence, 8(6):679-698, November 1986), is used.
  • a 3D Hough Transform 24 (R.O. Duda and P.E. Hart, "Use of the Hough Transform To Detect Lines and Curves in Pictures", Communications of the ACM, vol. 15, no. 1 , pp. 11-15, 1972), which is incorporated herein by reference, is performed.
  • the result is a list of lines (L s ).
  • L s is stored 29 as L s-1 and the iteration is completed. If this is not the first iteration 25, then L s and L ⁇ are matched 26 into pairs of lines. These pairs of lines are then used to calculate 27 a flow list, as described by Adiv, incorporated herein by reference (G. Adiv, Determining three-dimensional motion and structure from optical flow generated by several moving objects. IEEE Trans, on Pattern Analysis and Machine Intelligence, 7(4):384-401, 1985), the output of the flow list being a list of vectors. On this vector list segmentation is performed 28, as described by Adiv, 1985 and incorporated herein by reference. The purpose of the segmentation is to identify which of the lines is most likely to represent the instrument.
  • the output of the segmentation is the identified instrument.
  • the x, y, and z points comprising the instrument are then sent 50 to Graphic Aids Generator 9.
  • the final step is to store 29 L s as L s _[ and then the next iteration begins.
  • instrument tracker 5 If a 6DOF sensor system, mounted on the instrument (or its holder) and the transducer, is being used, then instrument tracker 5 first calls the 6DOF sensor system to determine if the instrument is within the 3D volume. If it is not within the 3D volume, position data supplied by the sensor to instrument tracker 5 is output directly by instrument tracker 5, without using the algorithm described above to determine the output of instrument tracker 5. If the instrument is within the 3D volume, the above described algorithm is implemented.
  • the advantage of using the 6DOF sensors is that when the instrument is inside the 3D volume, the initial sub-volume (T s ) searched for the instrument is defined by position data received from the sensor and is thus much smaller. This improves the performance of the instrument tracking algorithm.
  • FIG 4 is a flow diagram describing the functioning of the algorithm for target tracker 8.
  • the system works in an iterative fashion, locating the target in each consecutive 3D volume it receives. Two 3D volumes are required to find the target.
  • the previous volume, T n . b is the frozen volume used by target marker 7, and prev_target is that supplied by target marker 7. If it is not the first iteration then T n-1 and prev_target are from the previous iteration.
  • T n the current volume, is read from shared memory 2. Then, using the two volumes and the details of the previous target, registration 30 is performed as described by Barber, 1992 and incorporated herein by reference. Registration identifies an area in the volume corresponding to the new target.
  • the target's position defined as its center point, is calculated 31 and passed to Graphic Aids Generator 9. Finally, before continuing on to the next iteration, the target is stored 32 as prev_target and T n is stored 33 as T ⁇ _,. The next iteration begins by once again reading a 3D volume from shared memory 2.
  • FIG. 5 is a diagram illustrating the functioning of slice generator 6.
  • Each slice is a virtual image frame buffer defined in a world coordinate system.
  • the voxels pierced by the virtual frame are sampled, mapped and displayed in their image coordinate system after the frame is clipped against the volume buffer.
  • P stands for "point”, with PI being point #1.
  • B is represents the 3D point mapped to 2D.
  • the algorithm is basically an extension of the widely known 2D scan-line algorithm (J.D. Foley and A. van Dam and K. Feiner and F. Hughes, in Computer Graphics Principles and Practice, Addison- Wesley, 1990.) where at each scan-line the third dimension is also interpolated.
  • the sweeping technique described by Cohen, Kaufman and Kong and incorporated herein by reference, (D. Cohen-Or, A. Kaufman and T.Y. Kong, On the Soundness of Surface Voxelizations, in Topological Algorithms for Digital Image Processing, T. Yung Kong and A. Rosenfeld (eds.), North-Holland, Amsterdam, 1995, pages 181-204.) can be used.
  • a 3D ultrasound image could be rendered from the 3D volume, for viewing by the operator, instead of individual 2D ultrasound images being rendering by slice generator 6.
  • Figure 6 shows the flow of control of the graphic aids generator 9 module. This component works iteratively. For each slice received 34 the following steps are performed. First the 3D representation of the instrument is mapped 35 to a 2D representation and the instrument is drawn 36. Then, if there is a target 37 the 3D representation of the target is mapped to a 2D representation. If the instrument has hit the target 39 then the target is drawn 40 as a flashing green object. If the target hasn't been hit then the target is drawn 41 without flashing. At this point a loop 42 begins in which each avoid object is processed. For each avoid object, the 3D representation thereof is mapped 43 to a corresponding 2D representation and is drawn in one of two ways. If the instrument hits the avoid object then the avoid object is drawn 44 as a flashing red object. Otherwise the avoid object is drawn normally 46.
  • Figures 7a, 7b, 8a, and 8b show examples of graphic aids generated by graphic aids generator module 9 to aid the operator in orienting the instrument relative to the target.
  • the imaging slice has been chosen (by the operator) to constantly be in the long axis of the surgical instrument. As such, the full length of the instrument, including the tip of the instrument, is seen.
  • the circle represents the target, the arrow represents the instrument, and the dotted line represents a projection from the target to the instrument. When the dotted line and the instrument form a straight line, as in figure 7b, the instrument orientation is optimal.
  • the imaging plane is in the short axis of the surgical instrument.
  • the light circle represents the target and the dark circle represents the instrument.
  • IRGS 12 would describe only the 2D relationships between the target, instrument and avoidance targets.
  • additional 2D or 3D digital image data sources such as CT, MRI, MRT, and the like- can be used to perform the above described image processing functions (marking and tracking of the instrument, targets and avoidance targets, and description of the relationships between them), by using the same software components, provided that the image data source generates chronologically sequential images allowing for the depiction of dynamic events.
  • the phrase "mechanism for obtaining chronologically sequential images" hereinafter refers to all such image data sources.

Abstract

A system and method for guiding interventional radiological procedures. A constantly updated 3D ultrasound data volume of the area of the body in which the procedure is being performed is obtained. 2D ultrasound images in multiple desired orientations are generated from the 3D volume and displayed. Software-based image processing of the 3D volume is performed to identify, graphically mark, and track the movements, on the 2D ultrasound images, of the surgical instrument being used, the target of the procedure and the structures which it is desired to avoid. Visual and audible signals are generated to aid the operator in optimizing the 3D orientation of the surgical instrument relative to the target, to inform the operator when the target has been reached, and to warn the operator when the surgical instrument is approaching a structure which it is desired to avoid.

Description

INTERVENTIONAL RADIOLOGY GUIDANCE SYSTEM
FIELD AND BACKGROUND OF THE INVENTION
The present invention relates to medical imaging technology, and, in particular, it concerns an interventional radiology guidance system and method. It is known that needles, biopsy instruments and other surgical instruments are used in many different kinds of interventional, radiologically guided, medical procedures. For example, tissue biopsy and fine needle aspiration are two methods often used to determine the nature of a tissue within the human body, one or both of these procedures usually being performed as a first step in determining the nature of a potentially malignant growth. Fluid drainage and tissue ablation are other procedures in which a needle is used; for example, when draining fluid from around the lungs, heart, or joints, or when eradicating a growth in the liver by injecting alcohol into the growth. When performing such procedures, radiological guidance is required so as to successfully navigate the instrument to the desired organ or tissue (hereinafter referred to as the target), while avoiding other organs, tissues or the like (hereinafter referred to as "avoidance targets"). Radiological guidance is also used during intraoperative procedures.
All of these procedures are currently performed using one of several different radiological imaging modalities: two dimensional (2D) ultrasound, computerized tomography (CT), magnetic resonance imaging (MRI), or magnetic resonance therapeutics (MRT). Of these, 2D ultrasound is the most widely used guidance modality, due to its widespread availability, low cost, lack of radiation exposure, and ability to generate images in real time. 2D ultrasound, however, suffers from several limitations:
1. It is often difficult for the operator to see the instrument in the ultrasound image, as the acoustic contrast between the instrument and the surrounding tissue may be weak or variable. 2. Usually the instrument, such as a needle, cannot be perfectly oriented within and parallel to the plane of the 2D ultrasound image, because the physical presence of the ultrasound transducer of necessity displaces the surgical instrument, and vice- versa. The surgical instrument thus often passes through the plane of the ultrasound image at an angle. As such, it is difficult for the operator to be sure that the 2D ultrasound image of the end of the instrument truly represents the tip of the instrument, rather than merely the shaft of the instrument where it leaves the plane of the ultrasound image. This difficulty in identifying the tip of the instrument hinders the operator in appropriately orienting the instrument relative to the targets or avoidance targets. For similar reasons, when viewing a single 2D image, the operator cannot know with certainty that the instrument tip has entered the target, or has avoided an avoidance target.
3. It may be difficult for the operator to orient the ultrasound probe such that both the instrument and the target (or avoidance target) are visualized in a single 2D imaging plane.
4. Even if the operator can see both the instrument and the target (or avoidance target) in a single 2D image, it is difficult to determine if they are on the same 3 dimensional (3D) plane, both because of the inherent difficulty in extrapolating 3D conclusions from a single 2D image, and because the 2D ultrasound image is derived from an ultrasound beam which itself is of a certain thickness, such thickness not being depicted in the viewed 2D image. As such, it is difficult for the operator to determine the appropriate 3D angle at which to insert and advance the instrument, so that the tip will enter the target, or avoid the avoidance target. 5. It is difficult for the operator to track the target (or the avoidance target) by eye, as it often moves along with movement of the instrument.
6. If the instrument tip does enter the target, this fact cannot be graphically and definitively documented for future reference.
As CT provides better image resolution than does 2D ultrasound, it is often preferred as an imaging modality for radiological guidance of interventional procedures. CT, however, still suffers from the same 3D orientational drawbacks mentioned above with regard to 2D ultrasound. In addition, CT entails exposure to radiation and does not readily provide real time imaging. MRI and MRT are rarely used due to their prohibitive expensive and lack of availability.
An interventional radiological guidance system using mechanical, externally mounted, 6-degree-of-freedom (6 DOF) receivers to track the spatial location of sensors mounted on the external portion of the surgical instrument, and then use this spatial location data to display a graphic, on an ultrasound screen, depicting the 3D spatial location of the surgical instrument within the body, has been described by King (US patent # 5,608,849; issued March 4, 1997). This system, however, does not identify and track the targets and avoidance targets of the interventional medical procedure. Furthermore, several deficiencies in the use of mechanical sensors to identify and track a surgical instrument within a body make this system innacurate:
1. Sensor-based spatial location data needs to be accurately aligned with external and internal reference coordinates. This process of mechanical alignment is a significant source of data inaccuracy.
2. Once the spatial location data for the surgical instrument has been determined, a graphic, depicting the instrument, needs to be accurately superimposed on the appropriate pixels in the 2D ultrasound image being viewed by the operator. As this process cannot be 100% precise, a further source of inaccuracy is introduced. 3. The instrument's resiliency, that is, the degree to which its shape may bend or become deformed once inserted into a body, cannot be appreciated by externally mounted sensors. As such, this method results in the instrument being depicted as if it were completely rigid, an often invalid assumption.
Other mechanically-based tracking systems would suffer from the above deficiencies, as well additional sources of inaccuracy specific to the mechanical technology being used. For example, the influence of external magnetic fields
(if a magnetic tracking system is used), or interference with line of sight (if an optical system is used).
There is therefore a need for an accurate, ultrasound based, radiological guidance system which aids the operator performing an interventional medical procedure to identify the instrument, the target, and the avoidance targets, to appropriately orient the instrument in three dimensions, and to confirm and document successful placement of the instrument relative to the target.
SUMMARY OF THE INVENTION
The present invention is a software based method and system for graphically marking the targets and avoidance targets of an interventional procedure, for tracking their movements and the movements of the instrument being used to perform the procedure, and for depicting desired orientational and spatial relationships between them. As opposed to mechanical tracking systems, such as those utilizing sensors (in which the spatial coordinates of the structure being tracked are defined in part by a mechanical process located externally to the body of the patient), the software based tracking system of the present invention defines the spatial coordinates of the structure being tracked with reference only to the software generated image of the patients body, and without reference to external mechanical processes. As such, this system can be said to be "self-referential" rather than "externally-referential". Hereinafter, therefore, the phrase "self-referential" means "not defined by or involving the use of a mechanical sensor of any sort, be it electrical, optical, magnetic or the like".
According to the teachings of the present invention there is provided an interventional radiological guidance system, including a mechanism for obtaining a plurality of chronologically sequential images of a body; an image identification software module operable to designate at least one self-referential spatial location, defined by a first set of spatial coordinates, within at least one of the chronologically sequential images, the desired self-referential spatial location corresponding to at least one structure in the body; an image tracking software module operable to self-referentially locate a second set of spatial coordinates in a chronologically subsequent image of the body, the second set of spatial coordinates corresponding to the first set of spatial coordinates in a chronologically precedent image of the body, such that the second set of spatial coordinates correspond to the at least one structure; and a graphic aids generator software module operable to generate at least one signal marking a location of the at least one structure in the chronologically sequential images, and operable to describe at least one relationship between a plurality of the at least one structures in the chronologically sequential images. The interventional radiological guidance system may further include a display unit operable to display the signal and the relationship, wherein the signal is displayed as a visual graphic. The interventional radiological guidance system may also further include a sound production unit operable to produce an audible signal corresponding to the relationship. The mechanism for obtaining chronologically sequential images of a body, which may be a human body, may include a three- dimensional ultrasound machine, a two-dimensional ultrasound machine, a computerized tomography scanner, a magnetic resonance imaging scanner, and/or a magnetic resonance therapeutics scanner. The at least one self- referential spatial location may be manually designated, and/or may be designated by an image processing software algorithm, which may be a motion tracking software algorithm. The structure to which the self-referential spatial location corresponds may include a medical instrument, such as a surgical instrument being used in an interventional radiological procedure, and/or an organic structure, such as a target of a radiologically guided interventional medical procedure and/or a structure which it is desired to avoid in a radiologically guided interventional medical procedure. The relationship described by the graphic aids generator software module may include a desired orientation of one of the structures relative to a second of the structures, a desired minimum distance between one of the structures and a second of the structures, and/or a desired maximum distance between one of the structures and a second of the structures. This relationship may be displayed as a visual graphic.
There is also provided an interventional radiological guidance method, including the steps of obtaining a plurality of chronologically sequential images of a body; designating at least one self-referential spatial location, defined by a first set of self-referential spatial coordinates within at least one of the chronologically sequential images, the desired self-referential spatial location corresponding to at least one structure in the body; self-referentially locating a second set of spatial coordinates in a chronologically subsequent image of the body, the second set of spatial coordinates corresponding to the first set of spatial coordinates in a chronologically precedent image of the body, such that the second set of spatial coordinates correspond to the at least one structure; and generating at least one signal marking a location of the at least one structure in the chronologically sequential images. The method may further include the step of displaying the signal as a visual graphic on a display. The images may be obtained using a three-dimensional ultrasound machine, a two- dimensional ultrasound machine, a computerized tomography scanner, a magnetic resonance imaging scanner, and/or a magnetic resonance therapeutics scanner. The body being imaged may be a human body. The self-referential spatial location may be designated by an image processing software algorithm, which may be a motion tracking software algorithm, and/or may be manually designated. The structure to which the self-referential spatial location corresponds may be a medical instrument, such as a surgical instrument being used in an interventional radiological procedure, and/or an organic structure, such as a target of a radiologically guided interventional medical procedure, or a structure which it is desired to avoid in a radiologically guided interventional medical procedure. The method may also further include the steps of describing at least one relationship between a plurality of the structures in at least one of the chronologically sequential images, and displaying the relationship as a visual graphic on a display. The relationship may include a desired orientation of one of the structures relative to a second of the structures, a desired minimum distance between one of the structures and a second of the structures, and/or a desired maximum distance between one of the structures and a second of the structures. The method may further include the step of producing an audible signal corresponding to the relationship.
The interventional radiology guidance system (IRGS) is thus based on 3D ultrasound technology, real-time imaging software, and software based tracking algorithms. As opposed to current radiological guidance systems, the interventional radiology guidance system uses image processing software alone, without employing mechanical sensors, to identify and track the surgical instrument.
In the preferred embodiment, the general functioning of the system is as follows:
A standard 3D ultrasound machine is used to continuously acquire 3D ultrasound volumes of the area being imaged. Based on this data, real time 2D ultrasound images are depicted on the display of the 3D ultrasound machine, for viewing by the operator. By extracting data from the 3D volumes, the system is able to provide the operator with an unlimited number of image orientations. For example, the ultrasound data can be viewed from the point of view of the instrument, in relation to a particular plane of the body (coronal, sagittal, transverse or oblique), or any other chosen angle. As multiple images can be viewed side-by-side simultaneously, 2 orthogonally oriented images can be displayed at the same time, thus facilitating 3D conceptualization, by the operator, of the spatial orientations of the structures being depicted in the 2D images. By "structure" is meant both organic structures (such as the target of an interventional radiological procedure or a structured which it is desired to avoid during an interventional radiological procedure) and non-organic structures such as medical instruments. "Medical instruments" refers to aspiration needles, injection needles, biopsy instruments, scalpels, ablation instruments, biomedical devices such as stents, biomedical prostheses and the like.
When the surgical instrument being used to perform the procedure is introduced into the 3D volume, an image processing software-based tracking system is used to identify the instrument and track its spatial location. In order to improve the visibility of the instrument for the operator, a graphical representation of the instrument is superimposed on the ultrasound image of the instrument displayed on the screen.
The operator then manually designates the target on the image, and marks its center and diameter by using a standard pointing device, such as the electronic calipers incorporated in the imaging system. By "manually" is meant that the target is designated by the operator by hand, rather than automatically by means of a software algorithm. A circle, or other graphical symbol, appears on or around the target in all subsequent ultrasound images, making the target easier to see. The system then tracks the position of the target as the soft tissues, transducer, and instrument move. If the target moves to the extent that it is no longer within the 3D volume, upon relocation the operator is required to re-mark it. If the instrument enters the defined radius of the target, the operator is informed that the instrument is located at or within the target by means of graphic and/or audible signals; for example, the structure's graphical symbol may change color and flash, a beep may be heard, and/or a textual message may appear on the screen. A hard copy video or print picture documenting that the instrument has reached the target can be generated.
In a similar manner, other structures in the body can be marked and tracked, so as to enable the operator to avoid them during the course of the procedure. In this circumstance, once the operator has designated the areas to be avoided, a graphical symbol appears on or around the structures (making them easier to see), and the system tracks their positions. If the instrument approaches to within a defined distance from the structure (either a default tolerance, or a distance defined by the operator) a warning is provided, for example, the structure's graphical symbol may change color and flash, a beep may be heard, and/or a textual message may appear on the screen.
So as to facilitate optimization of the trajectory of the instrument with regard to the target, a graphical aid is displayed on the ultrasound screen. By reorienting the instrument in accordance with the orientation indicated by the graphic aid, the operator is able to more rapidly and accurately complete the procedure.
In an additional embodiment, 6 DOF physical sensors are mounted on an instrument holder, and can be used to determine the spatial location of the instrument when the instrument is outside of the 3D volume. A graphical representation of the instrument, indicating its location and orientation, can be depicted alongside the ultrasound image, so as to enable the operator to optimize the orientation and trajectory of the instrument before it enters the body and the 3D ultrasound volume.
The interventional radiology guidance system therefore increases the accuracy of interventional procedures by providing a clear image of the instrument in all planes desired by the physician, and by providing graphical (and other) aids which aid the physician in placing the instrument in the target, while avoiding critical structures in the body. When used in conjunction with 6DOF sensors mounted on the surgical instrument, the interventional radiology guidance system can additionally provide information about the instrument's location and orientation even before it has entered the 3D ultrasound image volume.
With increased accuracy fewer cases will need to be repeated or transferred to other imaging modalities, resulting in lower cost. In addition to the advantages provided by the guidance aspects of the system, being able to verify in real time on intersecting, orthogonal planes that the instrument has successfully entered the target will greatly increase the confidence of the physician doing the procedure.
In summary, this software based tracking system is more reliable than a mechanical sensor based system because:
1. There is no need to perform alignment of the spatial location data with reference coordinate systems.
2. Superimposition of graphic elements on the ultrasound image is 100% precise because the spatial location of the graphic is defined by the voxels of the ultrasound image itself.
3. Appropriate algorithm manipulation can be performed so as to evaluate the resiliency of the surgical instrument within the body in real time.
4. External influences such as magnetic fields or line-of-sight interferences do not effect the system.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention is herein described, by way of example only, with reference to the accompanying drawings, wherein: FIG. 1 is a block diagram of the software components of the IRGS
FIG. 2 is a block diagram of the general flow of control of the IRGS FIG. 3 is a flow diagram describing the functioning of the algorithm of an instrument tracker FIG. 4 is a flow diagram describing the functioning of a target, or avoidance target, tracker FIG. 5 is a diagram illustrating the functioning of a slice generator FIG. 6 shows the flow of control of a graphic aids generator FIG. 7a and FIG. 7b show examples of graphic aids generated by an
IRGS when oriented in the long axis of a surgical instrument FIG. 8a and FIG. 8b show examples of graphic aids generated by an
IRGS when oriented in the short axis of a surgical instrument.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
The present invention is an interventional radiology guidance system (IRGS). The IRGS enables ultrasound guided procedures to be performed with increased speed and accuracy by tracking the location of the medical instrument (by which is meant surgical instruments as well as other instruments used in interventional medical procedures, as defined above), the target, and the avoidance targets, and providing the operator with a graphical representation of the instrument's trajectory with relation to the target.
The principles and operation of an interventional radiology guidance system, according to the present invention, may be better understood with reference to the drawings and the accompanying description.
The hardware component of the present invention is a 3D ultrasound machine, including a processing unit, a storage device, and a display unit. In an additional embodiment, 6 DOF physical sensors are mounted on an instrument holder. The 3D ultrasound machine automatically produces one volume after another as the operator scans the subject. The Kretz (Kretztechnik, Austria) and 3D Ultrasound Inc (Durham, NC, USA) systems are examples of 3D ultrasound machines capable of supplying consecutive 3D volumes, and suitable for use in the IRGS. In the preferred embodiment, the processing unit and storage device are those present in the 3D ultrasound unit. In an alternative embodiment, the processing unit and storage device are additional components added to the 3D ultrasound unit. A suitable processing unit would include 128MB of RAM and a 166MHz processor. The display unit is that of the 3D ultrasound machine. In a preferred embodiment the system is fully integrated into an existing 3D ultrasound machine. In an alternative embodiment the system is a stand-alone unit, which can be connected to a 3D ultrasound machine.
The software components of the IRGS are located in the processing unit of the 3D ultrasound machine.
Referring now to the drawings, Figure 1 is a block diagram of the software components of an IRGS 12, and their interrelationship with some hardware components of the system. The software components are: a 3D volume generator 1, an instrument tracker 5, a target marker 7, a target tracker 8, an avoid marker 4, an avoid tracker 3, a slice generator 6, and a graphic aids generator 9. 3D Volume Generator 1 provides 3D ultrasound volumes to IRGS 12.
3D Volume Generator 1 is an external component located in the 3D ultrasound machine. Output from this component is a 3D volume.
Target Marker 7 provides a user interface which enables the user to designate targets. For example, the calipers used to make measurements on an ultrasound screen can be used to indicate the location of a target in a frozen ultrasound image, by marking a circle on the image in one of the views displayed. Target Marker 7 then stores the center point and radius of the designated target. In this way a sphere representing the target in three dimensions is defined. Avoid marker 4 provides a user interface which enables the user to indicate structures to be avoided (avoidance targets). As with target marking, calipers can be used to indicate the location of structures in a frozen ultrasound image, by marking a circle on the image in one of the views displayed. Avoid marker 4 stores the center point and radius of the structures to be avoided. In this way, a sphere representing the structure to be avoided in three dimensions is defined. The operator may mark multiple structures for avoidance, with avoid marker 4 being called each time a structure is marked for avoidance.
Instrument tracker 5 is a software algorithm that locates the instrument within the volume provided by 3D Volume Generator 1. The output provided by this component is a list of x,y, and z coordinates describing the location of the instrument.
Instrument tracker 5, Avoid marker 4, and Target Marker 7 are hereinafter collectively referred to as image identification software modules.
The avoid tracker 3 component tracks structures that the operator wishes to avoid touching with the instrument (avoidance targets). It receives as input the position of the instrument, a 3D volume, and the details of the structures to be avoided. It then calculates the distance of the instrument from each of the structures than have been marked for avoidance. If the instrument is "too close"
(according to a pre-defined tolerance) a message is sent to Graphics Aids Generator 9 to change the structure's graphical symbol color and cause it to flash. A textual warning message is also displayed. Once the instrument is moved away from the avoidance target, another message is sent to Graphics
Aids Generator 9 to return the graphical symbol to its original color, stop the flashing and the remove the textual warning message. The algorithm used to track the structures to be avoided is identical to that used by instrument tracker
5.
Target tracker 8 is responsible for tracking the target in the 3D volume.
The details, center point and radius, of the target are provided by Target Marker
7 and the 3D volume is supplied by 3D Volume Generator 1. The output provided by this component is a set of coordinates representing the location and shape of the target.
Slice generator 6 generates specific 2D slices from within the 3D volume, according to the angles and orientation designated by the operator during setup of the system. Graphic Aids Generator 9 is responsible for graphically displaying the instrument, the target, the avoidance targets, and their relationships on each sequential ultrasound image. It receives as input the position and dimensions of the target, the avoidance targets and the instrument, as well as the 2D ultrasound images generated by slice generator 6. If the distance of the instrument from an avoidance target is "too close" (less than a pre-defined tolerance) then the avoidance targets graphical symbol color is changed (to red for example) and the symbol begins to flash. A textual warning is also displayed. Once the instrument is moved away from the structure, its graphical symbol returns to its original color, stops flashing, and the textual message disappears. When the instrument is inside the target, the target symbol's color changes (to green for example) and the target symbol begins to flash - showing that the user has succeeded in navigating the instrument to the desired location.
If a 6DOF sensor system is used then it is possible for IRGS 12 to receive an instrument location which is outside of the 3D volume. In such a case, graphics aids generator 9 draws the instrument outside of the 2D ultrasound image, in a position representative of the instrument's location with respect to the 3D volume.
A shared memory unit 2 stores 3D volume data generated by 3D Volume Generator 1 of the 3D ultrasound system.
A video card 10 receives output from all of the software components, and a display unit 11, for displaying input received from video card 10, are also shown.
A block diagram of the general flow of control of IRGS 12 is depicted in Figure 2. The first step in the iterative process is to access the 3D volume placed in shared memory 2 by 3D volume generator 1 of the 3D ultrasound system. Next a check 13 is performed to see if there are any requests to mark either a target or an avoidance target. If there is a request to mark a target, target marker 7 is called. If there is a request to mark an avoidance target, avoid marker 4 is called. Thereafter, instrument tracker 5 is called, in order to identify and find the location of the instrument. If there is a marked target 14, Target Tracker 8 is then called in order to find the target's location in the current 3D volume. Following that, avoid tracker 3 is called for each of the marked avoidance targets (if any) 15. Then, slice generator 6 is called to produce the 2D slices corresponding to each of the views. Finally, graphic aids generator 9, which draws the appropriate graphic aids onto the 2D slices, is called. Upon completion, the slices are sent to video card 10 for display on display unit 11, and the flow is repeated.
Figure 3 is a flow diagram describing the functioning of the algorithm of instrument tracker 5. The system works in an iterative fashion, identifying and locating the instrument in each consecutive 3D volume it receives. An N depth FIFO queue is used to store sub-volumes. The first step is to read the 3D volume Tn from shared memory 2 . The second step is to define a sub-volume of Tn, Ts, surrounding the instrument. If this is one of the first seven volumes received 16, the sub- volume is determined by a predefined location 17 (usually the surface of the body) or, in an alternative embodiment, by information received from a 6DOF sensor system. Otherwise the sub-volume is determined 18 by the location of the instrument in volume T^. The sub- volume Ts is placed 19 in the FIFO and TS.N is removed 20 from the FIFO (assuming that this is the 9 iteration or more). Ts is then registered 21 to Ts_, using a standard registration algorithm such as that described by Barber (Barber D. C, Registration of low resolution medical images. Phys. Med. Biol., 27(3), pp. 87- 96, 1992), which is incorporated herein by reference. Following this, the variances are calculated 22 on all of the voxels of all of the sub- volumes in the FIFO and normalized to grey level values. By calculating the variances of each voxel as it changes from frame to frame the instrument is identified by identifying its motion. (Because the instrument is typically difficult to see in ultrasound images, standard edge detection or feature detection alone is often inadequate to identify the instrument, whereas the above described motion tracking software algorithm is more consistently successful at identifying the instrument.)
Next, edge detection is performed 23 on the sub- volume Ts. A standard edge detection algorithm such as that described by Canny, and which is incorporated herein by reference (J.F. Canny. A computational approach to edge detection. IEEE Trans. On Pattern Analysis and Machine Intelligence, 8(6):679-698, November 1986), is used. Using the edges produced by the edge detection algorithm, a 3D Hough Transform 24 (R.O. Duda and P.E. Hart, "Use of the Hough Transform To Detect Lines and Curves in Pictures", Communications of the ACM, vol. 15, no. 1 , pp. 11-15, 1972), which is incorporated herein by reference, is performed. The result is a list of lines (Ls). If this is the first iteration 25, then Ls is stored 29 as Ls-1 and the iteration is completed. If this is not the first iteration 25, then Ls and L^ are matched 26 into pairs of lines. These pairs of lines are then used to calculate 27 a flow list, as described by Adiv, incorporated herein by reference (G. Adiv, Determining three-dimensional motion and structure from optical flow generated by several moving objects. IEEE Trans, on Pattern Analysis and Machine Intelligence, 7(4):384-401, 1985), the output of the flow list being a list of vectors. On this vector list segmentation is performed 28, as described by Adiv, 1985 and incorporated herein by reference. The purpose of the segmentation is to identify which of the lines is most likely to represent the instrument. The output of the segmentation is the identified instrument. The x, y, and z points comprising the instrument are then sent 50 to Graphic Aids Generator 9. The final step is to store 29 Ls as Ls_[ and then the next iteration begins. If a 6DOF sensor system, mounted on the instrument (or its holder) and the transducer, is being used, then instrument tracker 5 first calls the 6DOF sensor system to determine if the instrument is within the 3D volume. If it is not within the 3D volume, position data supplied by the sensor to instrument tracker 5 is output directly by instrument tracker 5, without using the algorithm described above to determine the output of instrument tracker 5. If the instrument is within the 3D volume, the above described algorithm is implemented. The advantage of using the 6DOF sensors is that when the instrument is inside the 3D volume, the initial sub-volume (Ts) searched for the instrument is defined by position data received from the sensor and is thus much smaller. This improves the performance of the instrument tracking algorithm.
Figure 4 is a flow diagram describing the functioning of the algorithm for target tracker 8. The system works in an iterative fashion, locating the target in each consecutive 3D volume it receives. Two 3D volumes are required to find the target. On the first iteration, the previous volume, Tn.b is the frozen volume used by target marker 7, and prev_target is that supplied by target marker 7. If it is not the first iteration then Tn-1 and prev_target are from the previous iteration. Tn, the current volume, is read from shared memory 2. Then, using the two volumes and the details of the previous target, registration 30 is performed as described by Barber, 1992 and incorporated herein by reference. Registration identifies an area in the volume corresponding to the new target. The target's position, defined as its center point, is calculated 31 and passed to Graphic Aids Generator 9. Finally, before continuing on to the next iteration, the target is stored 32 as prev_target and Tn is stored 33 as Tπ_,. The next iteration begins by once again reading a 3D volume from shared memory 2.
Figure 5 is a diagram illustrating the functioning of slice generator 6. Each slice is a virtual image frame buffer defined in a world coordinate system. The voxels pierced by the virtual frame are sampled, mapped and displayed in their image coordinate system after the frame is clipped against the volume buffer. In the figure, "P" stands for "point", with PI being point #1. B is represents the 3D point mapped to 2D. A voxelization algorithm for planar polygons, as described by D. Cohen and A. Kaufman, (Scan conversion algorithms for linear and quadratic objects, in: A. Kaufman, Ed., Volume Visualization, IEEE Computer Society Press, Los Alamitos, CA, 1990, pages 280-301) and incorporated herein by reference, is used. The algorithm is basically an extension of the widely known 2D scan-line algorithm (J.D. Foley and A. van Dam and K. Feiner and F. Hughes, in Computer Graphics Principles and Practice, Addison- Wesley, 1990.) where at each scan-line the third dimension is also interpolated. Alternatively, the sweeping technique described by Cohen, Kaufman and Kong and incorporated herein by reference, (D. Cohen-Or, A. Kaufman and T.Y. Kong, On the Soundness of Surface Voxelizations, in Topological Algorithms for Digital Image Processing, T. Yung Kong and A. Rosenfeld (eds.), North-Holland, Amsterdam, 1995, pages 181-204.) can be used. In this technique a polygon is generated by replicating one discrete line over the other, thus saving most of the computations involved in the discretization process of the plane. The output generated by slice generator 6 is one 2D ultrasound image for each view chosen by the operator. Because the 3D coordinates describing the orientation and tip of the instrument within the 3D volume are known, slice generator 6 is able to calculate the 4 points which define a plane with an orientation within the long or short axes of the instrument. It is thus a feature of the IRGS that the operator can choose to display a 2D ultrasound plane which is consistently oriented within the long or short axes of the instrument, regardless of the motion of the instrument during the procedure. It will be appreciated that multiple other orientations of the 2D imaging plane can be chosen by the operator. In a further embodiment of the current invention, a 3D ultrasound image could be rendered from the 3D volume, for viewing by the operator, instead of individual 2D ultrasound images being rendering by slice generator 6. Figure 6 shows the flow of control of the graphic aids generator 9 module. This component works iteratively. For each slice received 34 the following steps are performed. First the 3D representation of the instrument is mapped 35 to a 2D representation and the instrument is drawn 36. Then, if there is a target 37 the 3D representation of the target is mapped to a 2D representation. If the instrument has hit the target 39 then the target is drawn 40 as a flashing green object. If the target hasn't been hit then the target is drawn 41 without flashing. At this point a loop 42 begins in which each avoid object is processed. For each avoid object, the 3D representation thereof is mapped 43 to a corresponding 2D representation and is drawn in one of two ways. If the instrument hits the avoid object then the avoid object is drawn 44 as a flashing red object. Otherwise the avoid object is drawn normally 46.
Figures 7a, 7b, 8a, and 8b show examples of graphic aids generated by graphic aids generator module 9 to aid the operator in orienting the instrument relative to the target. In figures 7a and 7b the imaging slice has been chosen (by the operator) to constantly be in the long axis of the surgical instrument. As such, the full length of the instrument, including the tip of the instrument, is seen. The circle represents the target, the arrow represents the instrument, and the dotted line represents a projection from the target to the instrument. When the dotted line and the instrument form a straight line, as in figure 7b, the instrument orientation is optimal. In figure 8, the imaging plane is in the short axis of the surgical instrument. The light circle represents the target and the dark circle represents the instrument. Only when the dark circle is located inside the light circle will the instrument penetrate the target if inserted further, as is seen in Figure 8b. The above description related to image processing of an ultrasound image extracted from a 3D volume of ultrasound image data. It will be understood that by using the same software components (with the exception of slice generator 6, which would not be utilized), the image processing functions of marking and tracking of the instrument, targets and avoidance targets could be performed by IRGS 12 using 2D ultrasound image data instead of 3D data. In this instance, however, the operator would be unable to freely select a desired angle at which to generate the ultrasound image, and could not view more than one image simultaneously. In addition, use of 2D ultrasound data rather than 3D data would preclude a 3D analysis of the relationships between the viewed structures. As such, IRGS 12 would describe only the 2D relationships between the target, instrument and avoidance targets. Similarly, it will be further understood that additional 2D or 3D digital image data sources (other than ultrasound) - such as CT, MRI, MRT, and the like- can be used to perform the above described image processing functions (marking and tracking of the instrument, targets and avoidance targets, and description of the relationships between them), by using the same software components, provided that the image data source generates chronologically sequential images allowing for the depiction of dynamic events. The phrase "mechanism for obtaining chronologically sequential images" hereinafter refers to all such image data sources.

Claims

WHAT IS CLAIMED IS:
1. An interventional radiological guidance system, comprising a) a mechanism for obtaining a plurality of chronologically sequential images of a body b) an image identification software module operable to designate at least one self-referential spatial location, defined by a first set of spatial coordinates, within at least one of said chronologically sequential images, said self-referential spatial location corresponding to at least one structure in said body c) an image tracking software module operable to self-referentially locate a second set of spatial coordinates in a chronologically subsequent image of said body, said second set of spatial coordinates corresponding to said first set of spatial coordinates in a chronologically precedent image of said body, such that said second set of spatial coordinates correspond to said at least one structure, and d) a graphic aids generator software module operable to generate at least one signal marking a location of said at least one structure in said chronologically sequential images, and operable to describe at least one relationship between a plurality of said at least one structures in said chronologically sequential images.
2. The system of claim 1, wherein said mechanism is selected from the group consisting of three-dimensional ultrasound machines, two-dimensional ultrasound machines, computerized tomography scanners, magnetic resonance imaging scanners, and magnetic resonance therapeutics scanners.
3. The system of claim 1 , wherein said body is a human body.
4. The system of claim 1 , wherein said at least one self-referential spatial location is designated by an image processing software algorithm.
5. The system of claim 4, wherein said image processing software algorithm is a motion tracking software algorithm.
6. The system of claim 1, wherein said at least one self-referential spatial location is manually designated.
7. The system of claim 1, wherein said structure is selected from the group consisting of medical instruments and organic structures.
8. The system of claim 7, wherein said medical instrument is a surgical instrument being used in an interventional radiological procedure.
9. The system of claim 7, wherein said organic structures are selected from the group consisting of a target of a radiologically guided interventional medical procedure, and a structure which it is desired to avoid in a radiologically guided interventional medical procedure.
10. The system of claim 1, further comprising e) a display unit operable to display said at least one signal and said at least one relationship, wherein said at least one signal is displayed as a visual graphic.
11. The system of claim 1 , wherein said at least one relationship is selected from the group consisting of a desired orientation of one of said at least one structure relative to a second of said at least one structure, a desired minimum distance between one of said at least one structure and a second of said at least one structure, and a desired maximum distance between one of said at least one structure and a second of said at least one structure.
12. The system of claim 11, wherein said relationship is displayed as a visual graphic.
13. The system of claim 1, further comprising e) a sound production unit operable to produce an audible signal corresponding to said at least one relationship.
14. An interventional radiological guidance method, comprising the steps of a) obtaining a plurality of chronologically sequential images of a body b) designating at least one self-referential spatial location, defined by a first set of self-referential spatial coordinates within at least one of said chronologically sequential images, said self-referential spatial location corresponding to at least one structure in said body c) self-referentially locating a second set of spatial coordinates in a chronologically subsequent image of said body, said second set of spatial coordinates corresponding to said first set of spatial coordinates in a chronologically precedent image of said body, such that said second set of spatial coordinates correspond to said at least one structure, and d) generating at least one signal marking a location of said at least one structure in said chronologically sequential images.
15. The method of claim 14, wherein said images are obtained using a mechanism selected from the group consisting of three-dimensional ultrasound machines, two-dimensional ultrasound machines, computerized tomography scanners, magnetic resonance imaging scanners, and magnetic resonance therapeutics scanners.
16. The method of claim 14, wherein said body is a human body.
17. The method of claim 14, wherein said at least one self-referential spatial location is designated by an image processing software algorithm.
18. The method of claim 17, wherein said image processing software algorithm is a motion tracking software algorithm.
19. The method of claim 14, wherein said at least one self-referential spatial location is manually designated.
20. The method of claim 14, wherein said structure is selected from the group consisting of medical instruments and organic structures.
21. The method of claim 20, wherein said medical instrument is a surgical instrument being used in an interventional radiological procedure.
22. The method of claim 20, wherein said organic structures are selected from the group consisting of a target of a radiologically guided interventional medical procedure, and a structure which it is desired to avoid in a radiologically guided interventional medical procedure.
23. The method of claim 14, further comprising the step of e) displaying said at least one signal as a visual graphic on a display.
24. The method of claim 23, further comprising the steps of f) describing at least one relationship between a plurality of said structures in at least one of said chronologically sequential images, and g) displaying said relationship as a visual graphic on a display.
25. The method of claim 24, wherein said relationship is selected from the group consisting of a desired orientation of one of said structures relative to a second of said structures, a desired minimum distance between one of said structures and a second of said structures, and a desired maximum distance between one of said structures and a second of said structures.
26. The method of claim 24, further comprising the step of producing an audible signal corresponding to said relationship.
PCT/US1998/019124 1997-09-29 1998-09-16 Interventional radiology guidance system WO1999016352A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU94826/98A AU9482698A (en) 1997-09-29 1998-09-16 Interventional radiology guidance system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US93966297A 1997-09-29 1997-09-29
US08/939,662 1997-09-29

Publications (1)

Publication Number Publication Date
WO1999016352A1 true WO1999016352A1 (en) 1999-04-08

Family

ID=25473544

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/019124 WO1999016352A1 (en) 1997-09-29 1998-09-16 Interventional radiology guidance system

Country Status (2)

Country Link
AU (1) AU9482698A (en)
WO (1) WO1999016352A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002002028A1 (en) * 2000-07-06 2002-01-10 Synthes Ag Chur Method and device for impingement detection
EP1323380A3 (en) * 2001-12-31 2003-08-27 Medison Co., Ltd. Method and apparatus for ultrasound imaging of a biopsy needle
WO2004084736A1 (en) * 2003-03-27 2004-10-07 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by three dimensional ultrasonic imaging
EP1795130A1 (en) * 2005-12-05 2007-06-13 Medison Co., Ltd. Ultrasound system for interventional treatment
US7270634B2 (en) 2003-03-27 2007-09-18 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by high resolution three dimensional ultrasonic imaging
US7371067B2 (en) 2001-03-06 2008-05-13 The Johns Hopkins University School Of Medicine Simulation method for designing customized medical devices
US7720522B2 (en) 2003-02-25 2010-05-18 Medtronic, Inc. Fiducial marker devices, tools, and methods
US7742639B2 (en) 2004-04-16 2010-06-22 Koninklijke Philips Electronics N.V. Data set visualization
US8150495B2 (en) 2003-08-11 2012-04-03 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
EP2454996A1 (en) * 2010-11-17 2012-05-23 Samsung Medison Co., Ltd. Providing an optimal ultrasound image for interventional treatment in a medical system
US8619862B2 (en) 2008-03-18 2013-12-31 Thomson Licensing Method and device for generating an image data stream, method and device for reconstructing a current image from an image data stream, image data stream and storage medium carrying an image data stream
US8663110B2 (en) 2009-11-17 2014-03-04 Samsung Medison Co., Ltd. Providing an optimal ultrasound image for interventional treatment in a medical system
US8886288B2 (en) 2009-06-16 2014-11-11 MRI Interventions, Inc. MRI-guided devices and MRI-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time
US9138165B2 (en) 2012-02-22 2015-09-22 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9218664B2 (en) 2005-09-13 2015-12-22 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US9259290B2 (en) 2009-06-08 2016-02-16 MRI Interventions, Inc. MRI-guided surgical systems with proximity alerts
US10165928B2 (en) 2010-08-20 2019-01-01 Mark Hunter Systems, instruments, and methods for four dimensional soft tissue navigation
US10470725B2 (en) 2003-08-11 2019-11-12 Veran Medical Technologies, Inc. Method, apparatuses, and systems useful in conducting image guided interventions
US10617324B2 (en) 2014-04-23 2020-04-14 Veran Medical Technologies, Inc Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
US10624701B2 (en) 2014-04-23 2020-04-21 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US11304629B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5383454A (en) * 1990-10-19 1995-01-24 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
US5447154A (en) * 1992-07-31 1995-09-05 Universite Joseph Fourier Method for determining the position of an organ
US5526812A (en) * 1993-06-21 1996-06-18 General Electric Company Display system for enhancing visualization of body structures during medical procedures
US5638819A (en) * 1995-08-29 1997-06-17 Manwaring; Kim H. Method and apparatus for guiding an instrument to a target
US5671739A (en) * 1995-04-03 1997-09-30 General Electric Company Imaging of interventional devices during medical procedures
US5765561A (en) * 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5383454A (en) * 1990-10-19 1995-01-24 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
US5383454B1 (en) * 1990-10-19 1996-12-31 Univ St Louis System for indicating the position of a surgical probe within a head on an image of the head
US5447154A (en) * 1992-07-31 1995-09-05 Universite Joseph Fourier Method for determining the position of an organ
US5526812A (en) * 1993-06-21 1996-06-18 General Electric Company Display system for enhancing visualization of body structures during medical procedures
US5765561A (en) * 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system
US5671739A (en) * 1995-04-03 1997-09-30 General Electric Company Imaging of interventional devices during medical procedures
US5638819A (en) * 1995-08-29 1997-06-17 Manwaring; Kim H. Method and apparatus for guiding an instrument to a target

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7769429B2 (en) 2000-07-06 2010-08-03 Ao Technology Ag Method and device for impingement detection
WO2002002028A1 (en) * 2000-07-06 2002-01-10 Synthes Ag Chur Method and device for impingement detection
US7371067B2 (en) 2001-03-06 2008-05-13 The Johns Hopkins University School Of Medicine Simulation method for designing customized medical devices
EP1323380A3 (en) * 2001-12-31 2003-08-27 Medison Co., Ltd. Method and apparatus for ultrasound imaging of a biopsy needle
JP2003284717A (en) * 2001-12-31 2003-10-07 Medison Co Ltd Biopsy needle observation device, and method for the same
US6764449B2 (en) 2001-12-31 2004-07-20 Medison Co., Ltd. Method and apparatus for enabling a biopsy needle to be observed
US7720522B2 (en) 2003-02-25 2010-05-18 Medtronic, Inc. Fiducial marker devices, tools, and methods
US7270634B2 (en) 2003-03-27 2007-09-18 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by high resolution three dimensional ultrasonic imaging
WO2004084736A1 (en) * 2003-03-27 2004-10-07 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by three dimensional ultrasonic imaging
US10470725B2 (en) 2003-08-11 2019-11-12 Veran Medical Technologies, Inc. Method, apparatuses, and systems useful in conducting image guided interventions
US8150495B2 (en) 2003-08-11 2012-04-03 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
US11426134B2 (en) 2003-08-11 2022-08-30 Veran Medical Technologies, Inc. Methods, apparatuses and systems useful in conducting image guided interventions
US11154283B2 (en) 2003-08-11 2021-10-26 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
US7742639B2 (en) 2004-04-16 2010-06-22 Koninklijke Philips Electronics N.V. Data set visualization
US11304629B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US11304630B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US9218664B2 (en) 2005-09-13 2015-12-22 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US9218663B2 (en) 2005-09-13 2015-12-22 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
US10617332B2 (en) 2005-09-13 2020-04-14 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
EP1795130A1 (en) * 2005-12-05 2007-06-13 Medison Co., Ltd. Ultrasound system for interventional treatment
US8619862B2 (en) 2008-03-18 2013-12-31 Thomson Licensing Method and device for generating an image data stream, method and device for reconstructing a current image from an image data stream, image data stream and storage medium carrying an image data stream
US9439735B2 (en) 2009-06-08 2016-09-13 MRI Interventions, Inc. MRI-guided interventional systems that can track and generate dynamic visualizations of flexible intrabody devices in near real time
US9259290B2 (en) 2009-06-08 2016-02-16 MRI Interventions, Inc. MRI-guided surgical systems with proximity alerts
US8886288B2 (en) 2009-06-16 2014-11-11 MRI Interventions, Inc. MRI-guided devices and MRI-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time
US8663110B2 (en) 2009-11-17 2014-03-04 Samsung Medison Co., Ltd. Providing an optimal ultrasound image for interventional treatment in a medical system
US10898057B2 (en) 2010-08-20 2021-01-26 Veran Medical Technologies, Inc. Apparatus and method for airway registration and navigation
US10165928B2 (en) 2010-08-20 2019-01-01 Mark Hunter Systems, instruments, and methods for four dimensional soft tissue navigation
US11690527B2 (en) 2010-08-20 2023-07-04 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US10264947B2 (en) 2010-08-20 2019-04-23 Veran Medical Technologies, Inc. Apparatus and method for airway registration and navigation
US11109740B2 (en) 2010-08-20 2021-09-07 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
EP2454996A1 (en) * 2010-11-17 2012-05-23 Samsung Medison Co., Ltd. Providing an optimal ultrasound image for interventional treatment in a medical system
US10977789B2 (en) 2012-02-22 2021-04-13 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US10140704B2 (en) 2012-02-22 2018-11-27 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9138165B2 (en) 2012-02-22 2015-09-22 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9972082B2 (en) 2012-02-22 2018-05-15 Veran Medical Technologies, Inc. Steerable surgical catheter having biopsy devices and related systems and methods for four dimensional soft tissue navigation
US11403753B2 (en) 2012-02-22 2022-08-02 Veran Medical Technologies, Inc. Surgical catheter having side exiting medical instrument and related systems and methods for four dimensional soft tissue navigation
US10460437B2 (en) 2012-02-22 2019-10-29 Veran Medical Technologies, Inc. Method for placing a localization element in an organ of a patient for four dimensional soft tissue navigation
US11551359B2 (en) 2012-02-22 2023-01-10 Veran Medical Technologies, Inc Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US10249036B2 (en) 2012-02-22 2019-04-02 Veran Medical Technologies, Inc. Surgical catheter having side exiting medical instrument and related systems and methods for four dimensional soft tissue navigation
US11830198B2 (en) 2012-02-22 2023-11-28 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US10624701B2 (en) 2014-04-23 2020-04-21 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US10617324B2 (en) 2014-04-23 2020-04-14 Veran Medical Technologies, Inc Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
US11553968B2 (en) 2014-04-23 2023-01-17 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter

Also Published As

Publication number Publication date
AU9482698A (en) 1999-04-23

Similar Documents

Publication Publication Date Title
US20220291741A1 (en) Using Optical Codes with Augmented Reality Displays
WO1999016352A1 (en) Interventional radiology guidance system
US6669635B2 (en) Navigation information overlay onto ultrasound imagery
EP1103229B1 (en) System and method for use with imaging devices to facilitate planning of interventional procedures
US7085400B1 (en) System and method for image based sensor calibration
US6690960B2 (en) Video-based surgical targeting system
US5608849A (en) Method of visual guidance for positioning images or data in three-dimensional space
US6675032B2 (en) Video-based surgical targeting system
US8248414B2 (en) Multi-dimensional navigation of endoscopic video
US8831310B2 (en) Systems and methods for displaying guidance data based on updated deformable imaging data
US8248413B2 (en) Visual navigation system for endoscopic surgery
JP3589505B2 (en) 3D image processing and display device
US6049622A (en) Graphic navigational guides for accurate image orientation and navigation
JP2003531516A (en) Enhanced visualization of live breast biopsy locations for medical documentation
WO2001058359A1 (en) Ultrasonic imager
JPH09508994A (en) Image forming apparatus and method
Welch et al. A real-time freehand 3D ultrasound system for image-guided surgery
US20220202493A1 (en) Alignment of Medical Images in Augmented Reality Displays
US7376254B2 (en) Method for surface-contouring of a three-dimensional image
CN114845655A (en) 3D path detection visualization
Shahidi et al. Volumetric image guidance via a stereotactic endoscope
US20230248441A1 (en) Extended-reality visualization of endovascular navigation
US20220211440A1 (en) Camera-Assisted Image-Guided Medical Intervention
Juszczyk et al. Time Regarded Method of 3D Ultrasound Reconstruction
Maitland et al. A Video Based Tracker for use in Computer Aided Surgery.

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

122 Ep: pct application non-entry in european phase