CA2240651A1 - Intravascular ultrasound enhanced image and signal processing - Google Patents

Intravascular ultrasound enhanced image and signal processing Download PDF

Info

Publication number
CA2240651A1
CA2240651A1 CA002240651A CA2240651A CA2240651A1 CA 2240651 A1 CA2240651 A1 CA 2240651A1 CA 002240651 A CA002240651 A CA 002240651A CA 2240651 A CA2240651 A CA 2240651A CA 2240651 A1 CA2240651 A1 CA 2240651A1
Authority
CA
Canada
Prior art keywords
image
detector
images
ultrasound
ultrasound signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002240651A
Other languages
French (fr)
Inventor
Ehud Nachtomy
Jacob Richter
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medinol Ltd
Original Assignee
Medinol Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medinol Ltd filed Critical Medinol Ltd
Publication of CA2240651A1 publication Critical patent/CA2240651A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52034Data rate converters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52036Details of receivers using analysis of echo signal for target characterisation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52046Techniques for image enhancement involving transmitter or receiver
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/06Measuring blood flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8934Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
    • G01S15/8938Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in two dimensions
    • G01S15/894Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in two dimensions by rotation about a single axis
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8979Combined Doppler and pulse-echo imaging systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52025Details of receivers for pulse systems

Abstract

A device and method for intravascular ultrasound imaging. A catheter including ultrasonic apparatus is introduced into and may be moved through a bodily lumen. The apparatus transmits ultrasonic signals and detects reflected ultrasound signals which contain information relating to the bodily lumen. A processor coupled to the catheter is programmed to derive a first image or series of images and a second image or series of images from the detected ultrasound signals. The processor is also programmed to compare the second image or series of images to the first image orseries of images respectively. The processor may be programmed to stabilize the second image in relation to the first image and to limit drift. The processor may also be programmed to monitor the first and second images for cardiovascular periodicity, image quality, temporal change and vasomotion. it can also match the first series of images and the second series of images.

Description

CA 022406~1 1998-06-16 '1~.

[~390/48101]
INTRAVASCULAR ULTRASOUND ENHANCED
IMAGE AND SIGNAL PROCESSING

Field Of The Invention The present invention relates to a device and method for enhanced image and signal processing for Intravascular Ultrasound ("IVUS"), and more specifically, to a device and method for processing IVUS image and signal information which will enhance the quality and utility of IVUS images.

Background Information IVUS images are derived from a bearn of ultrasonic energy projected by apparatussuch as a transducer or transducer array located around, along or at the tip of a catheter inserted within a blood vessel. An ultrasound beam from the apparatus is continuously rotated within the blood vessel forming a 360~ internal cross sectional image, i.e., the image is formed in a transverse (X-Y) plane. Depending on the specific apparatus configuration, the image may be derived either from the same transverse plane of the apparatus or from a transverse plane found slightly forward (i.e., distal) of the transverse plane of the apparatus. If the catheter is moved inside ~o and along the blood vessel (i.e., along the Z-axis), images of various segments (series of consecutive cross sections) of the vessel may be formed and displayed.

IVUS may be used in all types of blood vessels, including but not limited to arteries, veins and other peripheral vessels, and in all parts of a body.
~5 The ultrasonic signal that is received (detected) is originally an analog signal. This signal is processed using analog and digital methods so as to eventually form a set of vectors comprising digitized data. Each vector represents the ultrasonic response of a different angular sector of the vessel, i.e., a section of the blood vessel. The number ,o of data elements in each vector (axial sampling resolution) and the number of vectors used to scan a complete cross section (lateral sampling resolution) of the vessel may vary depending on the type of system used.

CA 022406~1 1998-06-16 The digitized vectors may initially be placed into a two-dimensional array or matri:;
having Polar coordinates, i. e., A(r, ~). In this Polar matrix~ for example the X axis corresponds to the r coordinate and the Y axis corresponds to the ~ coordinate. Each value of the matrix is a value (ranging from 0-255 if the system is 8 bit) representing the strength of the ultrasonic response at that location.

This Polar matrix is not usually transferred to a display because the resultant image will not be easily interpreted by a physician. The information stored in the Polar matrix A(r, ~) usually undergoes several processing stages and is interpolated into o Cartesian coordinates, e.g., X and Y coordinates (A(X, Y)) that are more easily interpreted by a physician. Thus, the X and Y axis of matrix A(X, Y) will correspond to the Cartesian representation of the vessel's cross-section. The information in the Cartesian matrix possibly undergoes further processing and is eventually displayed for analysis by a physician. Images are acquired and displayed in a variable rate, s depending on the system. Some systems can acquire and display images in video-display rate, e.g, up to about 30 images per second.

IVUS ex~min~tion of a segment of a bodily lumen, i. e. ~ vessel is generally performed by situating the catheter distal (i. e., downstream) to the segment to be reviewed and ~o then the catheter is pulled back (pullback) slowly along the bodily lumen (Z-axis) so that successive images that form the segment are continuously displayed. In manycases the catheter is connected to a mechanical pulling device which pulls the catheter at a constant speed (i.e., a typical speed is approximately 0.5 - I mm/sec.).

~5 In IVUS im~ging systems today the technique described above for displaying an image of a cross section of a bodily lumen, e.g, blood vessel, is generally used.
These systems are deficient, however, because they do not include any form of stabilization of the images to compensate for movements of the catheter and/or bodily lumen, e.g, blood vessel. It is well known that during IVUS im~ging of a bodily lumen, there is always motion exhibited by the catheter and/or the bodily lumen. This motion might be exhibited in the transverse (X-Y) plane, along the vessel axis (Z axis) or a combination of those movements. The im~ging catheter can also be tilted in CA 022406~1 1998-06-16 relation to the vessel so that the im~ginCJ plane is not perpendicular to the Z a~;is (This movement shall be termed as angulation). These movements are caused by. among other things, beating of the heart, blood and/or other fluid flow through the lumen~
vasomotion, forces applied by the physician, and other forces caused by the physiology of the patient.

In IVUS systems today, when the im~ging catheter is stationary or when performing slow manual or mechanical pullback, relative movement between the catheter and the lumen is the primary factor for the change in appearance between successive images, o i.e., as seen on the display and/or on film or video. This change in appearance occurs because the rate of change of an image due to movements is much greater than therate of change in the real morphology due to pullback.

Stabilization occurs when the images include compensation for the relative movement between the catheter and the lumen in successive images. Because none of the IVUS
systems used today perform stabilization, there is no compensation for or correction of relative movements between the catheter and the lumen. As a result, morphological features are constantly moving or rotating, i.e., on the display and/or film or video. This makes it difficult for the physician to accurately interpret~o morphology in an IVUS dynamic display. Furtherrnore, when non-stabilized IVUS
images are fed as an input to a processing algorithm such as 3D reconstruction or different types of filter that process a set of successive images, this can lead to degraded performance and misdiagnosis or inaccurate determinations.

Current IVUS im~ging apparatus or catheters may have occasional malfunctions of an electronic or mechanical origin. This can cause displayed images to exhibit bothrecognized or unrecognized artifacts and obscure the real morphology. Currently there is no automatic methods to determine whether images posses these types of artifacts which hamper the analysis of the images of the vessel or bodily lumen.
The behavior of cardio-vascular function is generally periodic. The detection of this periodicity and the ability to establish correlation between an image and the temporal CA 022406~1 1998-06-16 phase in the cardiac cycle to which it belongs is referred to as cardiac gating.

Currently, cardiac gating is performed by using an external signal, usually an ECG
(Electro-Cardiogram). However, ECG gating requires both the acquisition of the ECG signal and its interleaving (or synchronization) with the IWS image. This requires additional hardware/software.

Morphological features in IWS images of blood vessels can be broken into three general categories: the lumen, i.e., the area through which the blood or other bodily fluid flows, the vessel layers; and the exterior, i.e., the tissue or morphology outside of the vessel. Blood in most IWS films (images) is characterized by a rapidly ch~nging speckular pattern. The exterior of the vessel also alternates with hightemporal frequency. Currently, the temporal behaviour of pixels and their textural attributes are not monitored automatically.
Vasomotion in the context of bodily lumens, e.g., blood vessel, is defined as the change in the caliber of the lumen, e.g., vessel. This chance can be brought about by natural circumstances or under induced conditions. Vasomotion can have a dynamiccomponent, i.e., dynamic change ofthe lumen's dimensions, e.g., vessel's's caliber (contraction and dilation) during the cardiovascular cycle, and a baseline static component, i.e., a chance in the baseline caliber of the lumen, e.g., vessel.

Vasomotion can be expressed as q~1~ntit~tive physiological parameters indicating the ability of the lumen, e.g., vessel to change its caliber under certain conditions. These types of parameters have current and possibly future medical and diagnostic importance in providing information regarding the state of the lumen, e.g., vessel and the effect of the therapy performed.

IVUS can be used to monitor vasomotion because it provides an image of the lumen' s baseline caliber and its dynamic changes. Additionally, IWS can be used to monitor whether the vasomotion is global (uniform), i.e., where the entire cross-section of the lumen contracts/dilates in the same magnitude and direction. IWS can also be used CA 022406~1 1998-06-16 to determine whether the vasomotion is non-uniform which leads to local changes in the caliber of the lumen, i.e., di~erenl parts of the lumen cross-section behave di~elelllly.

Currently, all types of vasomotion monitoring by IWS are performed m~ml~lly. This is tedious, time con~llming and prevents monitoring of the vasomotion in real time.

Intel~relalion of IVUS images is achieved through analysis of the composition of the static images and monitoring their temporal behaviour. Most IWS images can be divided into three basic parts. The most inner section is the flow passage ofthelumen, i.e., the cavity through which matter, i.e., blood, flows. Around the flow passage is the actual vessel, which may include blood vessels and any other bodily vessels, which is composed of multiple layers of tissue (and plaque, if diseased).
Outside the vessel other tissue which may belong to the surrounding morphology, for example, the heart in a coronary vessel image.

When the IWS film is viewed dynamically, i.e., in film format, the pixels corresponding to matter flowing through the vessel and to the morphology exterior to the vessel exhibit a dirrel ell~ temporal behavior than the vessel itself. For example, in most IWS films, blood flowing through the vessel is characterized by a frequently alternating speckular pattern. The morphology exterior to the vessel also exhibits frequent alternation. Currently the temporal behaviour of pixels in dynamic IWS
images is not monitored automatically.

In current IWS displays, if designed into the system, high frequency temporal changes are suppressed by means such as averaging over a number of images.
However, this sometimes fails to suppress the appearance of features with high amplitudes, i.e., bright gray values, and it also has a blurring effect.

The size of the flow passage of the lumen is a very important diagnostic parameter.
When required for diagnosis, it is m~nll~lly determined by, for example, a physician.
This is accomplished by drawing the contour of the flow passage borders CA 022406~1 1998-06-16 superimposed on a static image, e.g, frozen on video or on a machine display. This method of manual extraction is time consuming, inaccurate and subject to bias.

Currently, there is commercial image processing software for the automatic extraction of the flow passage. However, these are based on the gray value composition of static images and do not take into account the different temporal behavior exhibited by the material, e.g, blood flowing through the passage as opposed to the vessel layers.

During treatment of vessels, it is common practice to repeat IVUS pullback o ex~min~tions in the same vessel segments. For example, a typical situation is first to review the segment in question, evaluate the disease (if any), remove the IVUS
catheter, consider therapy options, perform therapy, e.g, PTCA-"balloon" or stenting, and then immediately thereafter reexamine the treated segment using IVUS in order to assess the results of the therapy. To properly evaluate the results and fully appreciate the effect of the therapy performed, it is desirable that the images of the pre-treated and post-treated segments, which reflect cross sections of the vessel Iying at the same locations along the vessel's Z-axis (i.e., corresponding segments), be compared. To accomplish this comparison it must be determined which locations in the films of the pre-treatment IVUS images and post-treatment IVUS images correspond to one ~o another. This procedure, called matching (registration) allows an accurate comparison of pre- and post-treatment IVUS images.

Currently, matching is usually performed by viewing the IVUS pullback films of pre-and post-treatment segments, one after the other or side by side by using identifiable anatomical l~n-lm~rks to locate the sequences that correspond visually to one another.
This method is extremely imprecise and difficult to achieve considering that theimages are unstable and often rotate and/or move around on the display due to the absence of stabilization and because many of the anatomical landmarks found in the IVUS pullback film of the pre-treatment segment may be disturbed or changed as aresult of the therapy performed on the vessel. Furthermore, the orientation and appearance of the vessel is likely to change as a result of a different orientations and relative positions of the IVUS catheter in relation to the vessel due to its removal and CA 022406~1 1998-06-16 reinsertion after therapy is completed. The matching that is performed is manual and relies primarily on manual visual identification which can be extremely time consuming and inaccurate.

Summarv Of The Invention The present invention solves the problems associated with IVUS im~sging systems currently on the market and with the prior art by providing physicians with accurate IVUS images and image sequences of the morphology being assessed, thereby enabling more accurate diagnosis and evaluation.

The present invention processes IVUS image and signal information to remove distortions and inaccuracies caused by various types of motion in both the catheter and the bodily lumen. This results in both enhanced quality and utility of the IVUS
images. An advantage provided by the present invention is that individual IVUS
images are stabilized with respect to prior image(s), thereby removing negative effects on any later processing of multiple images. If the movements in each image are of the transverse type, then it is possible for the motion to be completely compensated for in each acquired image.

The present invention also allows volume reconstruction algorithms to accuratelyreproduce the morphology since movement of the bodily lumen is stabilized. The present invention is applicable to and useful in any type of system where there is a need to stabilize images (IVUS or other) because a probe (e.g., ultrasonic or other) moving through a lumen experiences relative motion (i.e., of the probe and/or of the lumen).

The present invention provides for detection of an ultrasonic signal emitted by ultrasonic apparatus in a bodily lumen, conversion of the received analog signal into Polar coordinates (A(r, ~)), stabilization in the Polar field, converting the stabilized Polar coordinates into Cartesian coordinates (A(X, Y)), stabilization in the Cartesian field and then transferring the stabilized image as Cartesian coordinates to a display.
Stabilized images, either in Polar or Cartesian coordinates, may be further processed CA 022406~1 1998-06-16 prior to display or they might not be displayed. Conversion into Cartesian coordinates and/or stabilization in the Cartesian field mav be done at any point either before or after stabilization in the Polar field. Additionally, either of Polar or Cartesian stabilization may be omitted, depending on the detected shift in the image an~'or s other factors. Furthermore, additional forrns of stabilization may be included or omitted depending on the detected shiR and/or other factors.

For example, stabilization of rigid motion may be introduced to compensate for rotational motion (angular) or global vasomotion (expansion or contraction in the r I o direction) in the Polar field and/or for Cartesian displacement (X and/or Y direction) in the Cartesian field.

Transverse rigid motion between the representations of successive images is called a "shiR," i.e., a uniform motion of all morphological features in the plane of the image.
s To stabilize IVUS images, the first step that is performed is "shiR evaluation and detection." This is where the shift (if any) between each pair of successive images is evaluated and detected. The system may utilize a processor to perform an operation on a pair of successive IVUS images to determine whether there has been a shift between such images. The processor may utilize a single algorithrn or may select~o from a number of algorithms to be used in making this determination.

The system utilizes the algorithm(s) to simulate a shiR in an image and then compares this shiRed image to its predecessor image. The comparisons between images are known as closeness operations which may also be known in the prior art as matching.
~5 The system performs a single closeness operation for each shiR. The results of the series of closeness operations is evaluated to determine the location (direction and magnitude) of the shiRed image that bears the closest resemb!ance to the predecessor unshiRed image. An image can of course be compared in the same manner to its successor image. ARer the actual shiR is determined, the current image becomes the predecessor image, the next image becomes the current image and the above operation is repeated.

CA 022406~1 1998-06-16 Using shift evaluation and detection, the system determines the type of transverse shift, e.g., rotational, expansion. contraction. dlsplacement (Cartesian). etc.. along with the direction and magnitude of the shift. The next step is "shift implementation."
This is where the system performs an operation or a series of operations on successive s IVUS images to stabilize each of the images with respect to its adjacent predecessor image. This stabilization utilizes one or multiple "reverse shifts" which are aimed at canceling the detected shift. The system may include an algorithm or may select from a number of algorithrns to be used to implement each "reverse shift." The logic which decides upon what reverse shift will actually be implemented on an image, prior to its o feeding to further processing or display, is referred to as "shift logic". Once the IVUS
images are stabilized for the desired types of detected motion, the system may then transfer the Cartesian (or Polar) image information for further processing and finally for display where the results of stabilization may be viewed, for exarnple, by aphysician. Alternatively, stabilization can be invisible to the user in the sense that stabilization can be used prior to some other processing steps, after which, resulted images are projected to the display in their original non-stabilized posture or orientation.

It is possible that the transverse motion behveen images will not be rigid but rather of ~o a local nature, i.e., different portions ofthe image will exhibit motion in different directions and magnitudes. In that case the stabilization methods described above or other types of methods can be implemented on a local basis to compensate for such motion.

The present invention provides for detection of the cardiac periodicity by using the information derived only from IVUS images without the need for an external signal such as the ECG. This process involves closeness operations which are also partly used in the stabilization process. One important function of detecting periodicity (i.e., cardiac gating), when the catheter is stationary or when perforrning controlled IVUS
pullback, is that it allows the selection of images belonging to the same phase in successive cardiac cycles. Selecting images based on the cardiac gating will allow stabilization of all types of periodic motion (including transverse, Z-axis and CA 022406~1 1998-06-16 angulations) in the sense that images are selected from the same phase in successive heart-beats. These IVUS images, for example, can be displayed and any gaps created between them may be compensated for by filling in and displaying interpolated images. The IVUS images selected by this operation can also be sent onward for further processing.

The closeness operations used for periodicity detection can also be utilized formonitoring image quality and indicate artifacts associated with malfunction of the im~ging and processing apparatus.

Operations used for shift evaluation can automatically indicate vasomotion. This can serve the stabilization process as vasomotion causes successive images to differbecause of change in the vessel's caliber. If images are stabilized for vasomotion, then this change is compensated for. Altematively, the information regarding thechange in caliber may be displayed since it might have physiological significance.
Monitoring of vasomotion is accomplished by applying closeness operations to successive images using their Polar representations, i.e., A(r, ~). These operations can be applied between whole images or between corresponding individual Polar vectors (from successive images), depending on the type of infommation desired. Since global ~o vasomotion is expressed as a unifomm change in the lumen's caliber it can be assessed by a closeness operation which takes into account the whole Polar image. In general, any operation suitable for global stabilization in the Polar representation can be used to assess global vasomotion.

Under certain conditions during IVUS im~ging there may be non-uniform vasomotion, i.e., movement only in certain sections of the IVUS image corresponding to specific locations in the bodily lumen. This may occur, for example, where anartery has a buildup of plaque in a certain location, thereby allowing expansion or contraction of the artery only in areas free of the plaque buildup. When such movement is detected the system is able to divide the ultrasound signals representing cross sections of the bodily lumen into multiple segments which are then each processed individually with respect to a corresponding segment in the adjacent image CA 022406',1 1998-06-16 using certain algorithm(s). The resulting IVUS images may then be displayed. This form of stabilization may be used individually or in conjunction with the previously discussed stabilization techniques. Alternatively~ the information regarding the local change in vessel caliber can be displayed since it might have physiological significance.

The temporal behavior of pixels and their textural attributes could serve for:
enhancement of display; and automatic segmentation (lumen extraction). If monitored in a stabilized image environment then the performance of the display o enhancement and segmentation processes may be improved.

According to the present invention, the temporal behavior of IVUS images may be automatically monitored. The information extracted by such monitoring can be used to improve the accuracy of IVUS image interpretation. By filtering and suppressing the fast ch"nging features such as the matter, e.g, blood flowing through the vessel and the morphology exterior to the vessel as a result of their temporal behavior, human perception of the vessel on both static images and dynamic images, e.g., images played in cine forrn, may be enhanced.

Automatic segmentation, i.e., identification of the vessel and the matter~ e.g., blood flowing through the vessel may be performed by using an algorithm which automatically identifies the matter, e.g., blood based on the temporal behavior of textural attributes formed by its comprising pixels. The temporal behavior that is extracted from the images can be used for several purposes. For example, temporal 2~ filtering may be performed for image enhancement, and detection of the changes in plxel texture may be used for automatic identification of the lumen and its clrcumference.

In all IVUS images, the catheter itself (and im~ging apparatus) is best to be elimin~ted from the image prior to performing stabilization or for monitoring. Failure to elimin~te the catheter might impair stabilization techniques and monitoring.
F.limin~tion of the catheter may be performed automatically since its dimensions are CA 022406~1 1998-06-16 known.

The present invention also provides for automatic identification (i.e., matching or registration) of corresponding frames of two different IVUS pullback films of the same segment of a vessel, e.g, pre-treatment and post-treatment. To compare a first IVUS pullback film, i.e., a first IVUS im~ginn sequence, with a second IVUS
pullback film, i.e., a second IVUS im~oing sequence, of the same segment of a bodily lumen, for example, captured on video, film or in digitized form, the imaging sequences must be synchronized. Matching, which will achieve this synchronization, o involves performing closeness operations between groups of consecutive images belonging to the two sets of IVUS im~ging sequences.

Out of one im~ging sequence a group of consecutive images, termed the reference group, is selected. This group should be selected from a portion of the vessel displayed in both im~ging sequences and it should be a portion on which therapy will not be performed since the morphology of the vessel is likely to change due to therapy. Another condition for this matching process is that the two im~ging sequences are acquired at a known, constant and preferably the same pullback rate.

~o Closeness operations are performed between the images of the reference group and the images from the second group which has the same number of successive images extracted from the second im~ging sequence. This second group of images is then shifted by a single frame with respect to the reference group and the closeness operations are repeated. This may be repeated for a predetermined number of times ~5 and the closeness results of each frame shift are compared to determine maximal closeness. Ma~cimal closeness will determine the frame displacement between the images of the two im~ging sequences. This displacement can be reversed in the first or second film so that corresponding images may be automatically identified and/or viewed simultaneously.
Thus, corresponding images may be viewed, for example, to determine the effectiveness of any therapy performed or a change in the morphology over time.

CA 022406=,1 1998-06-16 Additionally, the various types of stabilization discussed above may be implemented within or between the images in the t-vo sequences~ either before, during or after this matching operation. Thus, the two films can be displayed not only in a synchronized fashion, but also in the same orientation and posture with respect to one another.
s Brief Description Of The Drawings Figures 1 (a) and (b) show a two-dimensional array or matrix of an image arranged in digitized vectors in Polar and Carte~,ian coordinates, respectively.

o Figure 2 illustrates the results of a shift evaluation between two successive images in Cartesian coordinates.

Figure 3 shows images illustrating the occurrence of drift phenomena in Polar and Cartesian coordinates.
Figure 4 illustrates the effect of performing stabilization operations (rotational and Cartesian shifts) on an image.

Figure 5 illustrates global contraction or dilation of a bodily lumen expressed in the ~o Polar representation of the image and in the Cartesian representation of the image.

Figure 6 shows an image divided into four sections for processing according to the present invention.

~5 Figure 7 shows a vessel, in both Cartesian and Polar coordinates, in which local vasomotion has been detected.

Figure 8 illustrates the results of local vasomotion monitoring in a real coronary vessel in graphical form.
Figure 9 shows an ECG and cross-correlation coefficient plotted graphically in synchronous fashion.

CA 022406~1 1998-06-16 Figure 10 shows a table of a group of cross-correlation coefficient v alues (middle ro~v) belonging to successive images (numbers 1 through 10 sho~n in the top row)and the results of internal cross-correlations (bottom ro~v).

Figure 11 shows a plot of a cross-correlation coefficient indicating an artifact in IVUS
Images.

Figure 12 shows an IVUS images divided into three basic parts: the lumen through which fluid flows; the actual vessel; and the surrounding tissue.

Figure 13 illustrates the results of temporal filtering.

Figure 14 shows an image of the results of the algorithm for automatic extraction of the lumen.
Figure 15 illustrates the time sequence of a first film (left column), reference segment from the second film (middle column) and the images from the first film which correspond (or match) the images of the reference segment (right column).

~o Detailed Description In intravascular ultrasound (IVUS) im~ging systems the ultrasonic signals are emitted and received by the ultrasonic apparatus, for example, a transducer or transducer array, processed and eventually arranged as vectors comprising digitized data. Each vector represents the ultrasonic response of a different angular sector of the bodily ~5 lumen. The number of data elements in each vector (axial sampling resolution) and the number of vectors used to scan the complete cross-section (lateral sampling resolution) of the bodily lumen depends on the specific IVUS system used.

The digitized vectors are initially packed into a two-dimensional array or matrix ,o which is illustrated in Figure 1 (a). Generally, this matrix has what are known as Polar coordinates, i. e., coordinates A(r, ~). The X-axis of the matrix shown in Figure 1 (a) colles~ol1ds to the r coordinate while the Y-axis of the matrix corresponds to the ~

CA 022406~1 1998-06-16 coordinate. Each value of the matrix is generally a gray value, for e~ample, ranging from 0-255 if it is 8 bit, representing the strength of the ultrasonic signal at that corresponding location in the bodily lumen. This Polar matrix may then be converted into a Cartesian matrix as shown in Figure I (b) having an X-axis and Y-axis which correspond to the Cartesian representation of the vessel's cross-section. This image may then be further processed and transferred to a display. The initial array and the display may each utilize either Polar or Cartesian coordinates. The values for the matrix may be other than gray values, for example, they may be color values or other values and may be less than or more than 8 bits.

During an IVUS im~ginE pullback procedure the bodily lumen, hereinafter referred to as a vessel, and/or the im~ging catheter may experience several modes of relative motion. These types of motion include: ( I ) Rotation in the plane of the image, i. e., a shift in the ~-coordinate of the Polar image; (2) Cartesian displacement, i. e., a shift in the X and/or Y coordinate in the Cartesian image; (3) Global vasomotion, characterized by a radial contraction and expansion of the entire vessel, i.e., a uniform shift in the r-coordinate of the Polar image; (4) Local vasomotion, characterized by a radial contraction and expansion of different parts of the vessel with differentmagnitudes and directions, i.e., local shifts in the r-coordinate of the Polar image; (5) ~o Local motion, characterized by different tissue motion which vary depending on the exact location within the image; and (6) Through plane motion, i.e., movements which are perpendicular or near perpendicular (angulation) to the plane of the image.

Stabilization of successive raw images is applicable to the first 5 types of motion described above because motion is confined to the transverse plane. These types of motion can be compensated for, and stabilization achieved, by transforming each current image so that its resemblance to its predecessor image is maximized. The first 3 types of motion can be stabilized using closeness operations which compare whole or large parts of the images one to another. This is because the motion is global or rigid in its nature. The 4th and 5th types of motion are stabilized by applying closeness operations on a localized basis because different parts of the image exhibit different motion. The 6th type of motion can be only partly stabilized by applying CA 022406~1 1998-06-16 closeness operations on a localized basis. This is because the motion is not confined to the transverse plane. This type of motion can be stabilized using cardiovascular periodicity detection.

s The next sections shall describe methods for global stabilization, followed by a description of methods for local stabilization. Stabilization using cardiovascular periodicity detection shall be described in the sections discussing periodicity detection.

o To achieve global stabilization, shift evaluation is performed using some type of closeness operation. The closeness operation measures the similarity between twoimages. Shift evaluation is accomplished by transforming a first image and measuring its closeness, ie., similarity, to its predecessor second image. The transformation may be accomplished, for example, by shifting the entire first image along an axis or a combination of axes (X and/or Y in Cartesian coordinates or r and/or ~ in Polar coordinates) by a single pixel (or more). Once the transformation, ie., shift iscompleted the transformed first image is compared to the predecessor second image using a predefined function. This transformation is repeated, each time by shifting the first image an additional pixel (or more) along the same and/or other axis and ~o comparing the transformed first image to the predecessor second image using a predefined function. After all of the shiRs are evaluated, the location of the global extremum of the comparisons using the predefined function will indicate the direction and magnitude of the movement between the first image and its predecessor secondimage.
~5 For example, Figure 2 illustrates the results of a shift evaluation between two successive images in Cartesian coordinates. Image A is a predecessor image showing a pattern, e.g., a cross-section of a vessel, whose center is situated in the bottom right quadrant of the matrix. Image B is a current image showing the same pattern but ,o moved in an upward and left direction and situated in the upper leR quadrant of the matrix. The magnitude and direction of the movement of the vessel's center is indicated by the arrow. The bottom matrix is the C(shiftX, shiftY) matrix which is the CA 022406~1 1998-06-16 resulting matrix aRer performing shift evaluations using some type of closeness operation.

There are many different algorithms or mathematical functions that can be used to perform the closeness operations. One of these is cross-correlation, possibly using Fourier transform. This is where the current and predecessor images each consisting of, for example, 256 x 256 pixels, are each Fourier transformed using the FFT
algorithm. The conjugate of the FFT of the current image is multiplied with the FFT
of the predecessor image. The result is inversely Fourier transformed using the IFFT
o algorithm. The formula for cross-correlation using Fourier transform can be shown as followS:

C = real(ifR2((fR2(A)) * conj(fft2(B)))) where:
A = predecessor image matrix (e.g., 256 x 256);
B = current image matrix (e.g, 256 x 256);
fft2 = two dimensional FFT;
ifR2 = two dimensional inverse FFT;
~o conj = conjugate;
real = the real part of the complex expression;
* = multiplication of element by element; and C = cross-correlation matrix.

Evaluating closeness using cross-correlation implemented by Fourier transform isactually an approximation. This is because the mathematical formula for the Fourier transform relates to infinite or periodic functions or matrices, while in real life the matrices (or images) are of a finite size and not necessarily periodic. When implementing cross-correlation using FFT, the method assumes periodicity in both,0 axes.

As a result, this formula is a good approximation and it reflects the actual situation in CA 022406~1 1998-06-16 the ~-axis of the Polar representation of the image, however~ it does not reflect the actual situation in the r-axis of the Polar representation or of the X- or Y-axis of the Cartesian representation of the image.

There are a number of advantages to cross-correlation utilizing FFT. First. all values of the cross-correlation matrix C(shiftX, shiftY) are calculated by this basic operation.
Furthermore, there is dedicated hardware for the efficient implementation of the FFT
operation, i.e. Fourier transform chips or DSP boards.

o Another algorithm that can be used to perform closeness operations is direct cross-correlation, either norm~li7~d or not. This is achieved by multiplying each pixel in the current shifted image by its corresponding pixel in the predecessor image and summing up all of the results and norm~li7ing in the case of normalized cross-correlation. Each shift results in a sum and the actual shiR will be indicated by the s largest sum out of the evaluated shifts. The formula for cross-correlation can be shown by the following formula:

~, B(x - shiftX, y-shlftY) ~ A(x, y) C(shiftX, shiRY) = xr ~o The formula for normalized cross correlation is C(shiftX, shiftY) = ~, B(x-shiftX, y-shift Y) * A(x, y) /
I.y ~, (B(x-shiftX, y-shiftY) * B(x-shiftX, y-shiftY)) ~ (A(x, y) * A(x, y)) ~5 x.y x.

where:
A = predecessor image matrix;
B = current image matrix;
* = multiplication of pixel by corresponding pixel;
~, = sum of all pixels in matrix;
C = matrix holding results for all performed shifts.

CA 022406~1 1998-06-16 Using this direct method of cross-correlation. C(shiftX~ shiftY) can be evaluated for all possible values of shiftX and shiftY. For e:;arnple, if the original matrices~ A and B, have 256 x 256 pixels each, then shiftX and shiftY values, each ranging from -1~7 to +128 would have to be evaluated. making a total of ~56 x~6 = 65.536 shift evaluations in order for C(shiftX, shiftY) to be calculated for all possible values of shiftX and shiftY. Upon completion of these evaluations the global maximum of the matrix is determined.

Direct cross-correlation can be implemented more efficiently by lowering the number o of required arithmetic operations. In order to detect the actual shiR between images, evaluation of every possible shiftX and shiftY is not necessary. It is sufficient to find the location of the largest C(shiftX, shiftY) of all possible shiftX and shiftY.

A third algorithm that can be used to perform closeness operations is the sum ofabsolute differences (SAD). This is achieved by subtracting each pixel in one image from its corresponding pixel in the other image, taking their absolute values and s11mming up all of the results. Each shift will result in a sum and the actual shift will be indicated by the lowest sum. The formula for sum of absolute differences (SAD) can be shown as follows:
~o SAD = absolute(A - B) This formula can also be shown as follows:

~, abs(B(x - shiftX, y - shiftY) - A(x, y)) ~5 C(shiftX, shiftY) = ~r where:

A = predecessor image matrix;
B = current image matrix;
abs= absolute value.

CA 022406~1 1998-06-16 - = subtraction of element by element; and = sum of all differences.

While the accuracy of each of these algorithrns/formulas may vary slightly depending on the specific type of motion encountered and system settings, it is to be understood that no single formula can, a-priori be classified as providing the best or most accurate results. Additionally, there are numerous variations on the formulas described above and other algorithms/formulas that may be utilized for performing shiR evaluation and which may be substituted for the algorithms/formulas described above. These o algorithms/formulas also include those operations known in the prior art for use as matching operations Referring again to Figure 2, if the closeness operation performed is cross-correlation, then C(shiRX, shiRY) is called the cross-correlation matrix and its global maximum (indicated by the black dot in the upper leR quadrant) will be located at a distance and direction from the center of the matrix (arrow in matrix C) which is the same as that of the center of the vessel in Image B relative to the center of the vessel in image A
(arrow in Image B).

~o If the closeness operation perforrned is SAD, then the blacl; dot would indicate the global minimum which will be located at a distance and direction from the center of the matrix (arrow in matrix C) which is the same as that of the center of the vessel in Image B relative to the center of the vessel in Image A (arro-v in Image B).

~5 Rotational motion is expressed as a shiR along the current Polar image in the ~-coordinate relative to its predecessor. The rotational shift in a current image is detected by maximi7ing the closeness between the current Polar image and its predecessor. Maximum closeness will be obtained when the current image is reversibly shiRed by the exact magnitude of the actual shiR. In for example, a 256 x ,o 256 pixel image, the value of the difference (in pixels) between 128 and the ~-coordinate of the maximum in the cross-correlation image (minimum in the SAD
image), will indicate the direction (positive or negative) and the magnitude of the CA 022406~1 1998-06-16 rotation.

Global vasomotion is characterized by expansion and contraction of the entire cross section of the vessel. In the Polar image this type of motion is expressed as movement inwards and outwards of the vessel along the r-axis. Vasomotion can be compensated by performing the opposite vasomotion action on a current Polar image in relation to its predecessor Polar image using one of the formulas discussed above or some other formula. In contrast to _ngular stabilization, vasomotion stabilization does not change the orientation of the image but actually transforms the image by o stretching or co~ )-essing it.

Cartesian displacement is expressed as a shiR in the X-axis and/or Y-axis in theCartesian image relative to its predecessor. This type of motion is eliminated by shiRing the Cartesian image in an opposite direction to the actual shiR. Thus, i 5 Cartesian displacement, in the Cartesian representation, can be achieved by essentially the same arithmetic operations used for rotational and vasomotion stabilization in the - Polar representation.

The number of shiR evaluations necessary to locate the global extremum (maximum or minimum, depending on the closeness function) of C(shiRX, shiRY) may be reduced using various computational techniques. One technique, for example, takes advantage of the fact that motion between successive IVUS images is, in general,relatively low in relation to the full dimensions of the Polar and/or Cartesian matrices.
This means that C(shiRX, shiRY) can be evaluated only in a relatively small portion around the center of the matrix, i. e., around shiRX = 0, shiRY = 0. The extremum of that portion is assured to be the global extremum of matrix C(shiRX, shiRY) including for larger values of shiRX and shiRY. The size of the minim~l portion which willassure that the extremum detected within it is indeed a global extremum varies depending on the system settings. The number of necessary evaluation operations may be further reduced by relying on the smoothness and monotonous property expected from the C matrix (especially in the neighborhood of the global extremum).
Therefore, if the value in the C(shiRX, shiftY) matrix at a certain location is a local CA 022406~1 1998-06-16 extremum (e.g., in a 5 x 5 pixel neighborhood)~ then it is probably the global extremum of all of matrix C(shiftX, shiftY).

Implementing this reduction of the number of necessary evaluations can be accomplished by first searching from the center of the matrix (shiftX = O, shiRY = O) and checking a small neighborhood, e.g, S x S pixels around the center. If the local extremum is found inside this neighborhood then it is probably the global extremum of the whole matrix C(shiftX, shiftY) and the search may be terminated. If, however, the local extremum is found on the edges of this neighborhood, e.g, shiftX = -2,o shiftX = 2, shiftY = -2 or shiftY = 2, then the search is repeated around this pixel until a C(shiftX, shiftY) value is found that is bigger (smaller) than all of its close neighbors. Because in a large number of images there is no inter-image motion, the number of evaluations needed to locate the global extremum in those cases, will be approximately S x 5 = 25, instead of the original 65,536 evaluations.
The number of necessary evaluation operations may also be reduced by sampling the images. For example, if 256 x 256 sized images are sampled for every second pixel then they are reduced to 128 x 128 sized matrixes. In this case, direct cross-correlation or SAD, between such matrixes involve 128 x 128 operations instead of ~o 256 x 256 operations, each time the images are shifted one in relation to the other.
Sampling, as a reduction method for shift evaluation operations can be interleaved with other above described methods for reduction.

Referring again to Figure 2, as a result of the closeness operation, the indicated shiftX
~5 will have a positive value and shiftY a negative value. In order to stabilize Image B, i.e., compensate for the shifts in the X and Y directions, shift logic will reverse the shifts, i.e., change their sign but not their magnitude, and implement these shifts on the matrix colles~onding to Image B. This will artificially reverse the shift in Image B and cause Image B to be urlshifted with respect to Image A.
The actual values used in the closeness calculations need not necesc~rily be theoriginal values of the matrix as supplied by the im~ging system. For example, CA 022406~1 1998-06-16 improved results may be achieved ~vhen the original values are raised to the power of 2, 3 or 4 or processed by some other method.

The im~eing catheter and the enclosing sheath appear as constant artifacts in all IVUS
images. This feature obscures closeness operations perforrned between images since it is not part of the morphology of the vessel. It is. therefore, necessary to eliminate the catheter and associated objects from each image prior to performing closeness operations, i. e., its pixels are assigned a value of zero. The elimination of these objects from the image may be perforrned automatically since the catheter's o dimensions are known.

Shift evaluation and implementation may be modular. Thus, shift evaluation and implementation may be limited to either Polar coordinates or Cartesian coordinates individually, or shift evaluation and implementation may be implemented sequentially for Polar and Cartesian coordinates. Presently, because im~g~ing~ in IVUS systems is generally organized by first utilizing Polar coordinates and then converting into Cartesian coordinates, it is most convenient to perform shift evaluation and implementation in the same sequence. However, the sequence may be modified or changed without any negative effects or results.
7o The shift evaluation process can be perforrned along one or two axis. In general, two dimensional shift evaluation is preferred even when motion is directed along one axis.
Shift implementation may be limited to both axis, one axis or neither axis.

~5 There is not a necessary identity between the area in the image used for shift evaluation and between the area on which shift implementation is performed. For example, shift evaluation may be performed using a relatively small area in the image while shift implementation will shift the whole image according to the shift indicated by this area.
A trivial shift logic is one in which the shift implemented on each image (thereby forming a stabilized image) has a magnitude equal, and in opposite direction, to the CA 022406~1 1998-06-16 evaluated shiR. However, such logic can result in a process defined as DriR Drift is a process in which implemented shifts accumulate and produce a growing shiR whose dimensions are significant in relation to the entire image or display. Drift mav be a result of inaccurate shiR evaluation or non-transverse inter-image motion at some part of the cardiovascular cycle. When Cartesian stabilization is implemented, drift can cause, for example, the shiRing of a relatively large part of the image out of the display. When rotational stabilization is implemented, drift can cause the increasing rotation of the image in a certain direction.

o Figure 3 is an image illustrating the occurrence of driR in Polar and Cartesian coordinates. The left image is the original display of the image while the right image is the same image after Polar and Cartesian stabilization has been performed. Note how the right image is rotated counter-clockwise in a large angle and shiRed downward in relation to the leR image. In this case, rotational and Cartesian shiR
implementation do not compensate for actual shiRs in the image, but rather arise from inaccurate shiR evaluation.

The shift logic must be able to deal with this drift so that there will be a minimal implementation of mistaken evaluated shiRs. One method for preventing, or at least ~o limiting driR is by setting a limit to the magnitude of allowable shiRs. This will minimize the drift but at the cost of not compensating for some actual shiR.
Additional methods can be used to prevent or minimi7P shiR. These may possibly be interleaved with cardio-vascular periodicity detection methods discussed later.

2~ The images shown in Figure 4 illustrate the effect of performing stabilization operations (rotational and Cartesian shiRs) on an image. The left image is an IVUS
image from a coronary artery as it would look on a large portion of a regular display (with catheter deleted) while the right image shows how the leR image would be displayed aRer stabilization operations are implemented.
,o Taking a close look at the left and right images in Figure 4, certain differences can be observed. First, the right image is slightly rotated in a clockwise direction (i.e., by a CA 022406=,1 1998-06-16 few degrees) in relation to the left image. This is the result of rotational stabilization.
Next, the right image is translated in a general left direction in relation to the left image. This can be detected by noting the distance of the lumen (cavity) from the edges of the picture in each image. This is a result of Cartesian shift stabilization operations.

The advantages of stabilization of the displayed image cannot be appreciated by viewing single images as shown in Figure 4. However, viewing a film of such images would readily illustrate the advantages. In a display which does not include o stabilization, the location of the catheter would always be situated in the center of the display and the morphological features would move around and rotate on the display.
In contrast, in a stabilized display, the location of the catheter would move around while the morphological features would remain basically stationary. Stabilization does not necessarily have to be exhibited on an actual display. It can be invisible to the user in the sense that stabilization will enhance subsequent processing steps, but the actual display will exhibit the resultant processed images in their original (non-stabilized) posture and orientation.

Figure 5 illustrates global contraction or dilation of a vessel, expressed in the Polar ~o representation of the image as a movement of the features along the r-coordinates, i. e., movement along the Polar vectors. Figure S also shows the same global contraction or dilation expressed in the Cartesian representation of the image. Figure 5(a) shows the baseline appearance of the cross section of a vessel in both the Polar and Cartesian representations. Figure 5(b) shows a relative to baseline contraction of the vessel.
2~ Figure 5(c) shows a relative to baseline uniform dilation of the vessel.

Since global vasomotion is expressed as a uniform change in the vessel's caliber, any operation suitable for stabilization in the Polar representation can be used to assess global vasomotion, e.g., it can be assessed by a closeness operation lltil j7illg the entire Polar image.

After two dimensional shift evaluation is performed, as discussed above, the location CA 022406~1 1998-06-16 of the maximum in matrix C(shiftX, shiftY) on the ~-axis is utilized for rotational stabilization. This leaves the location of the extremum on the r-axis. which can be used as an indication of global vasomotion. Thus~ global vasomotion monitoring is a by-product of two dimensional shift evaluation in the Polar image.

Each pair of successive images produce a value indicative of the vasomotion. Both the magnitude and the sign of the resulting shift between images characterize the change in the vessel, i.e., vasomotion. Negative shifts indicate dilation, and positive shifts indicate contraction. The magnitude of the value indicates the magnitude of the o vasomotion change.

Under certain circumstances motion or vasomotion may not be uniform/rigid although confined to the plane of the image, i.e., transverse. To determine the type of motion or vasomotion, the image may be divided into sections and global stabilization evaluation performed on each of these sections. By ex~rnining the indicated shifts of these sections relative to the corresponding sections in the predecessor image, a determination can be made as to the type of motion. For example, as shown in Figure 6, the image in Figure 6(a) can be divided into four sections as shown in Figure 6(b).
Shift evaluation can be performed separately on each of the four sections.
Comparison between the results of the shiR evaluation for each of the four sections can possibly identify the type of actual motion. Thus, the type of stabilization applied can be varied depending on the type of motion detected.

Stabilization for local motion is achieved by performing closeness operations on a localized basis. Small portions of the predecessor image A ("template" regions) and small portions of the current image B ("search" regions) participate in the local stabilization process. Sometimes, it is best to perform local stabilization after global stabilization has been performed.

During local stabilization, template regions in the predecessor image (A) are shifted within search regions and compared, using closeness operations to template sizedregions in the current image (B). Each pixel, in the (newly) formed stabilized image CA 022406=,1 1998-06-16 (B') will be assigned a new value based on the results of the search and closeness evaluation performed.

Local stabilization is illustrated by the following example in which the template region is a I x I pixel region, i. e., a single pixel, the search region is a 3 x 3 pixel region and the closeness operation is SAD. In the following diagram, the pixel valued 3 in A and the pixel valued 9 in B are corresponding pixels. The 3 x 3 pixel neighborhood of the pixel valued 9 is also illustrated.

o Pixel in A ("template" region) Pixels in B (3 x 3 "search" region) B' In this example, according to the conditions described above the 'template' pixel valued 3 is compared using SAD to all pixels found in the 3 x 3 search region around the pixel valued 9. The pixel valued I at the top left comer of the search region will achieve the minim~l SAD value ( I 1-31 = 2) out of all the possibilities in the search -~o region. As a result, in the newly fommed stabilized image (B'), the pixel corresponding in location to pixels valued 3 and 9 will be assigned the value of 1.

In general, the dimensions of the template and search region can be varied along with the closeness operations used. The actual value which is assigned to the pixel of the ~5 newly formed stabilized image (B') need not necessarily be an actual pixel value from the current image B (as illustrated in the example) but some function of pixel values.
It is important to note that as a result of local stabilization, as opposed to the global/rigid methods, the "composition" of the image, i.e., the intemal relationship between pixels, and their distribution in the stabilized image, changes in relation to ,o the original image. Local stabilization can be implemented on both the Polar and Cartesian representations of the image.

Figure 7 shows a vessel, in both Cartesian and Polar coordinates, in which local CA 022406~1 1998-06-16 vasomotion has been detected. When local vasomotion is detected, it is an indication that some parts of the cross-section of the vessel are behaving differently than other parts of the cross-section.

Figure 7(a) shows a baseline figure of the vessel prior to local vasomotion. Figure 7(b) shows an example of local vasomotion. As indicated in both the Cartesian and Polar representations, four distinct parts of the vessel behave differently: twosegments of the vessel do not change caliber, or do not move relative to their corresponding segments in the predecessor image; one segment contracts, or moveso up; and one segment dilates, or moves down.

As can be observed, global vasomotion evaluation methods are not appropriate forevaluating local vasomotion because the vessel does not behave in a uniform manner.
If global vasomotion evaluation was to be applied, for example, on the example shown in Figure 7, it might detect overall zero vasomotion, i.e. the contraction and dilation would cancel each other.

Therefore, local vasomotion evaluation methods must be utilized. This may be achieved by separately evaluating vasomotion in each Polar vector, i. e., in each ~ (or Y) vector. Closeness operations are applied using one dimensional shifts in corresponding Polar vectors. For example, if closeness is utilized with cross-correlation, then the following operation illustrates how this is accomplished using one dimensional shifts.

2~ C(shiftX, Y) = ~ B(~-shiftX, y) * A(x, y) where:
A = predecessor image matrix;
B = current image matrix;
* = multiplication of pixel by corresponding pixel;
~, = sum of pixels in the matrix of the Polar vector;

CA 022406~1 1998-06-16 C = two dimensional matrix of correlation coefficient.

As can be seen, shifting is performed along one a~;is (X or r-axis) for each and every Polar vector (~ or Y vector). The values assigned in each vector for shift evaluation s may not be the actual values of the images but, for example. each pixel in the vector can be assigned the average of its lateral neighbors, i.e., A(X~ Y) will be assigned, for example, the average of A(X, Y- 1), A(X, Y) and A(X, Y+l ). The same goes for B(shiftX, Y). This can make the cioss-correlation process more robust to noise.

o A two dimensional matrix (C(shiftX, Y)) is formed. Each column in the matrix stores the results of closeness/similarity operations performed between corresponding Polar vectors from the current image and the predecessor image. This operation could also have been implemented using FFT.

s After formation of the matrix, the location of the extremum (maximum in the cross-correlation operation) in each column is detected. This extremum location indicates the match between the current Polar vector and its predecessor. Thus, the vasomotion in each vector can be characterized, i.e., the radial movement in each specific angular sector of the vessel.
~o This information can be used to display the local vasomotion, it can be added up from some or all Polar vectors and averaged to determine an average value for the vasomotion, or it can be used for other purposes. Therefore, by evaluating localvasomotion, both local and global vasomotion can be evaluated.
~s To be effectively used and/or expressed as quantitative physiological parameters, the magnitude of vasomotion must relate in some fashion to the vessel's actual caliber.
Thus, measurements of vasomotion monitoring should generally be used in conjunction with automatic or manual measurements of the vessel's caliber ,o Besides for true vasomotion, Cartesian displacement may also be detected as vasomotion. This is because Cartesian displacement, when expressed in Polar CA 022406~1 1998-06-16 coordinates, results in shiRs along both the r and ~ axes. To distinguish true vasomotion from Cartesian displacement, shiR evaluation in the Cartesian image must indicate no, or little motion. If Cartesian displacement is detected, then it must first be stabilized. ThereaRer, the Cartesian coordinates may be converted bacl; into Polar coordinates for vasomotion evaluation. This will allow greater success and provide more accurate results when determining actual vasomotion.

The graphs in Figure 8 illustrate the results of local vasomotion monitoring in a human coronary vessel in vivo. Local vasomotion monitoring was performed twice in o approximately the same segment of the vessel, and consisted of 190 successive images as shown (X-axis) in Figures 8(a) and 8(b). The difference between the two graphs is that the vasomotion evaluation shown in Figure 8(a) was performed prior to treatment of the artery, i.e., pre-intervention, while the vasomotion evaluation shown in Figure 8(b) was performed aRer treatment of the artery, i.e., post-intervention.
In every image, vasomotion was assessed locally in every Polar vector and then all detected individual shiRs were added and averaged to produce a single global vasomotion indication (Y-a:cis) for each image, i.e., an indication for vasomotion activity.
~o The units on the Y-axis do not have a direct physiological meaning because the actual caliber of the vessel was not calculated, but the relationship between the values in Figures 8(a) and 8(b) have a meaning because they were extracted from the same vessel. Thus, important information may be derived from these figures. Note how the vasomotion increased aRer treatment (maximal vasomotion from approximately 40 toapproximately 150). Therefore, even though vasomotion was not fully quantified, a change in physiology (probably linked to the treatment) has been demonstrated.

Cardiovascular periodicity may be monitored solely based on information stored in IVUS images, thereby elimin~ting the need for an ECG or any other external signal.
This means that a link can be established between every image and its respectivetemporal phase in the cardiovascular cycle without need for an external signal. Once CA 022406~1 1998-06-16 this lin~age is established, then monitoring can substitute the ECG signal in a large number of utilities which require cardiac gating. This monitoring ma- be accomplished using closeness operations between successive images. Moreover, thesame closeness operations can produce information regarding the quality of IVUS
images and their behavior.

The cardiac cycle manifests itself in the cyclic behavior of certain parameters that are extracted by IVUS images. If the behavior of these parameters are monitored, then the periodicity of the cardiac cycle can be determined. Knowing the frame acquisition o rate will also allow the determination of the cardiovascular cycle as a temporal quantity.

The closeness between successive IVUS images is a parameter which clearly behaves in a periodic pattern. This is a result of the periodicity of most types of inter-image motion that are present. A closeness function may be formed in which each value results from a closeness operation between a pair of successive images. For example, a set of ten images will produce nine successive closeness values.

The closeness function can be derived from a cross-correlation type operation, SAD
operation or any other type of operation that produces a closeness type of function.
Normalized cross-correlation produces very good results w hen used for monitoring periodicity.

The following formula shows the formula for the cross-correlation coefficient (as a 2~ function of the Nth image) for calculating the closeness function:

Correlation_function(N) = ~, B(x, y) * A(x, y) / ,/ (~ A(x, y)~ * ~ B(x, y)' ) .~.y ~.y ~.Y

where:
Correlation_function(N) = one dimensional function producing one value for everypalr of Images;
A = predecessor image matrix (the Nth image);

CA 022406~1 1998-06-16 B = current image matrix (the Nth+l image);
* = multiplication of pixel by corresponding pi:;el:
~ = sum on all pixels in matrix.

The correlation coefficient is a byproduct of the stabilization process, because the central value (shiRX = O, shiftY = O) of the norm~li7~od cross-correlation matrix (C(shiftX, shiftY)) is always computed. This holds true for all types of closeness functions used for stabilization. The central value of the closeness matrix (C(shiftX =
O, shiftY = O)), either cross-correlation or another type of operation used for o stabilization, can always be used for producing a closeness function.

The closeness function can also be computed from images which are shifted one inrelation to another, i. e., the value used to form the function is C(shiftX, shiftY) where shiftX and shiftY are not equal to zero. The Closeness function need not necessarily be formed from whole images but can also be calculated from parts of images, either corresponding or shifted in relation to one another.

Figure 9 shows an ECG and cross-correlation coefficient plotted graphically in synchronous fashion. Both curves are related to the same set of images. Figure 9(a) ~o shows a graph of the ECG signal and Figure 9(b) shows a graph of the cross-correlation coefficient derived from successive IVUS images. The horizontal axisdisplays the image number (a total of 190 successive images). As can be observed, the cross-correlation coefficient function in Figure 9(b) shows a periodic pattern, and its periodicity is the same as that displayed by the ECG signal in Figure 9(a) (both show approximately six heart beats).

Monitoring the periodicity of the closeness function may be complicated because the closeness function does not have a typical shape, it may vary in time, it depends on the type of closeness function used, and it may vary from vessel segment to vessel segment and from subject to subject.

To monitor the periodicity of the closeness function continuously and automatically a CA 022406~1 1998-06-16 variety of methods may be employed. One method~ for example, is a threshold typemethod. This method monitors for a value of tne closeness function over a certain value known as a threshold. Once this value is detected, the method monitors forwhen the threshold is again crossed. The period is detemnined as the difference in s time between the crossings of the threshold. An example of this method is shown in Figure 10 as a table. The table shows a group of cross-correlation coefficient values (middle row) belonging to successive images (numbers I through 10 shown in the top row). If the threshold, for example, is set to the value of 0.885, then this threshold is first crossed in the passage from image #~ to image #3. The threshold is crossed a o second time in the passage from image #6 to image #7. Thus, the time period of the periodicity is the time taken to acquire 7 - 3 = 4 images.

Another method that can be used to extract the cardiac periodicity from the closeness curve is intemal cross-correlation. This method utilizes a segment of the closeness function, i.e., a group of successive values. For example, in the table shown in Figure 10, the segment may be comprised of the first four successive images, i.e., images #1 through #4. Once a segment is chosen, it is cross-correlated with itself, producing a cross-correlation value of 1. Next, this segment is cross-correlated with a segment of the same size extracted from the closeness function, but shiRed one image forward.
~o This is repeated, with the segment shiRed two images forward, and so on. In the example shown in Figure 10, the segment {0.8, 0.83, 0.89, 0.85} would be cross-correlated with a segment shiRed by one image {0.83, 0.89, 0.85, 0.82}, then thesegment {0.8, 0.83, 0.89, 0.85} would be cross-correlated with a segment shiRed by two images {0.89, 0.85, 0.8~, 0.87}, and so on. The bottom row of the table in Figure 10 shows the results of these intemal cross-correlations. The first value of 1 is a result of the cross-correlation of the segment with itself. These cross-correlation values are examined to detemmine the location of the local maxima. In this example, they are located in image #l and image #5 (their values are displayed in bold). The resulting periodicity is the difference between the location of the local maxima and the location from which the search was initiated (i.e., image #1). In this example, the periodicity is the time that elapsed from the acquisition of image # 1 to image #5, which is 5 - 1 =
4 images. Once a period has been detected, the search begins anew using a segment CA 022406~1 1998-06-16 surrounding the local maximum, e.g., image #~. In this example for example, the new segment could be the group of closeness values belonging to images #4 through #7.

Due to the nature of the type of calculation involved, the internal cross-correlation operation at a certain point in time requires the closeness values of images acquired at a future time. Thus, unlike the threshold method, the closeness method requires the storage of images (in memory) and the periodicity detection is done retrospectively.
The cardiac periodicity can also be monitored by transforming the closeness curve o into the temporal frequency domain by the Fourier transform. In the frequency domain the periodicity should be expressed as a peak corresponding to the periodicity.
This peak can be detected using spectral analysis.

The closeness function can provide additional important information about IVUS
images which cannot be extracted from external signals, such as ECG, that are not derived from the actual images. The behavior of this function can indicate certain states in the IVUS images or image parts used to form the closeness function.
Important features in the closeness function which are indicative of the state of the IVUS images are the presence of periodicity and the "ro~lghnçss" of the closeness ~o function. Normal IVUS images should exhibit a relatively smooth and periodic closeness function as displayed, for example, in Figure 9(b).

However, if "rol-ghness" and/or periodicity are not present then this could indicate some problem in the formation of IVUS images, i.e., the presence of an artifact in the ~5 image formation caused by, for example, either a mechanical or electronicmalfunction. The following figure helps to illustrate this. Figure 11 shows a graph of the cross-correlation coefficient derived from successive IVUS images. This graph is analogues, in its formation, to the cross-correlation plot in Figure 9(b), but in this example it is formed by a different im~ging catheter used in a different subject.
In this example, it is clear that the closeness function does not exhibit clear periodicity nor does it have a smooth appearance but rather a rough or spiky appea,~lce. In this CA 022406=,1 1998-06-16 case the behavior of the closeness graph was caused by the non-uniformity of therotation of the IVUS transducer responsible for emitting/collecting the ultrasonic signals displayed in the image. This type of artifact sometimes appears in IVUS
catheter-transducer assemblies in which there are moving mechanical parts.

The closeness function, when considered to reflect normal im~ginQ conditions, can serve for a further purpose. This is linked with the location of the maxima in each cycle of the closeness function. Locating these maxima may be important for image processing algorithms which process several successive images together. Images o found near maxima images tend to have high closeness and little inter-image motion, one in relation to the other. Additionally, if images belonging to the same phase of successive cardiac cycles are required to be selected, it is usually best to select them using the maxima (of the closeness function) in each cycle.

In one display method, for example, these images are projected onto the display and the gaps are filled in by interpolated images. By this display method all types of periodic motion can be stabilized.

The shift logic stage in the stabilization process can also make use of cardiovascular ~o periodicity monitoring. If drift is to be avoided, the accumulated shift after each (single) cardiac cycle should be small or zero~ i.e., the sum of all shifts over a period of a cycle should result in zero or near zero. This means that the drift phenomena can be limited by u1ili7inQ shift logic which is coupled to the periodicity monitoring.

Referring now to Figure 12, most IVUS images can be divided into three basic parts.
The central area (around the catheter), labeled as Lumen in Figure 12, is the actual lumen or interior passageway (cavity) through which fluid, e.g., blood flows. Around the lumen, is the actual vessel, labeled Vessel in Figure 12, composed of several layers of tissue and plaque (if diseased). Surrounding the vessel is other tissue, labeled Exterior in Figure 12, i.e., muscle or organ tissue, for example, the heart in the coronary vessel image.

CA 022406~1 1998-06-16 When IVUS images are viewed dynamically (i.e. ~ in film format). the display of the interior, where the blood flows, and of the exterior surrounding the ~ essel, usually shows a different temporal bellavior than the vessel itself.

Automatically monitoring the temporal behavior of pixels in the dynamic IVUS image would allow use of the information extracted by the process to aid in interpretation of IVUS images. This information can be used to enhance IVUS displays by filtering and su~p,essillg the a~pea allce of fast ch~nging features, such as fluid, e.g., blood, and the surrounding tissue, on account of their temporal behavior. This information o can also be used for automatic segmentation, to determine the size of the lumen automatically by identifying the fluid, e.g, blood, and the surrounding tissue based on the temporal behavior of textural attributes formed by their composing pixels.

To accomplish automatic monitoring of temporal behavior there must be an evaluation of the relationship between attributes formed by corresponding pixelsbelonging to successive images. Extraction of temporal behavior bears resemblance - to the methods used for closeness operations on a localized basis, as described previously.

~o High temporal changes are characterized by relatively large relative gray value changes of corresponding pixels, when passing from one image to the next. These fast temporal changes may be suppressed in the display by expressing these changes through the formation of a mask which multiplies the original image. This mask reflects temporal changes in pixel values. A problem that arises in this evaluation is ~s determining whether gray value changes in corresponding pixel values are due to either flow or change in matter, or movements of the vessel/catheter. By performing this evaluation on stabilized images overcomes or at least minimi7~s this problem.

The following definitions apply:
B = current (stabilized or non-stabilized) image;
A = predecessor (stabilized or non-stabilized) image;
C = successor (stabilized or non-stabilized) image;

CA 022406~1 1998-06-16 abs = absolute value.

The matrices used can be either in Cartesian or Polar form.

The following operation, resulting in a matrix Dl, shall be defined as follows: Dl is a matrix, in which each pixel with coordinates X, Y is the sum of the absolute differences of its small surrounding neighborhood, e.g., 9 elements (X-2:X+2. Y-2:Y+2 - a 3 x 3 square), extracted fium images A and B, respectively.

o For example, the following illustration shows co"esponding pixels (in bold) and their close neighborhood in matrices A and B.

A B Dl The pixel in matrix Dl, with the location corresponding to the pixels with value 4 (in B) and 7 (in A) will be assigned the following value:
~o abs(l-3) + abs (4-6) + abs(51-8) + abs(6-3) +
abs(7-4) + abs(l5-70) + abs(3-2) + abs(5-1) + abs(83-6) = 190 D2 is defined similarly but for matrices B and C.~75 Dl and D2 are, in effect, difference matrices which are averaged by using the 3 x 3 neighborhood in order to tiimini.ch local fluctuations or noise. Large gray value changes between images A and B or between B and C will be expressed as relatively high values in matrices D 1 and D2 respectively.~0 A new matrix, Dmax is next formed, in which every pixel is the maximum of the corresponding pixels in matrices Dl and D2:

CA 022406~1 1998-06-16 Dmax = max(D I, D2) where:
max(D 1, D2) = each pixel in Dmax holds the highest of the t~ o corresponding pixels in D I and D2.

Thus, the single matrix Dmax particularly enhances large pixel changes between matrices A, B and C. A mask matrix (MD), is then formed from Dmax by norrn~li7~tion, i.e., each pixel in Dmax is divided by the maximal value of Dmax.
o Therefore, the pixel values of the mask MD range from zero to one.

The role of the mask is to multiply the current image B in the following manner,forming a new matrix or image defined as BOUT:

1S BOUT = (I-MDn) * B

where:
B = original current image;
BOUT = the new image;
n = each pixel in the matrix MD is raised to the power of n. n is generally a number with a value, for example, of 2-10;
I-MDn = a matrix in which each pixel's value is one minus the value of the corresponding pixel in MD.

~5 By performing the subtraction l-MDn, small values of MD which reflect slow ch~nging features become high values in l-MDn. Moreover, the chance that only slow ch~nging features will have high values is increased because of the prior enhancement of high MD values (by forming MD as a maximum between matrices Dl and D2).

The multiplication of the mask (I-MDn) by the current image B, forms a new imageBOUT in which the appearance of slow ch~nging pixels are enh~nced while fast ch~n~in~ pixels' values are decreased. The number n deterrnin~s how strong the CA 022406~1 1998-06-16 suppression of fast ch~nging features will look on the display.

Figure 13 illustrates the results of temporal filtering. The left image is an original IVUS image (i.e., matrix B) from a coronary vessel, as it would look on the current display. The right image has undergone the processing steps described above, i.etemporal filtering (matrix BOUT). Note that in the right image, blood and the surrounding tissue is filtered (suppressed) and lumen and vessel borders are much easier to identify.

0 Automatic segment~tion differentiates fluid, e.g., blood and exterior, from the vessel wall based on the differences between the temporal behavior of a textural quality. As in the case of temporal filtering, this method is derived from the relationship between corresponding pixels from a number of successive images. If pixel values change because of inter-image motion, then performance of the algorithm will be degraded.
s Performing stabilization prior to automatic segmentation will overcome, or at least minimi7.- this problem.

As in the case of temporal filtering, the following definitions shall apply:

~o B = current (stabilized or non-stabilized) image;
A = predecessor (stabilized or non-stabilized) image;
C = successor (stabilized or non-stabilized) image.

The matrices can be either in Cartesian or Polar form.
~5 The textural quality can be defined as follows: Suppose the four nearest neighbors of a pixel with value "a" are "b," "c," "d" and "e," then the classification of "a" will depend on its relations with "b," "c," "d" and "e." This can be shown with the following illustration:
,o b cea d CA 022406~1 1998-06-16 The following categories can now be formed:

In the vertical direction:
if a>b and a>e then "a" is classified as belonging to the category I;
if a>b and ace then "a" is classified as belonging to the category II;
if a<b and a<e then "a" is classified as belonging to the category III;
if a<b and a>e then "a" is classified as belonging to the category IV;
if a=b or a=e then "a" is classified as belonging to the category V.

In the horizontal direction:

if a>c and a>d then "a" is classified as belonging to the category I;
if a>c and a<d then "a" is classified as belonging to the category II;
s if acc and a<d then "a" is classified as belonging to the category III;
if a<c and a>d then "a" is classified as belonging to the category IV;
if a=c or a=d then "a" is classified as belonging to the category V.

The vertical and horizontal categories are next combined to form a new category. As ~o a result~ pixel "a" can now belong to S x 5 = 25 possible categories. This means that the textural quality of "a" is characterized by its belonging to one of those (25) categories.

For exarnple, in the following neighborhood:
~5 10 lO 14 Pixel "a" = 10 is classified as belonging to the category which includes category I

CA 022406~1 1998-06-16 vertical (because 10>7 and 10>3) and category V horizontal (because 10=10).
However, if pixel "a" would have been situated in the follo~-ing neighborhood:

it would have been classified as belonging to a different category because its horizontal category is now category III (10<11 and 10<14).

By determining the relationship of each pixel to its close neighborhood a textural quality has been forrned which classifies each pixel into 25 possible categories. The number of categories may vary (increased or decreased), i.e., for example, by ch~nging the categorizing conditions, as may the number of close neighbors used, for example, instead of four, eight close neighbors may be used.

The basic concept by which the textural changes are used to differentiate fluid, e.g, blood, from the vessel is by monitoring the change in categories of corresponding pixels in successive images. To accomplish this the category in each and every pixel in matrices A, B and C are determined. Next, corresponding pixels are each tested to see if this category has changed. If it has, the pixel is suspected of being a fluid, e.g., blood, or surrounding tissue pixel. If it has not changed, then the pixel is suspected of being a vessel pixel.

The following example shows three corresponding pixels (with values 8, 12 and 14) and their neighborhoods in successive matrices A, B and C.

A B C
,o 9 8 11 19 12 13 21 14 17 In this example, the category of the pixel valued 12 (in B) is the same as in A and C, so it will be classified as a pixel with a higher chance of being a vessel wall pixel. If, CA 022406~1 1998-06-16 however, the situation was as shown below (20 in C chanoes to 13):

A B C

then pixels 8 in A and 12 in B have the same categories, but 14 in C has a different category as in the prior example. As a result, pixel 12 in B will be classified as a pixel o with a higher chance of being a fluid (lumen), i.e., blood, or exterior tissue pixel.

The classification method described so far monitors the change in the texture orpattem associated with the small neighborhood around each pixel. Once this change is detemmined as described above, each pixel can be assigned a binary value. Forexample, a value of 0, if it is suspected to be a vessel pixel, or a value of 1, if it is suspected to be a blood pixel or a pixel belonging to the vessel's exterior. The binary image, serves as an input for the process of identification of the lumen and the original pixel values cease to play a role in the segmentation process.

~o Identification of the lumen using the binary image is based on two assumptions which are generally valid in IVUS images processed in the manner described above. The first, is that the areas in the image which contain blood or are found on the exterior of the vessel are characterized by a high density of pixels with a binary value of I (or a low density of pixels with a value of zero). The term density is needed because there ~5 are always pixels which are misclassified. The second assumption, is that from a morphological point of view, connected areas of high density of pixels with the value of I (lumen) should be found around the catheter and surrounded by connected areas of low density of pixels with the value of 1 (vessel) which are in turn, surrounded again by connected areas of high density of pixels with the value of I (vessel'sexterior). The reason for this assumption is the typical morphological arrangement expected from a blood vessel.

These two assumptions fomm the basis of the subsequent procescing algorithm which CA 022406~1 1998-06-16 extracts the actual area associated with the lumen out of the binary image. Thisalgorithm can utilize known image processing techniques, such as thresholding the density feature in localized regions (to distinguish blood/exterior from vessel ) and morphological operators such as dilation or linking to inter-connect and form a connected region which should represent the actual lumen found within the vesselwall limits.

Figure 14 shows an image of the results of the algorithm for automatic extraction of the lumen. The image is an original IVUS image (for example, as described above as o image B) and the lumen borders are superimposed (by the algorithm) as a bright line.
The algorithm for the extraction of the lumen borders was based on the monitoring of the change in the textural quality described above, using three successive images.

The examples described above of temporal filtering and automatic segmentation 1~ include the use of two additional images (for example, as described above as images A and C) in addition to the current image (for example, as described above as image B). However, both of these methods could be modified to utilize less (i.e., only one additional image) or more additional images.

~o The performance of the two methods described above will be greatly enhanced if combined with cardiovascular periodicity monitoring. This applies, in particular, to successive images in which cardiovascular periodicity monitoring produces high inter-image closeness values. Those images usually have no inter-image motion.
Thus, most reliable results can be expected when successive images with maximal inter-image closeness are fed as inputs to either temporal filtering or automatic segmentation.

During treatment of vessels using catheterization, it is a common practice to repeat IVUS pullback ex~rnin~tions in the same vessel segment. For example, a typical situation is first to review the segment in question, evaluate the disease (if any), remove the IVUS catheter, consider therapy options, perform therapy and then immediately after (during the sarne session) exarnine the treated segment again using CA 022406~1 1998-06-16 IVUS in order to assess the results of therapy.

To properly assess the results of such therapy, corresponding segments of the pre-treatment and post-treatment segments which lie on the same locations along the length ofthe vessel, i.e., corresponding segments~ should be compared. The following method provides for m~tching, i. e., automatic identification (registration) of corresponding segments.

To accomplish matching of corresponding segments, closeness/similarity operations o are applied between images belonging to a first group of successive images, i. e., a reference segment, of a first pullback film and images belonging to a second group of successive images of a second pullback film. M5~t~ ing of the reference segment in the first film to its corresponding segment in the second film is obtained when some criteria function is maximized.
From either one of the two films a reference segment is chosen. The reference segment may be a group of successive images representing, for example, a fe-v seconds of film of an IVUS image. It is important to select the reference segment from a location in a vessel which is present in the two films and has undergone no ~o change as a result of any procedure, i. e., the reference segment is proximal or distal to the treated segment.

As an example, the table in Figure 15 will help clarify the method for matching of corresponding segments.
?~
The left colurnn shows the time sequence of the first film, in this case the film consists of twenty successive images. The middle column, shows the reference segment which is selected from the second film and consists of l O successive images. The right column lists the lO successive images from the first film (#5 - #14) which actually correspond to (or match) the images of the reference segment from the second film (#l - #lO). The purpose ofthe matching process is to actually reveal this correspondence .

CA 022406~1 1998-06-16 Once a reference segment is chosen, it is shifted along the other film, one image (or more) each time, and a set of stabilization and closeness operations are perforrmed between the corresponding images in each segment. The direction of the shift depends on the relative location of the reference segment in the time sequence of the s two films. However, in general, if this is not known, the shift can be performed in both directions.

For example, where:
r = reference segment; and f= first film, the first set of operations will take place between the images comprising the following pairs: r# I -f# I, r#2-f#2, r#3-f#3, . . ., r# 1 0-f# 10.

The second set of operations will take place between the images comprising the following pairs: r# I -f#2, r#2-f#3, r#3-f#4t . . ., r# 1 0-f# 1 1.

The third set of operations will take place between the images comprising the following pairs: r#l-f#3, r#2-f#4, r#3-f#5, . . ., f#l0-f#12, and so on, etc. As can be observed in this example, the shifting is performed, by a single image each time and in one direction only.

For example, the following operations between the images in each pair may be performed. First, an image from the reference segment is stabilized for rotational and Cartesian motion, in relation to its counterpart in the first film. Then closeness operations are performed between the images in each pair. This operation can be, for example, norm~li7~d cross-correlation (discussed above in relation to periodicity detection). Each such operation produces a closeness value, for example, a cross-correlation coefficient when norrn~li7Pd cross-correlation is used. A set of such operations will produce a number of cross-correlation values. In the example shown in the table of figure 1~, each time the reference segment is shifted, ten new cross-correlation coefficients will be produced.

CA 022406~1 1998-06-16 The closeness values produced by a set of operations can then be mapped into some type of closeness function, for example, an average function. Using the above example, the cross-correlation coefficients are summed up and then divided by the number of pairs, i.e., ten. Each set of operations results therefore. in a single value i. e., an average closeness, which should represent the degree of closeness between the reference segment and its temporary counterpart in the first film. Thus, the result of the first set of operations will be a single value, the result of the second set of operations will be another value, etc.

o We can expect that the maximal average closeness will occur as a result of the operations performed between segments which are very alike, i. e., corresponding or matching segments.

In the above example, these segments should be matched during the fifth set of operations which take place between the images comprising the following pairs: r#l-rll _~" r~ rll ~
lttS, r~ , rtt~-lft7, . . ., rtrlu-l~l4.

The maximal average closeness should, therefore, indicate corresponding segmentsbecause each pair of images are, in fact, corresponding images, i. e., they show the ~o same morphology. The criteria r~ight not, however, follow this algorithm. It may, for example, take into account the form of the closeness function, derived from manyshiRed segment positions instead of using only one of its values which turns out to be the maximum.

~5 Once corresponding segments are identified, the complete first and second films may be synchronized one in relation to the other. This will be a result of an appropriate frame shiR, revealed by the matching process, implemented in one film in relation to the other. Thus, when watching the two films side by side, the pre-treated segment will appear concurrently with the post-treated section.
,o Besides for synchronizing the corresponding segments, the above operation also stabilizes the corresponding segments one in relation to the other. This further CA 022406~1 1998-06-16 enhances the ability to understand the changes in morphology. Thus, even though when the catheter is reinserted in the vessel its position and orientation are lil~elv to have changed, nevertheless, the images in the pre-treatment and post-treatment films will be stabilized in relation to each other.
s The number of images used for the reference segment may vary. The more images used in the matching process, the more robust and less prone to local errors it will be.
However, the tradeoff is more computational time required for the calculations for each matching process as the number of pairs increases.

It is important in acquiring the pullback films that the pullback rate remains stable and is known. It is preferred that the pullback rate be identical in the two acquisitions.

Many different variations of the present invention are possible. The various features s described above may be incorporated individually and independently of one another.
These features may also be combined in various groupings.

Claims (144)

What is claimed is:
1. An intravascular ultrasound imaging device, comprising:
an ultrasound signal transmitter and detector, located within a bodily lumen;
and a processor coupled to the ultrasound signal transmitter and detector, the processor programmed to:
a. derive a first image from detected ultrasound signals, b. derive a second image from the detected ultrasound signals, c. compare the second image to the first image, and d. process the first and second images.
2. The device according to claim 1, wherein comparing the second image to the first image includes evaluating the second image in relation to the first image.
3. The device according to claim 1, wherein the processor programmed to derive includes at least one of processing and digitizing.
4. The device according to claim 1, further comprising a display coupled to the processor.
5. The device according to claim 1, wherein the deriving includes configuring a two-dimensional array.
6. The device according to claim 5, wherein the two-dimensional array is configured in at least one of Polar coordinates and Cartesian coordinates.
7. The device according to claim 5, wherein the two-dimensional array is configured in Polar coordinates and Cartesian coordinates.
8. The device according to claim 5, wherein the two-dimensional array having a plurality of elements, each of the plurality of elements representing a detected ultrasound signal from a predetermined spatial location.
9. The device according to claim 2, wherein evaluating the second image in relation to the first image includes shift evaluation.
10. The device according to claim 2, wherein evaluating the second image in relation to the first image includes at least one closeness operation.
11. The device according to claim 10, wherein the at least one closeness operation including at least one of cross-correlation, normalized cross-correlation and SAD.
12. The device according to claim 11, wherein the cross-correlation including atleast one of direct cross-correlation and Fourier transform.
13. The device according to claim 2, wherein evaluating the second image in relation to the first image is accomplished using at least one of Cartesian coordinates and Polar coordinates.
14. The device according to claim 2, wherein evaluating the second image in relation to the first image is accomplished in at least one dimension.
15. The device according to claim 1, wherein the processor being further programmed to detect at least one of Cartesian displacement, rotational movement and vasomotion.
16. The device according to claim 15, wherein at least one of the Cartesian displacement and the rotational movement is rigid.
17. The device according to claim 15, wherein at least one of the Cartesian displacement and the rotational movement is local.
18. The device according to claim 15, wherein the vasomotion is global.
19. The device according to claim 15, wherein the vasomotion is local.
20. The device according to claim 1, wherein the processor being further programmed to automatically monitor change in detected ultrasound signals for at least one of image enhancement and lumen identification.
21. The device according to claim 20, wherein the processing includes at least one of classification of temporal change in texture and temporal filtering.
22. The device according to claim 1, wherein the processor being further programmed to automatically monitor cardiovascular periodicity.
23. The device according to claim 1, wherein the processor being further programmed to automatically monitor image quality.
24. An intravascular ultrasound imaging, device, comprising:
an ultrasound signal transmitter and detector located within a bodily lumen;
and a processor coupled to the ultrasound signal transmitter and detector, the processor programmed to:
a. derive a first image from a first set of detected ultrasound signals, b. derive a second image from a second set of detected ultrasound signals, c. compare the second image to the first image, d. automatically monitor change in detected ultrasound signals, e. automatically monitor cardiovascular periodicity, and f. stabilize the second image in relation to the first image.
25. An intravascular ultrasound imaging device, comprising:
an ultrasound signal transmitter and detector located within a bodily lumen and moving through a section of the bodily lumen;
a processor coupled to the ultrasound signal transmitter and detector, the processor programmed to:
a. derive a first image from ultrasound signals detected during, a first movement of the ultrasound signal transmitter and detector through the section, b. derive a second image from ultrasound signals detected during a second movement of the ultrasound signal transmitter and detector through the section, c. compare the second image to the first image, and d. process the first and second images; and a display coupled to the processor, wherein the processor adjusting a display of the second image based on the comparison.
26. An intravascular ultrasound imaging, device, comprising:
an ultrasound signal transmitter and detector located within a bodily lumen and moving through a section of the bodily lumen;
a processor coupled to the ultrasound signal transmitter and detector, the processor programmed to:
a. derive a first image from ultrasound signals detected from a first portion of the section, b. derive a second image from ultrasound signals detected from a second portion of the section, c. compare the second image to the first image, and d. process the first and second images; and a display coupled to the processor, wherein the processor adjusting a display of the second image based on the comparison.
27. An intravascular ultrasound imaging device, comprising:
an ultrasound signal transmitter and detector located within a bodily lumen;
and a processor coupled to the ultrasound signal transmitter and detector, the processor programmed to:
a. derive a first image from a first set of detected ultrasound signals, b. derive a second image from a second set of detected ultrasound signals, c. perform automatic monitoring, and d. evaluate the second image in relation to the first image.
28. The device according to claim 27, wherein the processor automatically monitors the first image and the second image for vasomotion.
29. The device according to claim 28, wherein the vasomotion is at least one of local vasomotion and global vasomotion.
30. The device according to claim 27, wherein the processor is further programmed to form a closeness function.
31. The device according to claim 30, wherein the closeness function is formed using at least one of cross-correlation, normalized cross-correlation and SAD.
32. The device according to claim 30, wherein the processor automatically monitors the closeness function for cardiovascular periodicity.
33. The device according to claim 32, wherein the processor automatically monitors the closeness function for cardiovascular periodicity using at least one of threshold crossing, internal closeness, Fourier transform and spectral analysis.
34. The device according to claim 30, wherein the closeness function is analyzedfor image quality.
35. The device according to claim 30, wherein the evaluating includes shift evaluation.
36. An intravascular ultrasound imaging device, comprising:
an ultrasound signal transmitter and detector located within a bodily lumen;
and a processor coupled to the ultrasound signal transmitter and detector, the processor programmed to:
a. derive a first image from a first set of detected ultrasound signals, b. derive a second image from a second set of detected ultrasound signals, c. evaluate the second image in relation to the first image, and d. stabilize the second image in relation to the first image.
37. The device according to claim 36, further comprising a display coupled to the processor for displaying the first image and the stabilized second image.
38. The device according to claim 36, wherein stabilizing the second image in relation to the first image is accomplished using at least one of Cartesian coordinates and Polar coordinates.
39. The device according to claim 36, wherein stabilizing the second image in relation to the first image is accomplished in at least one dimension.
40. The device according to claim 36, wherein stabilizing includes stabilizing for at least one of Cartesian displacement, rotational movement and vasomotion.
41. The device according to claim 40, wherein stabilizing includes stabilizing for at least one of global, local and rigid movement.
42. The device according to claim 36, wherein stabilizing includes stabilizing each of a plurality of locations in the second image.
43. The device according to claim 36, wherein stabilizing includes shifting the second image.
44. The device according to claim 36, wherein stabilizing includes adjusting thesecond image based on the evaluation.
45. The device according to claim 36, wherein the processor being further programmed to limit drift.
46. The device according to claim 43, wherein the processor being further programmed to limit drift by adjusting the shifting of the second image using information derived from cardiovascular periodicity monitoring.
47. A method for intravascular ultrasound imaging, comprising the steps of:
placing an ultrasound signal transmitter and detector, within a bodily lumen;
detecting ultrasound signals;
deriving a first image from detected ultrasound signals;
deriving a second image from the detected ultrasound signals;
comparing the second image to the first image; and processing the first image and the second image.
48. The method according to claim 47, further comprising the step of displaying the first image and the second image.
49. The method according to claim 47, wherein the comparing includes evaluating the second image in relation to the first image.
50. The method according to claim 47, wherein the deriving includes at least oneof processing and digitizing,
51. The method according to claim 47, wherein the deriving includes configuring a two-dimensional array.
52. The method according to claim 51, wherein the two-dimensional array is configured in at least one of Polar coordinates and Cartesian coordinates.
53. The method according to claim 51, wherein the two-dimensional array having a plurality of elements, each of the plurality of elements representing a detected ultrasound signal from a predetermined spatial location.
54. The method according to claim 49, wherein the evaluating includes shift evaluation.
55. The method according to claim 49, wherein the evaluating includes at least one closeness operation.
56. The method according to claim 55, wherein at least one closeness operation includes at least one of cross-correlation, normalized cross-correlation and SAD.
57. The method according to claim 56, wherein the cross-correlation includes at least one of direct cross-correlation and Fourier transform.
58. The method according to claim 49, wherein the evaluating is accomplished using at least one of Cartesian coordinates and Polar coordinates.
59. The method according to claim 49, wherein the evaluating is accomplished in at least one dimension.
60. The method according to claim 47, further comprising the step of detecting at least one of Cartesian displacement, rotational movement and vasomotion.
61. The method according to claim 60, wherein at least one of the Cartesian displacement and the rotational movement is rigid.
62. The method according to claim 60, wherein at least one of the Cartesian displacement and the rotational movement is local.
63. The method according to claim 60, wherein the vasomotion is global.
64. The method according to claim 60, wherein the vasomotion is local.
65. The method according to claim 47, further comprising the step of automatically monitoring change in detected ultrasound signals.
66. The method according to claim 65, further comprising the step of image enhancement.
67. The method according to claim 65, further comprising the step of lumen identification.
68. The method according to claim 47, further comprising the step of automatically monitoring cardiovascular periodicity.
69. The method according to claim 47, further comprising the step of automatically monitoring image quality.
70. A method for intravascular ultrasound imaging, comprising the steps of:
placing an ultrasound signal transmitter and detector, within a bodily lumen;
detecting ultrasound signals;
deriving a first image from a first set of detected ultrasound signals;
deriving a second image from a second set of detected ultrasound signals;
automatic monitoring;

evaluating the second image in relation to the first image; and processing the first image and the second image.
71. The method according to claim 70, further comprising the step of forming a closeness function.
72. The method according to claim 71, wherein the closeness function is formed using at least one of cross-correlation, normalized cross-correlation and SAD.
73. The method according to claim 70, wherein the automatic monitoring monitors the first image and the second image for vasomotion.
74. The method according to claim 73, wherein the vasomotion is at least one of local vasomotion and global vasomotion.
75. The method according to claim 71, wherein the automatic monitoring monitors the closeness function for cardiovascular periodicity.
76. The method according to claim 75, wherein the automatic monitoring includes at least one of threshold crossing, internal closeness, Fourier transform and spectral analysis.
77. The method according to claim 71, wherein the closeness function is analyzed for image quality.
78. The method according to claim 70, wherein the evaluating includes shift evaluation.
79. A method for intravascular ultrasound imaging, comprising the steps of:
placing an ultrasound signal transmitter and detector, within a bodily lumen;
detecting ultrasound signals;
deriving a first image from a first set of detected ultrasound signals;

deriving a second image from a second set of detected ultrasound signals;
evaluating the second image in relation to the first image; and stabilizing the second image in relation to the first image.
80. The method according to claim 79, further comprising the step of displaying the first image and the stabilized second image.
81. The method according to claim 79, wherein the stabilizing is accomplished using at least one of Cartesian coordinates and Polar coordinates.
82. The method according to claim 79, wherein the stabilizing is accomplished in at least one dimension.
83. The method according to claim 79, wherein the stabilizing includes stabilizing for at least one of Cartesian displacement, rotational movement and vasomotion.
84. The method according to claim 83, wherein the stabilizing includes stabilizing for at least one of global, local and rigid movement.
85. The method according to claim 79, wherein the stabilizing includes stabilizing each of a plurality of locations in the second image.
86. The method according to claim 79, wherein the stabilizing includes shifting the second image.
87. The method according to claim 79, wherein the stabilizing includes adjustingthe second image based on the evaluating.
88. The method according to claim 79, further comprising the step of limiting drift.
89. The method according to claim 86, further comprising the step of limiting drift, and wherein the limiting includes adjusting the shifting of the second image using information derived from cardiovascular periodicity monitoring.
90. A method for intravascular ultrasound imaging, comprising the steps of:
placing an ultrasound signal transmitter and detector, within a bodily lumen;
detecting ultrasound signals;
deriving a first image from a first set of detected ultrasound signals;
deriving a second image from a second set of detected ultrasound signals;
comparing the second image to the first image;
automatically monitoring change in detected ultrasound signals;
automatically monitoring cardiovascular periodicity; and stabilizing the second image in relation to the first image.
91. A method for intravascular ultrasound imaging, comprising the steps of:
placing an ultrasound signal transmitter and detector, within a bodily lumen;
the ultrasound signal transmitter and detector moving through a section of the bodily lumen;
detecting ultrasound signals;
deriving a first image from ultrasound signals detected during a first movement of the ultrasound signal transmitter and detector through the section;
deriving a second image from ultrasound signals detected during a second movement of the ultrasound signal transmitter and detector through the section;
comparing the second image to the first image;
adjusting the second image; and displaying an adjusted second image.
92. A method for intravascular ultrasound imaging, comprising the steps of:
placing an ultrasound signal transmitter and detector, within a bodily lumen, an ultrasound signal transmitter and detector moving through a section of the bodily lumen;
detecting ultrasound signals;

deriving a first image from ultrasound signals detected from a first portion of the section;
deriving a second image from ultrasound signals detected from a second portion of the section;
comparing the second image to the first image;
adjusting the second image; and displaying an adjusted second image.
93. A method for intravascular ultrasound imaging, comprising the steps of:
placing an ultrasound signal transmitter and detector, within a bodily lumen;
detecting ultrasound signals;
deriving a first series of images from a first set of detected ultrasound signals;
deriving a second series of images from a second set of detected ultrasound signals;
comparing the first series of images to the second series of images; and automatically matching the first series of images and the second series of images.
94. The method according to claim 93, wherein the matching includes identification of corresponding images.
95. The method according to claim 93, wherein at least a portion of the first series of images being a reference segment, and wherein at least a portion of the second series of images being a non-reference segment.
96. The method according to claim 95, wherein the matching includes the non-reference segment being shifted by an image in relation to the reference segment.
97. The method according to claim 95, wherein the matching includes the non-reference segment being stabilized in relation to the reference segment.
98. The method according to claim 97, wherein the stabilization is individually performed on each of the corresponding images from the reference and non-reference segments.
99. The method according to claim 97, wherein the stabilization is individually performed on each of the corresponding images from the first series and the second series of images.
100. The method according to claim 93, wherein the matching includes a closeness operation.
101. The method according to claim 100, wherein the closeness operation includes one of cross-correlation and normalized cross-correlation.
102. The method according to claim 93, wherein the first series of images is derived from a first movement of the ultrasound signal transmitter and detector along a first section of the bodily lumen, and wherein the second series of images is derived from a second movement of the ultrasound signal transmitter and detector along a second section of the bodily lumen.
103. The method according to claim 102, wherein the first section and the secondsection of the bodily lumen are approximately coextensive.
104. The method according to claim 93, wherein the comparing includes evaluating the second image in relation to the first image.
105. An intravascular ultrasound imaging device, comprising:
an ultrasound signal transmitter and detector located within a bodily lumen;
and a processor coupled to the ultrasound signal transmitter and detector, the processor programmed to:

a. derive a first series of images from a first set of detected ultrasound signals, b. derive a second series of images from a second set of detected ultrasound signals, c. compare the first series of images to the second series of images, and d. automatically match the first series of images and the second series of images.
106. The device according to claim 105, wherein comparing the second image to the first image includes evaluating the second image in relation to the first image.
107. The device according to claim 105, wherein the matching includes identification of corresponding images.
108. The device according to claim 105, wherein at least a portion of the first series of images being a reference segment, and wherein at least a portion of the second series of images being a non-reference segment.
109. The device according to claim 108, wherein the matching includes the non-reference segment being shifted by an image in relation to the reference segment.
110. The device according to claim 108, wherein the matching includes the non-reference segment being stabilized in relation to the reference segment.
111. The device according to claim 110, wherein the stabilization is individually performed on each of the corresponding images from the reference and non-reference segments.
112. The device according to claim 110, wherein the stabilization is individually performed on each of the corresponding images from the first series and the second series of images.
113. The device according to claim 105, wherein the matching includes a closeness operation.
114. The device according to claim 113, wherein the closeness operation includesone of cross-correlation and normalized cross-correlation.
115. The device according to claim 105, wherein the first series of images is derived from a first movement of the ultrasound signal transmitter and detector along a first section of the bodily lumen, and wherein the second series of images is derived from a second movement of the ultrasound signal transmitter and detector along a second section of the bodily lumen.
116. The device according to claim 115, wherein the first section and the secondsection of the bodily lumen are approximately coextensive.
117. The device according to claim 1, further comprising a probe coupled to the ultrasound signal transmitter and detector.
118. The device according to claim 117, wherein the probe is at least one of a catheter and a guide wire.
119. The device according to claim 1, wherein the ultrasound signal transmitter and detector includes an independent transmitter and an independent detector.
120. The device according to claim 25, further comprising a probe coupled to theultrasound signal transmitter and detector and moving the ultrasound signal transmitter and detector through the section.
121. The device according to claim 120, wherein the probe is at least one of a catheter and a guide wire.
122. The device according to claim 25, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
123. The device according to claim 26, further comprising a probe coupled to theultrasound signal transmitter and detector and moving the ultrasound signal transmitter and detector through the section.
124. The device according to claim 123, wherein the probe is at least one of a catheter and a guide wire.
125. The device according to claim 26, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
126. The device according to claim 27, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
127. The device according to claim 36, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
128. The method according to claim 47, wherein the ultrasound signal transmitterand detector is coupled to a probe.
129. The method according to claim 128, wherein the probe is at least one of a catheter and a guide wire.
130. The method according to claim 47, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
131. The method according to claim 70, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
132. The method according to claim 79, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
133. The method according to claim 91, wherein the ultrasound signal transmitterand detector is coupled to a probe, the probe moving the ultrasound signal transmitter and detector.
134. The method according to claim 133, wherein the probe is at least one of a catheter and a guide wire.
135. The method according to claim 91, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
136. The method according to claim 92, wherein the ultrasound signal transmitterand detector is coupled to a probe, the probe moving the ultrasound signal transmitter and detector.
137. The method according to claim 136, wherein the probe is at least one of a catheter and a guide wire.
138. The method according to claim 92, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
139. The method according to claim 102, wherein the ultrasound signal transmitter and detector is coupled to a probe, the probe moving the ultrasound signal transmitter and detector.
140. The method according to claim 139, wherein the probe is at least one of a catheter and a guide wire.
141. The method according to claim 93, wherein the ultrasound signal transmitterand detector includes an independent transmitter and an independent detector.
142. The device according to claim 115, further comprising a probe coupled to the ultrasound signal transmitter and detector and moving the ultrasound signal transmitter and detector through the section.
143. The device according to claim 142, wherein the probe is at least one of a catheter and a guide wire.
144. The device according to claim 105, wherein the ultrasound signal transmitter and detector includes an independent transmitter and an independent detector.
CA002240651A 1997-06-19 1998-06-16 Intravascular ultrasound enhanced image and signal processing Abandoned CA2240651A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US87912597A 1997-06-19 1997-06-19
US08/879,125 1997-06-19

Publications (1)

Publication Number Publication Date
CA2240651A1 true CA2240651A1 (en) 1998-12-19

Family

ID=25373482

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002240651A Abandoned CA2240651A1 (en) 1997-06-19 1998-06-16 Intravascular ultrasound enhanced image and signal processing

Country Status (25)

Country Link
US (2) US6095976A (en)
EP (2) EP0885594B1 (en)
JP (1) JPH11151246A (en)
KR (1) KR19990007305A (en)
AR (1) AR015123A1 (en)
AT (2) ATE283497T1 (en)
AU (2) AU8032298A (en)
BR (1) BR9814778A (en)
CA (1) CA2240651A1 (en)
CZ (1) CZ190198A3 (en)
DE (3) DE69827857T2 (en)
DK (1) DK0885594T3 (en)
EE (1) EE04166B1 (en)
ES (1) ES2192290T3 (en)
GB (1) GB2326479B (en)
HK (1) HK1050242A1 (en)
IL (1) IL125000A (en)
NO (1) NO982817L (en)
PL (1) PL326831A1 (en)
PT (1) PT885594E (en)
RU (1) RU2238041C2 (en)
SG (1) SG68672A1 (en)
SK (1) SK87098A3 (en)
UA (1) UA57011C2 (en)
WO (1) WO1998057580A1 (en)

Families Citing this family (212)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6240307B1 (en) 1993-09-23 2001-05-29 Endocardial Solutions, Inc. Endocardial mapping system
US7930012B2 (en) * 1992-09-23 2011-04-19 St. Jude Medical, Atrial Fibrillation Division, Inc. Chamber location method
WO1994006349A1 (en) * 1992-09-23 1994-03-31 Endocardial Therapeutics, Inc. Endocardial mapping system
US7189208B1 (en) * 1992-09-23 2007-03-13 Endocardial Solutions, Inc. Method for measuring heart electrophysiology
US6027451A (en) * 1997-09-26 2000-02-22 Ep Technologies, Inc. Method and apparatus for fixing the anatomical orientation of a displayed ultrasound generated image
US5885218A (en) * 1997-11-07 1999-03-23 Scimed Life Systems, Inc. Method and apparatus for spatial filtering in an intravascular ultrasound imaging system
US7806829B2 (en) * 1998-06-30 2010-10-05 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for navigating an ultrasound catheter to image a beating heart
US7670297B1 (en) 1998-06-30 2010-03-02 St. Jude Medical, Atrial Fibrillation Division, Inc. Chamber mapping system
US7263397B2 (en) 1998-06-30 2007-08-28 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for catheter navigation and location and mapping in the heart
US6120445A (en) * 1998-10-02 2000-09-19 Scimed Life Systems, Inc. Method and apparatus for adaptive cross-sectional area computation of IVUS objects using their statistical signatures
US7837624B1 (en) 1998-11-20 2010-11-23 Siemens Medical Solutions Usa, Inc. Medical diagnostic ultrasound imaging methods for extended field of view
US6352509B1 (en) * 1998-11-16 2002-03-05 Kabushiki Kaisha Toshiba Three-dimensional ultrasonic diagnosis apparatus
US6352508B1 (en) * 1998-11-20 2002-03-05 Acuson Corporation Transducer motion compensation in medical diagnostic ultrasound extended field of view imaging
US9572519B2 (en) 1999-05-18 2017-02-21 Mediguide Ltd. Method and apparatus for invasive device tracking using organ timing signal generated from MPS sensors
US7386339B2 (en) * 1999-05-18 2008-06-10 Mediguide Ltd. Medical imaging and navigation system
US7343195B2 (en) * 1999-05-18 2008-03-11 Mediguide Ltd. Method and apparatus for real time quantitative three-dimensional image reconstruction of a moving organ and intra-body navigation
US6381350B1 (en) * 1999-07-02 2002-04-30 The Cleveland Clinic Foundation Intravascular ultrasonic analysis using active contour method and system
US6586411B1 (en) 2000-08-16 2003-07-01 Mayo Foundation For Medical Education And Research System for monitoring the location of transgenes
US6200268B1 (en) * 1999-09-10 2001-03-13 The Cleveland Clinic Foundation Vascular plaque characterization
US6896881B1 (en) * 1999-09-24 2005-05-24 Mayo Foundation For Medical Education And Research Therapeutic methods and compositions using viruses of the recombinant paramyxoviridae family
JP2001184492A (en) * 1999-12-27 2001-07-06 Fuji Photo Film Co Ltd Method and device for displaying image
US6402693B1 (en) * 2000-01-13 2002-06-11 Siemens Medical Solutions Usa, Inc. Ultrasonic transducer aligning system to replicate a previously obtained image
US6368277B1 (en) * 2000-04-05 2002-04-09 Siemens Medical Solutions Usa, Inc. Dynamic measurement of parameters within a sequence of images
JP3854062B2 (en) * 2000-04-28 2006-12-06 株式会社モリタ製作所 Tomographic image display method, display device, and recording medium storing program for realizing the display method
US6517488B1 (en) * 2000-06-29 2003-02-11 Acuson Corporation Medical diagnostic ultrasound system and method for identifying constrictions
CA2314794A1 (en) * 2000-08-01 2002-02-01 Dimitre Hristov Apparatus for lesion or organ localization
US6716166B2 (en) * 2000-08-18 2004-04-06 Biosense, Inc. Three-dimensional reconstruction using ultrasound
US7118740B1 (en) * 2000-09-22 2006-10-10 Mayo Foundation For Medical Education And Research Method for limiting the growth of cancer cells using an attenuated measles virus
EP1341443B1 (en) 2000-10-18 2010-12-29 Paieon Inc. System for positioning a device in a tubular organ
US6508768B1 (en) * 2000-11-22 2003-01-21 University Of Kansas Medical Center Ultrasonic elasticity imaging
US6537221B2 (en) * 2000-12-07 2003-03-25 Koninklijke Philips Electronics, N.V. Strain rate analysis in ultrasonic diagnostic images
US6773402B2 (en) 2001-07-10 2004-08-10 Biosense, Inc. Location sensing with real-time ultrasound imaging
NL1018864C2 (en) 2001-08-31 2003-03-03 Technologiestichting Stw Device and method for generating three-dimensional images with tissue hardness information.
WO2006103644A1 (en) 2005-03-31 2006-10-05 Paieon Inc. Method and apparatus for positioning a device in a tubular organ
US6782284B1 (en) * 2001-11-21 2004-08-24 Koninklijke Philips Electronics, N.V. Method and apparatus for semi-automatic aneurysm measurement and stent planning using volume image data
US6589176B2 (en) * 2001-12-05 2003-07-08 Koninklijke Philips Electronics N.V. Ultrasonic image stabilization system and method
NL1019612C2 (en) * 2001-12-19 2003-06-20 Gemeente Amsterdam Steam superheater.
US7024025B2 (en) * 2002-02-05 2006-04-04 Scimed Life Systems, Inc. Nonuniform Rotational Distortion (NURD) reduction
US6961468B2 (en) * 2002-03-29 2005-11-01 Sun Microsystems, Inc. Method and apparatus for adaptive local image quantification verification
US7092571B2 (en) 2002-03-29 2006-08-15 Sun Microsystems, Inc. Method and apparatus for regional image quantification verification
US7092572B2 (en) * 2002-03-29 2006-08-15 Sun Microsystems, Inc. Method and apparatus for global image quantification verification
JP4088104B2 (en) 2002-06-12 2008-05-21 株式会社東芝 Ultrasonic diagnostic equipment
US7359554B2 (en) * 2002-08-26 2008-04-15 Cleveland Clinic Foundation System and method for identifying a vascular border
US7927275B2 (en) * 2002-08-26 2011-04-19 The Cleveland Clinic Foundation System and method of aquiring blood-vessel data
US7074188B2 (en) * 2002-08-26 2006-07-11 The Cleveland Clinic Foundation System and method of characterizing vascular tissue
TR201902962T4 (en) * 2002-08-26 2019-03-21 Cleveland Clinic Found System and method for characterizing vascular tissue.
US7356172B2 (en) * 2002-09-26 2008-04-08 Siemens Medical Solutions Usa, Inc. Methods and systems for motion tracking
CN1720464A (en) 2002-12-02 2006-01-11 皇家飞利浦电子股份有限公司 Segmentation tool for identifying flow regions in an imaging system
WO2004051579A2 (en) * 2002-12-04 2004-06-17 Philips Intellectual Property & Standards Gmbh Apparatus and method for assisting the navigation of a catheter in a vessel
US7927278B2 (en) * 2002-12-13 2011-04-19 California Institute Of Technology Split-screen display system and standardized methods for ultrasound image acquisition and multi-frame data processing
AU2003279484A1 (en) * 2002-12-13 2004-07-09 Koninklijke Philips Electronics N.V. System and method for processing a series of image frames representing a cardiac cycle
ATE441360T1 (en) * 2003-02-25 2009-09-15 Koninkl Philips Electronics Nv INTRAVASCULAR IMAGING
US7314448B2 (en) * 2003-03-28 2008-01-01 Scimed Life Systems, Inc. Imaging transducer assembly
JP4468677B2 (en) * 2003-05-19 2010-05-26 オリンパス株式会社 Ultrasonic image generation method and ultrasonic image generation program
US6942618B2 (en) * 2003-06-19 2005-09-13 Siemens Medical Solutions U.S.A., Inc. Change detection for optimized medical imaging
JP4799833B2 (en) * 2003-06-19 2011-10-26 サラヤ株式会社 Method and apparatus for measuring blood vessel diameter using echo
CA2533538A1 (en) * 2003-07-21 2005-01-27 Paieon Inc. Method and system for identifying an optimal image within a series of images that depict a moving organ
WO2005024729A1 (en) * 2003-09-04 2005-03-17 Philips Intellectual Property & Standards Gmbh Device and method for displaying ultrasound images of a vessel
DE10343808B4 (en) 2003-09-22 2017-06-01 Siemens Healthcare Gmbh Medical examination and / or treatment system
JP5129480B2 (en) 2003-09-25 2013-01-30 パイエオン インコーポレイテッド System for performing three-dimensional reconstruction of tubular organ and method for operating blood vessel imaging device
CA2449080A1 (en) 2003-11-13 2005-05-13 Centre Hospitalier De L'universite De Montreal - Chum Apparatus and method for intravascular ultrasound image segmentation: a fast-marching method
DE10354496B4 (en) 2003-11-21 2011-03-31 Siemens Ag Medical examination and / or treatment system
EP1697903B1 (en) * 2003-12-19 2010-08-04 Philips Intellectual Property & Standards GmbH Method for the computer-assisted visualization of diagnostic image data
US7542544B2 (en) * 2004-01-06 2009-06-02 The Regents Of The University Of Michigan Ultrasound gating of cardiac CT scans
US7874990B2 (en) * 2004-01-14 2011-01-25 The Cleveland Clinic Foundation System and method for determining a transfer function
US20080051660A1 (en) * 2004-01-16 2008-02-28 The University Of Houston System Methods and apparatuses for medical imaging
WO2005070299A1 (en) * 2004-01-16 2005-08-04 The University Of Houston System Methods and apparatus for medical imaging
DE102004008366B3 (en) 2004-02-20 2005-09-15 Siemens Ag Apparatus for performing laser angioplasty with OCT monitoring
DE102004008373B3 (en) 2004-02-20 2005-09-29 Siemens Ag Apparatus for performing and monitoring endovascular brachytherapy
DE102004008368B4 (en) * 2004-02-20 2006-05-24 Siemens Ag Catheter for performing and monitoring rotablation
DE102004008371B4 (en) * 2004-02-20 2006-05-24 Siemens Ag atherectomy
DE102004008370B4 (en) * 2004-02-20 2006-06-01 Siemens Ag Catheter for performing and monitoring rotablation
US7215802B2 (en) * 2004-03-04 2007-05-08 The Cleveland Clinic Foundation System and method for vascular border detection
DE102004011156A1 (en) * 2004-03-08 2005-10-06 Siemens Ag Method for endoluminal imaging with movement correction
DE102004015639B4 (en) 2004-03-31 2007-05-03 Siemens Ag Apparatus for performing cutting-balloon intervention with IVUS monitoring
DE102004015642B3 (en) 2004-03-31 2006-02-02 Siemens Ag Device for elimination of complete occlusion with OCT monitoring
DE102004015640B4 (en) 2004-03-31 2007-05-03 Siemens Ag Apparatus for performing a cutting-balloon intervention with OCT monitoring
US7654958B2 (en) * 2004-04-20 2010-02-02 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for ultrasound imaging with autofrequency selection
US7397935B2 (en) 2004-05-10 2008-07-08 Mediguide Ltd. Method for segmentation of IVUS image sequences
US7578790B2 (en) * 2004-07-20 2009-08-25 Boston Scientific Scimed, Inc. Systems and methods for detecting and presenting textural information from medical images
US20060173318A1 (en) * 2004-07-20 2006-08-03 Scimed Life Systems Inc. Systems and methods for detecting and presenting textural information from medical images
US20060036147A1 (en) * 2004-07-20 2006-02-16 Scimed Life Systems, Inc. Systems and methods for detecting and presenting textural information from medical images
DE102005045071A1 (en) * 2005-09-21 2007-04-12 Siemens Ag Catheter device with a position sensor system for the treatment of a partial and / or complete vascular occlusion under image monitoring
DE102004062395B4 (en) 2004-12-23 2008-10-02 Siemens Ag Intravenous pacemaker electrode
WO2006076409A2 (en) 2005-01-11 2006-07-20 Volcano Corporation Vascular image co-registration
US8295577B2 (en) 2005-03-31 2012-10-23 Michael Zarkh Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ
US8858441B2 (en) * 2005-05-12 2014-10-14 The Trustees Of Columbia University In The City Of New York System and method for electromechanical wave imaging of body structures
DE102005022120B4 (en) * 2005-05-12 2009-04-09 Siemens Ag Catheter, catheter device and diagnostic imaging device
US10687785B2 (en) 2005-05-12 2020-06-23 The Trustees Of Columbia Univeristy In The City Of New York System and method for electromechanical activation of arrhythmias
DE102005027951A1 (en) 2005-06-16 2007-01-04 Siemens Ag Medical system for introducing a catheter into a vessel
DE102005029476A1 (en) * 2005-06-24 2007-02-08 Siemens Ag Device for carrying out intravascular examinations
JP2008543511A (en) 2005-06-24 2008-12-04 ヴォルケイノウ・コーポレーション Vascular image preparation method
DE102005032755B4 (en) 2005-07-13 2014-09-04 Siemens Aktiengesellschaft System for performing and monitoring minimally invasive procedures
DE102005034167B4 (en) * 2005-07-21 2012-01-26 Siemens Ag Device and method for determining a position of an implant in a body
DE102005045373A1 (en) 2005-09-22 2007-04-05 Siemens Ag catheter device
DE102005045362B4 (en) * 2005-09-22 2012-03-22 Siemens Ag Device for determining the position of a medical instrument, associated imaging examination device and associated method
DE102005045600B4 (en) 2005-09-23 2008-01-17 Siemens Ag An injector and method for assisting imaging on a patient
US7988633B2 (en) * 2005-10-12 2011-08-02 Volcano Corporation Apparatus and method for use of RFID catheter intelligence
DE102005050344A1 (en) 2005-10-20 2007-05-03 Siemens Ag Cryocatheter for medical investigation and treatment equipment for e.g. diagnosis and treatment of heart infarcts, has image capture device that maps region of vessel around balloon arranged near catheter tip
US8303505B2 (en) 2005-12-02 2012-11-06 Abbott Cardiovascular Systems Inc. Methods and apparatuses for image guided medical procedures
US20090221916A1 (en) * 2005-12-09 2009-09-03 The Trustees Of Columbia University In The City Of New York Systems and Methods for Elastography Imaging
DE102005059261B4 (en) * 2005-12-12 2013-09-05 Siemens Aktiengesellschaft Catheter device for the treatment of a partial and / or complete vascular occlusion and X-ray device
DE102006015013B4 (en) 2006-03-31 2010-06-02 Siemens Ag Implantable pacemaker
WO2007124953A1 (en) * 2006-05-02 2007-11-08 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for space-resolved, nondestructive analysis of work pieces
US8660325B2 (en) * 2006-07-17 2014-02-25 Koninklijke Philips N.V. Efficient user interaction with polygonal meshes for medical image segmentation
US9867530B2 (en) 2006-08-14 2018-01-16 Volcano Corporation Telescopic side port catheter device with imaging system and method for accessing side branch occlusions
DE102006061178A1 (en) 2006-12-22 2008-06-26 Siemens Ag Medical system for carrying out and monitoring a minimal invasive intrusion, especially for treating electro-physiological diseases, has X-ray equipment and a control/evaluation unit
US10716528B2 (en) 2007-03-08 2020-07-21 Sync-Rx, Ltd. Automatic display of previously-acquired endoluminal images
US9305334B2 (en) 2007-03-08 2016-04-05 Sync-Rx, Ltd. Luminal background cleaning
US11197651B2 (en) 2007-03-08 2021-12-14 Sync-Rx, Ltd. Identification and presentation of device-to-vessel relative motion
US8781193B2 (en) 2007-03-08 2014-07-15 Sync-Rx, Ltd. Automatic quantitative vessel analysis
US9375164B2 (en) 2007-03-08 2016-06-28 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US11064964B2 (en) 2007-03-08 2021-07-20 Sync-Rx, Ltd Determining a characteristic of a lumen by measuring velocity of a contrast agent
US8700130B2 (en) 2007-03-08 2014-04-15 Sync-Rx, Ltd. Stepwise advancement of a medical tool
US9968256B2 (en) 2007-03-08 2018-05-15 Sync-Rx Ltd. Automatic identification of a tool
US9629571B2 (en) 2007-03-08 2017-04-25 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
EP2129284A4 (en) 2007-03-08 2012-11-28 Sync Rx Ltd Imaging and tools for use with moving organs
WO2008151676A1 (en) 2007-06-15 2008-12-18 Ge Inspection Technologies Gmbh Sensor array for navigation on surfaces
US9596993B2 (en) 2007-07-12 2017-03-21 Volcano Corporation Automatic calibration systems and methods of use
WO2009009802A1 (en) 2007-07-12 2009-01-15 Volcano Corporation Oct-ivus catheter for concurrent luminal imaging
WO2009009799A1 (en) 2007-07-12 2009-01-15 Volcano Corporation Catheter for in vivo imaging
CA2696190C (en) * 2007-08-17 2014-02-18 Bell Helicopter Textron Inc. System for optical recognition, interpretation, and digitization of human readable instruments, annunciators, and controls
JP5134932B2 (en) * 2007-12-03 2013-01-30 株式会社東芝 Ultrasonic diagnostic apparatus and control program for ultrasonic diagnostic apparatus
WO2011035312A1 (en) 2009-09-21 2011-03-24 The Trustees Of Culumbia University In The City Of New York Systems and methods for opening of a tissue barrier
US20090259099A1 (en) * 2008-04-10 2009-10-15 Georgia Tech Research Corporation Image-based control systems
US9451929B2 (en) 2008-04-17 2016-09-27 Boston Scientific Scimed, Inc. Degassing intravascular ultrasound imaging systems with sealed catheters filled with an acoustically-favorable medium and methods of making and using
RU2520369C2 (en) * 2008-06-25 2014-06-27 Конинклейке Филипс Электроникс Н.В. Device and method for localising object of interest in subject
WO2010014977A1 (en) 2008-08-01 2010-02-04 The Trustees Of Columbia University In The City Of New York Systems and methods for matching and imaging tissue characteristics
WO2010030819A1 (en) 2008-09-10 2010-03-18 The Trustees Of Columbia University In The City Of New York Systems and methods for opening a tissue
US9101286B2 (en) 2008-11-18 2015-08-11 Sync-Rx, Ltd. Apparatus and methods for determining a dimension of a portion of a stack of endoluminal data points
US9144394B2 (en) 2008-11-18 2015-09-29 Sync-Rx, Ltd. Apparatus and methods for determining a plurality of local calibration factors for an image
US9095313B2 (en) 2008-11-18 2015-08-04 Sync-Rx, Ltd. Accounting for non-uniform longitudinal motion during movement of an endoluminal imaging probe
US8855744B2 (en) 2008-11-18 2014-10-07 Sync-Rx, Ltd. Displaying a device within an endoluminal image stack
US11064903B2 (en) 2008-11-18 2021-07-20 Sync-Rx, Ltd Apparatus and methods for mapping a sequence of images to a roadmap image
US9974509B2 (en) 2008-11-18 2018-05-22 Sync-Rx Ltd. Image super enhancement
US10362962B2 (en) 2008-11-18 2019-07-30 Synx-Rx, Ltd. Accounting for skipped imaging locations during movement of an endoluminal imaging probe
BRPI0917609A2 (en) * 2008-12-10 2019-10-15 Koninklijke Philips Electrnics N. V. '' system for performing vessel analysis, medical imaging workstation, method for performing vessel analysis, and, computer program product ''
US8545412B2 (en) * 2009-05-29 2013-10-01 Boston Scientific Scimed, Inc. Systems and methods for making and using image-guided intravascular and endocardial therapy systems
WO2011015952A1 (en) * 2009-08-07 2011-02-10 Medinol Ltd. Method and system for stabilizing a series of intravascular ultrasound images and extracting vessel lumen from the images
KR101812492B1 (en) * 2009-10-12 2017-12-27 실리콘 밸리 메디컬 인스트루먼츠, 인코포레이티드 Intravascular Ultrasound System for Co-Registered Imaging
US20110118590A1 (en) * 2009-11-18 2011-05-19 Siemens Medical Solutions Usa, Inc. System For Continuous Cardiac Imaging And Mapping
FR2955007B1 (en) * 2010-01-04 2012-02-17 Sagem Defense Securite ESTIMATION OF GLOBAL MOVEMENT AND DENSE
WO2011122200A1 (en) 2010-03-29 2011-10-06 ソニー株式会社 Data processing device, data processing method, image processing device, image processing method, and program
DE102010019421A1 (en) 2010-05-05 2011-11-10 Siemens Aktiengesellschaft Imaging method for displaying results of intravascular imaging and CFD results and medical system for performing the method
AU2011272764B2 (en) * 2010-06-30 2015-11-19 Muffin Incorporated Percutaneous, ultrasound-guided introduction of medical devices
EP2611920B1 (en) 2010-09-02 2015-05-13 Mayo Foundation For Medical Education And Research Vesicular stomatitis viruses
US9951117B2 (en) 2010-09-02 2018-04-24 Mayo Foundation For Medical Education And Research Vesicular stomatitis viruses
JP5944913B2 (en) * 2010-10-28 2016-07-05 ボストン サイエンティフィック サイムド,インコーポレイテッドBoston Scientific Scimed,Inc. Computer readable medium for reducing non-uniform rotational distortion in ultrasound images and system including the same
GB2485390A (en) * 2010-11-12 2012-05-16 Sony Corp Video Surveillance System that Detects Changes by Comparing a Current Image with a Reference Image
WO2012066446A1 (en) * 2010-11-18 2012-05-24 Koninklijke Philips Electronics N.V. Sensing apparatus for sensing an object
US11141063B2 (en) 2010-12-23 2021-10-12 Philips Image Guided Therapy Corporation Integrated system architectures and methods of use
US20120330145A1 (en) * 2010-12-31 2012-12-27 Volcano Corporation Pulmonary Embolism Therapeutic Methods and Associated Devices and Systems
US11040140B2 (en) 2010-12-31 2021-06-22 Philips Image Guided Therapy Corporation Deep vein thrombosis therapeutic methods
US9320491B2 (en) 2011-04-18 2016-04-26 The Trustees Of Columbia University In The City Of New York Ultrasound devices methods and systems
WO2012162664A1 (en) 2011-05-26 2012-11-29 The Trustees Of Columbia University In The City Of New York Systems and methods for opening of a tissue barrier in primates
WO2013033489A1 (en) 2011-08-31 2013-03-07 Volcano Corporation Optical rotary joint and methods of use
JP6106190B2 (en) * 2011-12-21 2017-03-29 ボルケーノ コーポレイション Visualization method of blood and blood likelihood in blood vessel image
CA2875346A1 (en) 2012-06-26 2014-01-03 Sync-Rx, Ltd. Flow-related image processing in luminal organs
US9292918B2 (en) 2012-10-05 2016-03-22 Volcano Corporation Methods and systems for transforming luminal images
US9286673B2 (en) 2012-10-05 2016-03-15 Volcano Corporation Systems for correcting distortions in a medical image and methods of use thereof
US10568586B2 (en) 2012-10-05 2020-02-25 Volcano Corporation Systems for indicating parameters in an imaging data set and methods of use
JP2015532536A (en) 2012-10-05 2015-11-09 デイビッド ウェルフォード, System and method for amplifying light
US10070827B2 (en) 2012-10-05 2018-09-11 Volcano Corporation Automatic image playback
US9324141B2 (en) 2012-10-05 2016-04-26 Volcano Corporation Removal of A-scan streaking artifact
US9367965B2 (en) 2012-10-05 2016-06-14 Volcano Corporation Systems and methods for generating images of tissue
US9307926B2 (en) 2012-10-05 2016-04-12 Volcano Corporation Automatic stent detection
US11272845B2 (en) 2012-10-05 2022-03-15 Philips Image Guided Therapy Corporation System and method for instant and automatic border detection
US9858668B2 (en) 2012-10-05 2018-01-02 Volcano Corporation Guidewire artifact removal in images
WO2014059170A1 (en) 2012-10-10 2014-04-17 The Trustees Of Columbia University In The City Of New York Systems and methods for mechanical mapping of cardiac rhythm
US9840734B2 (en) 2012-10-22 2017-12-12 Raindance Technologies, Inc. Methods for analyzing DNA
EP2931132B1 (en) 2012-12-13 2023-07-05 Philips Image Guided Therapy Corporation System for targeted cannulation
JP2016506276A (en) 2012-12-20 2016-03-03 ジェレミー スティガール, Locate the intravascular image
US11406498B2 (en) 2012-12-20 2022-08-09 Philips Image Guided Therapy Corporation Implant delivery system and implants
US10942022B2 (en) 2012-12-20 2021-03-09 Philips Image Guided Therapy Corporation Manual calibration of imaging system
WO2014099899A1 (en) 2012-12-20 2014-06-26 Jeremy Stigall Smooth transition catheters
US9709379B2 (en) 2012-12-20 2017-07-18 Volcano Corporation Optical coherence tomography system that is reconfigurable between different imaging modes
US10939826B2 (en) 2012-12-20 2021-03-09 Philips Image Guided Therapy Corporation Aspirating and removing biological material
US9486143B2 (en) 2012-12-21 2016-11-08 Volcano Corporation Intravascular forward imaging device
US10058284B2 (en) 2012-12-21 2018-08-28 Volcano Corporation Simultaneous imaging, monitoring, and therapy
US9383263B2 (en) 2012-12-21 2016-07-05 Volcano Corporation Systems and methods for narrowing a wavelength emission of light
US9612105B2 (en) 2012-12-21 2017-04-04 Volcano Corporation Polarization sensitive optical coherence tomography system
CA2895993A1 (en) 2012-12-21 2014-06-26 Jason Spencer System and method for graphical processing of medical data
WO2014099672A1 (en) 2012-12-21 2014-06-26 Andrew Hancock System and method for multipath processing of image signals
US10993694B2 (en) 2012-12-21 2021-05-04 Philips Image Guided Therapy Corporation Rotational ultrasound imaging catheter with extended catheter body telescope
US10413317B2 (en) 2012-12-21 2019-09-17 Volcano Corporation System and method for catheter steering and operation
US10166003B2 (en) 2012-12-21 2019-01-01 Volcano Corporation Ultrasound imaging with variable line density
US10191220B2 (en) 2012-12-21 2019-01-29 Volcano Corporation Power-efficient optical circuit
AU2014223201B2 (en) * 2013-03-01 2017-03-02 Boston Scientific Scimed, Inc. Systems and methods for lumen border detection in intravascular ultrasound sequences
JP6243453B2 (en) 2013-03-07 2017-12-06 ボルケーノ コーポレイション Multimodal segmentation in intravascular images
US10226597B2 (en) 2013-03-07 2019-03-12 Volcano Corporation Guidewire with centering mechanism
US20140276923A1 (en) 2013-03-12 2014-09-18 Volcano Corporation Vibrating catheter and methods of use
CN105228518B (en) 2013-03-12 2018-10-09 火山公司 System and method for diagnosing coronal microvascular diseases
US10758207B2 (en) 2013-03-13 2020-09-01 Philips Image Guided Therapy Corporation Systems and methods for producing an image from a rotational intravascular ultrasound device
US11026591B2 (en) 2013-03-13 2021-06-08 Philips Image Guided Therapy Corporation Intravascular pressure sensor calibration
US9301687B2 (en) 2013-03-13 2016-04-05 Volcano Corporation System and method for OCT depth calibration
US20160030151A1 (en) 2013-03-14 2016-02-04 Volcano Corporation Filters with echogenic characteristics
US10219887B2 (en) 2013-03-14 2019-03-05 Volcano Corporation Filters with echogenic characteristics
US10292677B2 (en) 2013-03-14 2019-05-21 Volcano Corporation Endoluminal filter having enhanced echogenic properties
US9247921B2 (en) 2013-06-07 2016-02-02 The Trustees Of Columbia University In The City Of New York Systems and methods of high frame rate streaming for treatment monitoring
US10322178B2 (en) 2013-08-09 2019-06-18 The Trustees Of Columbia University In The City Of New York Systems and methods for targeted drug delivery
KR102185723B1 (en) * 2013-08-26 2020-12-02 삼성메디슨 주식회사 Ultrasonic apparatus for measuring stiffness of carotid artery and measuring method for the same
US10028723B2 (en) 2013-09-03 2018-07-24 The Trustees Of Columbia University In The City Of New York Systems and methods for real-time, transcranial monitoring of blood-brain barrier opening
JP6353038B2 (en) * 2013-10-07 2018-07-04 アシスト・メディカル・システムズ,インコーポレイテッド Signal processing for intravascular imaging
BR112015029058A2 (en) * 2014-01-30 2017-07-25 Koninklijke Philips Nv ultrasound system to detect a gas pocket in a region of interest, and method of detecting a gas pocket in a region of interest
CN104537645A (en) * 2014-12-15 2015-04-22 北京工业大学 ROI mark point matching method based on intravascular unltrasound image
US11369337B2 (en) 2015-12-11 2022-06-28 Acist Medical Systems, Inc. Detection of disturbed blood flow
KR102182489B1 (en) * 2016-07-26 2020-11-24 지멘스 메디컬 솔루션즈 유에스에이, 인크. Method of generating ultrasound image, ultrasound system and storage medium
EP3569154A1 (en) 2018-05-15 2019-11-20 Koninklijke Philips N.V. Ultrasound processing unit and method, and imaging system
US11911213B2 (en) * 2019-06-03 2024-02-27 General Electric Company Techniques for determining ultrasound probe motion
US11024034B2 (en) 2019-07-02 2021-06-01 Acist Medical Systems, Inc. Image segmentation confidence determination
JP7157098B2 (en) * 2019-10-30 2022-10-19 i-PRO株式会社 Angioscope system and method for measuring blood vessel diameter
CN111856474B (en) * 2020-07-30 2023-07-25 重庆大学 Subarray-based space-time domain conditional coherence coefficient ultrasonic imaging method

Family Cites Families (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4063549A (en) * 1975-12-22 1977-12-20 Technicon Instruments Corporation Ultrasonic method and apparatus for imaging and characterization of bodies
GB1590950A (en) * 1976-12-11 1981-06-10 Emi Ltd System for comparing representations of a scene
US4581581A (en) * 1983-06-30 1986-04-08 General Electric Company Method of projection reconstruction imaging with reduced sensitivity to motion-related artifacts
US4803990A (en) * 1985-12-03 1989-02-14 U.S. Philips Corporation Examining moving objects by ultrasound echograpy
US4794931A (en) * 1986-02-28 1989-01-03 Cardiovascular Imaging Systems, Inc. Catheter apparatus, system and method for intravascular two-dimensional ultrasonography
US5000185A (en) * 1986-02-28 1991-03-19 Cardiovascular Imaging Systems, Inc. Method for intravascular two-dimensional ultrasonography and recanalization
US5582178A (en) * 1986-02-28 1996-12-10 Cardiovascular Imaging Systems, Inc. Method and apparatus for intravascular ultrasonography
US4841977A (en) * 1987-05-26 1989-06-27 Inter Therapy, Inc. Ultra-thin acoustic transducer and balloon catheter using same in imaging array subassembly
US5040225A (en) * 1987-12-07 1991-08-13 Gdp, Inc. Image analysis method
US4951677A (en) * 1988-03-21 1990-08-28 Prutech Research And Development Partnership Ii Acoustic imaging catheter and the like
GB2233094B (en) * 1989-05-26 1994-02-09 Circulation Res Ltd Methods and apparatus for the examination and treatment of internal organs
JPH04146737A (en) * 1990-10-11 1992-05-20 Toshiba Corp Ultrasonic diagnostic device
US5445155A (en) * 1991-03-13 1995-08-29 Scimed Life Systems Incorporated Intravascular imaging apparatus and methods for use and manufacture
US5353798A (en) * 1991-03-13 1994-10-11 Scimed Life Systems, Incorporated Intravascular imaging apparatus and methods for use and manufacture
US5608849A (en) * 1991-08-27 1997-03-04 King, Jr.; Donald Method of visual guidance for positioning images or data in three-dimensional space
JP3043873B2 (en) * 1991-11-29 2000-05-22 フクダ電子株式会社 Ultrasonic aperture synthesis system
US5217456A (en) * 1992-02-24 1993-06-08 Pdt Cardiovascular, Inc. Device and method for intra-vascular optical radial imaging
JP3213766B2 (en) * 1992-03-16 2001-10-02 株式会社日立製作所 Replicate file update system
US5453575A (en) * 1993-02-01 1995-09-26 Endosonics Corporation Apparatus and method for detecting blood flow in intravascular ultrasonic imaging
US5331964A (en) * 1993-05-14 1994-07-26 Duke University Ultrasonic phased array imaging system with high speed adaptive processing using selected elements
US5419328A (en) * 1993-08-09 1995-05-30 Hewlett-Packard Company Mean squared speed and myocardial performance
JP3045642B2 (en) * 1994-01-25 2000-05-29 アロカ株式会社 Ultrasound diagnostic equipment
US5363849A (en) * 1994-01-26 1994-11-15 Cardiovascular Imaging Systems, Inc. Enhancing intravascular ultrasonic blood vessel image
US5363850A (en) * 1994-01-26 1994-11-15 Cardiovascular Imaging Systems, Inc. Method for recognition and reduction of blood speckle in blood vessel imaging system
NO943269D0 (en) * 1994-09-02 1994-09-02 Vingmed Sound As Method for analyzing and measuring ultrasound signals
US5683451A (en) * 1994-06-08 1997-11-04 Cardiovascular Concepts, Inc. Apparatus and methods for deployment release of intraluminal prostheses
GB2296565B (en) * 1994-12-23 1999-06-16 Intravascular Res Ltd Ultrasound imaging
NO943696D0 (en) * 1994-10-04 1994-10-04 Vingmed Sound As Method of ultrasound imaging
US5538004A (en) * 1995-02-28 1996-07-23 Hewlett-Packard Company Method and apparatus for tissue-centered scan conversion in an ultrasound imaging system
US5575286A (en) * 1995-03-31 1996-11-19 Siemens Medical Systems, Inc. Method and apparatus for generating large compound ultrasound image
US5899861A (en) * 1995-03-31 1999-05-04 Siemens Medical Systems, Inc. 3-dimensional volume by aggregating ultrasound fields of view
US5655535A (en) * 1996-03-29 1997-08-12 Siemens Medical Systems, Inc. 3-Dimensional compound ultrasound field of view
JPH08280684A (en) * 1995-04-18 1996-10-29 Fujitsu Ltd Ultrasonic diagnostic apparatus
US5752522A (en) * 1995-05-04 1998-05-19 Cardiovascular Concepts, Inc. Lesion diameter measurement catheter and method
US5596990A (en) * 1995-06-06 1997-01-28 Yock; Paul Rotational correlation of intravascular ultrasound image with guide catheter position
US5566674A (en) * 1995-06-30 1996-10-22 Siemens Medical Systems, Inc. Method and apparatus for reducing ultrasound image shadowing and speckle
US5623929A (en) * 1995-06-30 1997-04-29 Siemens Medical Systems, Inc. Ultrasonic doppler flow imaging method for eliminating motion artifacts
US5503153A (en) * 1995-06-30 1996-04-02 Siemens Medical Systems, Inc. Noise suppression method utilizing motion compensation for ultrasound images
US5601085A (en) * 1995-10-02 1997-02-11 Nycomed Imaging As Ultrasound imaging
US5771895A (en) * 1996-02-12 1998-06-30 Slager; Cornelis J. Catheter for obtaining three-dimensional reconstruction of a vascular lumen and wall
US5830145A (en) * 1996-09-20 1998-11-03 Cardiovascular Imaging Systems, Inc. Enhanced accuracy of three-dimensional intraluminal ultrasound (ILUS) image reconstruction
US5724978A (en) * 1996-09-20 1998-03-10 Cardiovascular Imaging Systems, Inc. Enhanced accuracy of three-dimensional intraluminal ultrasound (ILUS) image reconstruction
US5876345A (en) * 1997-02-27 1999-03-02 Acuson Corporation Ultrasonic catheter, system and method for two dimensional imaging or three-dimensional reconstruction
US5971895A (en) * 1997-09-26 1999-10-26 Precor Incorporated Combined press and row exercise arm
US5885218A (en) * 1997-11-07 1999-03-23 Scimed Life Systems, Inc. Method and apparatus for spatial filtering in an intravascular ultrasound imaging system
US5921934A (en) * 1997-11-25 1999-07-13 Scimed Life Systems, Inc. Methods and apparatus for non-uniform rotation distortion detection in an intravascular ultrasound imaging system

Also Published As

Publication number Publication date
PL326831A1 (en) 1998-12-21
KR19990007305A (en) 1999-01-25
EP0885594B1 (en) 2003-04-09
EE9800196A (en) 1999-02-15
US6152878A (en) 2000-11-28
GB2326479B (en) 2002-04-10
EP0885594A2 (en) 1998-12-23
EP0885594A3 (en) 1999-10-27
AR015123A1 (en) 2001-04-18
SK87098A3 (en) 2000-02-14
DE69827857T2 (en) 2005-12-15
IL125000A (en) 2004-05-12
DE69813087D1 (en) 2003-05-15
DE19827460A1 (en) 1998-12-24
ATE236574T1 (en) 2003-04-15
HK1050242A1 (en) 2003-06-13
AU8032298A (en) 1999-01-04
GB2326479A (en) 1998-12-23
IL125000A0 (en) 1999-01-26
UA57011C2 (en) 2003-06-16
EP1227342A1 (en) 2002-07-31
US6095976A (en) 2000-08-01
EP1227342B1 (en) 2004-11-24
ATE283497T1 (en) 2004-12-15
AU757314B2 (en) 2003-02-13
NO982817D0 (en) 1998-06-18
BR9814778A (en) 2001-06-26
PT885594E (en) 2003-08-29
JPH11151246A (en) 1999-06-08
DE69813087T2 (en) 2004-02-05
GB9813217D0 (en) 1998-08-19
ES2192290T3 (en) 2003-10-01
AU7307498A (en) 1999-01-14
EE04166B1 (en) 2003-10-15
DK0885594T3 (en) 2003-07-14
RU2238041C2 (en) 2004-10-20
NO982817L (en) 1998-12-21
WO1998057580A1 (en) 1998-12-23
SG68672A1 (en) 1999-11-16
CZ190198A3 (en) 1999-10-13
DE69827857D1 (en) 2004-12-30

Similar Documents

Publication Publication Date Title
US6095976A (en) Method for enhancing an image derived from reflected ultrasound signals produced by an ultrasound transmitter and detector inserted in a bodily lumen
US5329929A (en) Ultrasonic diagnostic apparatus
US8233718B2 (en) System and method for identifying a vascular border
US8480582B2 (en) Image processing apparatus and ultrasonic diagnosis apparatus
NZ330692A (en) Intravascular ultrasound imaging, image enhancement to compensate for movement of ultrasound probe and lumen
US8094893B2 (en) Segmentation tool for identifying flow regions in an image system
JP4676334B2 (en) Biological signal monitoring device
Adam et al. Semiautomated Border Tracking of Cine Echocardiographic Ventnrcular Images
EP1189074A2 (en) Method and apparatus for locking sample volume onto moving vessel in pulsed doppler ultrasound imaging
EP2405818A1 (en) Automatic analysis of cardiac m-mode views
WO2004054447A1 (en) Ultrasonic apparatus for estimating artery parameters
JPH07178086A (en) Method for ultrasonic diagnosis and system therefor
JP2003517912A (en) Ultrasound image processing method and inspection system for displaying ultrasonic composite image sequence of artery
Hernàndez-Sabaté et al. Approaching artery rigid dynamics in IVUS
WO2000036433A1 (en) Ultrasound process and apparatus for the determination of the location of a parietal surface in a tissue and of the absolute radius of an artery
JP2004195253A (en) Ultrasonic video system
MXPA98005013A (en) Improved processing of images and signals of ultrasound intravascu
Hernàndez-Sabaté et al. The Benefits of IVUS Dynamics for Retrieving Stable Models of Arteries
Norenberg Analysis of time variations of cardiac ultrasound image sequences
JPH04117951A (en) Ultrasonic diagnostic device
Klingensmith Assessment of coronary plaque volume using three-dimensional intravascular ultrasound

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued
FZDE Discontinued

Effective date: 20080114