WO2001016886A2 - Non-rigid motion image analysis - Google Patents

Non-rigid motion image analysis Download PDF

Info

Publication number
WO2001016886A2
WO2001016886A2 PCT/GB2000/002767 GB0002767W WO0116886A2 WO 2001016886 A2 WO2001016886 A2 WO 2001016886A2 GB 0002767 W GB0002767 W GB 0002767W WO 0116886 A2 WO0116886 A2 WO 0116886A2
Authority
WO
WIPO (PCT)
Prior art keywords
boundary
space
sequence
shape
spline curve
Prior art date
Application number
PCT/GB2000/002767
Other languages
French (fr)
Other versions
WO2001016886A3 (en
Inventor
Julia Alison Noble
Gary Jacob
Original Assignee
Isis Innovation Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isis Innovation Limited filed Critical Isis Innovation Limited
Priority to US10/069,291 priority Critical patent/US7043063B1/en
Priority to JP2001520357A priority patent/JP2003508139A/en
Priority to EP00946145A priority patent/EP1212729A2/en
Publication of WO2001016886A2 publication Critical patent/WO2001016886A2/en
Publication of WO2001016886A3 publication Critical patent/WO2001016886A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/50Clinical applications
    • A61B6/508Clinical applications for non-human patients
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac

Definitions

  • the present invention relates to a method of analysing images of a deformable object undergoing non-rigid motion.
  • a method of analysing the image so that desired image features can be detected and tracked through the sequence, and so that the motion of the features can be automatically quantified and analysed.
  • MUGA multi- gated acquisition scanning
  • CT fast computed tomography
  • PET positron emission tomograph
  • MRI magnetic resonance imaging
  • echocardiography i.e. ultrasound imaging
  • MUGA multi- gated acquisition scanning
  • CT fast computed tomography
  • PET positron emission tomograph
  • MRI magnetic resonance imaging
  • echocardiography i.e. ultrasound imaging
  • Figures 1(a) to (d) show a typical set of echocardiographic images.
  • Figure 1(a) shows an image digitized from a video recording;
  • Figure 1(b) shows an image obtained from one of the latest ultrasound machines;
  • Figure 1(c) shows a stress echocardiography image of a patient at rest and
  • Figure 1(d) shows the same patient as in Figure 1(c) at a peak dose of dobutamine (a drug which mimics the effects of exercise). It will be appreciated that identifying the desired regions of the ventricle is difficult for a human, and that automatic analysis is even more difficult.
  • the position of the endocardial wall in each image is regarded as being composed of a time varying departure from a defined position e.g. the initial position, the departure being characterised as a time-varying weighted sum of certain basic types of motion of the contour.
  • a very simple shape-space would characterise the motion of an object as consisting of a certain proportion of rotation and a certain proportion of translation compared to a defined position. Then the only thing which varies with time is the relative amount of the rotation and translation. In analysing echocardiograms a more complicated shape space has been found to be necessary.
  • PCA principal component analysis
  • the clinician is required first to examine the frames of the image sequence and manually to locate and trace in a few of the frames the endocardial wall. For instance, in a sequence of 60-80 frames the clinician could manually "draw around" the endocardial boundary every fifth frame. A B-spline curve is then fitted to the manually traced contours to provide an approximation of them and a principal component analysis is performed to find the define components of the motion of the contour through the image sequence. Then the whole sequence is reprocessed so that starting from a predefined initial position the position of the endocardial wall in each frame is predicted based on the position in the preceding two frames and the PCA results.
  • the prediction is corrected in each frame by searching for image features (such as intensity edges) representing the actual position of the endocardial wall.
  • image features such as intensity edges
  • the B-spline curve for each frame can be displayed overlying the image on that frame so that when the sequence is displayed the contour appears to track the endocardial wall through the sequence.
  • the shape of a B-spline curve is determined by the position of its control points.
  • the movement of a spline curve fitted to the endocardial wall through the image sequence can be expressed entirely as a change from frame to frame of the position of the control points of the spline curve.
  • the x and y coordinates of the control points are conventionally written in a matrix known as a spline-vector Q and as discussed above, the position of the control points in any frame of the sequence can be expressed as an offset from a defined position Q 0 .
  • the offset which is time-varying, can conveniently be separated into a time-varying part known as the shape-space vector X and a part representing the type of allowed motions (the main components of the motion), known as the shape matrix W (normally assumed to be constant).
  • the first step is that the clinician manually draws around the boundary in several frames, for instance every fifth frame. Then a B-spline curve is fitted to the drawn boundary using a user-defined number of control points.
  • Figure 3(a) shows a schematic representation of a quadratic approximating B-spline with 24 control points used to model the endocardial boundary.
  • Figure 3(b) illustrates the curve superimposed on a frame of an ultrasound image.
  • a principal component analysis is then performed on the positions of the control points in each of the frames segmented by the clinician to calculate Q 0 and W.
  • Figure 4 illustrates the process schematically.
  • the process involves predicting the position of the boundary each frame based on the boundary in the preceding frames.
  • the value of X the shape-vector
  • a search is performed around the predicted position to find image features representative of the actual position of the endocardial boundary.
  • the searches are performed along a plurality of normals spaced along the predicted curve and the image features are identified through known image processing operations, such as looking at the intensity variation along the search line.
  • the predicted position can be updated and the actual position of the contour (expressed through the position of the control points of the B-spline) is established.
  • FIG. 5 illustrates an example of a principal component analysis performed on four cardiac cycles of an ultrasound image sequence using a B-spline with 14 control points. The six most dominant modes are shown, each is shown as an initial template (thick solid line) and the change in shape represented by that component of the motion is indicated by the thin solid lines. The diagram in the top left of Figure 5 is the dominant mode and that in the bottom mode is the least dominant.
  • the first deformation mode in Figure 7(a) appears to be a scaling of the inferior part of the left ventricular boundary.
  • the corresponding plot of the weight of that component illustrates that this motion is basically periodic. But because this component of the motion affects more than just a single part of the boundary (all parts of the boundary move) it does not give a good idea of how any particular region of the wall is moving. Also, some of the information about the motion of that part of the boundary is "encoded" in the other components.
  • the endocardial boundary (the inner boundary) is a boundary between muscle and blood which have quite different acoustic impedances. In general this means that the endocardial boundary shows up well on an echocardiogram.
  • the epicardial boundary on the other hand is a tissue-tissue interface and so it is very difficult to trace on the image. Thus even having tracked the endocardial boundary, it is difficult to detect and quantitatively analyse the movement of the epicardial boundary.
  • the present invention provides techniques which are useful in solving these two problems. Although illustrated in use in analysing echocardiograms of the left ventricle the techniques are not limited to this. They are applicable to analysis of non-rigid motion in two or three dimensions of other deformable objections. Thus they have other medical and vetinary applications as well as being applicable to imaging deformable objects in general and are also applicable to image modalities other than ultrasound.
  • the first aspect of the present invention provides a method of analysing a sequence of images of an internal body organ in non-rigid motion, comprising the steps of: detecting the boundary of the organ in each image of the sequence; and automatically calculating the amount of movement through the sequence of each of a plurality of clinically significant segments of the detected boundary.
  • the amount of movement of each of the clinically significant segments can be displayed graphically, for instance as a graph. Further, an average of the amount of movement of that segment can be calculated as a single number representative of the amount of movement of that segment. It is also possible to calculate the variation in the amount of movement in a segment, the greater the variation, the more likely it is that only a part of that segment is normal. It is also possible to calculate and output the maximal excursion of the detected boundary during the motion, for each segment.
  • the boundary is detected and tracked by the technique of principal component analysis and fitting of a spline curve as described above.
  • the amount of movement of the segments can conveniently be found by calculating and outputing for each segment a measure of the amount of movement of the control points controlling the curve within that segment. This measure may be a simple average, or can be weighted in favour of the control points in the middle of each segment.
  • the variation in the amount of movement within the segment is conveniently found by comparing the amount of movement of the different spline curve control points for that segment.
  • the invention also contemplates the interpretation of a moving spline curve tracked in one shape-space by using a different shape-space.
  • a method of analysing a sequence of images of a deformable object in non-rigid motion comprising the steps of detecting a boundary of the object in each of a plurality of frames of the sequence, fitting a spline curve to the boundary in constructing a shape space representation of the movement of spline curve using a first shape space so that the spline curve tracks the boundary, and decomposing the tracking spline curve using a second different, shape space.
  • the different shape- space can be chosen to select a particular attribute of the motion. In other words, a motion tracked using one shape-space need not be interpreted in the same shape- space: a different one - an interpretational shape-space can be used.
  • Another aspect of the invention involves the modelling of the configuration of the wall of an object having two boundaries as seen in a sequence of images by developing a model of the distance between the two boundaries.
  • a model is constructed for the distance between them.
  • this aspect of the invention provides a method of analysing a sequence of images of a deformable object in non-rigid motion comprising detecting first and second boundaries of the object in a plurality of frames of the sequence and constructing a shape space representation of the variation through the sequence of the distance between the two boundaries.
  • the model can be a shape-space of the change in distance between the boundaries, and can be based on a principal component analysis of the way that distance changes.
  • the model can be improved by searching the images to find image features representative of the outer boundary.
  • Another aspect of the invention provides a method of analysing a sequence of images of a deformable object in non-rigid motion to detect inner and outer boundaries of a wall of the object, comprising the steps of: detecting the inner boundary; and searching outside the inner boundary for image features representing the outer boundary.
  • a spline curve is fitted to the detected image features representing the outer boundary, e.g. by: manually locating the inner and outer boundaries in only some images of the sequence, calculating a shape-space for the change through the sequence of the distance between the two boundaries, detecting the inner boundary and performing said search outside the inner boundary for image features representing the outer boundary in other images of the sequence; and fitting a spline curve to the detected image features in said other images of the sequence by using said shape-space.
  • a shape-space representing the distance between the two boundaries is obtained.
  • the use of this shape-space helps to ensure that the calculated outer boundary always lies outside the inner boundary.
  • the distance between the two boundaries in the manually treated frames can be subjected to a principal component analysis which is used as a basis for the shape space. This then provides a convenient model of the deformation of the object wall.
  • the search for the image features representing the outer boundary can be, for instance, by analysing changes in the image intensity, such as a maximum in the intensity, along search lines projected outwards from the inner boundary.
  • the plot of the intensity can be extremely noisy and it can be rather difficult to detect a clear maximum.
  • a wavelet decomposition of the profile of image intensity can be performed to smooth the profile.
  • detected image features can be down weighted if they imply a curvature of the outer boundary which is too high.
  • features can be weighted down if they imply a difference between the inner and outer boundaries which lies outside the shape-space.
  • the ventricular wall can be segmented according to the Figure 8 model and the degree of thickening for each separate segment can calculated and graphically displayed, as can the variation within each segment. It will be appreciated that the methods of the invention are conveniently embodied in a computer program, and thus the invention provides a computer program and computer system for performing the methods above.
  • Figures 1(a) to (d) show a typical set of echocardiographic images
  • Figures 2(a) to (d) illustrate define echocardiographic images and respective
  • Figure 3(a) shows a schematic representation of a quadratic approximating B- spline
  • Figure 3(b) illustrates the B-spline of Figure 3(a) superimposed on an ultrasound image
  • Figure 4 schematically illustrates a tracking process
  • Figure 5 illustrates components of a PCA decomposition
  • Figure 6 illustrates the components of Figure 5 in a different way
  • Figure 7(a) illustrates PCA-based components for an image sequence
  • Figure 7(b) shows the variation with time of the components of Figure 7(a);
  • Figure 8 illustrates the sixteen-segment anatomical model of the heart
  • Figure 9 illustrates schematically the positioning of control points for a B- spline and the position of the segments of the boundary
  • Figure 10(a) illustrates the time variation of PCA components in an image sequence
  • Figure 10(b) illustrates the variation with time of the components in an interpretational shape-space
  • Figure 11(a) shows plots corresponding to those in Figure 10, but including the 95% confidence interval for the components;
  • Figure 11(b) shows how the plots of Figure 11(a) are displayed to the clinician;
  • Figure 12 illustrates calculation of the excursion of the contours tracking the endocardial boundary
  • Figure 13(a) shows the maximum wall excursion for each segment in pixels and Figure 13(b) shows these values normalised
  • Figure 14 shows the displayed results of tracking the endocardial wall and epicardial wall in an image sequence
  • Figure 15 illustrates a Coiffman wavelet packet
  • Figure 16 illustrates the results of detecting the epicardial boundary by wavelet decomposition
  • Figure 17 illustrates the improved results using information assimilated from previous search lines
  • Figure 18(a) illustrates the results of the track endocardial and epicardial walls and Figure 18(b) illustrates the myocardium shaded;
  • Figure 19 illustrates the wall thickening for each segment through the image sequence
  • Figure 20 illustrates the variation in thickening within each segment
  • Figure 21 illustrates the thickening information as it might be displayed to a clinician
  • Figure 22(a) shows the normalised maximal endocardial wall excursion for the data of Figure 13 and the percentage myocardial thickening scores for the same data
  • Figure 22(b) illustrates the endocardial wall excursion and myocardial thickening scores in the form in which they are presented to the clinician.
  • PCA the tracking is based on a principal component analysis.
  • the time varying part is the shape-vector X which can be recovered using a pseudo inverse of the shape-space W PCA :- where Wp CA represents the pseudo inverse.
  • Figure 7 demonstrates that it is not easy to place any clinical significance on the time varying components of X.
  • the interpretational shape-space can be selected so that the components of X C ⁇ K represented the movement of only certain control points. For instance, if there are four control points for each segment as illustrated in Figure 9, then the interpretational shape-space can be defined so that the components of X resulting from the matrix multiplication will be an average of the amount of movement of the control points in each segment. To achieve this each row of ⁇ C N nas ° ur non-zero weights to pick-out the four x an ⁇ y coordinates of the four control points in a segment, the rest of the row being zeros. Thus the position 1 to 4 of the first row of W (LIN can be j , with the rest zero to form, after multiplication,
  • W ⁇ UN can be as follows:
  • the above interpretational shape-space is based on the use of four control points in each segment weighted equally. However, a different number of control points for each segment can be used. Further, because the control points at the end of each segment actually affect the spline curve in the next segment, it is also possible to adjust the value of the components in the interpretational shape-space to favour the control points in the middle of each segment. For instance instead of the components
  • Q and Q 0 are the average of Q and Q 0 respectively.
  • this interpretational shape-space means that the shape-space vector components are meaningful in clinical terms - they represent the average function of each segment of the left ventricular boundary.
  • Figure 10 compares plots of the time variation of the six components of a PCA shape-space, with the variation with time of the six components from the interpretational shape-space (i.e. the average position of the four control points for each segment).
  • Figure 10(a) illustrates the variation based on the PCA tracking shape-space
  • Figure 10(b) the interpretational shape-space.
  • the plots of Figure 10(b) can be clearly related to the underlying local motion of the left ventricular boundary.
  • Figure 10(a) demonstrates periodicity in principally the first three of the components, it is very difficult to relate these to the underlying motion of the left ventricular boundary because each component represents motion of the whole boundary.
  • Figure 10(b) shows that all six of the components of the new shape-space are periodic. It can be seen that the basal interior, mid-interior, mid-inferior and basal inferior segments all move normally. The observed smaller movement in the basal inferior region is in accordance with normal heart function. However the apical inferior and apical anterior segments, although moving periodically, have a reduced endocardial wall excursion. This is in accordance with the diagnosis that this subject has a myocardial infarct in the apical region. Consequently it will be understood that illustrating the component plots from the interpretational shape-space to the clinician, gives the clinician a valuable and immediately recognisable tool for assessing heart function.
  • an abnormal region of heart wall may not completely fill any single segment, but could be just one part of segment, and possibly be a part of another.
  • a measure of this can easily be derived by using the technique of this embodiment of the present invention by determining the variation in the amount of movement within each segment. In this embodiment this can be done by calculating the standard deviation in the movement of the four control points in each segment. The standard deviation is, of course, a measure of the variation in degree of movement between the different control points. If all control points moved by the same amount then the standard deviation will be low. If some move considerably more than others, the standard deviation will be high.
  • Figure 11(a) illustrates plots corresponding to those in Figure 10 but including the 95% confidence interval for the interpretational shape-space vector components. From Figure 11(a) it can be seen that the standard deviation is approximately the same in the basal interior, mid-inferior and basal inferior segments. However, the apical anterior and apical inferior segments show very noticeable increase in variation, particularly during the systolic stage of the heart cycle. This implies that not all of the apical inferior and apical anterior segments are abnormal. Figure 11(b) shows how these plots are displayed to the clinician.
  • a scoring system is provided by calculating the maximum displacement of the boundary segment. This is then normalised with respect to the maximum movement over all the six segments. The resulting number is representative of the relative maximum displacement of each segment of the endocardial wall. The wall excursion is calculated as the maximum (signed) excursion of the contour minus the minimum (signed) excursion of the contour. This is illustrated in Figure 12. The peak-to-peak component plot is measured. Figure 13 illustrates these values for the data from Figure 10.
  • Figure 13(a) shows the maximum wall excursion in pixels, and
  • Figure 13(b) shows the values normalised by dividing by the largest of the six values. The relative magnitude of these values is consistent with the diagnosis of a myocardial infarct in the apical interior and apical anterior regions.
  • the system can further provide a way of tracking the outer boundary of the left ventricle, the epicardial wall, and of quantifying myocardial wall thickening by measuring the distance between the tracked and endocardial and epicardial walls.
  • step 3 The fitting algorithm for step 3 (taken from Blake, A. and Isard, M. (1998) “Active Contours", Springer) is as follows:
  • Figure 14 illustrates the results of applying these techniques to an image sequence to detect the epicardium for frames 25, 96, 104, 109 and 114 of a sequence.
  • the wavelet chosen for this embodiment was the Coiffman wavelet series. These orthogonal, compactly supported wavelets feature the highest number of vanishing moments for a given filter length.
  • several packets within the Coiffman set have a spatial distribution that is particularly suitable for evaluating ridge-like structures (such as the epicardium) at low resolutions.
  • Each of the profiles (such as those shown in Figure 15) are initially dyadically-decomposed, and then reconstructed using the best-basis algorithm.
  • Each packet within this optimised set is then compared with the total best-basis reconstruction to determine which decompositions contribute most strongly to the original profile signal.
  • the Coiffman wavelet with a filter length of six was chosen as it gave accurate results without imposing too heavy a computational burden.
  • Figure 21 illustrates the data as it might be presented to a clinician.
  • the endocardial and epicardial walls are overlayed on the image and track movement of the ventricle.
  • Plots of endocardial wall excursion and myocardial thickening, including the variation within each segment are illustrated alongside.
  • the clinician can easily recognise from the plots abnormal areas of the left ventricular wall.
  • a single numerical score representative of wall thickening can also be calculated as follows :-

Abstract

A method of automatically detecting and tracking the endocardial and epicardial boundaries of the left ventricle in an echocardiographic image sequence. The endocardial boundary is manually located in some frames of the image sequence, a B-spline curve is fitted to the manually located boundary and a shape-space for the deformation of the boundary through the sequence is calculated by a principal component analysis (PCA) of the motion. The location of the endocardial boundary for all frames in the sequences is then predicted using the shape-space and this prediction is adjusted by searching for image features, such as sharp changes in intensity, in the vicinity of the prediction. The amount of movement of the endocardial boundary in each clinically significant segment of the ventricular wall is obtained by measuring the degree of movement of the control points for the spline in that segment, and also monitoring the variation in the amount of movement between the control points for each spline.

Description

Non-Rigid Motion Image Analysis
The present invention relates to a method of analysing images of a deformable object undergoing non-rigid motion. In particular it relates to a method of analysing the image so that desired image features can be detected and tracked through the sequence, and so that the motion of the features can be automatically quantified and analysed.
Basic techniques for analysing images of objects in motion are relatively straightforward for rigid motion, i.e. where the object does not itself deform.
However, the analysis of non-rigid motion, where the object deforms in time, is more difficult. Such non-rigid motion occurs in many situations, but a typical one is in the medical or vetinary imaging field where organs of the human or animal body are imaged in real-time. As well as the problem created by the non-rigid motion of the organ being analysed, these imaging applications in particular have the additional problem that the images are very noisy. It is often difficult even for trained operators to find desired image features in the image, and thus reliable automatic detection of the features presents considerable difficulty.
In the field of cardiac imaging, various techniques are used such as multi- gated acquisition scanning (MUGA), fast computed tomography (CT), positron emission tomograph (PET), magnetic resonance imaging (MRI) and echocardiography (i.e. ultrasound imaging). Of these echocardiography is the most widely used because the imaging equipment is relatively cheap to obtain and maintain and the equipment is relatively portable. In assessing cardiac function the performance of the left or right ventricle of the heart is particularly significant and there has been an increasing interest in obtaining ventricular measurements, such as the chamber dimensions, area, volume and ejection fraction. To provide a more accurate picture of the ventricular function, and in particular to enable assessment of abnormalities which occur in only parts of the ventricular wall, two particular aspects of the motion of the ventricle have proved significant. These are endocardial wall motion, also referred to as wall excursion, and myocardial wall thickening. It has been determined that when the heart muscle becomes ischemic (i.e. deficient in blood), its motion is altered almost immediately. Because abnormalities can be confined to particular parts of the ventricular wall, a systematic method for the assessment of wall motion involves the segmentation of the surface into a number of different segments. The wall can be segmented in various ways, but a useful method is the sixteen-segment anatomical model of the heart proposed by the American Society of Echocardiography and illustrated in Figure 8 of the accompanying drawings. This is useful in assessing the images derived from two-dimensional echocardiography. Figure 8(d) shows an example of a view which is used for analysis in the techniques described below. In assessing cardiac function for instance of the left ventricle clinicians examine the motion and thickening of each segment and try to assess visually the motion of each segment through the heart cycle. One scoring scheme requires the clinician to score each segment as follows:
Score Grading Characterized by
1 Normal A uniform increase in wall excursion and thickening
2 Hypokinetic A reduced (<5 mm) inward systolic wall motion
3 Akinetic An absence of inward motion and thickening 4 Dyskinetic Systolic thinning and outward systolic wall motion
However, this scoring scheme is highly subjective. Thus clinical reporting of echocardiography examination is highly operator-dependent and basically qualitative. Also each segment must be classified as normal or abnormal, so it is difficult for a clinician to indicate within the scoring system subtleties such as only part of a segment being abnormal.
While there is therefore a clear need for an automatic method of detecting and quantifying wall motion and wall thickening, the images are extremely difficult to assess automatically. In the accompanying drawings Figures 1(a) to (d) show a typical set of echocardiographic images. Figure 1(a) shows an image digitized from a video recording; Figure 1(b) shows an image obtained from one of the latest ultrasound machines; Figure 1(c) shows a stress echocardiography image of a patient at rest and Figure 1(d) shows the same patient as in Figure 1(c) at a peak dose of dobutamine (a drug which mimics the effects of exercise). It will be appreciated that identifying the desired regions of the ventricle is difficult for a human, and that automatic analysis is even more difficult.
Automatic boundary detection of regions in an ultrasound image is available on certain machines manufactured by Hewlett-Packard by a technique known as acoustic quantification (AQ). This technique discriminates boundaries prior to image formation by using discontinuities in the signal returning from the tissue. Pixels with an intensity gradient above a user-defined threshold are marked as boundary points. Pixels labelled as boundary points are then joined together to form connected boundaries. However, Figures 2(a)-(d) show that this technique is not always useful. Figures 2(a) and (c) show the basic echocardiographic image, and Figures 2(b) and 2(d) show the corresponding respective AQ images at different thresholds. It can be seen that the illustrated boundaries do not help assessment of the image at all because they do not accurately follow the real boundaries.
A better method for detecting the inner boundary of the left ventricle (the endocardium) is proposed in the paper "Evaluating A Robust Contour Tracker On Echocardiographic Sequences" by Jacob, Noble, Mulet-Parada and Blake, published in Medical Image Analysis (1997/8 volume 3, number 1, pp 63-75) which is hereby incorporated by reference. As proposed there the inner boundary, endocardium, is modelled by a non-rigid contour (a B-spline) and the variation in the shape of this contour through the echocardiographic sequence (i.e. as the heart contracts and expands) is represented by using a shape-space. This means that the position of the endocardial wall in each image is regarded as being composed of a time varying departure from a defined position e.g. the initial position, the departure being characterised as a time-varying weighted sum of certain basic types of motion of the contour. For instance, a very simple shape-space would characterise the motion of an object as consisting of a certain proportion of rotation and a certain proportion of translation compared to a defined position. Then the only thing which varies with time is the relative amount of the rotation and translation. In analysing echocardiograms a more complicated shape space has been found to be necessary. The paper referred to above uses a principal component analysis (PCA) of the motion of the endocardial wall to find a set of define motions which can efficiently be used as components to approximate the actual motion of the endocardial wall. Again, the only thing which varies through the sequence is the relative weight of the different define motions in each image.
In this technique for detecting and tracking the endocardial wall, the clinician is required first to examine the frames of the image sequence and manually to locate and trace in a few of the frames the endocardial wall. For instance, in a sequence of 60-80 frames the clinician could manually "draw around" the endocardial boundary every fifth frame. A B-spline curve is then fitted to the manually traced contours to provide an approximation of them and a principal component analysis is performed to find the define components of the motion of the contour through the image sequence. Then the whole sequence is reprocessed so that starting from a predefined initial position the position of the endocardial wall in each frame is predicted based on the position in the preceding two frames and the PCA results. The prediction is corrected in each frame by searching for image features (such as intensity edges) representing the actual position of the endocardial wall. When this process is complete, the B-spline curve for each frame can be displayed overlying the image on that frame so that when the sequence is displayed the contour appears to track the endocardial wall through the sequence.
Illustrating this in more detail, it will be recalled that the shape of a B-spline curve is determined by the position of its control points. Thus the movement of a spline curve fitted to the endocardial wall through the image sequence can be expressed entirely as a change from frame to frame of the position of the control points of the spline curve. The x and y coordinates of the control points are conventionally written in a matrix known as a spline-vector Q and as discussed above, the position of the control points in any frame of the sequence can be expressed as an offset from a defined position Q0. The offset, which is time-varying, can conveniently be separated into a time-varying part known as the shape-space vector X and a part representing the type of allowed motions (the main components of the motion), known as the shape matrix W (normally assumed to be constant). Thus, in matrix notation: -
Q = Q0+ WX
In order to find the spline curve which fits to the endocardial boundary in every frame of the sequence (which amounts to finding the position of the control points of the curve in every frame of the sequence) the first step is that the clinician manually draws around the boundary in several frames, for instance every fifth frame. Then a B-spline curve is fitted to the drawn boundary using a user-defined number of control points. Figure 3(a) shows a schematic representation of a quadratic approximating B-spline with 24 control points used to model the endocardial boundary. Figure 3(b) illustrates the curve superimposed on a frame of an ultrasound image. A principal component analysis is then performed on the positions of the control points in each of the frames segmented by the clinician to calculate Q0 and W. The aim then is to find the position of the endocardial boundary in all of the frames of the sequence automatically, i.e. without requiring the clinician manually to draw around them. Figure 4 illustrates the process schematically. The process involves predicting the position of the boundary each frame based on the boundary in the preceding frames. In other words the value of X (the shape-vector) which is the only time varying part is predicted based on the value in the preceding two frames. Then a search is performed around the predicted position to find image features representative of the actual position of the endocardial boundary. In this technique the searches are performed along a plurality of normals spaced along the predicted curve and the image features are identified through known image processing operations, such as looking at the intensity variation along the search line. When the image features corresponding to the boundary have been found the predicted position can be updated and the actual position of the contour (expressed through the position of the control points of the B-spline) is established. It will be understood that having represented the endocardial boundary as a
B-spline, the only time varying part through the sequence is the position of the control points and, in the shape-space representation, the shape-vector X. It will be recalled that the elements of X are the weights of the different types of motion (i.e. the different principal components of the motion) found in the principal component analysis. Figure 5 illustrates an example of a principal component analysis performed on four cardiac cycles of an ultrasound image sequence using a B-spline with 14 control points. The six most dominant modes are shown, each is shown as an initial template (thick solid line) and the change in shape represented by that component of the motion is indicated by the thin solid lines. The diagram in the top left of Figure 5 is the dominant mode and that in the bottom mode is the least dominant. The deformation represented by each mode is shown in an alternative way in Figure 6 where a flow vector is centred at the start of each span of the spline curve and shows the movement of that part of the curve. Thus the motion of the contour (spline curve) through the image sequence which was analysed can be expressed as a sum of these motions. The position of the curve in any frame represents a certain weighted sum of these motions. The weights are the values of the components of X and thus the motion can be expressed entirely by looking at the time variation of the components of X. The variation of X with time is illustrated for an echocardiogram sequence in Figure 7. Figure 7(a) shows the PCA based components for this sequence and Figure 7(b) shows the values of the weights versus time of each of those components. It is, however, difficult to interpret clinically the significance of these components and weights. For example, the first deformation mode in Figure 7(a) (top left) appears to be a scaling of the inferior part of the left ventricular boundary. The corresponding plot of the weight of that component illustrates that this motion is basically periodic. But because this component of the motion affects more than just a single part of the boundary (all parts of the boundary move) it does not give a good idea of how any particular region of the wall is moving. Also, some of the information about the motion of that part of the boundary is "encoded" in the other components.
Thus although the principal component analysis, which gives the component magnitudes of the shape-space vector X is very useful in tracking, it does not provide a good basis for automatic interpretation of the results. It was mentioned above that wall-thickening, known as myocardial thickening is also a clinically significant factor in assessing the condition of the heart. As the heart is beating the ventricle expands and contracts, predominately by periodic thickening of the ventricular wall. If the wall fails to thicken then the ventricular volume will not change by so great an amount and the pumping of blood will be reduced. Thus it would be advantageous to be able to quantitatively analyse the degree of thickening of the ventricular wall. It may be thought that this could straightforwardly be done by detecting the outer (epicardial) boundary of the ventricular wall, in just the same way as the inner (endocardial) boundary is detected above. However, the endocardial boundary (the inner boundary) is a boundary between muscle and blood which have quite different acoustic impedances. In general this means that the endocardial boundary shows up well on an echocardiogram. The epicardial boundary on the other hand is a tissue-tissue interface and so it is very difficult to trace on the image. Thus even having tracked the endocardial boundary, it is difficult to detect and quantitatively analyse the movement of the epicardial boundary.
The present invention provides techniques which are useful in solving these two problems. Although illustrated in use in analysing echocardiograms of the left ventricle the techniques are not limited to this. They are applicable to analysis of non-rigid motion in two or three dimensions of other deformable objections. Thus they have other medical and vetinary applications as well as being applicable to imaging deformable objects in general and are also applicable to image modalities other than ultrasound.
The first aspect of the present invention provides a method of analysing a sequence of images of an internal body organ in non-rigid motion, comprising the steps of: detecting the boundary of the organ in each image of the sequence; and automatically calculating the amount of movement through the sequence of each of a plurality of clinically significant segments of the detected boundary.
The amount of movement of each of the clinically significant segments, which can be the segments illustrated in Figure 8, preferably those in Figure 8(d), can be displayed graphically, for instance as a graph. Further, an average of the amount of movement of that segment can be calculated as a single number representative of the amount of movement of that segment. It is also possible to calculate the variation in the amount of movement in a segment, the greater the variation, the more likely it is that only a part of that segment is normal. It is also possible to calculate and output the maximal excursion of the detected boundary during the motion, for each segment.
Preferably the boundary is detected and tracked by the technique of principal component analysis and fitting of a spline curve as described above.
The amount of movement of the segments can conveniently be found by calculating and outputing for each segment a measure of the amount of movement of the control points controlling the curve within that segment. This measure may be a simple average, or can be weighted in favour of the control points in the middle of each segment.
The variation in the amount of movement within the segment is conveniently found by comparing the amount of movement of the different spline curve control points for that segment.
These measures can easily be obtained from the position of the control points in each frame of the sequence by defining a new shape-space, different from that used in the tracking process, and calculating from the control points the shape-vector corresponding to the different shape-space. The new shape-space can be selected to ensure that each component of the shape-vector represents the amount of movement of control points in a single clinically significant segment only. Then displaying graphically the time varying components of the new shape-vector gives a good indication of the motion of that segment. This aspect of the invention is particularly applicable to analysing ultrasound images of a heart, e.g. of the left or right ventricle.
The invention also contemplates the interpretation of a moving spline curve tracked in one shape-space by using a different shape-space. Thus another aspect of the invention provides A method of analysing a sequence of images of a deformable object in non-rigid motion comprising the steps of detecting a boundary of the object in each of a plurality of frames of the sequence, fitting a spline curve to the boundary in constructing a shape space representation of the movement of spline curve using a first shape space so that the spline curve tracks the boundary, and decomposing the tracking spline curve using a second different, shape space. The different shape- space can be chosen to select a particular attribute of the motion. In other words, a motion tracked using one shape-space need not be interpreted in the same shape- space: a different one - an interpretational shape-space can be used.
Another aspect of the invention involves the modelling of the configuration of the wall of an object having two boundaries as seen in a sequence of images by developing a model of the distance between the two boundaries. Thus rather than modelling each boundary separately, a model is constructed for the distance between them. Thus this aspect of the invention provides a method of analysing a sequence of images of a deformable object in non-rigid motion comprising detecting first and second boundaries of the object in a plurality of frames of the sequence and constructing a shape space representation of the variation through the sequence of the distance between the two boundaries. The model can be a shape-space of the change in distance between the boundaries, and can be based on a principal component analysis of the way that distance changes. The model can be improved by searching the images to find image features representative of the outer boundary. The model avoids incorrect results such as the outer boundary crossing inside the inner boundary. Another aspect of the invention provides a method of analysing a sequence of images of a deformable object in non-rigid motion to detect inner and outer boundaries of a wall of the object, comprising the steps of: detecting the inner boundary; and searching outside the inner boundary for image features representing the outer boundary. Thus because it is known that the inner boundary will be inside the outer boundary, this provides a useful start point for the search of the image features representing the outer boundary.
Preferably a spline curve is fitted to the detected image features representing the outer boundary, e.g. by: manually locating the inner and outer boundaries in only some images of the sequence, calculating a shape-space for the change through the sequence of the distance between the two boundaries, detecting the inner boundary and performing said search outside the inner boundary for image features representing the outer boundary in other images of the sequence; and fitting a spline curve to the detected image features in said other images of the sequence by using said shape-space. Thus in this method a shape-space representing the distance between the two boundaries is obtained. The use of this shape-space helps to ensure that the calculated outer boundary always lies outside the inner boundary. The distance between the two boundaries in the manually treated frames can be subjected to a principal component analysis which is used as a basis for the shape space. This then provides a convenient model of the deformation of the object wall.
The search for the image features representing the outer boundary can be, for instance, by analysing changes in the image intensity, such as a maximum in the intensity, along search lines projected outwards from the inner boundary. In certain images the plot of the intensity can be extremely noisy and it can be rather difficult to detect a clear maximum. In this case a wavelet decomposition of the profile of image intensity can be performed to smooth the profile.
To obtain a better fit to the actual outer boundary, it is possible to apply extra conditions to the fitting. For instance, detected image features can be down weighted if they imply a curvature of the outer boundary which is too high. Similarly, features can be weighted down if they imply a difference between the inner and outer boundaries which lies outside the shape-space.
This, technique is particularly useful for the analysis of ultrasound images, in particular of the heart, e.g. of the left or right ventricle. In that case the distance between the inner and outer boundaries represents the myocardial thickness and the change in that thickness through an image sequence is indicative of the condition of the heart.
Again, the ventricular wall can be segmented according to the Figure 8 model and the degree of thickening for each separate segment can calculated and graphically displayed, as can the variation within each segment. It will be appreciated that the methods of the invention are conveniently embodied in a computer program, and thus the invention provides a computer program and computer system for performing the methods above.
The invention will be further described by way of a non-limitative example with reference to the accompanying drawings in which: -
Figures 1(a) to (d) show a typical set of echocardiographic images; Figures 2(a) to (d) illustrate define echocardiographic images and respective
AQ images;
Figure 3(a) shows a schematic representation of a quadratic approximating B- spline;
Figure 3(b) illustrates the B-spline of Figure 3(a) superimposed on an ultrasound image;
Figure 4 schematically illustrates a tracking process;
Figure 5 illustrates components of a PCA decomposition;
Figure 6 illustrates the components of Figure 5 in a different way;
Figure 7(a) illustrates PCA-based components for an image sequence; Figure 7(b) shows the variation with time of the components of Figure 7(a);
Figure 8 illustrates the sixteen-segment anatomical model of the heart;
Figure 9 illustrates schematically the positioning of control points for a B- spline and the position of the segments of the boundary;
Figure 10(a) illustrates the time variation of PCA components in an image sequence;
Figure 10(b) illustrates the variation with time of the components in an interpretational shape-space;
Figure 11(a) shows plots corresponding to those in Figure 10, but including the 95% confidence interval for the components; Figure 11(b) shows how the plots of Figure 11(a) are displayed to the clinician;
Figure 12 illustrates calculation of the excursion of the contours tracking the endocardial boundary;
Figure 13(a) shows the maximum wall excursion for each segment in pixels and Figure 13(b) shows these values normalised;
Figure 14 shows the displayed results of tracking the endocardial wall and epicardial wall in an image sequence;
Figure 15 illustrates a Coiffman wavelet packet; Figure 16 illustrates the results of detecting the epicardial boundary by wavelet decomposition; Figure 17 illustrates the improved results using information assimilated from previous search lines;
Figure 18(a) illustrates the results of the track endocardial and epicardial walls and Figure 18(b) illustrates the myocardium shaded;
Figure 19 illustrates the wall thickening for each segment through the image sequence;
Figure 20 illustrates the variation in thickening within each segment; Figure 21 illustrates the thickening information as it might be displayed to a clinician;
Figure 22(a) shows the normalised maximal endocardial wall excursion for the data of Figure 13 and the percentage myocardial thickening scores for the same data; and
Figure 22(b) illustrates the endocardial wall excursion and myocardial thickening scores in the form in which they are presented to the clinician.
First of all an embodiment of the invention will be described which is for providing clinically significant quantitative data from an echocardiographic image sequence in which the endocardial wall of the left ventricle has been tracked, for instance using the technique described in Jacob et al mentioned above. It will be recalled that the tracking of the endocardial wall can be defined in terms of the movement of the control points of the spline curve as:-
Q = Qo + WP PCCAΛXΛ,PCA
Where the subscript "PCA" indicates that the tracking is based on a principal component analysis. The time varying part is the shape-vector X which can be recovered using a pseudo inverse of the shape-space WPCA:-
Figure imgf000014_0001
where WpCA represents the pseudo inverse. However, it will be recalled that
Figure 7 demonstrates that it is not easy to place any clinical significance on the time varying components of X.
With this embodiment of the present invention, however a new shape space can be used for decomposing the results of tracking. This can be termed an "interpretational shape-space" and it can be selected to derive from the positions of the control points through the sequence time varying values of clinical interest. In matrix notation this can be explained as:-
Figure imgf000014_0002
Recalling that the components of Q are the coordinates (in 2D just the x and y coordinates) of the control points, the interpretational shape-space can be selected so that the components of XC ΓK represented the movement of only certain control points. For instance, if there are four control points for each segment as illustrated in Figure 9, then the interpretational shape-space can be defined so that the components of X resulting from the matrix multiplication will be an average of the amount of movement of the control points in each segment. To achieve this each row of ^C N nas °ur non-zero weights to pick-out the four x anάy coordinates of the four control points in a segment, the rest of the row being zeros. Thus the position 1 to 4 of the first row of W(LIN can be j , with the rest zero to form, after multiplication,
an average of the x-coordinates of the control points in the first segment. In the second row positions 5 to 8 are , with the rest zero and so on. Thus, without
writing the whole matrix out, W^UN can be as follows:-
Figure imgf000015_0001
Thus, in 2D, if the positions of the first four control points (i.e. for the (basal inferior segment) are written at time to as:-
Figure imgf000015_0002
and at time t, as:-
(xl'.yD .yiUxo' yΑ) then
Figure imgf000015_0003
Figure imgf000015_0004
and it can be seen that the effect of multiplying Q - Q0 by the interpretational shape-space matrix WrUN means that the components of XCUN are just the averages of the x anάy components of each of the segments. Thus the first component of XCLIN is just:-
} {(xϊ - *;) + (x° - x2' ) + ( 3° - *£) + (*4° - x4' )}
The same is true for the_y components, and for each of the other segments.
The above interpretational shape-space is based on the use of four control points in each segment weighted equally. However, a different number of control points for each segment can be used. Further, because the control points at the end of each segment actually affect the spline curve in the next segment, it is also possible to adjust the value of the components in the interpretational shape-space to favour the control points in the middle of each segment. For instance instead of the components
( T , 4" , T J 7 X one could set the components as ( , j , j , -ξ ) for each segment.
It is possible to further enhance the analysis by removing the heart's translational and rotational motion. This is because the excursion of the contour (representing the heart wall) is measured from a fixed external frame of reference.
The effect of translation and rotation can be removed by subtracting out the centroid of the spline curve, so tha -
c^ = ^ ((Q- Q) - (Qo- Qo))
where Q and Q0 are the average of Q and Q0 respectively.
Thus the use of this interpretational shape-space means that the shape-space vector components are meaningful in clinical terms - they represent the average function of each segment of the left ventricular boundary. To show the value of this Figure 10 compares plots of the time variation of the six components of a PCA shape-space, with the variation with time of the six components from the interpretational shape-space (i.e. the average position of the four control points for each segment). Figure 10(a) illustrates the variation based on the PCA tracking shape-space and Figure 10(b) the interpretational shape-space. The plots of Figure 10(b) can be clearly related to the underlying local motion of the left ventricular boundary. Thus although Figure 10(a) demonstrates periodicity in principally the first three of the components, it is very difficult to relate these to the underlying motion of the left ventricular boundary because each component represents motion of the whole boundary. In comparison Figure 10(b) shows that all six of the components of the new shape-space are periodic. It can be seen that the basal interior, mid-interior, mid-inferior and basal inferior segments all move normally. The observed smaller movement in the basal inferior region is in accordance with normal heart function. However the apical inferior and apical anterior segments, although moving periodically, have a reduced endocardial wall excursion. This is in accordance with the diagnosis that this subject has a myocardial infarct in the apical region. Consequently it will be understood that illustrating the component plots from the interpretational shape-space to the clinician, gives the clinician a valuable and immediately recognisable tool for assessing heart function.
It will be appreciated that an abnormal region of heart wall may not completely fill any single segment, but could be just one part of segment, and possibly be a part of another. A measure of this can easily be derived by using the technique of this embodiment of the present invention by determining the variation in the amount of movement within each segment. In this embodiment this can be done by calculating the standard deviation in the movement of the four control points in each segment. The standard deviation is, of course, a measure of the variation in degree of movement between the different control points. If all control points moved by the same amount then the standard deviation will be low. If some move considerably more than others, the standard deviation will be high.
Thus, according to this embodiment, for each segment two measures are obtained, the mean and standard deviation of the endocardial wall excursion. Figure 11(a) illustrates plots corresponding to those in Figure 10 but including the 95% confidence interval for the interpretational shape-space vector components. From Figure 11(a) it can be seen that the standard deviation is approximately the same in the basal interior, mid-inferior and basal inferior segments. However, the apical anterior and apical inferior segments show very noticeable increase in variation, particularly during the systolic stage of the heart cycle. This implies that not all of the apical inferior and apical anterior segments are abnormal. Figure 11(b) shows how these plots are displayed to the clinician. It was mentioned in the introduction that currently clinicians qualitatively score the movement of each segment. It would be useful to provide a similar scoring scheme for each segment, but which is automated and thus less subjective. In this embodiment a scoring system is provided by calculating the maximum displacement of the boundary segment. This is then normalised with respect to the maximum movement over all the six segments. The resulting number is representative of the relative maximum displacement of each segment of the endocardial wall. The wall excursion is calculated as the maximum (signed) excursion of the contour minus the minimum (signed) excursion of the contour. This is illustrated in Figure 12. The peak-to-peak component plot is measured. Figure 13 illustrates these values for the data from Figure 10. Figure 13(a) shows the maximum wall excursion in pixels, and
Figure 13(b) shows the values normalised by dividing by the largest of the six values. The relative magnitude of these values is consistent with the diagnosis of a myocardial infarct in the apical interior and apical anterior regions.
It should be noted that there is one extreme case when this scoring system will not work well. This is when every segment is abnormal, in which case each maximal endocardial wall excursion score will be small and the normalised score would appear totally normal. This can be monitored by setting a minimal allowable amplitude of the endocardial wall excursion.
It is possible to enhance the display of the results of tracking the endocardial wall by so-called "colour kinesis" . In this case a number of contours are plotted together on the same image, for instance the previous 20 contours can be plotted on one image. Each contour is colour coded from the most recent in one colour, say green, to the oldest in another colour, say red. Then the wall motion is more easily recognised. Further, it is possible to calculate the velocity at a point on the contour between each frame and its predecessor. This velocity can then be colour coded, for instance so that low velocities are coded in blue and faster velocities in red, and these velocities displayed overlying the image.
The system can further provide a way of tracking the outer boundary of the left ventricle, the epicardial wall, and of quantifying myocardial wall thickening by measuring the distance between the tracked and endocardial and epicardial walls.
It was noted in the introduction that the epicardial wall is quite difficult to track. For this reason, using a tracking strategy corresponding to that used for the endocardial wall is not possible. This embodiment of the present invention overcomes this difficulty by basing the prediction of the epicardial wall on a model of the distance between the endocardial and epicardial walls. Effectively a constraint is placed upon the distance between the two walls, and that is used to optimise the tracking of the epicardial wall.
Given an image sequence the technique involves the clinician initially manually locating and drawing around contours representing the endocardium and epicardium in a few of the image frames. Then, just as a PCA shape-space was constructed for the motions of the endocardial wall in the technique described above, a PCA shape-space is constructed in this case for the difference between the two.
This is known as the "difference shape-space". Thus the difference shape-space WDiff is based on the PCA of the difference QDiff between the control points of the epicardial wall and endocardial wall, i.e.:-
QDiff= QEP " QEN-
The shape-space vector for ^s is XDiff. The technique for finding the epicardial wall is then to use both the difference shape-space (which indicates what movement can be expected) and a search for image features representing the epicardium. This search is conducted by starting from the endocardium (whose position is more easy to establish in the image) and then searching along normals from the curve representing the endocardium. The search is for an image feature, such as a change in image intensity, representative of the epicardial wall. Then using the difference shape-space W^a- a contour is fitted to the measurement results.
In summary the algorithm for epicardium estimation is as follows :-
1. Obtain a shape-space, " ^, of the difference between the manually segmented endocardium and epicard um contours, QED.1. QEnι2,...QEn and QEP>1, QEp,2..., QEpM, respectively.
2. Search normally to the estimated position of the endocardium, to find image measurements that represent the epicard um.
3. Using these measurements, obtain a best fitting curve, ^Dif (^roin trιe fitting algorithm below), in the difference
shape-space, "WDiff. Call this contour QDjff
4. The estimated epicardium position, QE , is then given by
QEp = QE„+ QDiff
The fitting algorithm for step 3 (taken from Blake, A. and Isard, M. (1998) "Active Contours", Springer) is as follows:
Given an initial shape estimate f (s) (or X in shape-space) with
normals n(s) , and a regularisation weight matrix S , minimize e.g. by solving:
N r where T = (X- X)τ S(X- X) + ∑ - (v,. - h(^.)τrχ- χ]): i=l
Algorithm
1. Choose samples st , i = l, ..., N, such that
sι = °Λ+ι = sl + h,sN = L.
2. For each /, apply some image-processing filter along a suitable line (e.g. curve normal) passing through r(s, ) , to establish the
position of r , (st ) .
3. Initialise
Z0= 0, S0= 0.
4. Iterate, for /' = 1,..., N: v,- = (/>(*,)- F )) «(*,),
Figure imgf000021_0001
Z, = Z + *(* ,.
5. The aggregated observation vector is,
Z = Z M
with associated statistical information,
S — SN
6. Finally, the best fitting curve is given in the shape-space by:
X = X + (S + S)-'Z
Figure 14 illustrates the results of applying these techniques to an image sequence to detect the epicardium for frames 25, 96, 104, 109 and 114 of a sequence.
The choice of search scale for the detection of image features affects the accuracy of tracking. For example an emperically chosen constant scale of 1.14 cm (30 pixels) of gave good results on an image of 720 x 512 pixels. It would, however be possible to link the search scale for the epicardium to the stage of the cardiac cycle, so that the search scale would be longer in systole than in diastole. The above technique still requires the detection of image features representing the epicardial wall, the difficulty of which has been mentioned several times. A particularly advantageous approach used in this embodiment is to plot the image intensity along the search lines (normals from the endocardial boundary) and to use a wavelet ridge detector to find the image feature corresponding to the epicardial boundary. The wavelet chosen for this embodiment was the Coiffman wavelet series. These orthogonal, compactly supported wavelets feature the highest number of vanishing moments for a given filter length. In addition, several packets within the Coiffman set have a spatial distribution that is particularly suitable for evaluating ridge-like structures (such as the epicardium) at low resolutions. Each of the profiles (such as those shown in Figure 15) are initially dyadically-decomposed, and then reconstructed using the best-basis algorithm. Each packet within this optimised set is then compared with the total best-basis reconstruction to determine which decompositions contribute most strongly to the original profile signal. These particular low-resolution decompositions are used in the next stage of the analysis to restrict the reconstruction of the filtered profile to that of the ridge-characteristics of successive profiles. The first profile normal is chosen to be in an area of good contrast-to-noise ratio (i.e. a good profile) e.g. at the start of the basal anterior segment, with the profile normals incremented anti-clockwise from this point, with the last being at the end of the basal inferior segment.
The Coiffman wavelet with a filter length of six was chosen as it gave accurate results without imposing too heavy a computational burden.
The results of detecting image features using this wavelet set are shown in Figure 16. It is possible to further improve the detection by using information obtained along one search line in the search along the next search line.
To do this peak or ridge-like features found along a search line are negatively weighted if: (1) they deviate outside the estimated image space for the myocardium (given by the tracking above) or (2) if local maxima deviate too greatly from an allowable curvature of the epicardial boundary. Thus once the decomposed profile has been adjusted to reflect ridge localisation in the previous profiles, the low- resolution representation is then adjointly convolved with the low and high-pass quadrature filters for reconstruction The reconstructed profile has now been smoothed (due to the use of a limited number of packets for reconstruction), shifted and normalised (due to the low-resolution weighting function), then a ridge detection that finds the maximum of the reconstructed profile is applied to this. The net result is that the ridge detection is much more consistent with the visual location of the epicardium, as Figure 17.
It will be noted that in the basal anterior segment, localisation is poorer than on other parts of the epicardial boundary The reason for this is that the mitral valve is positioned here and the curvature of the boundary deviates too much from the allowable curvature However, the localisation is improved for the rest of the epicardial boundary
An example of the results of this technique are illustrated in Figure 18 Figure 18(a) shows the endocardial and epicardial walls and Figure 18(b) illustrates the myocardial thickening which occurs through the sequence This is illustrated by colouring the region between tracked epicardial and endocardial walls.
Just as the regional wall excursion, i e the movement of the endocardial wall in the different segments was quantified, it is useful also to quantify myocardial thickening for the segments This can be done by calculating the average distance between the epicardial and endocardial boundaries An alternative is to use the difference shape-space and associated shape-space vector WDlff and XDlff The values of these can be integrated over the individual segments to provide a measure of the myocardial thickness in that segment A plot of this value for each segment through an image sequence is shown in Figure 19 It can be seen at the basal inferior, mid- anterior, mid-inferior and basal inferior segments all move normally The smaller thickening the basal anterior region is also normal However, the reduced change in thickness in the apical anterior and apical inferior segments is abnormal and agrees with the diagnosis that this patient has a myocardial infarct in the apical region
It is also useful to look at the variation in segment wall thickening in each of the six segments as shown in Figure 20 The starting point for this data set is diastole, with the heart entering systole after around a tenth of a second The basal anterior and basal inferior segments show a lot of variation in wall thickness. The lack of thickening and variation is the apical anterior segment is consistent, again, with the above diagnosis. However, there is more variation in the apical inferior, than in the apical anterior segment. This implies that not all of the apical inferior segment is ischemic. Thus again the measurement of variation in individual segments gives an idea of whether it is the whole segment or only part of the segment which is abnormal.
Figure 21 illustrates the data as it might be presented to a clinician. Thus the endocardial and epicardial walls are overlayed on the image and track movement of the ventricle. Plots of endocardial wall excursion and myocardial thickening, including the variation within each segment are illustrated alongside. Thus the clinician can easily recognise from the plots abnormal areas of the left ventricular wall.
A single numerical score representative of wall thickening can also be calculated as follows :-
% Th = x lOO lED
Figure imgf000024_0001
Where, ThES is the thickness of the myocardial segment at end-systole, and ThπD is the thickness of the myocardial segment at end-diastole (in cm). Scores of regional percentage of wall thickening for the data in Figure 13 are shown in Figure 22. Figure 22(a) shows the normalised maximal endocardial wall excursion for Figure 13, Figure 22(b) shows the percentage myocardial thickening and Figure 22(c) illustrates the interface in the form presented to the clinician.
To further enhance the display the technique of colour kinesis mentioned previously can also be used. In this case the change in thickness between a frame and its predecessor is calculated, and these cumulative distances are colour coded so that the most recent is in one colour, say blue, and the oldest is in another colour, say red. Thus a plot of the previous 20 frames can be made on the same axis with time being illustrated by changing colour.
While the above embodiment has been described with reference to the analysis and interpretation of echocardiograms, it will be appreciated that the techniques are applicable to any image sequence of the non-rigid motion of a deformable object. Thus this includes ultrasound images of other organs or other objects, as well as images obtained of human organs or other objects by other imaging modalities.

Claims

1. A method of analysing a sequence of images of an internal body organ in non-rigid motion, comprising the steps of:- detecting the boundary of the organ in each image of the sequence; and automatically calculating the amount of movement through the sequence of each of a plurality of clinically significant segments of the detected boundary.
2. A method according to claim 1, further comprising the step of displaying graphically the calculated amount of movement of each of the clinically significant segments of the detected boundary.
3. A method according to claim 1 or 2, further comprising the step of calculating and outputing for each of the clinically significant segments of the detected boundary an average of the amount of movement of that segment.
4. A method according to claim 1,2 or 3, further comprising the step of calculating for each of the clinically significant segments of the detected boundary the variation in the amount of movement within that segment.
5. A method according to claim 1, 2, 3 or 4, further comprising the step of calculating for each of the clinically significant segments the maximal excursion of the detected boundary during said non-rigid motion.
6. A method according to any one of the preceding claims wherein the organ is the human or animal heart.
7. A method according to any one of the preceding claims wherein the images are produced by ultrasound-based, MR-based or x-ray based, imaging or nuclear medicine.
8. A method according to any one of the preceding claims wherein a spline curve is fitted to the boundary.
9. A method according to claim 8 further comprising the step of visually locating the boundary in only some selected images in the sequence and fitting the spline curve to the visually located boundary in each selected image by calculation of the control points for the spline curve.
10. A method according to claim 9 further comprising the steps of calculating a shape-space space representation of the movement the spline curve through the selected images.
11. A method according to claim 10 wherein the shape-space space is calculated by performing a principal component analysis ( PCA) of the movement of the spline curve through the selected images.
12. A method according to claim 9, 10 or 11 further comprising the steps of predicting the position of the boundary in each frame of the sequence based on the spline curve, detecting image features representative of the boundary in the vicinity of the predicted position of the boundary, and correcting the predicted position on the basis of the detected image features.
13. A method according to any one of claims 8 to 12 further comprising the step of displaying the spline curve overlying the image.
14. A method according to any one of claims 8 to 13 further comprising the step of calculating and outputing for each of said clinically significant segments an average of the amount of movement of the spline curve control points for that segment.
15. A method according to claim 14 wherein the average is weighted in favour of spline curve control points in the middle of each segment.
16. A method according to any one of claims 8 to 15 further comprising the step of calculating and outputing for each of said clinically significant segments a measure of the variation in the amount of movement of the spline curve control points for that segment.
17. A method according to any one of claims 8 to 16 further comprising the step of calculating and outputing for each of said clinically significant segments a measure of the maximal excursion of the spline curve control points for that segment.
18. A method according to claim 11 or any claim dependent therefrom, further comprising the step of defining a different shape-space space, and calculating from the spline function control points the shape-vector corresponding to the different shape-space space.
19. A method according to claim 18 wherein a pseudo-inverse of the different shape-space space is defined to produce as components of the shape-vector a measure of the movement of the spline function control points for each of the clinically significant segments.
20. A method according to claim 19 further comprising the step of displaying graphically the variation through the sequence of the shape- vector components.
21. A method according to any one of claims 8 to 20 wherein four spline function control points are defined for each of the clinically significant segments.
22. A method of analysing a sequence of images of a deformable object in non-rigid motion to detect inner and outer boundaries of a wall of the object, comprising the steps of: detecting the inner boundary; and searching outside the inner boundary for image features representing the outer boundary.
23. A method according to claim 22 further comprising the step of fitting a spline curve to the detected image features representing the outer boundary.
24. A method according to claim 23 wherein the spline curve is fitted by: manually locating the inner and outer boundaries in only some images of the sequence; calculating a shape-space space for the change through the sequence of the distance between the two boundaries; detecting the inner boundary and performing said search outside the inner boundary for image features representing the outer boundary in images of the sequence; and fitting a spline curve to the detected image features in said other images of the sequence by using said shape-space.
25. A method according to claim 24 further comprising the step of performing a principal component analysis of the change in the distance between the two boundaries, as a basis for said shape-space space.
26. A method according to any one of claims 22 to 25 wherein the step of searching outside the inner boundary for image features representing the outer boundary comprises detecting and analysing changes in the image intensity outwards from said inner boundary.
27. A method according to claim 26 further comprising detecting a ridge in a plot of the image intensity outwards from the inner boundary.
28. A method according to claim 27 further comprising performing a wavelet decomposition of the plot of the image intensity to smooth the plot and detecting as said ridge a maximum in the smoothed plot.
29. A method according to claim 26, 27 or 28 wherein the search is conducted along a plurality of search lines spaced along and extending radially outwardly from said inner boundary.
30. A method according to any one of claims 26 to 29 when dependent from claim 23, wherein when fitting the spline curve to the detected image features, the detected image features are weighted down if they imply a high curvature of the outer boundary.
31. A method according to any one of claims 26 to 30 when dependent from claim 24, wherein when fitting the spline curve to the detected image features, the detected image features are weighted down if they imply a difference between the inner and outer boundaries which lies outside the shape-space space for that difference
32. A method according to any one of claims 22 to 31 wherein the images are ultrasound images.
33. A method according to any one of claims 22 to 32 wherein the object is a human or animal organ.
34. A method according to any one of claims 22 to 33 wherein the object is a human or animal heart.
35. A method according to claim 34 wherein the object is the left or right ventricle.
36. A method according to claim 34 or 35 further comprising the step of graphically displaying the change through the sequence of the distance between the inner and outer boundaries as a representation of myocardial thickening.
37. A method according to claim 34, 35 or 36 further comprising segmenting the wall of the heart and graphically displaying for each segment the change through the sequence of the distance between the inner and outer boundaries as a representation of myocardial thickening for that segment.
38. A method according to claim 37 wherein the distance between the inner and outer boundaries is averaged or integrated for within each segment.
39. A method according to claim 37 or 38 further comprising the step of calculating the variation within each segment of the change through the sequence of the distance between the inner and outer boundaries.
40. A method according to any one of claims 22 to 39 wherein the inner boundary is detected by the method of any one of claims 1 to 21.
41. A method of analysing a sequence of images of a deformable object in non-rigid motion comprising the steps of detecting a boundary of the object in each of a plurality of frames of the sequence, fitting a spline curve to the boundary in constructing a shape space representation of the movement of spline curve using a first shape space so that the spline curve tracks the boundary, and decomposing the tracking spline curve using a second different, shape space.
42. A method according to claim 41 wherein the first shape space is based on a principal component analysis of the movement of the boundary.
43. A method according to claim 41 or 42 wherein the second shape space is adapted to select a desired attribute of the motion.
44. A method of analysing a sequence of images of a deformable object in non-rigid motion comprising detecting first and second boundaries of the object in a plurality of frames of the sequence and constructing a shape space representation of the variation through the sequence of the distance between the two boundaries.
45. A method according to claim 44 further comprising the step of fitting a spline curve to the two boundaries.
46. A method according to claim 44 or 45 further comprising the steps of finding, in all frames of the sequence, the position of the first boundary, predicting the position in all frames of the second boundary on the basis of said shape space and a search for image features representation of said second boundary.
47. A computer program comprising program code means for performing the method of any one of the preceding claims when the program is run on a computer.
48. A computer program storage medium readable by a computer system and encoding a computer program according to claim 47.
PCT/GB2000/002767 1999-08-27 2000-07-19 Non-rigid motion image analysis WO2001016886A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/069,291 US7043063B1 (en) 1999-08-27 2000-07-19 Non-rigid motion image analysis
JP2001520357A JP2003508139A (en) 1999-08-27 2000-07-19 Non-rigid motion image analysis
EP00946145A EP1212729A2 (en) 1999-08-27 2000-07-19 Non-rigid motion image analysis

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9920401A GB9920401D0 (en) 1999-08-27 1999-08-27 Non-rigid motion image analysis
GB9920401.8 1999-08-27

Publications (2)

Publication Number Publication Date
WO2001016886A2 true WO2001016886A2 (en) 2001-03-08
WO2001016886A3 WO2001016886A3 (en) 2001-11-01

Family

ID=10859987

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2000/002767 WO2001016886A2 (en) 1999-08-27 2000-07-19 Non-rigid motion image analysis

Country Status (5)

Country Link
US (1) US7043063B1 (en)
EP (1) EP1212729A2 (en)
JP (1) JP2003508139A (en)
GB (1) GB9920401D0 (en)
WO (1) WO2001016886A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003250801A (en) * 2002-03-04 2003-09-09 Aloka Co Ltd Ultrasonic diagnostic apparatus
JP2005531352A (en) * 2002-06-28 2005-10-20 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ An image processing system for displaying information on the movement of the top of a deformable three-dimensional object
EP1593087A2 (en) * 2003-01-30 2005-11-09 Chase Medical, L. P. A method and system for image processing and contour assessment
WO2006046982A1 (en) * 2004-10-21 2006-05-04 Siemens Medical Solutions Usa, Inc. Automated diastolic function analysis with ultrasound
WO2006083569A1 (en) * 2005-02-03 2006-08-10 Siemens Medical Solutions Usa, Inc. Characterization of cardiac motion with spatial relationship
EP1757230A1 (en) * 2001-07-31 2007-02-28 Koninklijke Philips Electronics N.V. Transesophageal and transnasal, transesophageal ultrasound imaging systems .
WO2007141038A1 (en) * 2006-06-08 2007-12-13 Tomtec Imaging Systems Gmbh Method, device and computer programme for evaluating images of a cavity
DE102007009182A1 (en) 2007-02-26 2008-08-28 Siemens Ag Cyclically moving object i.e. heart, image representation method, involves recording image data of object by single photon emission computed tomography method and assigning recorded image to different phases of motion cycle
US7519206B2 (en) 2000-11-22 2009-04-14 Siemens Medical Solutions Usa, Inc. Detection of features in images
EP2679166A1 (en) * 2012-06-28 2014-01-01 Samsung Medison Co., Ltd. Diagnosis image apparatus and operation method thereof
WO2016022052A1 (en) * 2014-08-05 2016-02-11 Inovacor Ab A cardiac state monitoring system
CN105631931A (en) * 2015-12-21 2016-06-01 电子科技大学 Low-complexity heart surface three-dimensional shape online modeling system and method thereof
US9508140B2 (en) 2012-08-27 2016-11-29 Agency For Science, Technology And Research Quantifying curvature of biological structures from imaging data

Families Citing this family (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7826889B2 (en) * 2000-08-21 2010-11-02 Spectrum Dynamics Llc Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
WO2004042546A1 (en) * 2002-11-04 2004-05-21 V-Target Technologies Ltd. Apparatus and methods for imaging and attenuation correction
US8565860B2 (en) 2000-08-21 2013-10-22 Biosensors International Group, Ltd. Radioactive emission detector equipped with a position tracking system
US8489176B1 (en) 2000-08-21 2013-07-16 Spectrum Dynamics Llc Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
US8036731B2 (en) * 2001-01-22 2011-10-11 Spectrum Dynamics Llc Ingestible pill for diagnosing a gastrointestinal tract
US8909325B2 (en) 2000-08-21 2014-12-09 Biosensors International Group, Ltd. Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
IL157007A0 (en) * 2001-01-22 2004-02-08 Target Technologies Ltd V Ingestible device
US20030086596A1 (en) * 2001-11-07 2003-05-08 Medical Metrics, Inc. Method, computer software, and system for tracking, stabilizing, and reporting motion between vertebrae
DE10163813A1 (en) * 2001-12-22 2003-07-03 Philips Intellectual Property Method for displaying different images of an examination object
JP4503238B2 (en) * 2003-04-17 2010-07-14 株式会社日立メディコ Movement display method and diagnostic imaging apparatus for living tissue
WO2005001769A2 (en) * 2003-06-25 2005-01-06 Siemens Medical Solutions Usa, Inc. Automated regional myocardial assessment for cardiac imaging
US7912528B2 (en) * 2003-06-25 2011-03-22 Siemens Medical Solutions Usa, Inc. Systems and methods for automated diagnosis and decision support for heart related diseases and conditions
US7154498B2 (en) * 2003-09-10 2006-12-26 Siemens Medical Solutions Usa, Inc. System and method for spatio-temporal guidepoint modeling
US7596258B2 (en) * 2003-11-10 2009-09-29 Kabushiki Kaisha Toshiba Image processor
US8586932B2 (en) 2004-11-09 2013-11-19 Spectrum Dynamics Llc System and method for radioactive emission measurement
US7176466B2 (en) 2004-01-13 2007-02-13 Spectrum Dynamics Llc Multi-dimensional image reconstruction
US8571881B2 (en) 2004-11-09 2013-10-29 Spectrum Dynamics, Llc Radiopharmaceutical dispensing, administration, and imaging
WO2005118659A2 (en) * 2004-06-01 2005-12-15 Spectrum Dynamics Llc Methods of view selection for radioactive emission measurements
US7968851B2 (en) * 2004-01-13 2011-06-28 Spectrum Dynamics Llc Dynamic spect camera
US9040016B2 (en) 2004-01-13 2015-05-26 Biosensors International Group, Ltd. Diagnostic kit and methods for radioimaging myocardial perfusion
WO2007010534A2 (en) 2005-07-19 2007-01-25 Spectrum Dynamics Llc Imaging protocols
US9470801B2 (en) 2004-01-13 2016-10-18 Spectrum Dynamics Llc Gating with anatomically varying durations
US7653227B2 (en) * 2004-02-09 2010-01-26 Siemens Medical Solutions Usa, Inc. Hierarchical modeling in medical abnormality detection
DE102004008979B4 (en) * 2004-02-24 2006-12-28 Siemens Ag Method for filtering tomographic 3D representations after reconstruction of volume data
EP1778957A4 (en) 2004-06-01 2015-12-23 Biosensors Int Group Ltd Radioactive-emission-measurement optimization to specific body structures
US7339585B2 (en) * 2004-07-19 2008-03-04 Pie Medical Imaging B.V. Method and apparatus for visualization of biological structures with use of 3D position information from segmentation results
US9943274B2 (en) 2004-11-09 2018-04-17 Spectrum Dynamics Medical Limited Radioimaging using low dose isotope
US8615405B2 (en) 2004-11-09 2013-12-24 Biosensors International Group, Ltd. Imaging system customization using data from radiopharmaceutical-associated data carrier
US8000773B2 (en) * 2004-11-09 2011-08-16 Spectrum Dynamics Llc Radioimaging
US9316743B2 (en) 2004-11-09 2016-04-19 Biosensors International Group, Ltd. System and method for radioactive emission measurement
EP1827505A4 (en) 2004-11-09 2017-07-12 Biosensors International Group, Ltd. Radioimaging
WO2008059489A2 (en) 2006-11-13 2008-05-22 Spectrum Dynamics Llc Radioimaging applications of and novel formulations of teboroxime
EP1844351A4 (en) * 2005-01-13 2017-07-05 Biosensors International Group, Ltd. Multi-dimensional image reconstruction and analysis for expert-system diagnosis
US8837793B2 (en) 2005-07-19 2014-09-16 Biosensors International Group, Ltd. Reconstruction stabilizer and active vision
EP1908011B1 (en) * 2005-07-19 2013-09-04 Spectrum Dynamics LLC Reconstruction stabilizer and active vision
JP5014132B2 (en) * 2005-07-20 2012-08-29 パナソニック株式会社 Ultrasonic diagnostic equipment
EP1933698A2 (en) * 2005-09-16 2008-06-25 The Ohio State University Method and apparatus for detecting intraventricular dyssynchrony
US8131043B2 (en) * 2005-09-16 2012-03-06 The Ohio State University Method and apparatus for detecting interventricular dyssynchrony
EP1952180B1 (en) * 2005-11-09 2017-01-04 Biosensors International Group, Ltd. Dynamic spect camera
US20070167784A1 (en) * 2005-12-13 2007-07-19 Raj Shekhar Real-time Elastic Registration to Determine Temporal Evolution of Internal Tissues for Image-Guided Interventions
AU2006339503A1 (en) * 2005-12-20 2007-09-13 University Of Maryland, Baltimore Method and apparatus for accelerated elastic registration of multiple scans of internal properties of a body
WO2007074466A2 (en) 2005-12-28 2007-07-05 Starhome Gmbh Late forwarding to local voicemail system of calls to roaming users
US8046045B2 (en) * 2006-05-08 2011-10-25 Siemens Medical Solutions Usa, Inc. Anatomical feature condition assessment system using tissue displacement characteristics
US8894974B2 (en) 2006-05-11 2014-11-25 Spectrum Dynamics Llc Radiopharmaceuticals for diagnosis and therapy
US8425418B2 (en) * 2006-05-18 2013-04-23 Eigen, Llc Method of ultrasonic imaging and biopsy of the prostate
KR20090013199A (en) * 2006-05-25 2009-02-04 코닌클리케 필립스 일렉트로닉스 엔.브이. 3d echocardiographic shape analysis
US10353069B2 (en) * 2006-08-09 2019-07-16 Koninklijke Philips N.V. Ultrasound imaging system with image rate sequences
WO2008085193A2 (en) * 2006-08-14 2008-07-17 University Of Maryland Quantitative real-time 4d strees test analysis
US7889912B2 (en) * 2006-09-15 2011-02-15 The General Electric Company Method for real-time tracking of cardiac structures in 3D echocardiography
WO2008048933A2 (en) * 2006-10-16 2008-04-24 Assa Abloy Hospitality, Inc. Centralized wireless network for multi-room large properties
US8064664B2 (en) 2006-10-18 2011-11-22 Eigen, Inc. Alignment method for registering medical images
US7804989B2 (en) * 2006-10-30 2010-09-28 Eigen, Inc. Object recognition system for medical imaging
US9275451B2 (en) 2006-12-20 2016-03-01 Biosensors International Group, Ltd. Method, a system, and an apparatus for using and processing multidimensional data
US20080161687A1 (en) * 2006-12-29 2008-07-03 Suri Jasjit S Repeat biopsy system
US8175350B2 (en) * 2007-01-15 2012-05-08 Eigen, Inc. Method for tissue culture extraction
US20080186378A1 (en) * 2007-02-06 2008-08-07 Feimo Shen Method and apparatus for guiding towards targets during motion
US7856130B2 (en) * 2007-03-28 2010-12-21 Eigen, Inc. Object recognition system for medical imaging
US8989468B2 (en) * 2007-05-25 2015-03-24 Definiens Ag Generating an anatomical model using a rule-based segmentation and classification process
JP4690361B2 (en) * 2007-05-28 2011-06-01 富士フイルム株式会社 Cardiac function analysis apparatus, method and program thereof
US20090048515A1 (en) * 2007-08-14 2009-02-19 Suri Jasjit S Biopsy planning system
US8571277B2 (en) * 2007-10-18 2013-10-29 Eigen, Llc Image interpolation for medical imaging
US8521253B2 (en) 2007-10-29 2013-08-27 Spectrum Dynamics Llc Prostate imaging
US7942829B2 (en) * 2007-11-06 2011-05-17 Eigen, Inc. Biopsy planning and display apparatus
US8208698B2 (en) * 2007-12-14 2012-06-26 Mela Sciences, Inc. Characterizing a texture of an image
CN100547617C (en) * 2007-12-29 2009-10-07 浙江工业大学 Heart three dimensional representation method based on NURBS
US20090324041A1 (en) * 2008-01-23 2009-12-31 Eigen, Llc Apparatus for real-time 3d biopsy
US20100001996A1 (en) * 2008-02-28 2010-01-07 Eigen, Llc Apparatus for guiding towards targets during motion using gpu processing
US20090238404A1 (en) * 2008-03-18 2009-09-24 Fredrik Orderud Methods for using deformable models for tracking structures in volumetric data
DE102009010291A1 (en) * 2009-02-24 2010-05-12 Siemens Aktiengesellschaft Method for determining and classifying wall motion abnormalities of e.g. left ventricular myocardium of patient, involves segmenting myocardium of patient, determining characteristic values, and evaluating characteristic values
US8338788B2 (en) 2009-07-29 2012-12-25 Spectrum Dynamics Llc Method and system of optimized volumetric imaging
US20110064287A1 (en) * 2009-09-14 2011-03-17 Alexandru Bogdan Characterizing a texture of an image
WO2012094308A1 (en) * 2011-01-03 2012-07-12 Massachusetts Institute Of Technology Quantitative elastography
WO2012176848A1 (en) * 2011-06-23 2012-12-27 株式会社 東芝 Image processing device and x-ray diagnosis device
US8861830B2 (en) 2011-11-07 2014-10-14 Paieon Inc. Method and system for detecting and analyzing heart mechanics
JP6259772B2 (en) * 2012-01-16 2018-01-10 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Imaging device
US8879814B2 (en) * 2012-05-22 2014-11-04 General Electric Company Method and apparatus for reducing motion related imaging artifacts using consistency values
WO2014016695A2 (en) 2012-07-27 2014-01-30 Assa Abloy Ab Presence-based credential updating
EP2878142B1 (en) 2012-07-27 2021-05-19 Assa Abloy Ab Setback controls based on out-of-room presence information
US9529080B2 (en) 2012-12-06 2016-12-27 White Eagle Sonic Technologies, Inc. System and apparatus having an application programming interface for flexible control of execution ultrasound actions
US10499884B2 (en) 2012-12-06 2019-12-10 White Eagle Sonic Technologies, Inc. System and method for scanning for a second object within a first object using an adaptive scheduler
US9530398B2 (en) 2012-12-06 2016-12-27 White Eagle Sonic Technologies, Inc. Method for adaptively scheduling ultrasound system actions
US10076313B2 (en) 2012-12-06 2018-09-18 White Eagle Sonic Technologies, Inc. System and method for automatically adjusting beams to scan an object in a body
US9983905B2 (en) 2012-12-06 2018-05-29 White Eagle Sonic Technologies, Inc. Apparatus and system for real-time execution of ultrasound system actions
DE102014225282B4 (en) 2014-12-09 2016-07-21 Siemens Healthcare Gmbh Deformation calculation with cyclical movement of a test object
US10716544B2 (en) 2015-10-08 2020-07-21 Zmk Medical Technologies Inc. System for 3D multi-parametric ultrasound imaging
US11786202B2 (en) 2018-01-08 2023-10-17 Shenzhen Keya Medical Technology Corporation Method, system, and medium for analyzing image sequence of periodic physiological activity
US10499867B2 (en) * 2018-01-08 2019-12-10 Shenzhen Keya Medical Technology Corporation Method, storage medium, and system for analyzing image sequences of periodic physiological activities
JP2019171102A (en) * 2019-05-30 2019-10-10 コニカミノルタ株式会社 Image processing device, image processing method, and program
EP3866107A1 (en) * 2020-02-14 2021-08-18 Koninklijke Philips N.V. Model-based image segmentation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5669382A (en) * 1996-11-19 1997-09-23 General Electric Company System for measuring myocardium in cardiac images
JPH10165401A (en) * 1996-12-16 1998-06-23 Ge Yokogawa Medical Syst Ltd Ultrasonic diagnostic instrument and wall thickness measuring method

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4760548A (en) * 1986-06-13 1988-07-26 International Business Machines Corporation Method and apparatus for producing a curve image
US5214382A (en) 1990-02-23 1993-05-25 Baylor Research Institute Magnetic resonance imaging with selective contrast enhancement
US5054045A (en) 1990-11-14 1991-10-01 Cedars-Sinai Medical Center Coronary tracking display
US5090042A (en) 1990-12-24 1992-02-18 Bejjani Fadi J Videofluoroscopy system for in vivo motion analysis
GB2268351B (en) 1992-06-10 1995-09-06 Sony Broadcast & Communication Motion analysis of moving images
DE4226128A1 (en) 1992-08-07 1994-02-10 Philips Patentverwaltung Procedure for determining motion vectors
US5293574A (en) 1992-10-23 1994-03-08 General Electric Company Digital x-ray imaging system with automatic tracking
JP2942454B2 (en) 1993-05-26 1999-08-30 松下電工株式会社 Shape recognition method
US5435310A (en) * 1993-06-23 1995-07-25 University Of Washington Determining cardiac wall thickness and motion by imaging and three-dimensional modeling
US5471252A (en) 1993-11-03 1995-11-28 Matsushita Electric Industrial Corporation Of America Method and apparatus for estimating motion vector fields by rejecting local outliers
NO941126D0 (en) 1994-03-25 1994-03-25 Int Digital Tech Inc The multi-domain motion estimator
US5825936A (en) * 1994-09-22 1998-10-20 University Of South Florida Image analyzing device using adaptive criteria
CA2216109A1 (en) 1995-03-22 1996-09-26 Idt International Digital Technologies Deutschland Gmbh Method and apparatus for coordination of motion determination over multiple frames
GB9906420D0 (en) 1999-03-19 1999-05-12 Isis Innovation Method and apparatus for image processing
GB0006598D0 (en) 2000-03-17 2000-05-10 Isis Innovation Three-dimensional reconstructions from images
GB0028491D0 (en) 2000-11-22 2001-01-10 Isis Innovation Detection of features in images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5669382A (en) * 1996-11-19 1997-09-23 General Electric Company System for measuring myocardium in cardiac images
JPH10165401A (en) * 1996-12-16 1998-06-23 Ge Yokogawa Medical Syst Ltd Ultrasonic diagnostic instrument and wall thickness measuring method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
CHALANA V ET AL: "A MULTIPLE ACTIVE CONTOUR MODEL FOR CARDIAC BOUNDARY DETECTION ON ECHOCARDIOGRAPHIC SEQUENCES" IEEE TRANSACTIONS ON MEDICAL IMAGING,US,IEEE INC. NEW YORK, vol. 15, no. 3, 1 June 1996 (1996-06-01), pages 290-298, XP000587923 ISSN: 0278-0062 *
JACOB G ET AL: "Robust contour tracking in echocardiographic sequences" SIXTH INTERNATIONAL CONFERENCE ON COMPUTER VISION (IEEE CAT. NO.98CH36271), PROCEEDINGS OF IEEE 6TH INTERNATIONAL CONFERENCE ON COMPUTER VISION, BOMBAY, INDIA, 4-7 JAN. 1998, pages 408-413, XP002155450 1998, New Delhi, India, Narosa Publishing House, India ISBN: 81-7319-221-9 cited in the application *
KASS M ET AL: "SNAKES: ACTIVE CONTOUR MODELS" LONDON, JUNE 8 - 11, 1987,WASHINGTON, IEEE COMP. SOC. PRESS,US, vol. CONF. 1, 8 June 1987 (1987-06-08), pages 259-268, XP000971219 *
MCEACHEN J C II ET AL: "Shape-based tracking of left ventricular wall motion" IEEE TRANSACTIONS ON MEDICAL IMAGING, JUNE 1997, IEEE, USA, vol. 16, no. 3, pages 270-283, XP002155446 ISSN: 0278-0062 *
SETAREHDAN S K ET AL: "AUTOMATIC LEFT VENTRICULAR FEATURE EXTRACTION AND VISUALISATION FROM ECHOCARDIOGRAPHIC IMAGES" COMPUTERS IN CARDIOLOGY,US,NEW YORK, IEEE, 1996, pages 9-12, XP000687747 ISBN: 0-7803-3711-5 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7519206B2 (en) 2000-11-22 2009-04-14 Siemens Medical Solutions Usa, Inc. Detection of features in images
EP1757230A1 (en) * 2001-07-31 2007-02-28 Koninklijke Philips Electronics N.V. Transesophageal and transnasal, transesophageal ultrasound imaging systems .
JP2003250801A (en) * 2002-03-04 2003-09-09 Aloka Co Ltd Ultrasonic diagnostic apparatus
JP2005531352A (en) * 2002-06-28 2005-10-20 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ An image processing system for displaying information on the movement of the top of a deformable three-dimensional object
EP1593087A2 (en) * 2003-01-30 2005-11-09 Chase Medical, L. P. A method and system for image processing and contour assessment
EP1593087A4 (en) * 2003-01-30 2006-10-04 Chase Medical Lp A method and system for image processing and contour assessment
WO2006046982A1 (en) * 2004-10-21 2006-05-04 Siemens Medical Solutions Usa, Inc. Automated diastolic function analysis with ultrasound
WO2006083569A1 (en) * 2005-02-03 2006-08-10 Siemens Medical Solutions Usa, Inc. Characterization of cardiac motion with spatial relationship
WO2007141038A1 (en) * 2006-06-08 2007-12-13 Tomtec Imaging Systems Gmbh Method, device and computer programme for evaluating images of a cavity
DE102007009182A1 (en) 2007-02-26 2008-08-28 Siemens Ag Cyclically moving object i.e. heart, image representation method, involves recording image data of object by single photon emission computed tomography method and assigning recorded image to different phases of motion cycle
US8290224B2 (en) 2007-02-26 2012-10-16 Siemens Aktiengesellschaft Method and device for imaging cyclically moving objects
DE102007009182B4 (en) * 2007-02-26 2016-09-22 Siemens Healthcare Gmbh Method and device for image display of cyclically moving objects
EP2679166A1 (en) * 2012-06-28 2014-01-01 Samsung Medison Co., Ltd. Diagnosis image apparatus and operation method thereof
US9508140B2 (en) 2012-08-27 2016-11-29 Agency For Science, Technology And Research Quantifying curvature of biological structures from imaging data
WO2016022052A1 (en) * 2014-08-05 2016-02-11 Inovacor Ab A cardiac state monitoring system
CN105631931A (en) * 2015-12-21 2016-06-01 电子科技大学 Low-complexity heart surface three-dimensional shape online modeling system and method thereof
CN105631931B (en) * 2015-12-21 2018-05-04 电子科技大学 A kind of heart surface three-dimensional configuration line modeling system and method for low complex degree

Also Published As

Publication number Publication date
WO2001016886A3 (en) 2001-11-01
JP2003508139A (en) 2003-03-04
GB9920401D0 (en) 1999-11-03
US7043063B1 (en) 2006-05-09
EP1212729A2 (en) 2002-06-12

Similar Documents

Publication Publication Date Title
US7043063B1 (en) Non-rigid motion image analysis
US7155042B1 (en) Method and system of measuring characteristics of an organ
Gerard et al. Efficient model-based quantification of left ventricular function in 3-D echocardiography
Lee et al. Automatic left ventricle segmentation using iterative thresholding and an active contour model with adaptation on short-axis cardiac MRI
US7822246B2 (en) Method, a system and a computer program for integration of medical diagnostic information and a geometric model of a movable body
EP1620827B1 (en) Non-invasive left ventricular volume determination
JP4918048B2 (en) Image processing apparatus and method
WO2013131420A1 (en) Device and method for determining boundary of target region of medical image
Veronesi et al. Tracking of left ventricular long axis from real-time three-dimensional echocardiography using optical flow techniques
CN101028187A (en) System and method for image based physiological monitoring of cardiovascular function
FLEAGLE et al. Automated identification of left ventricular borders from spin-echo magnetic resonance images: experimental and clinical feasibility studies
EP2059173B1 (en) System and method for measuring left ventricular torsion
Mazonakis et al. Development and evaluation of a semiautomatic segmentation method for the estimation of LV parameters on cine MR images
JP2007521862A (en) Stochastic analysis of cardiac function
Richens et al. Segmentation by deformable contours of MRI sequences of the left ventricle for quantitative analysis
Po et al. In-vivo clinical validation of cardiac deformation and strain measurements from 4D ultrasound
Priya et al. Automatic epicardial fat segmentation with the DOOG based confined contrast histogram (DC2H) method
Angelini et al. Comparison of segmentation methods for analysis of endocardial wall motion with real-time three-dimensional ultrasound
Pednekar et al. Intensity and morphology-based energy minimization for the automatic segmentation of the myocardium
Caiani et al. Development and validation of automated endocardial and epicardial contour detection for MRI volumetric and wall motion analysis
Lelieveldt et al. Towards ‘One‐Stop’Cardiac MR Image Analysis
Ye et al. 3D freehand echocardiography for automatic left ventricle reconstruction and analysis based on multiple acoustic windows
Loeckx et al. Spatiotemporal non-rigid image registration for 3D ultrasound cardiac motion estimation
Liang et al. Left ventricle myocardial border detection in three-dimensional intracardiac ultrasound images
Nesser et al. Volumetric analysis of regional left ventricular function with real-time 3D echocardiography: validation by magnetic resonance and clinical utility testing

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): JP US

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): JP US

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

WWE Wipo information: entry into national phase

Ref document number: 2000946145

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 10069291

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 2000946145

Country of ref document: EP