US20030165263A1 - Histological assessment - Google Patents

Histological assessment Download PDF

Info

Publication number
US20030165263A1
US20030165263A1 US10/274,358 US27435802A US2003165263A1 US 20030165263 A1 US20030165263 A1 US 20030165263A1 US 27435802 A US27435802 A US 27435802A US 2003165263 A1 US2003165263 A1 US 2003165263A1
Authority
US
United States
Prior art keywords
image data
pixels
saturation
image
hue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/274,358
Inventor
Michael Hamer
Maria Petrou
Anastaslos Kesidis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qinetiq Ltd
Original Assignee
Qinetiq Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0218909A external-priority patent/GB0218909D0/en
Priority claimed from GB0222218A external-priority patent/GB0222218D0/en
Application filed by Qinetiq Ltd filed Critical Qinetiq Ltd
Assigned to QINETIQ LIMITED reassignment QINETIQ LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KESIDIS, TASOS, PETROU, MARIA, HAMER, MICHAEL J.
Publication of US20030165263A1 publication Critical patent/US20030165263A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S128/00Surgery
    • Y10S128/92Computer assisted medical diagnostics
    • Y10S128/922Computer assisted medical diagnostics including image analysis

Definitions

  • This invention relates to a method, a computer program and an apparatus for histological assessment, and more particularly for making measurements upon histological imagery to provide clinical information on potentially cancerous tissue such as for example (but not exclusively) breast cancer tissue.
  • Breast cancer is a common form of female cancer: once a lesion indicative of breast cancer has been detected, tissue samples are taken and examined by a histopathologist to establish a diagnosis, prognosis and treatment plan.
  • pathological analysis of tissue samples is a time consuming and inaccurate process. It entails interpretation of colour images by human eye, which is highly subjective: it is characterised by considerable inaccuracies in observations of the same samples by different observers and even by the same observer at different times. For example, two different observers assessing the same ten tissue samples may easily give different opinions for three of the slides ⁇ 30% error. The problem is exacerbated by heterogeneity, i.e. complexity of some tissue sample features. Moreover, there is a shortage of pathology staff.
  • Oestrogen and progesterone receptor (ER and PR) status, C-erb-2 and vascularity are parameters which are data of interest for assisting a clinician to formulate a diagnosis, prognosis and treatment plan for a patient.
  • C-erb-2 is also known as Cerb-B2, her-2, her-2/neu and erb-2.
  • the present invention provides a method of measuring oestrogen or progesterone receptor (ER or PR) status having the steps of:
  • the invention provides the advantage that it is computer-implementable, and hence is carried out in a way which avoids the subjectivity of a manual inspection process.
  • the invention may provide a method of measuring ER or PR status having the steps of:
  • the invention may provide a method of measuring ER or PR status having the steps of:
  • Step b) may implemented using a K-means clustering algorithm employing a Mahalanobis distance metric.
  • Step c) may be implemented by transforming the image data into a chromaticity space, and deriving hue and saturation from image pixels and a reference colour.
  • Hue may be obtained from an angle ⁇ equal to sin - 1 ⁇ ⁇ x ⁇ ⁇ ⁇ y - x ⁇ ⁇ y ⁇ ⁇ x ⁇ 2 + y ⁇ 2 ⁇ x 2 + y 2
  • (x, y) and ( ⁇ tilde over (x) ⁇ , ⁇ tilde over (y) ⁇ ) are respectively image pixel coordinates and reference colour coordinates in the chromaticity space. It may be adapted to lie in the range 0 to 90 degrees and a hue threshold of 80 degrees may be set in step d).
  • a saturation threshold S o may be set in step d), S o being 0.9 for saturation in the range 0.1 to 1.9 and 0 for saturation outside this range.
  • the fraction of pixels corresponding to preferentially stained cells may be determined by counting the number of pixels having both saturation greater than a saturation threshold and hue modulus less than a hue threshold and expressing such number as a fraction of a total number of pixels in the image: it may be awarded a score 0, 1, 2, 3, 4 or 5 according respectively to whether it is (i) 0, (ii) >0 and ⁇ 0.01, (iii) ⁇ 0.01 and ⁇ 0.10, (iv) ⁇ 0.11 and ⁇ 0.33, (v) ⁇ 0.34 and ⁇ 0.66 or (vi) ⁇ 0.67 and ⁇ 1.0.
  • Normalised average saturation may be accorded a score 0, 1, 2 or 3 according respectively to whether it is (i) ⁇ 25%, (ii) >25% and ⁇ 50%, (iii) >50% and ⁇ 75% or (iv) >75% and ⁇ 100%.
  • Scores for normalised average saturation and fraction of pixels corresponding to preferentially stained cells may be added together to provide a measurement of ER or PR.
  • the method of the invention may include measuring C-erb-2 status by the following steps:
  • the method of the invention may include measuring vascularity by the following steps:
  • the invention provides a method of measuring C-erb-2 status having the steps of:
  • the window functions may have non-zero values of 6, 12, 24 and 48 pixels respectively and zero values elsewhere. Pixels associated with a cell boundary are identified from a maximum correlation with a window function, the window function having a length which provides an estimate of cell boundary width.
  • the brightness-related measure of cell boundary brightness and sharpness may be computed in step d) using a calculation including dividing cell boundaries by their respective widths to provide normalised boundary magnitudes, selecting a fraction of the normalised boundary magnitudes each greater than unselected equivalents and summing the normalised boundary magnitudes of the selected fraction.
  • a brightness-related measure of brightness extent around cell boundaries may be computed using a calculation including dividing normalised boundary magnitudes into different magnitude groups each associated with a respective range of magnitudes, providing a respective magnitude sum of normalised boundary magnitudes for each magnitude group, and subtracting a smaller magnitude sum from a larger magnitude sum.
  • the comparison image having brightness-related measures closest to those determined for the image data may be determined from a Euclidean distance between the brightness-related measures of the comparison image and the image data.
  • step b) identifying in the image data contiguous pixel groups corresponding to respective cell nuclei is carried out by an adaptive thresholding technique arranged to maximise the number of contiguous pixel groups identified.
  • the adaptive thresholding technique may include:
  • the first three pairs of RMM and CMM values may be 0.802 and 1.24, 0.903 and 0.903, and 1.24 and 0.802 respectively.
  • Brown pixels may be removed from the thresholded red image if like-located pixels in the cyan image are less than CMM ⁇ C ; edge pixels may be removed likewise if like-located pixels in a Sobel-filtered cyan image having a standard deviation ⁇ C are greater than ( ⁇ C +1.5 ⁇ C ). Pixels corresponding to lipids may also be removed if their red green and blue pixel values are all greater than the sum of the relevant colour's minimum value and 98% of its range of pixel values in each case.
  • the thresholded red image may be subjected to a morphological closing operation.
  • the present invention provides a method of measuring vascularity having the steps of:
  • the image data may comprise pixels with red, green and blue values designated R, G and B respectively, characterised in that a respective saturation value S is derived in step b) for each pixel by:
  • Hue values designated H may be derived by:
  • the step of producing a segmented image may be implemented by designating for further processing only those pixels having both a hue H in the range 282-356 and a saturation S in the range 0.2 to 0.24.
  • the step of identifying in the segmented image groups of contiguous pixels may include the step of spatially filtering such groups to remove groups having insufficient pixels to contribute to vascularity.
  • the step of determining vascularity may include treating vascularity as having a high or a low value according to whether or not it is at least 31%.
  • the present invention provides a computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of:
  • the present invention provides a computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of:
  • the present invention provides a computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of:
  • the present invention provides a computer program for use in measuring C-erb-2 status arranged to control computer apparatus to execute the steps of:
  • the present invention provides a computer program for use in measuring vascularity arranged to control computer apparatus to execute the steps of:
  • the present invention provides an apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of:
  • the present invention provides an apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of:
  • the present invention provides an apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of:
  • the present invention provides an apparatus for measuring C-erb-2 status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of:
  • the present invention provides an apparatus for measuring vascularity including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, characterised in that the computer apparatus is also programmed to execute the steps of:
  • the computer program and apparatus aspects of the invention may have preferred features corresponding to those of respective method aspects.
  • FIG. 1 is a block diagram of a procedure for measuring indications of cancer to assist in formulating diagnosis and treatment
  • FIG. 2 is a block diagram of a process for measuring ER and PR receptor status in the procedure of FIG. 1;
  • FIG. 3 is a pseudo three dimensional view of a red, green and blue colour space (colour cube) plotted on respective orthogonal axes;
  • FIG. 4 is a transformation of FIG. 3 to form a chromaticity space
  • FIG. 5 is a drawing of a chromaticity space reference system
  • FIG. 6 illustrates use of polar co-ordinates
  • FIG. 7 is a block diagram of a process for measuring C-erb-2 in the procedure of FIG. 1;
  • FIG. 8 is a block diagram of a process for measuring vascularity in the procedure of FIG. 1.
  • FIG. 1 A procedure 10 for the assessment of tissue samples in the form of histopathological slides of potential carcinomas of the breast is shown in FIG. 1.
  • This drawing illustrates processes which generate measurements of specialised kinds for use by a pathologist as the basis for assessing patient diagnosis, prognosis and treatment plan.
  • the procedure 10 employs a database which maintains digitised image data obtained from histological slides as will be described later. Sections are taken (cut) from breast tissue samples (biopsies) and placed on respective slides. Slides are stained using a staining agent selected from the following depending on which parameter is to be determined:
  • ER-DAB Oestrogen receptor
  • PR Progesterone receptor
  • a clinician places a slide under a microscope and examines a region of it (referred to as a tile) at magnification of ⁇ 40 for indications of C-erb-2, ER and PR status and at ⁇ 20 for vascularity.
  • the present invention requires data from histological slides in a suitable form.
  • image data were obtained by a pathologist using Zeiss Axioskop microscope with a Jenoptiks Progres 3012 digital camera.
  • Image data from each slide is a set of digital images obtained at a linear magnification of 40 (i.e. 40 ⁇ ), each image being an electronic equivalent of a tile.
  • a pathologist scans the microscope over a slide, and at 40 ⁇ magnification selects regions (tiles) of the slide which appear to be most promising in terms of an analysis to be performed. Each of these regions is then photographed using the microscope and digital camera referred to above, which produces for each region a respective digitised image in three colours, i.e. red, green and blue (R, G & B). Three intensity values are obtained for each pixel in a pixel array to provide an image as a combination of R, G and B image planes. This image data is stored temporarily at 12 for later use.
  • Three tiles are required for vascularity measurement at 14, and one tile for each of oestrogen and progesterone receptor measurement at 16 and C-erb-2 measurement at 18. These measurements provide input to a diagnostic report at 20.
  • the prior art manual procedure for scoring C-erb-2 involves a pathologist subjectively and separately estimating stain intensity, stain location and relative number of cells associated with a feature of interest in a tissue sample. The values obtained in this way are combined by a pathologist to give a single measurement for use in diagnosis, prognosis and reaching a decision on treatment.
  • the process hereinafter described in this example replaces the prior art manual procedure with an objective procedure.
  • processing 16 to determine ER status begins with a pre-processing stage 30 in which a K-means clustering algorithm is applied to a colour image using a Mahalanobis metric. This determines or cues image regions of interest for further processing by associating pixels into clusters on the basis of their having similar values of the Mahalanobis metric.
  • the colour image is transformed into a chromaticity space which includes a location of a reference colour.
  • Hue and saturation are calculated at 34 for pixels in clusters cued by K-means clustering.
  • the number of brown stained pixels is computed at 36 by thresholding on the basis of hue and saturation.
  • An ER status measurement is then derived at 38 from a combination of the fraction of stained pixels and average colour saturation.
  • the input for the ER preprocessing stage 30 consists of raw digital data files of a single histopathological colour image or tile.
  • a triplet of image band values for each pixel represents the colour of that pixel in its red, green, and blue spectral components or image bands. These values in each of the three image bands are in the range [0 . . . 255], where [0,0,0] corresponds to black and [255,255,255] corresponds to white.
  • the K-means clustering algorithm 30 is applied to the digital colour image using clusters and the Mahalanobis distance metric.
  • a cluster is a natural grouping of data having similar values of the relevant metric
  • the Mahalanobis distance metric is a measurement that gives an indication of degree of closeness of data items to a cluster centre. It is necessary to have some means for locating cell nuclei as pixel groups but it is not essential to use four clusters or the Mahalanobis distance metric: these have been found to work well in identifying groups of contiguous pixels which correspond to respective cell nuclei.
  • the K-means algorithm is described by J. A. Hartigan and M. A. Wong, in a paper entitled ‘A K-means clustering algorithm’, Algorithm AS 136, Applied Statistics Journal, 1979.
  • the Mahalanobis distance metric is described by F.
  • cluster centres are set using 30+(cluster number +1) ⁇ 10 subtracted from the mean of the red, green and blue image bands respectively.
  • the first cluster values would be set at mean_red ⁇ 30+(0 +1) ⁇ 10 (hence mean_red ⁇ 20), similarly for mean_green and mean_blue.
  • the second cluster would be mean_red ⁇ 10, mean_green ⁇ 10, and mean_blue ⁇ 10, and similarly for other clusters. Pixels are then assigned to clusters for later readjustment.
  • ⁇ ij k is the ij th element of the covariance matrix
  • N k is the number of pixels in cluster k
  • C li , and C lj are the values of pixel l in image bands i and j,
  • i, j take values 1, 2, 3, which represent the red, green and blue image bands respectively,
  • ⁇ i k is the mean of all pixels in image band i belonging to cluster k.
  • ⁇ j k is the mean of all pixels in image band j belonging to cluster k.
  • each pixel ⁇ right arrow over (x) ⁇ i is now treated as a vector having three elements x i,1 , x i,2 , x i,3 which are the red (x i,1 ), green (x i,2 ) and blue (x i,3 ) pixel values: the red, green and blue image bands are therefore represented by second subscript indices 1, 2 and 3 respectively.
  • ⁇ i k is the mean of all pixel vectors ⁇ right arrow over (x) ⁇ i in cluster k, and
  • t indicates the transpose of the difference vector ( ⁇ right arrow over (x) ⁇ i ⁇ right arrow over ( ⁇ ) ⁇ k ).
  • Equation (2) is re-evaluated for the same pixel vector ⁇ right arrow over (x) ⁇ i in all other clusters also.
  • Pixel vector ⁇ right arrow over (x) ⁇ 1 has the highest likelihood of belonging to a cluster (denoted k m ) for which d k ( ⁇ right arrow over (x) ⁇ i )has a minimum value i.e. ⁇ d k m ( ⁇ right arrow over (x) ⁇ i ) ⁇ ; cluster k m is then the most suitable to receive pixel ⁇ right arrow over (x) ⁇ i ; i.e. find:—
  • each RGB image is transformed into a chromaticity space.
  • FIG. 3 shows an RGB cube 40 in which red, green and blue pixel values (expressed as R, G and B respectively) are normalised and represented as values in the range 0 to 1. These pixel values are represented on red, green and blue axes 52 , 54 and 56 respectively.
  • FIG. 4 shows the axes 52 , 54 and 56 and chromaticity space 58 looking broadly speaking along a diagonal of the RGB cube 50 from the point (1,1,1) (not shown) to the origin (0,0,0) now referenced O for convenience.
  • the points (0,0,1), (0,1,0) and (1,0,0) in FIG. 3 are now referenced J, K and L respectively.
  • D is a midpoint of a straight line between J and L.
  • Image pixel values from the input RGB image are projected on to the chromaticity space 108 and the resulting projections become data points for further processing.
  • the co-ordinate system origin is the centre of gravity G of the triangle 58 .
  • hue H is defined as the angle ⁇ between the radius vector (e.g. QP) to itself and the radius vector QS to the reference colour.
  • hue H For convenience the definition of hue H is now altered somewhat to render all values positive and in the range 0 to ⁇ /2: the transformation of earlier values ⁇ into a new version ⁇ is shown in Table 1 below: TABLE 1 Condition Magnitude of ⁇ (New Hue H) sin ⁇ > 0 and cos ⁇ > 0 ⁇ sin ⁇ > 0 and cos ⁇ ⁇ 0 ⁇ ⁇ ⁇ sin ⁇ ⁇ 0 and cos ⁇ > 0 ⁇ sin ⁇ ⁇ 0 and cos ⁇ > 0 ⁇ sin ⁇ ⁇ 0 and cos ⁇ ⁇ 0 ⁇ ⁇ ⁇ ⁇
  • a hue (H) threshold ⁇ 0 is set at 36 by a user or programmer of the procedure as being not more than ⁇ /2, a typical value which might be chosen being 80 degrees.
  • the thresholds are used to count selectively the number N b of pixels which are sufficiently brown (having a large enough value of saturation) having regard to the reference colour. All H and S pixel values in the image are assessed. The conditions to be satisfied by a pixel's hue and saturation values for it to be counted in the brown pixel number N b are set out in Table 3 below. TABLE 3 Condition Action For each pixel with both hue modulus Treat as a “saturated” pixel;
  • the average saturation of the N b saturated pixels determined in Table 3 is computed by adding all their saturation values S together and dividing the resulting sum by N b .
  • the maximum saturation value of the saturated pixels is then determined, and the average saturation is normalised by expressing it as a percentage of this maximum: this approach is used to counteract errors due to variation in colour staining between different images.
  • the normalised average saturation is then accorded a score at 38 of 0, 1, 2 or 3 according respectively to whether this percentage is (a) ⁇ 25%, (b) >25% and ⁇ 50%, (c) >50% and ⁇ 75% or (d) >75% and ⁇ 100%.
  • N b /N The fraction of saturated pixels—those corresponding to cells stained sufficiently brown relative to surrounding tissue—is computed at 38 from the ratio N b /N where N is the total number of pixels in the image. This fraction is then quantised to a score in the range 0 to 5 as set out in Table 5 below. TABLE 5 N b /N:Fraction of image pixels that are stained Score 0.00 0 ⁇ 0.01 1 0.01-0.10 2 0.11-0.33 3 0.34-0.66 4 0.67-1.00 5
  • Women with an ER score of 7 or 8 will respond favourably to hormonal treatment such as Tamoxifen; women with an ER score in the range 4 to 6 will have 50% of chance of responding to this treatment. Women scoring 2 or 3 will not respond very well, and those scoring 0 or 1 will not respond to hormonal treatment at all.
  • C-erb-2 the conventional manual technique involves processing a histopathological slide with chemicals to stain it appropriately, after which it is viewed by a clinician.
  • Breast cells on the slide will have stained nuclei with a range of areas which allows discrimination between tissue cells of interest and unwanted cell types which are not important to cancer assessment.
  • Cancerous cells will usually have a larger range of sizes of nuclei which must be allowed for in the discrimination process.
  • Score Staining Pattern 0 membrane staining in less than 10% of cells 1 just perceptible membrane staining in more than 10% of cells but membranes incompletely stained 2 weak to moderate complete membrane staining of more than 10% of cells 3 strong complete membrane staining of more than 10% cells
  • Scores 0 and 1 are negative (not justifying treatment), whereas scores 2 and 3 are called positive (justifying treatment).
  • Crushing artefact the tissue is inadvertently mechanically deformed allowing more ill-defined staining.
  • An optional preprocessing step 70 is carried out if images of tiles are poor due to camera vignetting or colour errors across the image.
  • Image segmentation is carried out in steps 71 to 78 , i.e. automated separation of objects from a background in a digital image.
  • the original digital image of a tile has red, green and blue image planes: from the green and blue image planes a cyan image plane is derived at 71 and a Sobel-filtered cyan image plane at 72 .
  • Statistical measures of the five image planes are computed at 74 and 76 , and then a segmented image is optimised and generated at 78 which has been filtered to remove unwanted pixels and spatial noise.
  • Step 78 is an adaptive thresholding technique using information from regions around pixels: it is shown in more detail within chain lines 80 with arrows 82 indicating iterations. It is an alternative to the K-means clustering algorithm previously described, which could also be used.
  • the image is rejected at 86 : if it is 16 or greater, then having found the cell nuclei, and hence the cells, the strength, thinness and completeness of each cell's surrounding membrane staining are measured and the membrane stainings are then ranked.
  • a sequence of cross-correlation windows of varying widths is passed along four radii from the cell centroid to determine the cell boundary brightness value, membrane width and distance from the centroid of the most intense staining.
  • Cell boundary brightness value is normalised by dividing by membrane width, and nuclear area and sum of normalised boundary brightness values are then obtained.
  • Statistical measures characterising membrane-staining strength, specificity and completeness are then deduced: these measures are compared with equivalents obtained from four reference images. The measured image is then graded by assigning it a score which is that of the closest reference, with the metric of Euclidean-distance. Other metrics may also be used. Alternatively, the scores of a moderately large sample may be used as references.
  • the C-erb-2 process will now be described in more detail.
  • the process 18 is applied to one image or tile obtained by magnifying by a factor of 40 an area of a histological slide.
  • the optional preprocessing step 70 is carried out by either:
  • (b) preferably, if.sufficient images are available from the same camera objective lens, computing its deficiency and correcting it, rather than processing sub-images with more part-cells split across boundaries.
  • the digital image of a slide is a three colour or red green and blue (RGB) image as defined above, i.e. there is a respective image plane for each colour.
  • RGB red green and blue
  • the letters R, G and B for each pixel are treated as the red green and blue intensities at that pixel.
  • Cyan is used because it is a complementary colour to brown, which is the cell boundary colour produced by conventional chemical staining of a specimen.
  • the blue image plane could be used instead but does not normally produce results as good as the cyan image. If a different colour staining were to be use, the associated complementary colour image would be selected. This process step is not essential but it greatly assists filtering out unwanted pixels and it does so without a reference colour (see the ER/PR example which uses an alternative approach).
  • a Sobel edge filter is applied to the cyan image plane: this is a standard image processing technique published in Klette R., & Zamperoni P., ‘Handbook of image processing operators’, John Wiley & Sons, 1995.
  • a Sobel edge filter consists of two 3 ⁇ 3 arrays of numbers S P and S Q , each of which is convolved with successive 3 ⁇ 3 arrays of pixels in an image.
  • S P [ 1 2 1 0 0 0 - 1 - 2 - 1 ] ⁇ ⁇
  • ⁇ ⁇ S Q [ 1 0 - 1 2 0 - 2 1 0 - 1 ] ( 11 )
  • the step 72 initially selects a first cyan 3 ⁇ 3 array of pixels in the top left hand corner of the cyan image: designating as C ij a general cyan pixel in row i and column j, the top left hand corner of the image consists of pixels C 11 , to C 13 , C 21 to C 23 and C 31 to C 33 .
  • C ij is then multiplied by the respective digit of S P located in the S P array as C ij is in the 3 ⁇ 3 cyan pixel array: i.e. C 11 to C 13 are multiplied by 1, 2 and 1 respectively, C 21 to C 23 by zeroes and C 31 to C 33 by ⁇ 1, ⁇ 2 and ⁇ 1 respectively.
  • the products so formed are added algebraically and provide a value p.
  • a general pixel T ij (row i, column j) in the transformed image is derived from C i ⁇ 1,j ⁇ 1 to C i ⁇ 1,j+1 , C i,j ⁇ 1 to C i,j+1 and C i+1,j ⁇ 1 to C i+1,j+1 of the cyan image. Because the central row and column of the Sobel filters in Equation (11) respectively are zeros, and other coefficients are 1s and 2s, p and q for T ij can be calculated as follows:
  • the Sobel filter cannot calculate values for pixels at image edges having no adjacent pixels on one or other of its sides: i.e. in a pixel array having N rows and M columns, edge pixels are the top and bottom rows and the first and last columns, or in the transformed image pixels T 11 to T 1M , T N1 to T NM , T 11 to T 1M and T 1M to T NM . By convention in Sobel filtering these edge pixels are set to zero.
  • a major problem with measurements on histopathological images is that the staining of different slides can vary enormously, e.g. from blue with dark spots to off-white with brown outlines. The situation can be improved by sifting the slides and using only those that conform to a predetermined colouration. However, it has been found that it is possible to cope with variation in staining to a reasonable extent by using statistical techniques to normalise images: in this connection steps 74 and 76 derive a variety of statistical parameters for use in image segmentation in step 78 .
  • Step 74 is computed the mean and standard deviation of the transformed pixel values T ij .
  • M this treats a two dimensional image as a single composite line composed of successive rows of the image.
  • x is substituted for T in each pixel value, so T ij becomes X k .
  • Equations (14) and (15) respectively are used for computing the mean ⁇ and standard deviation ⁇ of the transformed pixels x k .
  • the statistical parameters are the mean ⁇ R and standard deviation ⁇ R of its pixel values: in Equations (14) and (15), x k represents a general pixel value in the Red image plane.
  • the Red image plane's pixels are compared with one another to obtain their maximum, minimum and range (maximum-minimum).
  • pixels in each of the Green and Blue image planes are compared with one another to obtain a respective maximum, minimum and range for each plane.
  • pixels' mean and standard deviation are computed using Equations (14) and (15), in which x k represents a general pixel value in the Cyan image plane.
  • step 78 the image is segmented to identify and locate cell nuclei.
  • a pixel is counted as part of a cell nucleus if and only if it survives a combination of thresholding operations on the Red, Green, Blue, Cyan and Sobel of Cyan image planes followed by closure of image gaps left after thresholding operations. It is necessary to determine threshold values in a way which allows for variation in chemical staining between different images.
  • the technique employed in this example is to perform a multidimensional optimisation of some thresholds with nuclei-number as the objective-function to be maximised: i.e. for a given image, threshold values are altered intelligently until a near maximum number of nuclei is obtained.
  • Starting values are computed for the optimisation routines by choosing those suitable for provision of threshold levels.
  • two dimensional optimisation is used requiring three starting values indicated by suffixes 1, 2 and 3 and each with two components: the starting values represent vertices of a triangle in a two dimensional plane.
  • the starting values are RMM1/CMM1, MMM2/CMM2 and RMM3/CMM3, RMM indicating a “Red Mean Multiplier” and CMM indicating a “Cyan Mean Multiplier”.
  • (a) Produce a thresholded image for the Red image plane (approximately complimentary to Blue) as follows: for every Red pixel value that is less than an adaptive threshold, set the corresponding pixel location in the thresholded Red image to 1, otherwise set the latter to 0.
  • a respective adaptive threshold is computed separately for every pixel location as follows.
  • the Red image threshold value is dependent on the presence of enclosing brown stain in the neighbourhood of each pixel, i.e. it is a function of Cyan mean ⁇ C and Red mean ⁇ R .
  • a check for enclosing brown is performed by searching radially outwards from a pixel under consideration.
  • the procedure is in the Cyan image plane to select the same pixel location as in the Red image plane and from it to search in four directions—north, south, east and west directions—for a distance of seventy pixels (or as many as are available up to seventy).
  • north, south, east and west have the following meanings: north: upward from the pixel in the same column; south: downward from the pixel in the same column; east: rightward from the pixel in the same row; and west: leftward from the pixel in the same row.
  • More directions e.g. diagonals north-east, north-west, south-east and south-west could be used to improve accuracy but four have been found to be adequate for the present example.
  • a cyan pixel will fall below a threshold (indicating a brown pixel) or a radius of 70 pixels will be reached without a cyan pixel doing so.
  • the number R B of “brown” radii is then used to change the red threshold adaptively in the following way:
  • RTN Red image plane threshold
  • RMM1 ⁇ R is the product of RMM1 and ⁇ R
  • ⁇ R is the standard deviation of the Red image plane.
  • a limit is placed on RTN giving it a maximum possible value of 255. If the Red image plane pixel under consideration is less than the Red image plane threshold calculated for it, the corresponding pixel at the same location in the thresholded Red image is set to one, otherwise it is set to zero.
  • Pixels corresponding to lipids are now removed as follows: using the pixel minimum and range values computed at step 76 , a thresholded Red image is produced using data obtained from the Red, Green and Blue image planes: for each Red, Green and Blue pixel group at a respective pixel location that satisfies all three criteria at (i) to (iii) below, set the pixel at the corresponding location in the thresholded Red image to 0, otherwise do not change the pixel; this has the effect of removing lipid image regions (regions of fat which appear as highly saturated white areas). Removal of these regions is not essential but is desirable to improve processing.
  • the criteria for each set of Red, Green and Blue values at a respective pixel are:
  • Steps (c) and (d) could be moved outside the recursion loop defined within chain lines 80 if desired, with consequent changes to the procedure.
  • step (e) The next step is to apply to the binary image obtained at step (d) of 78 above a morphological closing operation, which consist of a dilation operation followed by an erosion operation.
  • morphological operations fuse narrow gaps and eliminate small holes in individual groups of contiguous pixels appearing as blobs in an image. They are not essential but they improve processing. They can be thought of as removal of irregularities or spatial “noise”, and they are standard image processing procedures published in Umbaugh S. C., ‘Colour vision and image processing’, Prentice Hall, 1998.
  • a connected component labelling process is now applied to the binary image produced at step (e).
  • This is a known image processing technique (sometimes referred to as ‘blob colouring’) published by R Klette and P Zamperoniu, ‘Handbook of Image Processing Operators’, John Wiley & Sons, 1996, and A Rosenfeld and A C Kak, ‘Digital Picture Processing’, Vols. 1 & 2, Academic Press, New York, 1982. It gives numerical labels to “blobs” in the binary image, blobs being regions or groups of like-valued contiguous or connected pixels in an image: i.e. each group or blob consists of connected pixels which are all 1 s, and each is assigned a number different to those of other groups.
  • the number of labelled image regions or blobs in the image is computed from the labels and output. Connected component labelling also determines each labelled image region's centroid (pixel location of region centre), height, width and area. Image regions are now removed from the binary image if they are not of interest because they are too small or too large in area or they have sufficiently dissimilar height and width indicating they are flattened. The remaining regions in the binary image pass to the next stage of processing at (g).
  • Steps (a) to (f) are carried out for all three starting points or triangle vertices RMM1/CMM1, RMM2/CMM2 and RMM3/CMM3: this yields three values for the number of regions remaining in the binary image in each case.
  • This step is referred to as the Downhill Simplex method: it is a standard iterative statistical technique for multidimensional optimisation published in Nelder J. A., Mead R., 1965, Computer Journal, vol. 7, pp 308-313, 1965. It takes as input the three numbers of regions remaining after step (f). It is possible to use other optimisation techniques such as that referred to as Powell which uses gradients. The starting point/vertex yielding the lowest number of regions remaining is then selected. A new starting point is then generated as the reflection of the selected vertex in the line joining to the two other vertices: i.e.
  • the new starting point is 2,2.
  • the selected vertex is then discarded and the other two retained.
  • the new starting point or vertex becomes RMM4/CMM4 and steps (a) to (f) are repeated using it to generate a new number of regions remaining for comparison with those associated with the two retained vertices. Again a vertex yielding the lowest number of regions remaining is selected, and the process of new RMM/CMM values and steps (a) to (f) is iterated as indicated by arrows 82 . Iterations continue until the rate of change of remaining number of image regions (cell nuclei number) slows down, i.e. when successive iterations show a change of less than 10% in this number: at that point optimisation is terminated and the binary image remaining after step (f) selected for further processing is that generated using the RMM/CMM values giving the highest nuclei number.
  • the procedure 18 is now concerned with determining quantities referred to as “grand_mean” and “mean_range” to be defined later. If the Downhill Simplex method (g) has determined that there are less than a user specified number of image regions or cell nuclei, sixteen in the present example, then at 84 processing is switched to 86 indicating a problem image which is to be rejected.
  • the Downhill Simplex method has determined that there are at least sixteen image regions, then at 84 processing is switched to 88 where a search to characterise these regions' boundaries is carried out.
  • the search uses each region's area and centroid pixel location as obtained in connected component labelling at 78 ( f ), and each region is assumed to be a cell with a centroid which is the centre of the cell's nucleus. This assumption is justified for most cells, but there may be misshapen cells for which it does not hold: it is possible to discard misshapen cells by eliminating those with concave boundary regions for example, but this is not implemented in the present example.
  • the search to characterise the regions' boundaries is carried out along the respective north, south, east and west directions (as defined earlier) from the centroid (more directions may be used to improve accuracy): it is carried out in each of these directions for a distance ⁇ which is either 140 pixels or 2 ⁇ square root ⁇ square root over (region area) ⁇ , whichever is the lesser. It employs the original (2B+G)/3 cyan image because experience shows that this image gives the best defined cell boundaries with the slide staining previously described.
  • C ij designating C ij as the intensity of a region's centroid pixel in the cyan image at row i and column j, then pixels to be searched north, south, east and west of this centroid will have intensities in the cyan image of C i+1,j to C i+ ⁇ ,j , C i ⁇ 1,j to C i ⁇ ,j , C i,j+1 to C i,j+ ⁇ and C i,j ⁇ 1 to C i,j ⁇ respectively.
  • the cyan intensity of each of the pixels to be searched is subtracted from the centroid pixel's cyan intensity C ij to produce a difference value, which may be positive or negative.
  • a cell nucleus is normally blue whereas a boundary is brown (with staining as described earlier).
  • Each pixel is then treated as being part of four linear groups or “windows” of six, twelve, twenty-four and forty-eight pixels each including the pixel and extending from it in a continuous line north, south, east or west (as defined earlier) according respectively to whether the pixel is north, south, east or west of the centroid.
  • pixels in each of the chosen directions have mathematical window functions applied to them, the function having the value 1 at pixels within a group and the value 0 outside it.
  • C i+1,j is for example grouped with C i+2,j to C i+6,j , C i+2,j to C i+12,j , C i+2,j to C i+24,j , and C i+2,j to C i+48,j (inclusive in each case).
  • This provides a total of 16 ⁇ groups from 4 ⁇ groups in each of four directions.
  • the difference between each of its pixels' cyan intensities and that of the centroid is calculated: the differences are summed over the group algebraically (positive and negative differences cancelling one another). This sum is divided by the number of pixels in the group to provide a net difference per pixel between the cyan intensities of the group's pixels and that of the centroid.
  • each direction i.e. north, south, east and west
  • the four maxima so obtained (one for each direction) and the respective window size in each case are stored.
  • Each maximum is a measure of the region boundary (cell membrane) magnitude in the relevant direction, because in a cyan image the maximum difference as compared to a blue cell nucleus occurs at a brown cell boundary.
  • the window size associated with each maximum indicates the region boundary width, because a boundary width will give a higher maximum in this technique with a window size which it more nearly matches as compared to one it matches less well. Greater accuracy is obtainable by using more window sizes and windows matched to cell boundary shape, i.e. multiplying pixels in each linear group by respective values collectively forming a boundary shape function.
  • the process is in fact mathematically a correlation operation in which a window shape is correlated with a linear group of pixels.
  • a further option is to record the position of the maximum or boundary (cell radius) as being that of one of the two pixels at the centre of the window in which the maximum occurs: this was not done in the present example, although it would enable misshapen cells to be detected and discarded as being indicated by significant differences in the positions of maxima in the four directions, and it would improve width measure by accounting for oblique intersections of windows and cell boundaries.
  • Each maximum or region boundary magnitude is then divided by the associated window size (region boundary width) used to derive it: this forms what is called for the purposes of this specification a normalised boundary magnitude—it is a measure of both brightness and sharpness: It enables discrimination against ill-defined staining not attached to a cell membrane.
  • the next step 90 is to apply what is referred to as a “quicksort” to the four normalised boundary magnitudes to sort them into descending order of magnitude.
  • Quicksort is a known technique published by Klette R., Zamperoniu P., ‘Handbook of Image Processing
  • a further quicksort is now applied (also at 90 ) to the image regions to sort them into descending order of item 5 values in Table 7 above, i.e. sum of Largest, Second Largest, Third Largest and Smallest normalised boundary magnitudes.
  • a subset of the image regions is now selected as being those having large values of item 5: these are the most significant image regions and they are the best one eighth of the total number of image regions in terms of item 5 magnitude. From this subset of image regions the following parameters are computed at 92 , “grand_mean”, “mean_range” and “relative_range” as defined below:
  • octile one eighth of the total number of image regions or cell nuclei (16)
  • mean_range [( ⁇ item 1) ⁇ ( ⁇ item 3)]/octile (22)
  • Grand_mean is indicative of the degree to which an image exhibits good cell boundary sharpness and brightness.
  • Relative_range indicates the degree to which an image exhibits brightness extending around cell boundaries—the smallest boundaries (item 4) are omitted from this computation to provide some robustness against incomplete cells.
  • a cell boundary that exhibits a large value of relative_range will have brightness varying appreciably around the boundary corresponding to non-uniformity of staining or possibly even absence of a boundary.
  • this measure provides an estimate of how far the current cyan image (generated at 71 ) is from each member of a predetermined standard set of images, four images in the present example.
  • M i and RR i become the components of respective four-element vectors M and RR, and are used in the following expression:
  • C-erb-2 indicator min i ⁇ (M i ⁇ grand mean) 2 +(RR i ⁇ relative range) 2 ⁇ (24)
  • the value of the index i is returned as the indicator for the C-erb-2 measurement process.
  • FIG. 8 there is shown a flow diagram of the process 14 (see FIG. 1) for measurement of vascularity.
  • the process 14 is applied to three images each of ⁇ 20 magnification compared to the histopathological slide from which they were taken.
  • each image is transformed from red/green/blue (RGB) to a different image space hue/saturation/value (HSV).
  • RGB to HSV transformation is described by K. Jack in ‘Video Demystified’, 2 nd ed., HighText Publications, San Diego, 1996.
  • V or brightness
  • H and S are calculated for each pixel of the two RGB images as follows:
  • Saturation (S) is set as follows:
  • the next step 102 is to apply colour segmentation to obtain a binary image.
  • This segmentation is based on thresholding using the Hue and Saturation from the HSV colour space, and is shown in Table 9 below.
  • Table 9 Binary Image Threshold Criterion Pixel Value Pixel with both Hue H in the range 282-356 degrees Set pixel to 1 (scale 0 to 360), and Saturation S in the range 0.2 to 0.24 (scale 0 to 1) Pixel with either Hue outside the range 282-356 Set pixel to 0 degrees, and/or Saturation outside the range 0.2-0.24
  • the next stage 104 is to apply connected component labelling (as defined previously) to the segmented binary image: this provides a binary image with regions of contiguous pixels equal to 1, the regions being uniquely labelled for further processing and their areas being determined.
  • the labelled binary image is then spatially filtered to remove small connected components (image regions with less than 10 pixels) which have insufficient pixels to contribute to vascularity: this provides a reduced binary image.
  • the sum of the area of the remaining image regions in the reduced binary image is then determined at 106 from the results of connected component labelling, and this sum is then expressed as a percentage of the area of the whole image.
  • This procedure is carried out for each of the original RGB images separately to provide three such percentage area values: the average of the three percentage area values is computed, and it represents an estimate of the percentage of the area of a tissue sample occupied by blood vessels—i.e. the sample vascularity.
  • vascularity is determined to be high or low depending on whether or not it is equal to at least 31%. TABLE 10 Description of vascularity Range High 31%-100% Low 0%-30%
  • High vascularity corresponds to relatively fast tumour growth because tumour blood supply has been facilitated, and early treatment is indicated.
  • Low vascularity corresponds to relatively slow tumour growth, and early treatment is less important.

Abstract

A method of measuring oestrogen or progesterone receptor (ER or PR) comprises identifying in histopathological specimen image data pixel groups indicating cell nuclei, and deriving image hue and saturation. The image is thresholded using hue and saturation and preferentially stained cells identified. ER or PR status is determined from normalised average saturation and proportion of preferentially stained cells. A method of measuring C-erb-2 comprises correlating window functions with pixel sub-groups to identify cell boundaries, computing measures of cell boundary brightness and sharpness and brightness extent around cell boundaries, and comparing the measures with comparison images associated with different values of C-erb-2. A C-erb-2 value associated with a comparison image having similar brightness-related measures is assigned. A method of measuring vascularity comprises deriving image hue and saturation, producing a segmented image by hue and saturation thresholding and identifying contiguous pixels. Vascularity is determined from contiguous pixel area corresponding to vascularity expressed as a proportion of total image area.

Description

  • This invention relates to a method, a computer program and an apparatus for histological assessment, and more particularly for making measurements upon histological imagery to provide clinical information on potentially cancerous tissue such as for example (but not exclusively) breast cancer tissue. [0001]
  • Breast cancer is a common form of female cancer: once a lesion indicative of breast cancer has been detected, tissue samples are taken and examined by a histopathologist to establish a diagnosis, prognosis and treatment plan. However, pathological analysis of tissue samples is a time consuming and inaccurate process. It entails interpretation of colour images by human eye, which is highly subjective: it is characterised by considerable inaccuracies in observations of the same samples by different observers and even by the same observer at different times. For example, two different observers assessing the same ten tissue samples may easily give different opinions for three of the slides −30% error. The problem is exacerbated by heterogeneity, i.e. complexity of some tissue sample features. Moreover, there is a shortage of pathology staff. [0002]
  • Oestrogen and progesterone receptor (ER and PR) status, C-erb-2 and vascularity are parameters which are data of interest for assisting a clinician to formulate a diagnosis, prognosis and treatment plan for a patient. C-erb-2 is also known as Cerb-B2, her-2, her-2/neu and erb-2. [0003]
  • It is an object of the invention to provide a technique for objective measurement of at least one of ER status, PR status, C-erb-2 and vascularity. [0004]
  • In a first aspect, the present invention provides a method of measuring oestrogen or progesterone receptor (ER or PR) status having the steps of: [0005]
  • a) obtaining histopathological specimen image data; and [0006]
  • b) identifying in the image data groups of contiguous pixels corresponding to respective cell nuclei; [0007]
  • characterised in that the method also includes the steps of: [0008]
  • c) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0009]
  • d) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0010]
  • e) determining ER or PR status from proportion of pixels corresponding to preferentially stained cells. [0011]
  • The invention provides the advantage that it is computer-implementable, and hence is carried out in a way which avoids the subjectivity of a manual inspection process. [0012]
  • In an alternative first aspect, the invention may provide a method of measuring ER or PR status having the steps of: [0013]
  • a) obtaining histopathological specimen image data; and [0014]
  • b) identifying in the image data groups of contiguous pixels corresponding to respective cell nuclei; [0015]
  • characterised in that the method also includes the steps of: [0016]
  • c) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0017]
  • d) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0018]
  • e) determining ER or PR status from normalised average saturation. [0019]
  • In a further alternative first aspect, the invention may provide a method of measuring ER or PR status having the steps of: [0020]
  • a) obtaining histopathological specimen image data; and [0021]
  • b) identifying in the image data groups of contiguous pixels corresponding to respective cell nuclei; [0022]
  • characterised in that the method also includes the steps of: [0023]
  • c) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0024]
  • d) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0025]
  • e) determining ER or PR status from normalised average saturation and fraction of pixels corresponding to preferentially stained cells. [0026]
  • Step b) may implemented using a K-means clustering algorithm employing a Mahalanobis distance metric. [0027]
  • Step c) may be implemented by transforming the image data into a chromaticity space, and deriving hue and saturation from image pixels and a reference colour. Hue may be obtained from an angle φ equal to [0028] sin - 1 x ~ y - x y ~ x ~ 2 + y ~ 2 x 2 + y 2
    Figure US20030165263A1-20030904-M00001
  • and saturation from an expression [0029] x x ~ + y y ~ x ~ 2 + y ~ 2 ,
    Figure US20030165263A1-20030904-M00002
  • where (x, y) and ({tilde over (x)}, {tilde over (y)}) are respectively image pixel coordinates and reference colour coordinates in the chromaticity space. It may be adapted to lie in the [0030] range 0 to 90 degrees and a hue threshold of 80 degrees may be set in step d). A saturation threshold So may be set in step d), So being 0.9 for saturation in the range 0.1 to 1.9 and 0 for saturation outside this range.
  • The fraction of pixels corresponding to preferentially stained cells may be determined by counting the number of pixels having both saturation greater than a saturation threshold and hue modulus less than a hue threshold and expressing such number as a fraction of a total number of pixels in the image: it may be awarded a [0031] score 0, 1, 2, 3, 4 or 5 according respectively to whether it is (i) 0, (ii) >0 and <0.01, (iii) ≧0.01 and ≦0.10, (iv) ≧0.11 and ≦0.33, (v) ≧0.34 and ≦0.66 or (vi) ≧0.67 and ≦1.0.
  • Normalised average saturation may be accorded a [0032] score 0, 1, 2 or 3 according respectively to whether it is (i) ≦25%, (ii) >25% and ≦50%, (iii) >50% and ≦75% or (iv) >75% and ≦100%.
  • Scores for normalised average saturation and fraction of pixels corresponding to preferentially stained cells may be added together to provide a measurement of ER or PR. [0033]
  • The method of the invention may include measuring C-erb-2 status by the following steps: [0034]
  • a) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries, [0035]
  • b) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries, [0036]
  • c) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and [0037]
  • d) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data. [0038]
  • The method of the invention may include measuring vascularity by the following steps: [0039]
  • a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0040]
  • b) producing a segmented image by thresholding the image data on the basis of hue and saturation; [0041]
  • c) identifying in the segmented image groups of contiguous pixels; and [0042]
  • d) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area. [0043]
  • In a second aspect, the invention provides a method of measuring C-erb-2 status having the steps of: [0044]
  • a) obtaining histopathological specimen image data; and [0045]
  • b) identifying in the image data contiguous pixel groups corresponding to respective cell nuclei associated with surrounding cell boundary staining; [0046]
  • c) characterised in that the method also includes the steps of: [0047]
  • d) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries, [0048]
  • e) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries, [0049]
  • f) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and [0050]
  • g) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data. [0051]
  • In this aspect, at least some of the window functions may have non-zero values of 6, 12, 24 and 48 pixels respectively and zero values elsewhere. Pixels associated with a cell boundary are identified from a maximum correlation with a window function, the window function having a length which provides an estimate of cell boundary width. [0052]
  • The brightness-related measure of cell boundary brightness and sharpness may be computed in step d) using a calculation including dividing cell boundaries by their respective widths to provide normalised boundary magnitudes, selecting a fraction of the normalised boundary magnitudes each greater than unselected equivalents and summing the normalised boundary magnitudes of the selected fraction. [0053]
  • In step d) a brightness-related measure of brightness extent around cell boundaries may be computed using a calculation including dividing normalised boundary magnitudes into different magnitude groups each associated with a respective range of magnitudes, providing a respective magnitude sum of normalised boundary magnitudes for each magnitude group, and subtracting a smaller magnitude sum from a larger magnitude sum. [0054]
  • The comparison image having brightness-related measures closest to those determined for the image data may be determined from a Euclidean distance between the brightness-related measures of the comparison image and the image data. [0055]
  • In step b) identifying in the image data contiguous pixel groups corresponding to respective cell nuclei is carried out by an adaptive thresholding technique arranged to maximise the number of contiguous pixel groups identified. For image data including red, green and blue image planes the adaptive thresholding technique may include: [0056]
  • a) generating a mean value μ[0057] R and a standard deviation σR for pixels in the red image plane,
  • b) generating a cyan image plane from the image data and calculating a mean value μ[0058] C for its pixels,
  • c) calculating a product CMMμ[0059] C where CMM is a predetermined multiplier,
  • d) calculating a quantity R[0060] B equal to the number of adjacent linear groups of pixels of predetermined length and including at least one cyan pixel which is less than CMMμC,
  • e) for each red pixel calculating a threshold equal to {RMMμ[0061] R−σR(4−RB)} and RMM is a predetermined multiplier,
  • f) forming a thresholded red image by discarding each red pixel that is greater than or equal to the threshold, [0062]
  • g) determining the number of contiguous pixel groups in the thresholded red image, [0063]
  • h) changing the values of RMM and CMM and iterating steps c) to g), [0064]
  • i) changing the values of RMM and CMM once more and iterating steps c) to g), [0065]
  • j) comparing the numbers of contiguous pixel groups determined in steps g) to i), treating the three pairs of values of RMM and CMM as points in a two dimensional space, selecting the pair of values of RMM and CMM associated with the lowest number of contiguous pixel groups, obtaining its reflection in the line joining the other two pairs of values of RMM and CMM, using this reflection as a new pair of values of RMM and CMM and iterating steps c) to g) and this step j). [0066]
  • The first three pairs of RMM and CMM values may be 0.802 and 1.24, 0.903 and 0.903, and 1.24 and 0.802 respectively. [0067]
  • Brown pixels may be removed from the thresholded red image if like-located pixels in the cyan image are less than CMMμ[0068] C; edge pixels may be removed likewise if like-located pixels in a Sobel-filtered cyan image having a standard deviation σC are greater than (μC+1.5σC). Pixels corresponding to lipids may also be removed if their red green and blue pixel values are all greater than the sum of the relevant colour's minimum value and 98% of its range of pixel values in each case.
  • The thresholded red image may be subjected to a morphological closing operation. [0069]
  • In a third aspect, the present invention provides a method of measuring vascularity having the steps of: [0070]
  • a) obtaining histopathological specimen image data; characterised in that the method also includes the steps of: [0071]
  • b) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0072]
  • c) producing a segmented image by thresholding the image data on the basis of hue and saturation; and [0073]
  • d) identifying in the segmented image groups of contiguous pixels; and [0074]
  • e) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area. [0075]
  • In this aspect the image data may comprise pixels with red, green and blue values designated R, G and B respectively, characterised in that a respective saturation value S is derived in step b) for each pixel by: [0076]
  • a) defining M and m for each pixel as respectively the maximum and minimum of R, G and B; and [0077]
  • b) setting S to zero if m equals zero and setting S to (M−m)/M otherwise. [0078]
  • Hue values designated H may be derived by: [0079]
  • a) defining new values newr, newg and newb for each pixel given by newr=(M−R)/(M−m), newg=(M−G)/(M−m) and newb=(M−B)/(M−m) in order to convert each pixel value into the difference between its magnitude and that of the maximum of the three colour magnitudes of that pixel, this difference being divided by the difference between the maximum and minimum of R, G and B, and [0080]
  • b) calculating H as tabulated immediately below: [0081]
    M H
    0 180
    R 60(newb − newg)*
    G 60(2 + newr − newb)*
    B 60(4 + newg − newr)*
  • provided that if H proves to be >360, then 360 is subtracted from it, and if H proves to be <0, 360 is added to it. [0082]
  • The step of producing a segmented image may be implemented by designating for further processing only those pixels having both a hue H in the range 282-356 and a saturation S in the range 0.2 to 0.24. The step of identifying in the segmented image groups of contiguous pixels may include the step of spatially filtering such groups to remove groups having insufficient pixels to contribute to vascularity. The step of determining vascularity may include treating vascularity as having a high or a low value according to whether or not it is at least 31%. [0083]
  • In a fourth aspect, the present invention provides a computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of: [0084]
  • a) processing histopathological specimen image data to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei; [0085]
  • characterised in that the program is also arranged to implement the steps of: [0086]
  • b) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0087]
  • c) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0088]
  • d) determining ER or PR status from proportion of pixels corresponding to preferentially stained cells. [0089]
  • In an alternative fourth aspect, the present invention provides a computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of: [0090]
  • a) processing histopathological specimen image data to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei; [0091]
  • b) characterised in that the program is also arranged to implement the steps of: [0092]
  • c) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0093]
  • d) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0094]
  • e) determining ER or PR status from normalised average saturation. [0095]
  • In a further alternative fourth aspect, the present invention provides a computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of: [0096]
  • a) processing histopathological specimen image data to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei; [0097]
  • characterised in that the program is also arranged to implement the steps of: [0098]
  • b) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0099]
  • c) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0100]
  • d) determining ER or PR status from normalised average saturation and fraction of pixels corresponding to preferentially stained cells. [0101]
  • In a fifth aspect, the present invention provides a computer program for use in measuring C-erb-2 status arranged to control computer apparatus to execute the steps of: [0102]
  • a) processing histopathological specimen image data to identify contiguous pixel groups corresponding to respective cell nuclei associated with surrounding cell boundary staining; [0103]
  • characterised in that the computer program is also arranged to implement the steps of: [0104]
  • b) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries, [0105]
  • c) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries, [0106]
  • d) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and [0107]
  • e) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data. [0108]
  • In a sixth aspect, the present invention provides a computer program for use in measuring vascularity arranged to control computer apparatus to execute the steps of: [0109]
  • a) using histopathological specimen image data to derive hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0110]
  • b) producing a segmented image by thresholding the image data on the basis of hue and saturation; and [0111]
  • c) identifying in the segmented image groups of contiguous pixels; and [0112]
  • f) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area. [0113]
  • In a seventh aspect, the present invention provides an apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of: [0114]
  • a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0115]
  • b) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0116]
  • c) determining ER or PR status from proportion of pixels corresponding to preferentially stained cells. [0117]
  • In an alternative seventh aspect, the present invention provides an apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of: [0118]
  • a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0119]
  • b) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0120]
  • c) determining ER or PR status from normalised average saturation. [0121]
  • In a further alternative seventh aspect, the present invention provides an apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of: [0122]
  • a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0123]
  • b) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and [0124]
  • c) determining ER or PR status from normalised average saturation and fraction of pixels corresponding to preferentially stained cells. [0125]
  • In an eighth aspect, the present invention provides an apparatus for measuring C-erb-2 status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of: [0126]
  • a) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries, [0127]
  • b) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries, [0128]
  • c) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and [0129]
  • d) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data. [0130]
  • In a ninth aspect, the present invention provides an apparatus for measuring vascularity including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, characterised in that the computer apparatus is also programmed to execute the steps of: [0131]
  • a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate; [0132]
  • b) producing a segmented image by thresholding the image data on the basis of hue and saturation; and [0133]
  • c) identifying in the segmented image groups of contiguous pixels; and [0134]
  • d) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area. [0135]
  • The computer program and apparatus aspects of the invention may have preferred features corresponding to those of respective method aspects.[0136]
  • In order that the invention might be more fully understood, embodiments thereof will now be described, by way of example only, with reference to the accompanying drawings, in which:—[0137]
  • FIG. 1 is a block diagram of a procedure for measuring indications of cancer to assist in formulating diagnosis and treatment; [0138]
  • FIG. 2 is a block diagram of a process for measuring ER and PR receptor status in the procedure of FIG. 1; [0139]
  • FIG. 3 is a pseudo three dimensional view of a red, green and blue colour space (colour cube) plotted on respective orthogonal axes; [0140]
  • FIG. 4 is a transformation of FIG. 3 to form a chromaticity space; [0141]
  • FIG. 5 is a drawing of a chromaticity space reference system; [0142]
  • FIG. 6 illustrates use of polar co-ordinates; [0143]
  • FIG. 7 is a block diagram of a process for measuring C-erb-2 in the procedure of FIG. 1; and [0144]
  • FIG. 8 is a block diagram of a process for measuring vascularity in the procedure of FIG. 1.[0145]
  • The examples to be described herein are three different inventions which can be implemented separately or together, because they are all measurements which individually or collectively assist a clinician to diagnose cancer and to formulate a treatment programme. In descending order of importance, the procedures are determination of oestrogen and progesterone receptor status, determination of C-erb-2 and determination of vascularity. [0146]
  • A [0147] procedure 10 for the assessment of tissue samples in the form of histopathological slides of potential carcinomas of the breast is shown in FIG. 1. This drawing illustrates processes which generate measurements of specialised kinds for use by a pathologist as the basis for assessing patient diagnosis, prognosis and treatment plan.
  • The [0148] procedure 10 employs a database which maintains digitised image data obtained from histological slides as will be described later. Sections are taken (cut) from breast tissue samples (biopsies) and placed on respective slides. Slides are stained using a staining agent selected from the following depending on which parameter is to be determined:
  • a) Immunohistochemical staining for C-erb-2 with diaminobenzidine (DAB) as substrate (chemical staining agent)—collectively “Cerb-DAB”—this is for assessing C-erb-2 gene amplification status; [0149]
  • b) Oestrogen receptor (ER) with DAB as substrate (collectively “ER-DAB”) for assessing the expression (the amount expressed or emitted) of the oestrogen receptors. Progesterone receptor (PR) status is investigated using chemical treatment giving the same colouration as in ER. [0150]
  • c) Immunohistochemical staining for CD31 with fuchsin (F) as substrate for assessing vascularity (angiogenesis). [0151]
  • In a prior art manual procedure, a clinician places a slide under a microscope and examines a region of it (referred to as a tile) at magnification of ×40 for indications of C-erb-2, ER and PR status and at ×20 for vascularity. [0152]
  • The present invention requires data from histological slides in a suitable form. In the present example, image data were obtained by a pathologist using Zeiss Axioskop microscope with a Jenoptiks Progres 3012 digital camera. Image data from each slide is a set of digital images obtained at a linear magnification of 40 (i.e. 40×), each image being an electronic equivalent of a tile. [0153]
  • To select images, a pathologist scans the microscope over a slide, and at 40× magnification selects regions (tiles) of the slide which appear to be most promising in terms of an analysis to be performed. Each of these regions is then photographed using the microscope and digital camera referred to above, which produces for each region a respective digitised image in three colours, i.e. red, green and blue (R, G & B). Three intensity values are obtained for each pixel in a pixel array to provide an image as a combination of R, G and B image planes. This image data is stored temporarily at 12 for later use. [0154]
  • Three tiles are required for vascularity measurement at 14, and one tile for each of oestrogen and progesterone receptor measurement at 16 and C-erb-2 measurement at 18. These measurements provide input to a diagnostic report at 20. [0155]
  • The prior art manual procedure for scoring C-erb-2 involves a pathologist subjectively and separately estimating stain intensity, stain location and relative number of cells associated with a feature of interest in a tissue sample. The values obtained in this way are combined by a pathologist to give a single measurement for use in diagnosis, prognosis and reaching a decision on treatment. The process hereinafter described in this example replaces the prior art manual procedure with an objective procedure. [0156]
  • Referring now to FIG. 2, processing [0157] 16 to determine ER status will be outlined and then described in more detail later. It begins with a pre-processing stage 30 in which a K-means clustering algorithm is applied to a colour image using a Mahalanobis metric. This determines or cues image regions of interest for further processing by associating pixels into clusters on the basis of their having similar values of the Mahalanobis metric. At 32 the colour image is transformed into a chromaticity space which includes a location of a reference colour. Hue and saturation are calculated at 34 for pixels in clusters cued by K-means clustering. The number of brown stained pixels is computed at 36 by thresholding on the basis of hue and saturation. An ER status measurement is then derived at 38 from a combination of the fraction of stained pixels and average colour saturation.
  • The input for the [0158] ER preprocessing stage 30 consists of raw digital data files of a single histopathological colour image or tile. A triplet of image band values for each pixel represents the colour of that pixel in its red, green, and blue spectral components or image bands. These values in each of the three image bands are in the range [0 . . . 255], where [0,0,0] corresponds to black and [255,255,255] corresponds to white. The K-means clustering algorithm 30 is applied to the digital colour image using clusters and the Mahalanobis distance metric. A cluster is a natural grouping of data having similar values of the relevant metric, and the Mahalanobis distance metric is a measurement that gives an indication of degree of closeness of data items to a cluster centre. It is necessary to have some means for locating cell nuclei as pixel groups but it is not essential to use four clusters or the Mahalanobis distance metric: these have been found to work well in identifying groups of contiguous pixels which correspond to respective cell nuclei. The K-means algorithm is described by J. A. Hartigan and M. A. Wong, in a paper entitled ‘A K-means clustering algorithm’, Algorithm AS 136, Applied Statistics Journal, 1979. The Mahalanobis distance metric is described by F. Heijden, in ‘Image Based Measurement Systems—object recognition and parameter estimation’, John Wiley & Sons, 1994 and by R. Schalkoff, in ‘Pattern Recognition—Statistical, Structural and Neural approaches’, John Wiley & Sons Inc., 1992. The process comprises an initialisation step a) followed by computation of a covariance matrix at step b). This leads to a likelihood calculation at step c), which effectively provides the distance of a pixel from a cluster centre. The procedure is as follows:
  • a) Initially, cluster centres are set using 30+(cluster number +1)×10 subtracted from the mean of the red, green and blue image bands respectively. For example the first cluster values would be set at mean_red −30+(0 +1)×10 (hence mean_red −20), similarly for mean_green and mean_blue. The second cluster would be mean_red −10, mean_green −10, and mean_blue −10, and similarly for other clusters. Pixels are then assigned to clusters for later readjustment. [0159]
  • b) For each cluster the following computations are carried out: [0160]
  • i) Compute elements of the kind σ[0161] ij k of a covariance matrix of the image bands indicating the degree of variation between intensities of different colours in pixels of each cluster from Equation (1): σ i j k = 1 N k l = 1 N k ( c li - μ i k ) ( c lj - μ j k ) ( 1 )
    Figure US20030165263A1-20030904-M00003
  • where: [0162]
  • σ[0163] ij k is the ijth element of the covariance matrix,
  • N[0164] k is the number of pixels in cluster k,
  • C[0165] li, and Clj, are the values of pixel l in image bands i and j,
  • i, j take [0166] values 1, 2, 3, which represent the red, green and blue image bands respectively,
  • μ[0167] i k is the mean of all pixels in image band i belonging to cluster k, and
  • μ[0168] j k is the mean of all pixels in image band j belonging to cluster k.
  • ii) Calculate the determinant of the covariance matrix denoted as [0169] d et k
    Figure US20030165263A1-20030904-M00004
  • iii) Calculate the inverse of the covariance matrix denoted as E [0170] i nv k .
    Figure US20030165263A1-20030904-M00005
  • c). With index i denoting pixel number, each pixel {right arrow over (x)}[0171] i is now treated as a vector having three elements xi,1, xi,2, xi,3 which are the red (xi,1), green (xi,2) and blue (xi,3) pixel values: the red, green and blue image bands are therefore represented by second subscript indices 1, 2 and 3 respectively. With i ranging over all pixels in a cluster k, the likelihood dk({right arrow over (x)}i) of a pixel vector {right arrow over (x)}i not belonging to that cluster is computed from Equation (2) below: d k ( x i ) = ln ( d et k ) + 1 / 2 [ ( x i - μ k ) t i nv k ( x i - μ k ) ] ( 2 )
    Figure US20030165263A1-20030904-M00006
  • where [0172] d et k and i nv k
    Figure US20030165263A1-20030904-M00007
  • are as defined above, [0173]
  • μ[0174] i k is the mean of all pixel vectors {right arrow over (x)}i in cluster k, and
  • t indicates the transpose of the difference vector ({right arrow over (x)}[0175] i−{right arrow over (μ)}k).
  • Equation (2) is re-evaluated for the same pixel vector {right arrow over (x)}[0176] i in all other clusters also. Pixel vector {right arrow over (x)}1 has the highest likelihood of belonging to a cluster (denoted km) for which dk({right arrow over (x)}i)has a minimum value i.e. {dk m ({right arrow over (x)}i)}; cluster km is then the most suitable to receive pixel {right arrow over (x)}i; i.e. find:—
  • dk m ({right arrow over (x)}i)≦dk ({right arrow over (x)} i) for all k≠km  (3)
  • Assign pixel {right arrow over (x)}[0177] ito cluster km
  • d). For each cluster k: [0178]
  • Store a record of which pixels belong to cluster k as an array X[0179] k, update it with each pixel vector assigned to that cluster and update the number Nk of pixels in that cluster.
  • Calculate the cluster centre μ[0180] j k for each image band j=1, 2 and 3 from: μ j k = 1 N k i = 1 N k x i k ( 4 )
    Figure US20030165263A1-20030904-M00008
  • Iterate steps b) to d) until convergence, i.e. when no more pixels change clusters or the number of iterations reaches a total of 20. [0181]
  • The first cluster (k=1) now corresponds to cell nuclei and the corresponding pixel vectors are those which are cued as of interest for output and further processing. [0182]
  • Transformation of the image at [0183] 32 from red/green/blue (RGB) to chromaticity space. In the present example, as will be described, a reference colour is used: if necessary, this can be avoided using e.g. the approach of the Cerb B2 example described later. The chemical staining used in the present example results in brown colouration and the approach used here is arranged to detect that preferentially; a different staining could however be used, in which case the technique would be adapted to detect a different pixel colour.
  • In practice brightness is liable to vary due to variation in degree of chemical staining and sample thickness across a slide, as well as possible vignetting by a camera lens used to produce the images. In consequence in this example emphasis is placed on computing a measurement of hue (or colour) and saturation as described later. [0184]
  • (a) Referring now also to FIGS. [0185] 3 to 6, each RGB image is transformed into a chromaticity space. FIG. 3 shows an RGB cube 40 in which red, green and blue pixel values (expressed as R, G and B respectively) are normalised and represented as values in the range 0 to 1. These pixel values are represented on red, green and blue axes 52, 54 and 56 respectively. The chromaticity space is a plane 58 for which R+G+B=1: it is triangular within the RGB cube 50 and passes through the points (1,0,0), (0,1,0) and (0,0,1).
  • (b) FIG. 4 shows the [0186] axes 52, 54 and 56 and chromaticity space 58 looking broadly speaking along a diagonal of the RGB cube 50 from the point (1,1,1) (not shown) to the origin (0,0,0) now referenced O for convenience. The points (0,0,1), (0,1,0) and (1,0,0) in FIG. 3 are now referenced J, K and L respectively. D is a midpoint of a straight line between J and L. Image pixel values from the input RGB image are projected on to the chromaticity space 108 and the resulting projections become data points for further processing.
  • The projection calculation is as follows: [0187]
  • Red green and blue pixel chromaticity values r, g and b respectively are defined as:— [0188] r = R R + G + B , g = G R + G + B , and b = B R + G + B ( 5 )
    Figure US20030165263A1-20030904-M00009
  • Perpendiculars from a point P in the chromaticity space [0189] 108 to the lines JK and LD meet the latter at E and G respectively. Perpendiculars from P and G to the plane JOK meet the latter at F and H respectively. Using Equations (5), the point P in the triangular chromaticity space 58 may then be defined by x and y co-ordinates shown in FIG. 4 and given by: x = DE = HF = g - r 2 and y = PE = GD = b 3 2 ( 6 )
    Figure US20030165263A1-20030904-M00010
  • (c) In FIG. 5, the [0190] chromaticity space 58 is shown with x and y co-ordinate axes extending from an origin Q. A reference colour denoted by a point S in the drawing is now defined as that specified for this purpose by a clinician: it is the colour of that part of the image which is most positively stained (the most intense colour on the part of the original slide from which the image was taken). The reference colour's RGB components are taken from the image and its x and y co-ordinates are computed using Equations (5) and (6): these co-ordinates are denoted as ({tilde over (x)}, {tilde over (y)}).
  • (d) In FIG. 6, a polar co-ordinate system (r,θ) is now defined on the (R+G+B=1) plane or [0191] chromaticity space 58. The co-ordinate system origin is the centre of gravity G of the triangle 58. A reference direction for θ=0 is defined as the direction QS of the radius vector to the reference colour S in FIG. 5. For any point such as P on the triangle defined as having co-ordinates (x, y) in the HSV colour space, hue H is defined as the angle φ between the radius vector (e.g. QP) to itself and the radius vector QS to the reference colour. This is computed at 34 from the following expressions for φ: sin φ = x ~ y - x y ~ x ~ 2 + y ~ 2 x 2 + y 2 ( 7 ) cos φ = x x ~ + y y ~ x ~ 2 + y ~ 2 x 2 + y 2 ( 8 )
    Figure US20030165263A1-20030904-M00011
  • and the angle φ is defined to be [0192]   sin - 1 | x ~ y - x y ~ | x ~ 2 + y ~ 2 x 2 + y 2 ( 9 )
    Figure US20030165263A1-20030904-M00012
  • For convenience the definition of hue H is now altered somewhat to render all values positive and in the [0193] range 0 to π/2: the transformation of earlier values φ into a new version ψ is shown in Table 1 below:
    TABLE 1
    Condition Magnitude of ψ (New Hue H)
    sin φ > 0 and cos φ > 0 φ
    sin φ > 0 and cos φ < 0 π − φ
    sin φ < 0 and cos φ > 0  −φ
    sin φ < 0 and cos φ < 0 φ − π
  • A hue (H) threshold ψ[0194] 0 is set at 36 by a user or programmer of the procedure as being not more than π/2, a typical value which might be chosen being 80 degrees. Saturation S is defined to be saturation = x x ~ + y y ~ x ~ 2 + y ~ 2 ( 10 )
    Figure US20030165263A1-20030904-M00013
  • Two values of saturation threshold S[0195] 0 are set according to whether or not image pixel saturation value S lies in the range 0.1 to 1.9: this is set out in Table 2 below:
    TABLE 2
    Saturation S S0
    Either S < 0.1 or S > 1.9 0
    0.1 ≦ S ≦ 1.9 0.9
  • At [0196] 36, the thresholds are used to count selectively the number Nb of pixels which are sufficiently brown (having a large enough value of saturation) having regard to the reference colour. All H and S pixel values in the image are assessed. The conditions to be satisfied by a pixel's hue and saturation values for it to be counted in the brown pixel number Nb are set out in Table 3 below.
    TABLE 3
    Condition Action
    For each pixel with both hue modulus Treat as a “saturated” pixel;
    |ψ| < ψ0 and saturation S > S0 increase count Nb of brown
    pixels by 1
    For each pixel with |ψ| ≧ ψ0 and/or Treat as an “unsaturated”
    saturation S ≦ S0 pixel; leave Nb unchanged
  • The average saturation of the N[0197] b saturated pixels determined in Table 3 is computed by adding all their saturation values S together and dividing the resulting sum by Nb. The maximum saturation value of the saturated pixels is then determined, and the average saturation is normalised by expressing it as a percentage of this maximum: this approach is used to counteract errors due to variation in colour staining between different images. The normalised average saturation is then accorded a score at 38 of 0, 1, 2 or 3 according respectively to whether this percentage is (a) ≦25%, (b) >25% and ≦50%, (c) >50% and ≦75% or (d) >75% and ≦100%.
  • The fraction of saturated pixels—those corresponding to cells stained sufficiently brown relative to surrounding tissue—is computed at 38 from the ratio N[0198] b/N where N is the total number of pixels in the image. This fraction is then quantised to a score in the range 0 to 5 as set out in Table 5 below.
    TABLE 5
    Nb/N:Fraction of image
    pixels that are stained Score
      0.00 0
    <0.01 1
    0.01-0.10 2
    0.11-0.33 3
    0.34-0.66 4
    0.67-1.00 5
  • The two scores determined above, i.e. for normalised average saturation and fraction of sufficiently brown pixels are now added together to give a measure in the [0199] range 0 to 8. The higher this number is, the more oestrogen (ER) positive the sample is, as shown in Table 6 below.
    TABLE 6
    Description of ER status (ER Score) Range
    Strongly positive 7-8
    Positive 4-6
    Weakly positive 2-3
    Negative 0-1
  • Women with an ER score of 7 or 8 will respond favourably to hormonal treatment such as Tamoxifen; women with an ER score in the range 4 to 6 will have 50% of chance of responding to this treatment. Women scoring 2 or 3 will not respond very well, and those scoring 0 or 1 will not respond to hormonal treatment at all. [0200]
  • Images for ER and PR are indistinguishable visually and they are distinguished by the fact that they are produced using different stains. A PR score is therefore produced from stained slides in the same way as an ER score described above. The significance of progesterone receptor (PR) positivity in a breast carcinoma is less well understood than the equivalent for ER. In general, cancers that are ER positive will also be PR positive. However, carcinomas that are PR positive, but not ER positive, may have a worse prognosis. [0201]
  • Turning now to C-erb-2 the conventional manual technique involves processing a histopathological slide with chemicals to stain it appropriately, after which it is viewed by a clinician. Breast cells on the slide will have stained nuclei with a range of areas which allows discrimination between tissue cells of interest and unwanted cell types which are not important to cancer assessment. Cancerous cells will usually have a larger range of sizes of nuclei which must be allowed for in the discrimination process. A clinician needs to ignore unwanted cell types and to make a measurement by subjectively grading cells of interest as follows: [0202]
    Score Staining Pattern
    0 membrane staining in less than 10% of cells
    1 just perceptible membrane staining in more than 10% of cells but
    membranes incompletely stained
    2 weak to moderate complete membrane staining of more than
    10% of cells
    3 strong complete membrane staining of more than 10% cells
  • [0203] Scores 0 and 1 are negative (not justifying treatment), whereas scores 2 and 3 are called positive (justifying treatment).
  • Unfortunately, there are artefacts which make measurement more complicated, as follows: [0204]
  • Retraction (shrinking) artefact: less sharply defined than true membrane staining; [0205]
  • Thermal artefact: if a electrocautery instrument is used, rather ill-defined staining occurs; [0206]
  • Crushing artefact: the tissue is inadvertently mechanically deformed allowing more ill-defined staining. [0207]
  • Thermal and crushing artefacts are normally confined to boundaries of a tissue specimen and would hopefully be excluded to some extent by a clinician photographing tiles from a slide. However, it is still important to guard against ill-defined staining not attached to a cell membrane. [0208]
  • The technique of this invention attempts to measure the parameters mentioned above namely: [0209]
  • Completeness of cell membrane staining; [0210]
  • Intensity and thinness of cell membrane staining; and [0211]
  • Ratio of cell membrane staining. [0212]
  • There are two main stages in the present invention, and these may optionally be preceded by pre-processing if images are poor. The main stages are: [0213]
  • finding cell nuclei which satisfy area and location limitations associated with tumours; and [0214]
  • determining a score which characterises the membranes of the cell nuclei found in the preceding stage. [0215]
  • Referring now to FIG. 7, the C-erb-2 technique of the invention will firstly be outlined and later described in more detail. An [0216] optional preprocessing step 70 is carried out if images of tiles are poor due to camera vignetting or colour errors across the image.
  • Image segmentation is carried out in [0217] steps 71 to 78, i.e. automated separation of objects from a background in a digital image. The original digital image of a tile has red, green and blue image planes: from the green and blue image planes a cyan image plane is derived at 71 and a Sobel-filtered cyan image plane at 72. There are now five image planes: of these only the red and blue image planes are essential with conventional colour staining, the other image planes are used for desirable but not essential filtering operations upon the red image planes. Statistical measures of the five image planes are computed at 74 and 76, and then a segmented image is optimised and generated at 78 which has been filtered to remove unwanted pixels and spatial noise. The segmented image identifies cell nuclei. Step 78 is an adaptive thresholding technique using information from regions around pixels: it is shown in more detail within chain lines 80 with arrows 82 indicating iterations. It is an alternative to the K-means clustering algorithm previously described, which could also be used.
  • If at [0218] 84 the number of cells found is less than 16, the image is rejected at 86: if it is 16 or greater, then having found the cell nuclei, and hence the cells, the strength, thinness and completeness of each cell's surrounding membrane staining are measured and the membrane stainings are then ranked.
  • For each cell, at [0219] 88 a sequence of cross-correlation windows of varying widths is passed along four radii from the cell centroid to determine the cell boundary brightness value, membrane width and distance from the centroid of the most intense staining. Cell boundary brightness value is normalised by dividing by membrane width, and nuclear area and sum of normalised boundary brightness values are then obtained. Statistical measures characterising membrane-staining strength, specificity and completeness are then deduced: these measures are compared with equivalents obtained from four reference images. The measured image is then graded by assigning it a score which is that of the closest reference, with the metric of Euclidean-distance. Other metrics may also be used. Alternatively, the scores of a moderately large sample may be used as references.
  • The C-erb-2 process will now be described in more detail. The [0220] process 18 is applied to one image or tile obtained by magnifying by a factor of 40 an area of a histological slide. Referring to FIG. 7 once more, The optional preprocessing step 70 is carried out by either:
  • (a) dividing the image into a suitable number of tiles (with less individual variability) and processing them separately—this should be considered an option in general, though it is not necessary if there is reasonable uniformity across individual images; or [0221]
  • (b) preferably, if.sufficient images are available from the same camera objective lens, computing its deficiency and correcting it, rather than processing sub-images with more part-cells split across boundaries. [0222]
  • The digital image of a slide is a three colour or red green and blue (RGB) image as defined above, i.e. there is a respective image plane for each colour. For the purposes of the following analysis, the letters R, G and B for each pixel are treated as the red green and blue intensities at that pixel. The RGB image is used at [0223] 71 to compute a cyan image derived from the blue and green image planes: i.e. for each pixel a cyan intensity C is computed from C=(2×B+G)/3, the respective pixel's green (G) intensity being added to twice its blue (B) intensity and the resulting sum being divided by three. When repeated for all pixels this yields a cyan image or image plane. Cyan is used because it is a complementary colour to brown, which is the cell boundary colour produced by conventional chemical staining of a specimen. The blue image plane could be used instead but does not normally produce results as good as the cyan image. If a different colour staining were to be use, the associated complementary colour image would be selected. This process step is not essential but it greatly assists filtering out unwanted pixels and it does so without a reference colour (see the ER/PR example which uses an alternative approach).
  • At [0224] 72, a Sobel edge filter is applied to the cyan image plane: this is a standard image processing technique published in Klette R., & Zamperoni P., ‘Handbook of image processing operators’, John Wiley & Sons, 1995. A Sobel edge filter consists of two 3×3 arrays of numbers SP and SQ, each of which is convolved with successive 3×3 arrays of pixels in an image. Here S P = [ 1 2 1 0 0 0 - 1 - 2 - 1 ] and S Q = [ 1 0 - 1 2 0 - 2 1 0 - 1 ] ( 11 )
    Figure US20030165263A1-20030904-M00014
  • The [0225] step 72 initially selects a first cyan 3×3 array of pixels in the top left hand corner of the cyan image: designating as Cij a general cyan pixel in row i and column j, the top left hand corner of the image consists of pixels C11, to C13, C21 to C23 and C31 to C33. Cij is then multiplied by the respective digit of SP located in the SP array as Cij is in the 3×3 cyan pixel array: i.e. C11 to C13 are multiplied by 1, 2 and 1 respectively, C21 to C23 by zeroes and C31 to C33 by −1, −2 and −1 respectively. The products so formed are added algebraically and provide a value p.
  • The value of p will be relatively low for pixel values changing slowly between the first and third rows either side of the row of C[0226] 22, and relatively high for pixel values changing rapidly between those rows: in consequence p provides an indication of image edge sharpness across rows. This procedure is repeated using the same pixel array but with SQ replacing SP, and a value q is obtained: q is relatively low for pixel values changing slowly between the first and third columns either side of the column of C22, and relatively high for pixel values changing rapidly between those columns: and q therefore provides an indication of image edge sharpness across columns. The square root of the sum of the squares of p and q are then computed i.e. {square root}{square root over (p2+q2)}, which is defined as an “edge magnitude” and becomes T22 (replacing pixel C22 at the centre of the 3×3 array) in the transformed cyan image. It is also possible to derive an edge “phase angle”as tan−1p/q, but that is not required in the present example.
  • A general pixel T[0227] ij (row i, column j) in the transformed image is derived from Ci−1,j−1 to Ci−1,j+1, Ci,j−1 to Ci,j+1 and Ci+1,j−1 to Ci+1,j+1 of the cyan image. Because the central row and column of the Sobel filters in Equation (11) respectively are zeros, and other coefficients are 1s and 2s, p and q for Tij can be calculated as follows:
  • p={C i−1,,j−1+2C i−1,j +C i−1,,j+1 }−{C i+1,,j−1+2C i+1,+1}  (12)
  • q={C i−1,,j−1+2C i,j−1 +C i+1,,j−1 }−{C i−1,,j+1+2C i,j+1 +C i+1,j+1}  (13)
  • Beginning with i=j=2, p and q are calculated for successive 3×3 pixel arrays by incrementing j by 1 and evaluating Equations (2) and (3) for each such array until the end of a row is reached; j is then incremented by 1 and the procedure is repeated for a second row and so on until the whole image has been transformed. This transformed image is referred to below as the “Sobel of Cyan” image or image plane. [0228]
  • The Sobel filter cannot calculate values for pixels at image edges having no adjacent pixels on one or other of its sides: i.e. in a pixel array having N rows and M columns, edge pixels are the top and bottom rows and the first and last columns, or in the transformed image pixels T[0229] 11 to T1M, TN1 to TNM, T11 to T1M and T1M to TNM. By convention in Sobel filtering these edge pixels are set to zero.
  • A major problem with measurements on histopathological images is that the staining of different slides can vary enormously, e.g. from blue with dark spots to off-white with brown outlines. The situation can be improved by sifting the slides and using only those that conform to a predetermined colouration. However, it has been found that it is possible to cope with variation in staining to a reasonable extent by using statistical techniques to normalise images: in this connection steps [0230] 74 and 76 derive a variety of statistical parameters for use in image segmentation in step 78.
  • In [0231] Step 74 is computed the mean and standard deviation of the transformed pixel values Tij. For convenience a change of nomenclature is implemented: index k is substituted for i and j, i.e. k=1 to NM for i, j=1, 1 to N, M: this treats a two dimensional image as a single composite line composed of successive rows of the image. Also x is substituted for T in each pixel value, so Tij becomes Xk. The following Equations (14) and (15) respectively are used for computing the mean μ and standard deviation σ of the transformed pixels xk. μ = 1 NM k = 1 NM x k ( 14 ) σ = 1 NM - 1 k = 1 NM ( x k - μ ) 2 ( 15 )
    Figure US20030165263A1-20030904-M00015
  • At [0232] 76, various statistical parameters are computed for the Red, Green, Blue and Cyan image planes using Equations (14) and (15) above.
  • For the Red image plane the statistical parameters are the mean μ[0233] R and standard deviation σR of its pixel values: in Equations (14) and (15), xk represents a general pixel value in the Red image plane. In addition, the Red image plane's pixels are compared with one another to obtain their maximum, minimum and range (maximum-minimum). Similarly, pixels in each of the Green and Blue image planes are compared with one another to obtain a respective maximum, minimum and range for each plane. Finally, for the Cyan image, pixels' mean and standard deviation are computed using Equations (14) and (15), in which xk represents a general pixel value in the Cyan image plane.
  • In [0234] step 78, the image is segmented to identify and locate cell nuclei. a pixel is counted as part of a cell nucleus if and only if it survives a combination of thresholding operations on the Red, Green, Blue, Cyan and Sobel of Cyan image planes followed by closure of image gaps left after thresholding operations. It is necessary to determine threshold values in a way which allows for variation in chemical staining between different images. The technique employed in this example is to perform a multidimensional optimisation of some thresholds with nuclei-number as the objective-function to be maximised: i.e. for a given image, threshold values are altered intelligently until a near maximum number of nuclei is obtained. Starting values are computed for the optimisation routines by choosing those suitable for provision of threshold levels. In this example, two dimensional optimisation is used requiring three starting values indicated by suffixes 1, 2 and 3 and each with two components: the starting values represent vertices of a triangle in a two dimensional plane. The starting values are RMM1/CMM1, MMM2/CMM2 and RMM3/CMM3, RMM indicating a “Red Mean Multiplier” and CMM indicating a “Cyan Mean Multiplier”. Tests using a substantial number of images have shown that suitable starting values are RMM1=0.802, CMM1=1.24, RMM2=CMM2=0.903, RMM3=1.24 and CMM3=0.802.
  • For images counterstained with Haemotoxylin and Eosin (H&E) cell nuclei are strongly stained blue—i.e. they have very low values in the complementary red plane. Hence the red plane is the primary plane used in thresholding as follows: [0235]
  • (a) Produce a thresholded image for the Red image plane (approximately complimentary to Blue) as follows: for every Red pixel value that is less than an adaptive threshold, set the corresponding pixel location in the thresholded Red image to 1, otherwise set the latter to 0. A respective adaptive threshold is computed separately for every pixel location as follows. At a) in [0236] step 78, the Red image threshold value is dependent on the presence of enclosing brown stain in the neighbourhood of each pixel, i.e. it is a function of Cyan mean μC and Red mean μR. A check for enclosing brown is performed by searching radially outwards from a pixel under consideration. The procedure is in the Cyan image plane to select the same pixel location as in the Red image plane and from it to search in four directions—north, south, east and west directions—for a distance of seventy pixels (or as many as are available up to seventy). Here north, south, east and west have the following meanings: north: upward from the pixel in the same column; south: downward from the pixel in the same column; east: rightward from the pixel in the same row; and west: leftward from the pixel in the same row. More directions (e.g. diagonals north-east, north-west, south-east and south-west) could be used to improve accuracy but four have been found to be adequate for the present example. In any of these directions or radii either a cyan pixel will fall below a threshold (indicating a brown pixel) or a radius of 70 pixels will be reached without a cyan pixel doing so. The number RB of “brown” radii (radii intersecting at least one brown pixel) is then used to change the red threshold adaptively in the following way: There is calculated a new Red image plane threshold RTN=RMM1μR−σR(4−RB), where RMM1μR is the product of RMM1 and μR and σR is the standard deviation of the Red image plane. A limit is placed on RTN giving it a maximum possible value of 255. If the Red image plane pixel under consideration is less than the Red image plane threshold calculated for it, the corresponding pixel at the same location in the thresholded Red image is set to one, otherwise it is set to zero.
  • (b) Using the Cyan image plane, and with the Cyan mean μ[0237] C from step 74, for every Cyan pixel value that is less than the product of CMM1 and μC, set the pixel in the corresponding location in the thresholded Red image to 0, otherwise do not change the pixel. This has the effect of removing excess brown pixels.
  • (c) Using the Sobel of Cyan image plane, and with the Cyan mean μ[0238] C and standard deviation σC from step 74: i.e. for every Cyan pixel value that is greater than (μC+1.5σC) set the corresponding pixel in the thresholded Red image to 0, otherwise do not change the pixel. This has the effect of removing brown edge pixels.
  • (d) Pixels corresponding to lipids are now removed as follows: using the pixel minimum and range values computed at [0239] step 76, a thresholded Red image is produced using data obtained from the Red, Green and Blue image planes: for each Red, Green and Blue pixel group at a respective pixel location that satisfies all three criteria at (i) to (iii) below, set the pixel at the corresponding location in the thresholded Red image to 0, otherwise do not change the pixel; this has the effect of removing lipid image regions (regions of fat which appear as highly saturated white areas). Removal of these regions is not essential but is desirable to improve processing. The criteria for each set of Red, Green and Blue values at a respective pixel are:
  • (i) Red value>Red minimum+0.98×(red range), AND [0240]
  • (ii) Green pixel>Green minimum+0.98×(green range), AND [0241]
  • (iii) Blue pixel>Blue minimum+0.98×(blue range) [0242]
  • Steps (c) and (d) could be moved outside the recursion loop defined within [0243] chain lines 80 if desired, with consequent changes to the procedure.
  • (e) The next step is to apply to the binary image obtained at step (d) of [0244] 78 above a morphological closing operation, which consist of a dilation operation followed by an erosion operation. These morphological operations fuse narrow gaps and eliminate small holes in individual groups of contiguous pixels appearing as blobs in an image. They are not essential but they improve processing. They can be thought of as removal of irregularities or spatial “noise”, and they are standard image processing procedures published in Umbaugh S. C., ‘Colour vision and image processing’, Prentice Hall, 1998.
  • (f) A connected component labelling process is now applied to the binary image produced at step (e). This is a known image processing technique (sometimes referred to as ‘blob colouring’) published by R Klette and P Zamperoniu, ‘Handbook of Image Processing Operators’, John Wiley & Sons, 1996, and A Rosenfeld and A C Kak, ‘Digital Picture Processing’, Vols. 1 & 2, Academic Press, New York, 1982. It gives numerical labels to “blobs” in the binary image, blobs being regions or groups of like-valued contiguous or connected pixels in an image: i.e. each group or blob consists of connected pixels which are all [0245] 1s, and each is assigned a number different to those of other groups. This enables individual blobs to be distinguished from others by means of their labels. The number of labelled image regions or blobs in the image is computed from the labels and output. Connected component labelling also determines each labelled image region's centroid (pixel location of region centre), height, width and area. Image regions are now removed from the binary image if they are not of interest because they are too small or too large in area or they have sufficiently dissimilar height and width indicating they are flattened. The remaining regions in the binary image pass to the next stage of processing at (g).
  • Steps (a) to (f) are carried out for all three starting points or triangle vertices RMM1/CMM1, RMM2/CMM2 and RMM3/CMM3: this yields three values for the number of regions remaining in the binary image in each case. [0246]
  • (g) This step is referred to as the Downhill Simplex method: it is a standard iterative statistical technique for multidimensional optimisation published in Nelder J. A., Mead R., 1965, Computer Journal, vol. 7, pp 308-313, 1965. It takes as input the three numbers of regions remaining after step (f). It is possible to use other optimisation techniques such as that referred to as Powell which uses gradients. The starting point/vertex yielding the lowest number of regions remaining is then selected. A new starting point is then generated as the reflection of the selected vertex in the line joining to the two other vertices: i.e. if the three vertices were to have been at 1,1, 1,2 and 2,1, and 1,1 was the selected vertex, then the new starting point is 2,2. The selected vertex is then discarded and the other two retained. The new starting point or vertex becomes RMM4/CMM4 and steps (a) to (f) are repeated using it to generate a new number of regions remaining for comparison with those associated with the two retained vertices. Again a vertex yielding the lowest number of regions remaining is selected, and the process of new RMM/CMM values and steps (a) to (f) is iterated as indicated by [0247] arrows 82. Iterations continue until the rate of change of remaining number of image regions (cell nuclei number) slows down, i.e. when successive iterations show a change of less than 10% in this number: at that point optimisation is terminated and the binary image remaining after step (f) selected for further processing is that generated using the RMM/CMM values giving the highest nuclei number.
  • The [0248] procedure 18 is now concerned with determining quantities referred to as “grand_mean” and “mean_range” to be defined later. If the Downhill Simplex method (g) has determined that there are less than a user specified number of image regions or cell nuclei, sixteen in the present example, then at 84 processing is switched to 86 indicating a problem image which is to be rejected.
  • If the Downhill Simplex method has determined that there are at least sixteen image regions, then at [0249] 84 processing is switched to 88 where a search to characterise these regions' boundaries is carried out. The search uses each region's area and centroid pixel location as obtained in connected component labelling at 78(f), and each region is assumed to be a cell with a centroid which is the centre of the cell's nucleus. This assumption is justified for most cells, but there may be misshapen cells for which it does not hold: it is possible to discard misshapen cells by eliminating those with concave boundary regions for example, but this is not implemented in the present example.
  • The search to characterise the regions' boundaries is carried out along the respective north, south, east and west directions (as defined earlier) from the centroid (more directions may be used to improve accuracy): it is carried out in each of these directions for a distance δ which is either 140 pixels or 2{square root}{square root over (region area)}, whichever is the lesser. It employs the original (2B+G)/3 cyan image because experience shows that this image gives the best defined cell boundaries with the slide staining previously described. Designating C[0250] ij as the intensity of a region's centroid pixel in the cyan image at row i and column j, then pixels to be searched north, south, east and west of this centroid will have intensities in the cyan image of Ci+1,j to Ci+δ,j, Ci−1,j to Ci−δ,j, Ci,j+1 to Ci,j+δ and Ci,j−1 to Ci,j−δ respectively. The cyan intensity of each of the pixels to be searched is subtracted from the centroid pixel's cyan intensity Cij to produce a difference value, which may be positive or negative. In a cyan image, a cell nucleus is normally blue whereas a boundary is brown (with staining as described earlier).
  • Each pixel is then treated as being part of four linear groups or “windows” of six, twelve, twenty-four and forty-eight pixels each including the pixel and extending from it in a continuous line north, south, east or west (as defined earlier) according respectively to whether the pixel is north, south, east or west of the centroid. In effect pixels in each of the chosen directions have mathematical window functions applied to them, the function having the value 1 at pixels within a group and the [0251] value 0 outside it. In the linear groups in the present example, Ci+1,j is for example grouped with Ci+2,j to Ci+6,j, Ci+2,j to Ci+12,j, Ci+2,j to Ci+24,j, and Ci+2,j to Ci+48,j (inclusive in each case). This provides a total of 16δ groups from 4δ groups in each of four directions. For each group the difference between each of its pixels' cyan intensities and that of the centroid is calculated: the differences are summed over the group algebraically (positive and negative differences cancelling one another). This sum is divided by the number of pixels in the group to provide a net difference per pixel between the cyan intensities of the group's pixels and that of the centroid.
  • For each direction, i.e. north, south, east and west, there is now a respective set of 4δ net differences per pixel: in each set the net differences per pixel are compared and their maximum value is identified. This produces a respective maximum net difference per pixel for each of the sets, i.e. for each of the north, south, east and west-directions, and size of window (number of pixels in group) in which the respective maximum occurred. The four maxima so obtained (one for each direction) and the respective window size in each case are stored. Each maximum is a measure of the region boundary (cell membrane) magnitude in the relevant direction, because in a cyan image the maximum difference as compared to a blue cell nucleus occurs at a brown cell boundary. The window size associated with each maximum indicates the region boundary width, because a boundary width will give a higher maximum in this technique with a window size which it more nearly matches as compared to one it matches less well. Greater accuracy is obtainable by using more window sizes and windows matched to cell boundary shape, i.e. multiplying pixels in each linear group by respective values collectively forming a boundary shape function. The process is in fact mathematically a correlation operation in which a window shape is correlated with a linear group of pixels. A further option is to record the position of the maximum or boundary (cell radius) as being that of one of the two pixels at the centre of the window in which the maximum occurs: this was not done in the present example, although it would enable misshapen cells to be detected and discarded as being indicated by significant differences in the positions of maxima in the four directions, and it would improve width measure by accounting for oblique intersections of windows and cell boundaries. [0252]
  • Each maximum or region boundary magnitude is then divided by the associated window size (region boundary width) used to derive it: this forms what is called for the purposes of this specification a normalised boundary magnitude—it is a measure of both brightness and sharpness: It enables discrimination against ill-defined staining not attached to a cell membrane. [0253]
  • The [0254] next step 90 is to apply what is referred to as a “quicksort” to the four normalised boundary magnitudes to sort them into descending order of magnitude. Quicksort is a known technique published by Klette R., Zamperoniu P., ‘Handbook of Image Processing
  • Operators’, John Wiley & Sons, 1996, and will not be described. It is not essential but convenient. For each image region, measurements made as described above are now recorded in a respective 1-dimensional vector as set out in Table 7 below: in this table the directions North, East etc are lost in the quicksort ordering into largest, second largest, third largest and smallest. [0255]
    TABLE 7
    Item number Parameter
    1 Largest normalised boundary magnitude
    2 Second Largest normalised boundary magnitude
    3 Third Largest normalised boundary magnitude
    4 Smallest normalised boundary magnitude
    5 Sum of Largest, Second Largest, Third Largest and
    Smallest normalised boundary magnitudes
  • A further quicksort is now applied (also at [0256] 90) to the image regions to sort them into descending order of item 5 values in Table 7 above, i.e. sum of Largest, Second Largest, Third Largest and Smallest normalised boundary magnitudes. A subset of the image regions is now selected as being those having large values of item 5: these are the most significant image regions and they are the best one eighth of the total number of image regions in terms of item 5 magnitude. From this subset of image regions the following parameters are computed at 92, “grand_mean”, “mean_range” and “relative_range” as defined below:
  • octile=one eighth of the total number of image regions or cell nuclei  (16)
  • boundaries=normalised boundary magnitudes  (17)
  • Σ=sum of . . . (over all boundaries in the subset or best octile)  (18)
  • item 1=Largest normalised boundary magnitude  (19)
  • item 3=Third Largest normalised boundary magnitude  (20)
  • grand_mean=6×[(ΣLargest boundaries)+(ΣSecond Largest boundaries) +(ΣThird Largest boundaries)+(ΣSmallest boundaries)]/4octile  (21)
  • mean_range=[(Σitem 1)−(Σitem 3)]/octile  (22)
  • relative_range=10×mean_range/grand_mean  (23)
  • Grand_mean is indicative of the degree to which an image exhibits good cell boundary sharpness and brightness. Relative_range indicates the degree to which an image exhibits brightness extending around cell boundaries—the smallest boundaries (item 4) are omitted from this computation to provide some robustness against incomplete cells. A cell boundary that exhibits a large value of relative_range will have brightness varying appreciably around the boundary corresponding to non-uniformity of staining or possibly even absence of a boundary. [0257]
  • At [0258] 94 an overall distance measure is computed: this measure provides an estimate of how far the current cyan image (generated at 71) is from each member of a predetermined standard set of images, four images in the present example. In this example the distance measure is computed against a set of four predetermined standard images: the standard images were obtained by dividing a large test dataset of images into four different image types corresponding respectively to four different C-erb-2 status indicators (as will be described later in more detail). The images of each image type were analysed to determine grand mean and relative range for each image using the process 18. A respective average grand mean Mi (i=0, 1, 2 and 3) and a respective average relative range RRi were determined for the images of each of the four image types. As an alternative, it is also possible to select four good quality images of the relevant types by inspection from many images, and to determine Mi and RRi from them. The values Mi and RRi become the components of respective four-element vectors M and RR, and are used in the following expression:
  • C-erb-2 indicator=mini{(Mi−grand mean)2+(RRi−relative range)2}  (24)
  • where min[0259] i is the value of i (i=0, 1, 2 or 3) for which the expression within curved brackets { } on the right of Equation (24) is a minimum. For the vector M, from the dataset the following elements were determined: MO=12.32, M1=23.16, M2=42.34 and M3=87.35; elements determined likewise for the vector RR were RR0=2.501, RR1=1.85, RR2=1.111 and RR3=0.5394. The value of the index i is returned as the indicator for the C-erb-2 measurement process.
  • If a value of i=3 is obtained in the C-erb-2 measurement process, this is regarded as a strongly positive result: the patient from whom the original tissue samples were taken is regarded as highly suitable for treatment, currently with herceptin. A value of i=2 is weakly positive indicating doubtful suitability for treatment, and i=1 or 0 is a negative result indicating unsuitability. This is tabulated below in Table 8. [0260]
    TABLE 8
    C-erb-2 status i Value
    Strongly positive 3
    Weakly positive 2
    Negative 0, 1
  • Referring now to FIG. 8, there is shown a flow diagram of the process [0261] 14 (see FIG. 1) for measurement of vascularity. The process 14 is applied to three images each of ×20 magnification compared to the histopathological slide from which they were taken. At 100 each image is transformed from red/green/blue (RGB) to a different image space hue/saturation/value (HSV). The RGB to HSV transformation is described by K. Jack in ‘Video Demystified’, 2nd ed., HighText Publications, San Diego, 1996. In practice value V (or brightness) is liable to vary due to staining and thickness variations across a slide, as well as possible vignetting by a camera lens used to produce the images. In consequence in this example the V component is ignored: it is not calculated, and emphasis is placed on the hue (or colour) and saturation values H and S. H and S are calculated for each pixel of the two RGB images as follows:
  • Let M=maximum of (R,G,B)  (25)
  • Let m=minimum of (R,G,B)  (26)
  • Then newr=(M−R)/(M−m)  (27)
  • newg=(M−G)/(M−m) and  (28)
  • newb=(M−B)/(M−m)  (29)
  • This converts each colour of a pixel into the difference between its magnitude and that of the maximum of the three colour magnitudes of that pixel, this difference being divided by the difference between the maximum and minimum of (R,G,B). [0262]
  • Saturation (S) is set as follows:[0263]
  • if M equals zero, then S=0  (30)
  • if M does not equal zero, then S=(M−m)/M  (31)
  • The calculation for Hue (H) is as follows: from Equation (25) M must be equal to at least one of R, G and B:[0264]
  • if M equals zero, then H=180  (32)
  • If M equals R then H=60(newb−newg)  (33)
  • If M equals G then H=60(2+newr−newb)  (34)
  • If M equals B then H=60(4+newg−newr)  (35)
  • If H is greater than or equal 360 then H=H −360  (36)
  • If H is less than 0 then H=H+360  (37)
  • The Value V is not used in this example, but were it to be used it would be set to the maximum of (R,G,B). [0265]
  • The [0266] next step 102 is to apply colour segmentation to obtain a binary image. This segmentation is based on thresholding using the Hue and Saturation from the HSV colour space, and is shown in Table 9 below.
    TABLE 9
    Binary Image
    Threshold Criterion Pixel Value
    Pixel with both Hue H in the range 282-356 degrees Set pixel to 1
    (scale 0 to 360), and Saturation S in the range 0.2 to
    0.24 (scale 0 to 1)
    Pixel with either Hue outside the range 282-356 Set pixel to 0
    degrees, and/or Saturation outside the range 0.2-0.24
  • This produces a segmented binary image in which pixels set to 1 are processed further and those set to 0 are discarded. [0267]
  • The [0268] next stage 104 is to apply connected component labelling (as defined previously) to the segmented binary image: this provides a binary image with regions of contiguous pixels equal to 1, the regions being uniquely labelled for further processing and their areas being determined. The labelled binary image is then spatially filtered to remove small connected components (image regions with less than 10 pixels) which have insufficient pixels to contribute to vascularity: this provides a reduced binary image.
  • The sum of the area of the remaining image regions in the reduced binary image is then determined at [0269] 106 from the results of connected component labelling, and this sum is then expressed as a percentage of the area of the whole image. This procedure is carried out for each of the original RGB images separately to provide three such percentage area values: the average of the three percentage area values is computed, and it represents an estimate of the percentage of the area of a tissue sample occupied by blood vessels—i.e. the sample vascularity.
  • As set out in Table 10 below, vascularity is determined to be high or low depending on whether or not it is equal to at least 31%. [0270]
    TABLE 10
    Description of vascularity Range
    High  31%-100%
    Low  0%-30%
  • High vascularity corresponds to relatively fast tumour growth because tumour blood supply has been facilitated, and early treatment is indicated. Low vascularity corresponds to relatively slow tumour growth, and early treatment is less important. [0271]
  • The procedures given in the foregoing description for calculating quantities and results can clearly be evaluated by an appropriate computer program recorded on a carrier medium and running on a conventional computer system. Such a program is straightforward for a skilled programmer to implement without requiring invention, because the mathematical expressions used are well known computational procedures. Such a program and system will therefore not be described. [0272]
  • The process steps described in the examples of all three inventions described herein are not all essential and alternatives may be provided. It is for example possible to omit a step of ignoring unsuitably small areas in selecting areas for later processing, if the consequent increase in processing burden is acceptable. The above examples are intended to provide an enabling disclosure, not to limit the invention. [0273]

Claims (105)

1. A method of measuring oestrogen or progesterone receptor (ER or PR) status having the steps of:
a) obtaining histopathological specimen image data; and
b) identifying in the image data groups of contiguous pixels corresponding to respective cell nuclei;
characterised in that the method also includes the steps of:
c) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
d) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
e) determining ER or PR status from proportion of pixels corresponding to preferentially stained cells.
2. A method of measuring ER or PR status having the steps of:
a) obtaining histopathological specimen image data; and
b) identifying in the image data groups of contiguous pixels corresponding to respective cell nuclei;
characterised in that the method also includes the steps of:
c) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
d) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
e) determining ER or PR status from normalised average saturation.
3. A method of measuring ER or PR status having the steps of:
a) obtaining histopathological specimen image data; and
b) identifying in the image data groups of contiguous pixels corresponding to respective cell nuclei;
characterised in that the method also includes the steps of:
c) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
d) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
e) determining ER or PR status from normalised average saturation and fraction of pixels corresponding to preferentially stained cells.
4. A method according to claim 3 characterised in that step b) is implemented using a K-means clustering algorithm.
5. A method according to claim 4 characterised in that the K-means clustering algorithm employs a Mahalanobis distance metric.
6. A method according to claim 3 characterised in that step c) is implemented by transforming the image data into a chromaticity space, and deriving hue and saturation from image pixels and a reference colour.
7. A method according to claim 6 characterised in that hue is obtained from an angle φequal to
sin - 1 | x ~ y - x y ~ | x ~ 2 + y ~ 2 x 2 + y 2
Figure US20030165263A1-20030904-M00016
and saturation from an expression
x x ~ + y y ~ x ~ 2 + y ~ 2 ,
Figure US20030165263A1-20030904-M00017
where (x, y) and ({tilde over (x)}, {tilde over (y)}) are respectively image pixel coordinates and reference colour coordinates in the chromaticity space.
8. A method according to claim 6 characterised in that hue is adapted to lie in the range 0 to 90 degrees and a hue threshold of 80 degrees is set in step d).
9. A method according to claim 6 or 8 characterised in that a saturation threshold So is set in step d), So being 0.9 for saturation in the range 0.1 to 1.9 and 0 for saturation outside this range.
10. A method according to claim 3 characterised in that the fraction of pixels corresponding to preferentially stained cells is determined by counting the number of pixels having both saturation greater than a saturation threshold and hue modulus less than a hue threshold and expressing such number as a fraction of a total number of pixels in the image.
11. A method according to claim 3 characterised in that the normalised average saturation is accorded a score 0, 1, 2 or 3 according respectively to whether it is (i) ≦25%, (ii) >25% and ≦50%, (iii) >50% and ≦75% or (iv) >75% and ≦100%.
12. A method according to claim 11 characterised in that the fraction of pixels corresponding to preferentially stained cells is accorded a score 0, 1, 2, 3, 4 or 5 according respectively to whether it is (i) 0, (ii) >0 and <0.01, (iii) ≧0.01 and ≦0.10, (iv) ≧0.11 and ≦0.33, (v) ≧0.34 and ≦0.66 or (vi) ≧0.67 and ≦1.0.
13. A method according to claim 12 characterised in that the scores for normalised average saturation and fraction of pixels corresponding to preferentially stained cells are added together to provide a measurement of ER or PR.
14. A method according to claim 3 characterised in that the fraction of pixels corresponding to preferentially stained cells is accorded a score 0, 1, 2, 3, 4 or 5 according respectively to whether it is (i) 0, (ii) >0 and <0.01, (iii) ≧0.01 and ≦0.10, (iv) ≧0.11 and ≦0.33, (v) ≧0.34 and ≦0.66 or (vi) ≧0.67 and ≦1.0.
15. A method according to claim 3 characterised in that step e) is carried out by obtaining a score for normalised average saturation and a score for fraction of pixels corresponding to preferentially stained cells and adding the scores together.
16. A method according to claim 1, 2 or 3 characterised in that it also includes measuring C-erb-2 status by the following steps:
a) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries,
b) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries,
c) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and
d) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data.
17. A method according to claim 1, 2, 3 or 16 characterised in that it also includes measuring vascularity by the following steps:
a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
b) producing a segmented image by thresholding the image data on the basis of hue and saturation;
c) identifying in the segmented image groups of contiguous pixels; and
d) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area.
18. A method of measuring C-erb-2 status having the steps of:
a) obtaining histopathological specimen image data; and
b) identifying in the image data contiguous pixel groups corresponding to respective cell nuclei associated with surrounding cell boundary staining;
characterised in that the method also includes the steps of:
c) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries,
d) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries,
e) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and
f) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data.
19. A method according to claim 18 characterised in that at least some of the window functions have non-zero values of 6, 12, 24 and 48 pixels respectively and zero values elsewhere.
20. A method according to claim 18 characterised in that pixels associated with a cell boundary are identified from a maximum correlation with a window function, the window function having a length which provides an estimate of cell boundary width.
21. A method according to claim 18 characterised in that a brightness-related measure of cell boundary brightness and sharpness is computed in step d) using a calculation including dividing cell boundaries by their respective widths to provide normalised boundary magnitudes, selecting a fraction of the normalised boundary magnitudes each greater than unselected equivalents and summing the normalised boundary magnitudes of the selected fraction.
22. A method according to claim 21 characterised in that in step d) a brightness-related measure of brightness extent around cell boundaries is computed using a calculation including dividing normalised boundary magnitudes into different magnitude groups each associated with a respective range of magnitudes, providing a respective magnitude sum of normalised boundary magnitudes for each magnitude group, and subtracting a smaller magnitude sum from a larger magnitude sum.
23. A method according to claim 22 characterised in that the comparison image having brightness-related measures closest to those determined for the image data is determined from a Euclidean distance between the brightness-related measures of the comparison image and the image data.
24. A method according to claim 18 characterised in that in step b) identifying in the image data contiguous pixel groups corresponding to respective cell nuclei is carried out by an adaptive thresholding technique arranged to maximise the number of contiguous pixel groups identified.
25. A method according to claim 24 wherein the image data includes red, green and blue image planes characterised in that the adaptive thresholding technique includes:
a) generating a mean value μR and a standard deviation σR for pixels in the red image plane,
b) generating a cyan image plane from the image data and calculating a mean value μC for its pixels,
c) calculating a product CMMμC where CMM is a predetermined multiplier,
d) calculating a quantity RB equal to the number of adjacent linear groups of pixels of predetermined length and including at least one cyan pixel which is less than CMMμC,
e) for each red pixel calculating a threshold equal to {RMMμR−σR(R(4−RB)} and RMM is a predetermined multiplier,
f) forming a thresholded red image by discarding each red pixel that is greater than or equal to the threshold,
g) determining the number of contiguous pixel groups in the thresholded red image,
h) changing the values of RMM and CMM and iterating steps c) to g),
i) changing the values of RMM and CMM once more and iterating steps c) to g),
j) comparing the numbers of contiguous pixel groups determined in steps g) to i), treating the three pairs of values of RMM and CMM as points in a two dimensional space, selecting the pair of values of RMM and CMM associated with the lowest number of contiguous pixel groups, obtaining its reflection in the line joining the other two pairs of values of RMM and CMM, using this reflection as a new pair of values of RMM and CMM and iterating steps c) to g) and this step j).
26. A method according to claim 25 characterised in that the first three pairs of RMM and CMM values referred to in step k) are 0.802 and 1.24, 0.903 and 0.903, and 1.24 and 0.802 respectively.
27. A method according to claim 25 characterised in that that it includes prior to step g) removing brown pixels from the thresholded red image if like-located pixels in the cyan image are less than CMMμC.
28. A method according to claim 25 characterised in that it includes prior to step g) forming an edge-filtered cyan image, generating a standard deviation σC for its pixels and removing edge pixels from the thresholded red image if like-located pixels in the Sobel-filtered cyan image are greater than (μC+1.5σC).
29. A method according to claim 25 characterised in that it includes prior to step g) removing pixels corresponding to lipids from the thresholded red image if their red green and blue pixel values are all greater than the sum of the relevant colour's minimum value and 98% of its range of pixel values in each case.
30. A method according to claim 25 characterised in that it includes prior to step g) subjecting the thresholded red image to a morphological closing operation.
31. A method of measuring vascularity having the steps of:
a) obtaining histopathological specimen image data;
characterised in that the method also includes the steps of:
b) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
c) producing a segmented image by thresholding the image data on the basis of hue and saturation; and
d) identifying in the segmented image groups of contiguous pixels; and
e) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area.
32. A method according to claim 31 wherein the image data comprises pixels with red, green and blue values designated R, G and B respectively, characterised in that a respective saturation value S is derived in step b) for each pixel by:
i) defining M and m for each pixel as respectively the maximum and minimum of R, G and B; and
ii) setting S to zero if m equals zero and setting S to (M−m)/M otherwise.
33. A method according to claim 32 characterised in that hue values designated H are derived by:
a) defining new values newr, newg and newb for each pixel given by newr=(M−R)/(M−m), newg=(M−G)/(M−m) and newb=(M−B)/(M−m) in order to convert each pixel value into the difference between its magnitude and that of the maximum of the three colour magnitudes of that pixel, this difference being divided by the difference between the maximum and minimum of R, G and B, and
b) calculating H as tabulated immediately below:
M H 0 180 R 60(newb − newg)* G 60(2 + newr − newb)* B 60(4 + newg − newr)*
34. A method according to claim 33 characterised in that the step of producing a segmented image is implemented by designating for further processing only those pixels having both a hue H in the range 282-356 and a saturation S in the range 0.2 to 0.24.
35. A method according to claim 34 characterised in that the step of identifying in the segmented image groups of contiguous pixels includes the step of spatially filtering such groups to remove groups having insufficient pixels to contribute to vascularity.
36. A method according to claim 35 characterised in that the step of determining vascularity includes treating vascularity as having a high or a low value according to whether or not it is at least 31%.
37. A computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of:
a) processing histopathological specimen image data to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei;
characterised in that the program is also arranged to implement the steps of:
b) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
c) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
d) determining ER or PR status from proportion of pixels corresponding to preferentially stained cells.
38. A computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of:
a) processing histopathological specimen image data to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei;
characterised in that the program is also arranged to implement the steps of:
b) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
c) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
d) determining ER or PR status from normalised average saturation.
39. A computer program for measuring ER or PR status, the program being arranged to control computer apparatus to execute the steps of:
a) processing histopathological specimen image data to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei;
characterised in that the program is also arranged to implement the steps of:
b) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
c) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
d) determining ER or PR status from normalised average saturation and fraction of pixels corresponding to preferentially stained cells.
40. A computer program according to claim 39 characterised in that step a) is implemented using a K-means clustering algorithm.
41. A computer program according to claim 39 characterised in that step b) is implemented by transforming the image data into a chromaticity space, and deriving hue and saturation from image pixels and a reference colour.
42. A computer program according to claim 41 characterised in that hue is obtained from an angle φ equal to
sin - 1 | x ~ y - x y ~ | x ~ 2 + y ~ 2 x 2 + y 2
Figure US20030165263A1-20030904-M00018
and saturation from an expression
x x ~ + y y ~ x ~ 2 + y ~ 2 ,
Figure US20030165263A1-20030904-M00019
where (x, y) and ({tilde over (x)}, {tilde over (y)}) are respectively image pixel coordinates and reference colour coordinates in the chromaticity space.
43. A computer program according to claim 41 characterised in that hue is adapted to lie in the range 0 to 90 degrees and a hue threshold of 80 degrees is set in step c).
44. A computer program according to claim 41 characterised in that a saturation threshold So is set in step c), So being 0.9 for saturation in the range 0.1 to 1.9 and 0 for saturation outside this range.
45. A computer program according to claim 39 characterised in that the fraction of pixels corresponding to preferentially stained cells is determined by counting the number of pixels having both saturation greater than a saturation threshold and hue modulus less than a hue threshold and expressing such number as a fraction of a total number of pixels in the image.
46. A computer program according to claim 39 characterised in that the normalised average saturation is accorded a score 0, 1, 2 or 3 according respectively to whether it is (i) ≦25%, (ii) >25% and ≦50%, (iii) >50% and ≦75% or (iv) >75% and ≦100%.
47. A computer program according to claim 46 characterised in that the fraction of pixels corresponding to preferentially stained cells is accorded a score 0, 1, 2, 3, 4 or 5 according respectively to whether it is (i) 0, (ii) >0 and <0.01, (iii) ≧0.01 and ≦0.10, (iv) ≧0.11 and ≦0.33, (v) ≧0.34 and ≦0.66 or (vi) ≧0.67 and ≦1.0.
48. A computer program according to claim 47 characterised in that the scores for normalised average saturation and fraction of pixels corresponding to preferentially stained cells are added together to provide a measurement of ER or PR.
49. A computer program according to claim 39 characterised in that the fraction of pixels corresponding to preferentially stained cells is accorded a score 0, 1, 2, 3, 4 or 5 according respectively to whether it is (i) 0, (ii) >0 and <0.01, (iii) ≧0.01 and ≦0.10, (iv) ≧0.11 and ≦0.33, (v) ≧0.34 and ≦0.66 or (vi) ≧0.67 and ≦1.0.
50. A computer program according to claim 39 characterised in that step e) is carried out by obtaining a score for normalised average saturation and a score for fraction of pixels corresponding to preferentially stained cells and adding the scores together.
51. A computer program according to claim 37, 38 or 39 characterised in that it is also arranged for derivation of a measure C-erb-2 status by:
a) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries,
b) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries,
c) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and
d) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data.
52. A computer program according to claim 37, 38, 39 or 51 characterised in that it is also arranged for derivation of a measure C-erb-2 status by:
a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
b) producing a segmented image by thresholding the image data on the basis of hue and saturation; and
c) identifying in the segmented image groups of contiguous pixels; and
d) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area.
53. A computer program for use in measuring C-erb-2 status arranged to control computer apparatus to execute the steps of:
a) processing histopathological specimen image data to identify contiguous pixel groups corresponding to respective cell nuclei associated with surrounding cell boundary staining;
characterised in that the computer program is also arranged to implement the steps of:
b) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries,
c) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries,
d) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and
e) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data.
54. A computer program according to claim 53 characterised in that at least some of the window functions have non-zero values of 6, 12, 24 and 48 pixels respectively and zero values elsewhere.
55. A computer program according to claim 53 characterised in that pixels associated with a cell boundary are identified from a maximum correlation with a window function, the window function having a length which provides an estimate of cell boundary width.
56. A computer program according to claim 53 characterised in that in step d) a brightness-related measure of cell boundary brightness and sharpness is computed using a calculation including dividing cell boundaries by their respective widths to provide normalised boundary magnitudes, selecting a fraction of the normalised boundary magnitudes each greater than unselected equivalents and summing the normalised boundary magnitudes of the selected fraction.
57. A computer program according to claim 53 characterised in that in step d) a brightness-related measure of brightness extent around cell boundaries is computed using a calculation including dividing normalised boundary magnitudes into different magnitude groups each associated with a respective range of magnitudes, providing a respective magnitude sum of normalised boundary magnitudes for each magnitude group, and subtracting a smaller magnitude sum from a larger magnitude sum.
58. A computer program according to claim 57 characterised in that the comparison image having brightness-related measures closest to those determined for the image data is determined from a Euclidean distance between the brightness-related measures of the comparison image and the image data.
59. A computer program according to claim 53 characterised in that in step b) identifying in the image data contiguous pixel groups corresponding to respective cell nuclei is carried out by an adaptive thresholding technique arranged to maximise the number of contiguous pixel groups identified.
60. A computer program according to claim 59 wherein the image data includes red, green and blue image planes characterised in that the adaptive thresholding technique includes:
a) generating a mean value PR and a standard deviation σR for pixels in the red image plane,
b) generating a cyan image plane from the image data and calculating a mean value μC for its pixels,
c) calculating a product CMMμC where CMM is a predetermined multiplier,
d) calculating a quantity RB equal to the number of adjacent linear groups of pixels of predetermined length and including at least one cyan pixel which is less than CMMμC,
e) for each red pixel calculating a threshold equal to {RMMμR−σR(4−RB)} and RMM is a predetermined multiplier,
f) forming a thresholded red image by discarding each red pixel that is greater than or equal to the threshold,
g) determining the number of contiguous pixel groups in the thresholded red image,
h) changing the values of RMM and CMM and iterating steps c) to g),
i) changing the values of RMM and CMM once more and iterating steps c) to g),
j) comparing the numbers of contiguous pixel groups determined in steps g) to i), treating the three pairs of values of RMM and CMM as points in a two dimensional space, selecting the pair of values of RMM and CMM associated with the lowest number of contiguous pixel groups, obtaining its reflection in the line joining the other two pairs of values of RMM and CMM, using this reflection as a new pair of values of RMM and CMM and iterating steps c) to g) and this step j).
61. A computer program according to claim 60 characterised in that the first three pairs of RMM and CMM values referred to in step k) are 0.802 and 1.24, 0.903 and 0.903, and 1.24 and 0.802 respectively.
62. A computer program according to claim 60 characterised in that that the adaptive thresholding technique includes prior to step g) removing brown pixels from the thresholded red image if like-located pixels in the cyan image are less than CMMμC.
63. A computer program according to claim 60 characterised in that the adaptive thresholding technique includes prior to step g) forming an edge-filtered cyan image, generating a standard deviation σC for its pixels and removing edge pixels from the thresholded red image if like-located pixels in the Sobel-filtered cyan image are greater than (μC+1.5σC).
64. A computer program according to claim 60 characterised in that the adaptive thresholding technique includes prior to step g) removing pixels corresponding to lipids from the thresholded red image if their red green and blue pixel values are all greater than the sum of the relevant colour's minimum value and 98% of its range of pixel values in each case.
65. A computer program according to claim 60 characterised in that the adaptive thresholding technique includes prior to step g) subjecting the thresholded red image to a morphological closing operation.
66. A computer program for use in measuring vascularity characterised in that it is arranged to control computer apparatus to execute the steps of:
a) using histopathological specimen image data to derive hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
b) producing a segmented image by thresholding the image data on the basis of hue and saturation; and
c) identifying in the segmented image groups of contiguous pixels; and
d) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area.
67. A computer program according to claim 66 wherein the image data comprises pixels with red, green and blue values designated R, G and B respectively, characterised in that a respective saturation value S is derived in step b) for each pixel by:
i) defining M and m for each pixel as respectively the maximum and minimum of R, G and B; and
ii) setting S to zero if m equals zero and setting S to (M−m)/M otherwise.
68. A computer program according to claim 67 characterised in that hue values designated H are derived by:
a) defining new values newr, newg and newb for each pixel given by newr=(M−R)/(M−m), newg=(M−G)/(M−m) and newb=(M−B)/(M−m) in order to convert each pixel value into the difference between its magnitude and that of the maximum of the three colour magnitudes of that pixel, this difference being divided by the difference between the maximum and minimum of R, G and B, and
b) calculating H as tabulated immediately below:
M H 0 180 R 60(newb − newg)* G 60(2 + newr − newb)* B 60(4 + newg − newr)*
69. A computer program according to claim 68 characterised in that the step of producing a segmented image is implemented by designating for further processing only those pixels having both a hue H in the range 282-356 and a saturation S in the range 0.2 to 0.24.
70. A computer program according to claim 69 characterised in that the step of identifying in the segmented image groups of contiguous pixels includes the step of spatially filtering such groups to remove groups having insufficient pixels to contribute to vascularity.
71. A computer program according to claim 70 characterised in that the step of determining vascularity includes treating vascularity as having a high or a low value according to whether or not it is at least 31%.
72. Apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of:
a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
b) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
c) determining ER or PR status from proportion of pixels corresponding to preferentially stained cells.
73. Apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of:
a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
b) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
c) determining ER or PR status from normalised average saturation.
74. Apparatus for measuring ER or PR status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of:
a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
b) thresholding the image data on the basis of hue and saturation and identifying pixels corresponding to cells which are preferentially stained relative to surrounding specimen tissue; and
c) determining ER or PR status from normalised average saturation and fraction of pixels corresponding to preferentially stained cells.
75. Apparatus according to claim 74 characterised in that step a) is implemented by transforming the image data into a chromaticity space, and deriving hue and saturation from image pixels and a reference colour.
76. Apparatus according to claim 75 characterised in that hue is obtained from an angle φ equal to
sin - 1 | x ~ y - x y ~ | x ~ 2 + y ~ 2 x 2 + y 2
Figure US20030165263A1-20030904-M00020
and saturation from an expression
x x ~ + y y ~ x ~ 2 + y ~ 2 ,
Figure US20030165263A1-20030904-M00021
where (x, y) and ({tilde over (x)}, {tilde over (y)}) are respectively image pixel coordinates and reference colour coordinates in the chromaticity space.
77. Apparatus according to claim 76 characterised in that hue is adapted to lie in the range 0 to 90 degrees and a hue threshold of 80 degrees is set in step b).
78. Apparatus according to claim 74 characterised in that a saturation threshold So is set in step b), SO being 0.9 for saturation in the range 0.1 to 1.9 and 0 for saturation outside this range.
79. Apparatus according to claim 74 characterised in that the fraction of pixels corresponding to preferentially stained cells is determined by counting the number of pixels having both saturation greater than a saturation threshold and hue modulus less than a hue threshold and expressing such number as a fraction of a total number of pixels in the image.
80. Apparatus according to claim 74 characterised in that the normalised average saturation is accorded a score 0, 1, 2 or 3 according respectively to whether it is (i) ≦25%, (ii) >25% and ≦50%, (iii) >50% and ≦75% or (iv) >75% and ≦100%.
81. Apparatus according to claim 80 characterised in that the fraction of pixels corresponding to preferentially stained cells is accorded a score 0, 1, 2, 3, 4 or 5 according respectively to whether it is (i) 0, (ii) >0 and <0.01, (iii) ≧0.01 and ≦0.10, (iv) ≧0.11 and ≦0.33, (v) ≧0.34 and ≦0.66 or (vi) ≧0.67 and ≦1.0.
82. Apparatus according to claim 81 characterised in that the scores for normalised average saturation and fraction of pixels corresponding to preferentially stained cells are added together to provide a measurement of ER or PR.
83. Apparatus according to claim 74 characterised in that the fraction of pixels corresponding to preferentially stained cells is accorded a score 0, 1, 2, 3, 4 or 5 according respectively to whether it is (i) 0, (ii) >0 and <0.01, (iii) ≧0.01 and ≦0.10, (iv) ≧0.11 and ≦0.33, (v) ≧0.34 and ≦0.66 or (vi) ≧0.67 and ≦1.0.
84. Apparatus according to claim 74 characterised in that step c) is carried out by obtaining a score for normalised average saturation and a score for fraction of pixels corresponding to preferentially stained cells and adding the scores together.
85. Apparatus according to claim 72, 73 or 74 characterised in that it is also arranged to determine C-erb-2 status and the computer apparatus is also programmed to:
a) correlate window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries,
b) compute brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries,
c) compare the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and
d) assign to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data.
86. Apparatus according to claim 72, 73, 74 or 85 characterised in that it is also arranged to determine vascularity and the computer apparatus is also programmed to:
a) derive hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
b) produce a segmented image by thresholding the image data on the basis of hue and saturation;
c) identify in the segmented image groups of contiguous pixels; and
d) determine vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area.
87. Apparatus for measuring C-erb-2 status including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, the computer apparatus being programmed to identify in the image data groups of contiguous pixels corresponding to respective cell nuclei, characterised in that the computer apparatus is also programmed to execute the steps of:
a) correlating window functions of different lengths with pixel sub-groups within the identified contiguous pixels groups to identify pixels associated with cell boundaries,
b) computing brightness-related measures of cell boundary brightness and sharpness and brightness extent around cell boundaries from pixels corresponding to cell boundaries,
c) comparing the brightness-related measures with predetermined equivalents obtained from comparison images associated with different values of C-erb-2, and
d) assigning to the image data a C-erb-2 value which is that associated with the comparison image having brightness-related measures closest to those determined for the image data.
88. Apparatus according to claim 87 characterised in that at least some of the window functions have non-zero values of 6, 12, 24 and 48 pixels respectively and zero values elsewhere.
89. Apparatus according to claim 87 characterised in that the computer apparatus is programmed to identify pixels associated with a cell boundary from a maximum correlation with a window function, the window function having a length which provides an estimate of cell boundary width.
90. Apparatus according to claim 87 characterised in that the computer apparatus is programmed to execute step b) by computing a brightness-related measure of cell boundary brightness and sharpness using a calculation including dividing cell boundaries by their respective widths to provide normalised boundary magnitudes, selecting a fraction of the normalised boundary magnitudes each greater than unselected equivalents and summing the normalised boundary magnitudes of the selected fraction.
91. Apparatus according to claim 87 characterised in that the computer apparatus is programmed to execute step b) by computing a brightness-related measure of brightness extent around cell boundaries using a calculation including dividing normalised boundary magnitudes into different magnitude groups each associated with a respective range of magnitudes, providing a respective magnitude sum of normalised boundary magnitudes for each magnitude group, and subtracting a smaller magnitude sum from a larger magnitude sum.
92. Apparatus according to claim 91 characterised in that the computer apparatus is programmed to determine the comparison image having brightness-related measures closest to those determined for the image data from a Euclidean distance between the brightness-related measures of the comparison image and the image data.
93. Apparatus according to claim 87 characterised in that the computer apparatus is programmed to identify in the image data contiguous pixel groups corresponding to respective cell nuclei by an adaptive thresholding technique arranged to maximise the number of contiguous pixel groups identified.
94. Apparatus according to claim 93 wherein the image data includes red, green and blue image planes characterised in that the adaptive thresholding technique includes:
a) generating a mean value μR and a standard deviation σR for pixels in the red image plane,
b) generating a cyan image plane from the image data and calculating a mean value μC for its pixels,
c) calculating a product CMMμC where CMM is a predetermined multiplier,
d) calculating a quantity RB equal to the number of adjacent linear groups of pixels of predetermined length and including at least one cyan pixel which is less than CMMμC,
e) for each red pixel calculating a threshold equal to {RMMμR−σR(4−RB)} and RMM is a predetermined multiplier,
f) forming a thresholded red image by discarding each red pixel that is greater than or equal to the threshold,
g) determining the number of contiguous pixel groups in the thresholded red image,
h) changing the values of RMM and CMM and iterating steps c) to g),
i) changing the values of RMM and CMM once more and iterating steps c) to g),
j) comparing the numbers of contiguous pixel groups determined in steps g) to i), treating the three pairs of values of RMM and CMM as points in a two dimensional space, selecting the pair of values of RMM and CMM associated with the lowest number of contiguous pixel groups, obtaining its reflection in the line joining the other two pairs of values of RMM and CMM, using this reflection as a new pair of values of RMM and CMM and iterating steps c) to g) and this step j).
95. Apparatus according to claim 94 characterised in that the first three pairs of RMM and CMM values referred to in step k) are 0.802 and 1.24, 0.903 and 0.903, and 1.24 and 0.802 respectively.
96. Apparatus according to claim 94 characterised in that the computer apparatus is programmed to remove brown pixels from the thresholded red image prior to step g) if like-located pixels in the cyan image are less than CMMμC.
97. Apparatus according to claim 94 characterised in that the computer apparatus is programmed to form an edge-filtered cyan image, generate a standard deviation σC for its pixels and remove edge pixels from the thresholded red image prior to step g) if like-located pixels in the Sobel-filtered cyan image are greater than (μC+1.5σC).
98. Apparatus according to claim 94 characterised in that the computer apparatus is programmed to remove pixels corresponding to lipids from the thresholded red image prior to step g) if their red green and blue pixel values are all greater than the sum of the relevant colour's minimum value and 98% of its range of pixel values in each case.
99. Apparatus according to claim 94 characterised in that the computer apparatus is programmed to subject the thresholded red image to a morphological closing operation prior to step g).
100. Apparatus for measuring vascularity including means for photographing histopathological specimens to provide image data and computer apparatus to process the image data, characterised in that the computer apparatus is also programmed to execute the steps of:
a) deriving hue and saturation for the image data in a colour space having a hue coordinate and a saturation coordinate;
b) producing a segmented image by thresholding the image data on the basis of hue and saturation; and
c) identifying in the segmented image groups of contiguous pixels; and
d) determining vascularity from the total area of the groups of contiguous pixels which are sufficiently large to correspond to vascularity, such area being expressed as a proportion of the image data's total area.
101. Apparatus according to claim 100 wherein the image data comprises pixels with red, green and blue values designated R, G and B respectively, characterised in that the computer apparatus is programmed to derive a respective saturation value S for each pixel in step b) by:
i) defining M and m for each pixel as respectively the maximum and minimum of R, G and B; and
ii) setting S to zero if m equals zero and setting S to (M−m)/M otherwise.
102. Apparatus according to claim 101 characterised in that the computer apparatus is programmed to derive hue values designated H by:
a) defining new values newr, newg and newb for each pixel given by newr=(M−R)/(M−m), newg=(M−G)/(M−m) and newb=(M−B)/(M−m) in order to convert each pixel value into the difference between its magnitude and that of the maximum of the three colour magnitudes of that pixel, this difference being divided by the difference between the maximum and minimum of R, G and B, and
b) calculating H as tabulated immediately below:
M H 0 180 R 60(newb − newg)* G 60(2 + newr − newb)* B 60(4 + newg − newr)*
103. Apparatus according to claim 102 characterised in that the computer apparatus is programmed to produce a segmented image by designating for further processing only those pixels having both a hue H in the range 282-356 and a saturation S in the range 0.2 to 0.24.
104. Apparatus according to claim 103 characterised in that the computer apparatus is programmed to identify in the segmented image groups of contiguous pixels by spatially filtering such groups to remove groups having insufficient pixels to contribute to vascularity.
105. Apparatus according to claim 100 characterised in that the computer apparatus is programmed to determine vascularity by treating it as having a high or a low value according to whether or not it is at least 31%.
US10/274,358 2002-02-19 2002-10-21 Histological assessment Abandoned US20030165263A1 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
GB02191271.4 2002-02-19
GB0218909.0 2002-08-15
GB0218909A GB0218909D0 (en) 2002-08-15 2002-08-15 Histological assessment
GB0219271A GB0219271D0 (en) 2002-08-15 2002-08-19 Histological assessment
GB0222035.8 2002-09-23
GB0222035A GB0222035D0 (en) 2002-08-15 2002-09-23 Histological assessment
GB0222218A GB0222218D0 (en) 2002-08-15 2002-09-25 Histological assessment
GB0222218.0 2002-09-25

Publications (1)

Publication Number Publication Date
US20030165263A1 true US20030165263A1 (en) 2003-09-04

Family

ID=28678966

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/274,358 Abandoned US20030165263A1 (en) 2002-02-19 2002-10-21 Histological assessment
US10/524,492 Expired - Fee Related US7079675B2 (en) 2002-08-15 2003-08-08 Histological assessment

Family Applications After (1)

Application Number Title Priority Date Filing Date
US10/524,492 Expired - Fee Related US7079675B2 (en) 2002-08-15 2003-08-08 Histological assessment

Country Status (5)

Country Link
US (2) US20030165263A1 (en)
EP (1) EP1529207A2 (en)
JP (1) JP4420821B2 (en)
AU (1) AU2003255763A1 (en)
WO (1) WO2004017052A2 (en)

Cited By (95)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005045734A1 (en) * 2003-10-30 2005-05-19 Bioimagene, Inc. Method and system for automatically determinig diagnostic saliency of digital images
US20050136509A1 (en) * 2003-09-10 2005-06-23 Bioimagene, Inc. Method and system for quantitatively analyzing biological samples
US20060067887A1 (en) * 2002-12-23 2006-03-30 Qinetiq Limited Scoring estrogen and progesterone receptors expression based on image analysis
GB2433985A (en) * 2006-01-09 2007-07-11 Cytokinetics Inc Characterization of features within the boundary regions of biological cells
US7337398B1 (en) * 2003-02-28 2008-02-26 Adobe Systems Incorporated Reconstitute tag-delimited tables in a graphics editing application
US20080097225A1 (en) * 2006-10-19 2008-04-24 The General Hospital Corporation Apparatus and method for obtaining and providing imaging information associated with at least one portion of a sample, and effecting such portion(s)
US20090003789A1 (en) * 2004-07-02 2009-01-01 The General Hospital Corporation Imaging system and related techniques
WO2010046821A1 (en) * 2008-10-23 2010-04-29 Koninklijke Philips Electronics N.V. Colour management for biological samples
US7761139B2 (en) 2003-01-24 2010-07-20 The General Hospital Corporation System and method for identifying tissue using low-coherence interferometry
US7796270B2 (en) 2006-01-10 2010-09-14 The General Hospital Corporation Systems and methods for generating data based on one or more spectrally-encoded endoscopy techniques
US7797119B2 (en) 2002-01-24 2010-09-14 The General Hospital Corporation Apparatus and method for rangings and noise reduction of low coherence interferometry LCI and optical coherence tomography OCT signals by parallel detection of spectral bands
US20100284583A1 (en) * 2007-08-22 2010-11-11 Phadia Ab Read-out method and apparatus
US7864822B2 (en) 2003-06-06 2011-01-04 The General Hospital Corporation Process and apparatus for a wavelength tuning source
US7889348B2 (en) 2005-10-14 2011-02-15 The General Hospital Corporation Arrangements and methods for facilitating photoluminescence imaging
US7898656B2 (en) 2008-04-30 2011-03-01 The General Hospital Corporation Apparatus and method for cross axis parallel spectroscopy
US7920271B2 (en) 2006-08-25 2011-04-05 The General Hospital Corporation Apparatus and methods for enhancing optical coherence tomography imaging using volumetric filtering techniques
US7933021B2 (en) 2007-10-30 2011-04-26 The General Hospital Corporation System and method for cladding mode detection
US7949019B2 (en) 2007-01-19 2011-05-24 The General Hospital Wavelength tuning source based on a rotatable reflector
US7969578B2 (en) 2003-10-27 2011-06-28 The General Hospital Corporation Method and apparatus for performing optical imaging using frequency-domain interferometry
US7982879B2 (en) 2006-02-24 2011-07-19 The General Hospital Corporation Methods and systems for performing angle-resolved fourier-domain optical coherence tomography
US7995210B2 (en) 2004-11-24 2011-08-09 The General Hospital Corporation Devices and arrangements for performing coherence range imaging using a common path interferometer
US8018598B2 (en) 2004-05-29 2011-09-13 The General Hospital Corporation Process, system and software arrangement for a chromatic dispersion compensation using reflective layers in optical coherence tomography (OCT) imaging
US8045177B2 (en) 2007-04-17 2011-10-25 The General Hospital Corporation Apparatus and methods for measuring vibrations using spectrally-encoded endoscopy
US8050747B2 (en) 2001-05-01 2011-11-01 The General Hospital Corporation Method and apparatus for determination of atherosclerotic plaque type by measurement of tissue optical properties
US8054468B2 (en) 2003-01-24 2011-11-08 The General Hospital Corporation Apparatus and method for ranging and noise reduction of low coherence interferometry LCI and optical coherence tomography OCT signals by parallel detection of spectral bands
US8081316B2 (en) 2004-08-06 2011-12-20 The General Hospital Corporation Process, system and software arrangement for determining at least one location in a sample using an optical coherence tomography
US8097864B2 (en) 2009-01-26 2012-01-17 The General Hospital Corporation System, method and computer-accessible medium for providing wide-field superresolution microscopy
US8145018B2 (en) 2006-01-19 2012-03-27 The General Hospital Corporation Apparatus for obtaining information for a structure using spectrally-encoded endoscopy techniques and methods for producing one or more optical arrangements
US8149418B2 (en) 2005-09-29 2012-04-03 The General Hospital Corporation Method and apparatus for optical imaging via spectral encoding
US8174702B2 (en) 2003-01-24 2012-05-08 The General Hospital Corporation Speckle reduction in optical coherence tomography by path length encoded angular compounding
US8175685B2 (en) 2006-05-10 2012-05-08 The General Hospital Corporation Process, arrangements and systems for providing frequency domain imaging of a sample
US8208995B2 (en) 2004-08-24 2012-06-26 The General Hospital Corporation Method and apparatus for imaging of vessel segments
US8351665B2 (en) * 2005-04-28 2013-01-08 The General Hospital Corporation Systems, processes and software arrangements for evaluating information associated with an anatomical structure by an optical coherence ranging technique
USRE44042E1 (en) 2004-09-10 2013-03-05 The General Hospital Corporation System and method for optical coherence imaging
US8428887B2 (en) 2003-09-08 2013-04-23 Ventana Medical Systems, Inc. Method for automated processing of digital images of tissue micro-arrays (TMA)
US8593619B2 (en) 2008-05-07 2013-11-26 The General Hospital Corporation System, method and computer-accessible medium for tracking vessel motion during three-dimensional coronary artery microscopy
US8804126B2 (en) 2010-03-05 2014-08-12 The General Hospital Corporation Systems, methods and computer-accessible medium which provide microscopic images of at least one anatomical structure at a particular resolution
US8861910B2 (en) 2008-06-20 2014-10-14 The General Hospital Corporation Fused fiber optic coupler arrangement and method for use thereof
US8922781B2 (en) 2004-11-29 2014-12-30 The General Hospital Corporation Arrangements, devices, endoscopes, catheters and methods for performing optical imaging by simultaneously illuminating and detecting multiple points on a sample
US8937724B2 (en) 2008-12-10 2015-01-20 The General Hospital Corporation Systems and methods for extending imaging depth range of optical coherence tomography through optical sub-sampling
US20150051484A1 (en) * 2013-08-14 2015-02-19 Siemens Aktiengesellschaft Histological Differentiation Grade Prediction of Hepatocellular Carcinoma in Computed Tomography Images
US8965487B2 (en) 2004-08-24 2015-02-24 The General Hospital Corporation Process, system and software arrangement for measuring a mechanical strain and elastic properties of a sample
US9020221B2 (en) 2012-07-13 2015-04-28 Sony Corporation Method and apparatus for automatic cancer diagnosis scoring of tissue samples
USRE45512E1 (en) 2004-09-29 2015-05-12 The General Hospital Corporation System and method for optical coherence imaging
US9060689B2 (en) 2005-06-01 2015-06-23 The General Hospital Corporation Apparatus, method and system for performing phase-resolved optical frequency domain imaging
US9069130B2 (en) 2010-05-03 2015-06-30 The General Hospital Corporation Apparatus, method and system for generating optical radiation from biological gain media
US9087368B2 (en) 2006-01-19 2015-07-21 The General Hospital Corporation Methods and systems for optical imaging or epithelial luminal organs by beam scanning thereof
CN104854620A (en) * 2012-12-07 2015-08-19 富士施乐株式会社 Image processing device, image processing system, and program
US9176319B2 (en) 2007-03-23 2015-11-03 The General Hospital Corporation Methods, arrangements and apparatus for utilizing a wavelength-swept laser using angular scanning and dispersion procedures
US9178330B2 (en) 2009-02-04 2015-11-03 The General Hospital Corporation Apparatus and method for utilization of a high-speed optical wavelength tuning source
US9186067B2 (en) 2006-02-01 2015-11-17 The General Hospital Corporation Apparatus for applying a plurality of electro-magnetic radiations to a sample
US9282931B2 (en) 2000-10-30 2016-03-15 The General Hospital Corporation Methods for tissue analysis
US9295391B1 (en) 2000-11-10 2016-03-29 The General Hospital Corporation Spectrally encoded miniature endoscopic imaging probe
US9330092B2 (en) 2011-07-19 2016-05-03 The General Hospital Corporation Systems, methods, apparatus and computer-accessible-medium for providing polarization-mode dispersion compensation in optical coherence tomography
US9341783B2 (en) 2011-10-18 2016-05-17 The General Hospital Corporation Apparatus and methods for producing and/or providing recirculating optical delay(s)
US9375158B2 (en) 2007-07-31 2016-06-28 The General Hospital Corporation Systems and methods for providing beam scan patterns for high speed doppler optical frequency domain imaging
US9415550B2 (en) 2012-08-22 2016-08-16 The General Hospital Corporation System, method, and computer-accessible medium for fabrication miniature endoscope using soft lithography
US9418414B2 (en) 2012-05-30 2016-08-16 Panasonic Intellectual Property Management Co., Ltd. Image measurement apparatus, image measurement method and image measurement system
US9441948B2 (en) 2005-08-09 2016-09-13 The General Hospital Corporation Apparatus, methods and storage medium for performing polarization-based quadrature demodulation in optical coherence tomography
US9510758B2 (en) 2010-10-27 2016-12-06 The General Hospital Corporation Apparatus, systems and methods for measuring blood pressure within at least one vessel
US9557154B2 (en) 2010-05-25 2017-01-31 The General Hospital Corporation Systems, devices, methods, apparatus and computer-accessible media for providing optical imaging of structures and compositions
US9569464B1 (en) * 2014-05-28 2017-02-14 Pivotal Software, Inc. Element identification in database
US9615748B2 (en) 2009-01-20 2017-04-11 The General Hospital Corporation Endoscopic biopsy apparatus, system and method
US9629528B2 (en) 2012-03-30 2017-04-25 The General Hospital Corporation Imaging system, method and distal attachment for multidirectional field of view endoscopy
CN106885802A (en) * 2015-12-16 2017-06-23 范石军 The real-time monitoring system of the antibiotic residue of sewage discharge in a kind of breeding process
CN106896100A (en) * 2015-12-18 2017-06-27 范石军 The quick decision-making system of antibiotic residue in a kind of livestock products
US9733460B2 (en) 2014-01-08 2017-08-15 The General Hospital Corporation Method and apparatus for microscopic imaging
US9777053B2 (en) 2006-02-08 2017-10-03 The General Hospital Corporation Methods, arrangements and systems for obtaining information associated with an anatomical sample using optical microscopy
US9784681B2 (en) 2013-05-13 2017-10-10 The General Hospital Corporation System and method for efficient detection of the phase and amplitude of a periodic modulation associated with self-interfering fluorescence
US9795301B2 (en) 2010-05-25 2017-10-24 The General Hospital Corporation Apparatus, systems, methods and computer-accessible medium for spectral analysis of optical coherence tomography images
US9881371B2 (en) 2012-10-24 2018-01-30 Sony Corporation System for visualization of a cancer diagnosis
US9968261B2 (en) 2013-01-28 2018-05-15 The General Hospital Corporation Apparatus and method for providing diffuse spectroscopy co-registered with optical frequency domain imaging
WO2018097883A1 (en) * 2016-11-22 2018-05-31 Agilent Technologies, Inc. A method for unsupervised stain separation in pathological whole slide image
US10058250B2 (en) 2013-07-26 2018-08-28 The General Hospital Corporation System, apparatus and method for utilizing optical dispersion for fourier-domain optical coherence tomography
US10117576B2 (en) 2013-07-19 2018-11-06 The General Hospital Corporation System, method and computer accessible medium for determining eye motion by imaging retina and providing feedback for acquisition of signals from the retina
WO2019009893A1 (en) * 2017-07-05 2019-01-10 Flagship Biosciences Inc. Methods for measuring and reporting vascularity in a tissue sample
US10191053B2 (en) 2014-07-29 2019-01-29 Flagship Biosciences, Inc. Methods for measuring and reporting vascularity in a tissue sample
US10228556B2 (en) 2014-04-04 2019-03-12 The General Hospital Corporation Apparatus and method for controlling propagation and/or transmission of electromagnetic radiation in flexible waveguide(s)
US10241028B2 (en) 2011-08-25 2019-03-26 The General Hospital Corporation Methods, systems, arrangements and computer-accessible medium for providing micro-optical coherence tomography procedures
US10285568B2 (en) 2010-06-03 2019-05-14 The General Hospital Corporation Apparatus and method for devices for imaging structures in or at one or more luminal organs
US10426548B2 (en) 2006-02-01 2019-10-01 The General Hosppital Corporation Methods and systems for providing electromagnetic radiation to at least one portion of a sample using conformal laser therapy procedures
US10478072B2 (en) 2013-03-15 2019-11-19 The General Hospital Corporation Methods and system for characterizing an object
US10534129B2 (en) 2007-03-30 2020-01-14 The General Hospital Corporation System and method providing intracoronary laser speckle imaging for the detection of vulnerable plaque
US10628658B2 (en) * 2014-11-10 2020-04-21 Ventana Medical Systems, Inc. Classifying nuclei in histology images
CN111079637A (en) * 2019-12-12 2020-04-28 武汉轻工大学 Method, device and equipment for segmenting rape flowers in field image and storage medium
US10736494B2 (en) 2014-01-31 2020-08-11 The General Hospital Corporation System and method for facilitating manual and/or automatic volumetric imaging with real-time tension or force feedback using a tethered imaging device
US10835110B2 (en) 2008-07-14 2020-11-17 The General Hospital Corporation Apparatus and method for facilitating at least partial overlap of dispersed ration on at least one sample
US10893806B2 (en) 2013-01-29 2021-01-19 The General Hospital Corporation Apparatus, systems and methods for providing information regarding the aortic valve
US10912462B2 (en) 2014-07-25 2021-02-09 The General Hospital Corporation Apparatus, devices and methods for in vivo imaging and diagnosis
US11158049B2 (en) * 2015-12-18 2021-10-26 Abbott Laboratories Methods and systems for assessing histological stains
US11179028B2 (en) 2013-02-01 2021-11-23 The General Hospital Corporation Objective lens arrangement for confocal endomicroscopy
US11452433B2 (en) 2013-07-19 2022-09-27 The General Hospital Corporation Imaging apparatus and method which utilizes multidirectional field of view endoscopy
US11490826B2 (en) 2009-07-14 2022-11-08 The General Hospital Corporation Apparatus, systems and methods for measuring flow and pressure within a vessel
US11490797B2 (en) 2012-05-21 2022-11-08 The General Hospital Corporation Apparatus, device and method for capsule microscopy
CN117575977A (en) * 2024-01-17 2024-02-20 锦恒科技(大连)有限公司 Follicular region enhancement method for ovarian tissue analysis

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030165263A1 (en) * 2002-02-19 2003-09-04 Hamer Michael J. Histological assessment
US7343046B2 (en) * 2004-02-12 2008-03-11 Xerox Corporation Systems and methods for organizing image data into regions
GB2430026A (en) 2005-09-09 2007-03-14 Qinetiq Ltd Automated selection of image regions
US9697582B2 (en) 2006-11-16 2017-07-04 Visiopharm A/S Methods for obtaining and analyzing images
ITVA20060079A1 (en) * 2006-12-19 2008-06-20 St Microelectronics Srl PIXEL CHROMATIC CLASSIFICATION METHOD AND ADAPTIVE IMPROVEMENT METHOD OF A COLOR IMAGE
ES2374686T3 (en) 2007-05-14 2012-02-21 Historx, Inc. SEPARATION IN COMPARTMENTS BY CHARACTERIZATION OF PIXEL USING CLOTHING OF IMAGE DATA.
JP5365011B2 (en) 2008-01-29 2013-12-11 日本電気株式会社 Pathological diagnosis support apparatus, pathological diagnosis support method, and program
EP2327040B1 (en) * 2008-08-15 2013-12-18 Visiopharm A/s A method and a system for determining a target in a biological sample by image analysis
JP5401062B2 (en) * 2008-09-05 2014-01-29 ベックマン コールター, インコーポレイテッド Image pattern determination method, system, apparatus, program, and computer-readable recording medium storing the program
ES2573945T3 (en) 2008-09-16 2016-06-13 Novartis Ag Reproducible quantification of biomarker expression
US20100124372A1 (en) * 2008-11-12 2010-05-20 Lockheed Martin Corporation Methods and systems for identifying/accessing color related information
US9025850B2 (en) 2010-06-25 2015-05-05 Cireca Theranostics, Llc Method for analyzing biological specimens by spectral imaging
US8861851B2 (en) * 2011-05-13 2014-10-14 Dolby Laboratories Licensing Corporation Color highlight reconstruction
WO2013118436A1 (en) 2012-02-09 2013-08-15 日本電気株式会社 Lifeform image analysis system, lifeform image analysis method, and lifeform image analysis program
CA2935473C (en) * 2014-02-21 2022-10-11 Ventana Medical Systems, Inc. Medical image analysis for identifying biomarker-positive tumor cells
JP6346576B2 (en) * 2015-02-27 2018-06-20 Hoya株式会社 Image processing device
US10460439B1 (en) 2015-08-12 2019-10-29 Cireca Theranostics, Llc Methods and systems for identifying cellular subtypes in an image of a biological specimen
JP6725646B2 (en) * 2015-09-02 2020-07-22 ベンタナ メディカル システムズ, インコーポレイテッド Automated analysis of cell samples with a mixture of analytically distinct analyte staining patterns
SE544735C2 (en) * 2018-11-09 2022-11-01 Mm18 Medical Ab Method for identification of different categories of biopsy sample images

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4724543A (en) * 1985-09-10 1988-02-09 Beckman Research Institute, City Of Hope Method and apparatus for automatic digital image analysis
US5134662A (en) * 1985-11-04 1992-07-28 Cell Analysis Systems, Inc. Dual color camera microscope and methodology for cell staining and analysis
US5202931A (en) * 1987-10-06 1993-04-13 Cell Analysis Systems, Inc. Methods and apparatus for the quantitation of nuclear protein
US5235522A (en) * 1990-10-10 1993-08-10 Cell Analysis Systems, Inc. Method and apparatus for automated analysis of biological specimens
US5548661A (en) * 1991-07-12 1996-08-20 Price; Jeffrey H. Operator independent image cytometer
US5577131A (en) * 1993-05-05 1996-11-19 U.S. Philips Corporation Device for segmenting textured images and image segmentation system comprising such a device
US5598481A (en) * 1994-04-29 1997-01-28 Arch Development Corporation Computer-aided method for image feature analysis and diagnosis in mammography
US5787208A (en) * 1995-06-07 1998-07-28 Neopath, Inc. Image enhancement method and apparatus
US5933519A (en) * 1994-09-20 1999-08-03 Neo Path, Inc. Cytological slide scoring apparatus
US6546123B1 (en) * 1995-11-30 2003-04-08 Chromavision Medical Systems, Inc. Automated detection of objects in a biological sample
US20030231791A1 (en) * 2002-06-12 2003-12-18 Torre-Bueno Jose De La Automated system for combining bright field and fluorescent microscopy
US20040071327A1 (en) * 1999-04-13 2004-04-15 Chromavision Medical Systems, Inc., A California Corporation Histological reconstruction and automated image analysis
US20050276457A1 (en) * 2002-08-15 2005-12-15 Qinetiq Limited Histological assessment

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5485527A (en) * 1985-11-04 1996-01-16 Becton Dickinson And Company Apparatus and method for analyses of biological specimens
JPS6379632A (en) * 1986-09-25 1988-04-09 株式会社東芝 Electronic endoscope apparatus
US5332968A (en) * 1992-04-21 1994-07-26 University Of South Florida Magnetic resonance imaging color composites
JP3267739B2 (en) 1993-05-11 2002-03-25 フクダ電子株式会社 Ultrasound color Doppler diagnostic system
US6151405A (en) * 1996-11-27 2000-11-21 Chromavision Medical Systems, Inc. System and method for cellular specimen grading
US6330349B1 (en) * 1995-11-30 2001-12-11 Chromavision Medical Systems, Inc. Automated method for image analysis of residual protein
WO1999044062A1 (en) 1998-02-25 1999-09-02 The United States Of America As Represented By The Secretary Department Of Health And Human Services Cellular arrays for rapid molecular profiling
JP2003502391A (en) 1999-06-17 2003-01-21 リジェネロン・ファーマシューティカルズ・インコーポレイテッド Methods of imaging and targeting tumor vasculature
AU6093400A (en) * 1999-07-13 2001-01-30 Chromavision Medical Systems, Inc. Automated detection of objects in a biological sample
AU6521700A (en) * 1999-08-04 2001-03-05 Chromavision Medical Systems, Inc. Method and apparatus for applying color thresholds in light microscopy
AU782452B2 (en) 1999-12-13 2005-07-28 Government of The United States of America, as represented by The Secretary Department of Health & Human Services, The National Institutes of Health, The High-throughput tissue microarray technology and applications
JP3699399B2 (en) * 2000-01-12 2005-09-28 ヴェンタナ メディカル システムズ インコーポレイテッド Methods for determining response to cancer treatment
IL136884A0 (en) 2000-06-19 2001-06-14 Yissum Res Dev Co A system for cancer detection and typing and for grading of malignancy
IL138123A0 (en) 2000-08-28 2001-10-31 Accuramed 1999 Ltd Medical decision support system and method
US8078262B2 (en) 2001-04-16 2011-12-13 The Johns Hopkins University Method for imaging and spectroscopy of tumors and determination of the efficacy of anti-tumor drug therapies

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4724543A (en) * 1985-09-10 1988-02-09 Beckman Research Institute, City Of Hope Method and apparatus for automatic digital image analysis
US5134662A (en) * 1985-11-04 1992-07-28 Cell Analysis Systems, Inc. Dual color camera microscope and methodology for cell staining and analysis
US5202931A (en) * 1987-10-06 1993-04-13 Cell Analysis Systems, Inc. Methods and apparatus for the quantitation of nuclear protein
US5235522A (en) * 1990-10-10 1993-08-10 Cell Analysis Systems, Inc. Method and apparatus for automated analysis of biological specimens
US5548661A (en) * 1991-07-12 1996-08-20 Price; Jeffrey H. Operator independent image cytometer
US5577131A (en) * 1993-05-05 1996-11-19 U.S. Philips Corporation Device for segmenting textured images and image segmentation system comprising such a device
US5598481A (en) * 1994-04-29 1997-01-28 Arch Development Corporation Computer-aided method for image feature analysis and diagnosis in mammography
US5933519A (en) * 1994-09-20 1999-08-03 Neo Path, Inc. Cytological slide scoring apparatus
US5787208A (en) * 1995-06-07 1998-07-28 Neopath, Inc. Image enhancement method and apparatus
US6546123B1 (en) * 1995-11-30 2003-04-08 Chromavision Medical Systems, Inc. Automated detection of objects in a biological sample
US20040120562A1 (en) * 1995-11-30 2004-06-24 Presley Hays Automated method for image analysis of residual protein
US20040071327A1 (en) * 1999-04-13 2004-04-15 Chromavision Medical Systems, Inc., A California Corporation Histological reconstruction and automated image analysis
US20030231791A1 (en) * 2002-06-12 2003-12-18 Torre-Bueno Jose De La Automated system for combining bright field and fluorescent microscopy
US20050276457A1 (en) * 2002-08-15 2005-12-15 Qinetiq Limited Histological assessment

Cited By (153)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9282931B2 (en) 2000-10-30 2016-03-15 The General Hospital Corporation Methods for tissue analysis
US9295391B1 (en) 2000-11-10 2016-03-29 The General Hospital Corporation Spectrally encoded miniature endoscopic imaging probe
US8150496B2 (en) 2001-05-01 2012-04-03 The General Hospital Corporation Method and apparatus for determination of atherosclerotic plaque type by measurement of tissue optical properties
US8050747B2 (en) 2001-05-01 2011-11-01 The General Hospital Corporation Method and apparatus for determination of atherosclerotic plaque type by measurement of tissue optical properties
US7903257B2 (en) 2002-01-24 2011-03-08 The General Hospital Corporation Apparatus and method for ranging and noise reduction of low coherence interferometry (LCI) and optical coherence tomography (OCT) signals by parallel detection of spectral bands
US7797119B2 (en) 2002-01-24 2010-09-14 The General Hospital Corporation Apparatus and method for rangings and noise reduction of low coherence interferometry LCI and optical coherence tomography OCT signals by parallel detection of spectral bands
US7646905B2 (en) * 2002-12-23 2010-01-12 Qinetiq Limited Scoring estrogen and progesterone receptors expression based on image analysis
US20060067887A1 (en) * 2002-12-23 2006-03-30 Qinetiq Limited Scoring estrogen and progesterone receptors expression based on image analysis
US8054468B2 (en) 2003-01-24 2011-11-08 The General Hospital Corporation Apparatus and method for ranging and noise reduction of low coherence interferometry LCI and optical coherence tomography OCT signals by parallel detection of spectral bands
US7761139B2 (en) 2003-01-24 2010-07-20 The General Hospital Corporation System and method for identifying tissue using low-coherence interferometry
US8174702B2 (en) 2003-01-24 2012-05-08 The General Hospital Corporation Speckle reduction in optical coherence tomography by path length encoded angular compounding
US8559012B2 (en) 2003-01-24 2013-10-15 The General Hospital Corporation Speckle reduction in optical coherence tomography by path length encoded angular compounding
US9226665B2 (en) 2003-01-24 2016-01-05 The General Hospital Corporation Speckle reduction in optical coherence tomography by path length encoded angular compounding
US8078964B1 (en) 2003-02-28 2011-12-13 Adobe Systems Incorporated Reconstitute tag-delimited tables in a graphics editing application
US7337398B1 (en) * 2003-02-28 2008-02-26 Adobe Systems Incorporated Reconstitute tag-delimited tables in a graphics editing application
USRE47675E1 (en) 2003-06-06 2019-10-29 The General Hospital Corporation Process and apparatus for a wavelength tuning source
US8416818B2 (en) 2003-06-06 2013-04-09 The General Hospital Corporation Process and apparatus for a wavelength tuning source
US7995627B2 (en) 2003-06-06 2011-08-09 The General Hospital Corporation Process and apparatus for a wavelength tuning source
US7864822B2 (en) 2003-06-06 2011-01-04 The General Hospital Corporation Process and apparatus for a wavelength tuning source
US8428887B2 (en) 2003-09-08 2013-04-23 Ventana Medical Systems, Inc. Method for automated processing of digital images of tissue micro-arrays (TMA)
US20050266395A1 (en) * 2003-09-10 2005-12-01 Bioimagene, Inc. Method and system for morphology based mitosis identification and classification of digital images
US7979212B2 (en) 2003-09-10 2011-07-12 Ventana Medical Systems, Inc. Method and system for morphology based mitosis identification and classification of digital images
US20050136509A1 (en) * 2003-09-10 2005-06-23 Bioimagene, Inc. Method and system for quantitatively analyzing biological samples
US8515683B2 (en) 2003-09-10 2013-08-20 Ventana Medical Systems, Inc. Method and system for automated detection of immunohistochemical (IHC) patterns
US7941275B2 (en) 2003-09-10 2011-05-10 Ventana Medical Systems, Inc. Method and system for automated detection of immunohistochemical (IHC) patterns
US9377290B2 (en) 2003-10-27 2016-06-28 The General Hospital Corporation Method and apparatus for performing optical imaging using frequency-domain interferometry
US7969578B2 (en) 2003-10-27 2011-06-28 The General Hospital Corporation Method and apparatus for performing optical imaging using frequency-domain interferometry
US8705046B2 (en) 2003-10-27 2014-04-22 The General Hospital Corporation Method and apparatus for performing optical imaging using frequency-domain interferometry
WO2005045734A1 (en) * 2003-10-30 2005-05-19 Bioimagene, Inc. Method and system for automatically determinig diagnostic saliency of digital images
US8018598B2 (en) 2004-05-29 2011-09-13 The General Hospital Corporation Process, system and software arrangement for a chromatic dispersion compensation using reflective layers in optical coherence tomography (OCT) imaging
US9664615B2 (en) 2004-07-02 2017-05-30 The General Hospital Corporation Imaging system and related techniques
US8369669B2 (en) 2004-07-02 2013-02-05 The General Hospital Corporation Imaging system and related techniques
US20090003789A1 (en) * 2004-07-02 2009-01-01 The General Hospital Corporation Imaging system and related techniques
US8676013B2 (en) 2004-07-02 2014-03-18 The General Hospital Corporation Imaging system using and related techniques
US7925133B2 (en) 2004-07-02 2011-04-12 The General Hospital Corporation Imaging system and related techniques
US7809225B2 (en) 2004-07-02 2010-10-05 The General Hospital Corporation Imaging system and related techniques
US7809226B2 (en) 2004-07-02 2010-10-05 The General Hospital Corporation Imaging system and related techniques
US20090022463A1 (en) * 2004-07-02 2009-01-22 The General Hospital Corporation Imaging system and related techniques
US8081316B2 (en) 2004-08-06 2011-12-20 The General Hospital Corporation Process, system and software arrangement for determining at least one location in a sample using an optical coherence tomography
US9226660B2 (en) 2004-08-06 2016-01-05 The General Hospital Corporation Process, system and software arrangement for determining at least one location in a sample using an optical coherence tomography
US8965487B2 (en) 2004-08-24 2015-02-24 The General Hospital Corporation Process, system and software arrangement for measuring a mechanical strain and elastic properties of a sample
US9254102B2 (en) 2004-08-24 2016-02-09 The General Hospital Corporation Method and apparatus for imaging of vessel segments
US8208995B2 (en) 2004-08-24 2012-06-26 The General Hospital Corporation Method and apparatus for imaging of vessel segments
US9763623B2 (en) 2004-08-24 2017-09-19 The General Hospital Corporation Method and apparatus for imaging of vessel segments
USRE44042E1 (en) 2004-09-10 2013-03-05 The General Hospital Corporation System and method for optical coherence imaging
USRE45512E1 (en) 2004-09-29 2015-05-12 The General Hospital Corporation System and method for optical coherence imaging
US7995210B2 (en) 2004-11-24 2011-08-09 The General Hospital Corporation Devices and arrangements for performing coherence range imaging using a common path interferometer
US8922781B2 (en) 2004-11-29 2014-12-30 The General Hospital Corporation Arrangements, devices, endoscopes, catheters and methods for performing optical imaging by simultaneously illuminating and detecting multiple points on a sample
US9326682B2 (en) 2005-04-28 2016-05-03 The General Hospital Corporation Systems, processes and software arrangements for evaluating information associated with an anatomical structure by an optical coherence ranging technique
US8351665B2 (en) * 2005-04-28 2013-01-08 The General Hospital Corporation Systems, processes and software arrangements for evaluating information associated with an anatomical structure by an optical coherence ranging technique
US9060689B2 (en) 2005-06-01 2015-06-23 The General Hospital Corporation Apparatus, method and system for performing phase-resolved optical frequency domain imaging
US9441948B2 (en) 2005-08-09 2016-09-13 The General Hospital Corporation Apparatus, methods and storage medium for performing polarization-based quadrature demodulation in optical coherence tomography
US8928889B2 (en) 2005-09-29 2015-01-06 The General Hospital Corporation Arrangements and methods for providing multimodality microscopic imaging of one or more biological structures
US9304121B2 (en) 2005-09-29 2016-04-05 The General Hospital Corporation Method and apparatus for optical imaging via spectral encoding
US8289522B2 (en) 2005-09-29 2012-10-16 The General Hospital Corporation Arrangements and methods for providing multimodality microscopic imaging of one or more biological structures
US9513276B2 (en) 2005-09-29 2016-12-06 The General Hospital Corporation Method and apparatus for optical imaging via spectral encoding
US8149418B2 (en) 2005-09-29 2012-04-03 The General Hospital Corporation Method and apparatus for optical imaging via spectral encoding
US8760663B2 (en) 2005-09-29 2014-06-24 The General Hospital Corporation Method and apparatus for optical imaging via spectral encoding
US7889348B2 (en) 2005-10-14 2011-02-15 The General Hospital Corporation Arrangements and methods for facilitating photoluminescence imaging
GB2433985A (en) * 2006-01-09 2007-07-11 Cytokinetics Inc Characterization of features within the boundary regions of biological cells
US7796270B2 (en) 2006-01-10 2010-09-14 The General Hospital Corporation Systems and methods for generating data based on one or more spectrally-encoded endoscopy techniques
US9516997B2 (en) 2006-01-19 2016-12-13 The General Hospital Corporation Spectrally-encoded endoscopy techniques, apparatus and methods
US8145018B2 (en) 2006-01-19 2012-03-27 The General Hospital Corporation Apparatus for obtaining information for a structure using spectrally-encoded endoscopy techniques and methods for producing one or more optical arrangements
US9646377B2 (en) 2006-01-19 2017-05-09 The General Hospital Corporation Methods and systems for optical imaging or epithelial luminal organs by beam scanning thereof
US10987000B2 (en) 2006-01-19 2021-04-27 The General Hospital Corporation Methods and systems for optical imaging or epithelial luminal organs by beam scanning thereof
US9087368B2 (en) 2006-01-19 2015-07-21 The General Hospital Corporation Methods and systems for optical imaging or epithelial luminal organs by beam scanning thereof
US9791317B2 (en) 2006-01-19 2017-10-17 The General Hospital Corporation Spectrally-encoded endoscopy techniques and methods
US10426548B2 (en) 2006-02-01 2019-10-01 The General Hosppital Corporation Methods and systems for providing electromagnetic radiation to at least one portion of a sample using conformal laser therapy procedures
US9186066B2 (en) 2006-02-01 2015-11-17 The General Hospital Corporation Apparatus for applying a plurality of electro-magnetic radiations to a sample
US9186067B2 (en) 2006-02-01 2015-11-17 The General Hospital Corporation Apparatus for applying a plurality of electro-magnetic radiations to a sample
US9777053B2 (en) 2006-02-08 2017-10-03 The General Hospital Corporation Methods, arrangements and systems for obtaining information associated with an anatomical sample using optical microscopy
US7982879B2 (en) 2006-02-24 2011-07-19 The General Hospital Corporation Methods and systems for performing angle-resolved fourier-domain optical coherence tomography
USRE46412E1 (en) 2006-02-24 2017-05-23 The General Hospital Corporation Methods and systems for performing angle-resolved Fourier-domain optical coherence tomography
US10413175B2 (en) 2006-05-10 2019-09-17 The General Hospital Corporation Process, arrangements and systems for providing frequency domain imaging of a sample
US8175685B2 (en) 2006-05-10 2012-05-08 The General Hospital Corporation Process, arrangements and systems for providing frequency domain imaging of a sample
US9364143B2 (en) 2006-05-10 2016-06-14 The General Hospital Corporation Process, arrangements and systems for providing frequency domain imaging of a sample
US7920271B2 (en) 2006-08-25 2011-04-05 The General Hospital Corporation Apparatus and methods for enhancing optical coherence tomography imaging using volumetric filtering techniques
US9968245B2 (en) 2006-10-19 2018-05-15 The General Hospital Corporation Apparatus and method for obtaining and providing imaging information associated with at least one portion of a sample, and effecting such portion(s)
US20080097225A1 (en) * 2006-10-19 2008-04-24 The General Hospital Corporation Apparatus and method for obtaining and providing imaging information associated with at least one portion of a sample, and effecting such portion(s)
US8838213B2 (en) 2006-10-19 2014-09-16 The General Hospital Corporation Apparatus and method for obtaining and providing imaging information associated with at least one portion of a sample, and effecting such portion(s)
US7949019B2 (en) 2007-01-19 2011-05-24 The General Hospital Wavelength tuning source based on a rotatable reflector
US9176319B2 (en) 2007-03-23 2015-11-03 The General Hospital Corporation Methods, arrangements and apparatus for utilizing a wavelength-swept laser using angular scanning and dispersion procedures
US10534129B2 (en) 2007-03-30 2020-01-14 The General Hospital Corporation System and method providing intracoronary laser speckle imaging for the detection of vulnerable plaque
US8045177B2 (en) 2007-04-17 2011-10-25 The General Hospital Corporation Apparatus and methods for measuring vibrations using spectrally-encoded endoscopy
US9375158B2 (en) 2007-07-31 2016-06-28 The General Hospital Corporation Systems and methods for providing beam scan patterns for high speed doppler optical frequency domain imaging
US8611619B2 (en) * 2007-08-22 2013-12-17 Phadia Ab Read-out method and apparatus
US20100284583A1 (en) * 2007-08-22 2010-11-11 Phadia Ab Read-out method and apparatus
US7933021B2 (en) 2007-10-30 2011-04-26 The General Hospital Corporation System and method for cladding mode detection
US7898656B2 (en) 2008-04-30 2011-03-01 The General Hospital Corporation Apparatus and method for cross axis parallel spectroscopy
US8593619B2 (en) 2008-05-07 2013-11-26 The General Hospital Corporation System, method and computer-accessible medium for tracking vessel motion during three-dimensional coronary artery microscopy
US9173572B2 (en) 2008-05-07 2015-11-03 The General Hospital Corporation System, method and computer-accessible medium for tracking vessel motion during three-dimensional coronary artery microscopy
US8861910B2 (en) 2008-06-20 2014-10-14 The General Hospital Corporation Fused fiber optic coupler arrangement and method for use thereof
US10835110B2 (en) 2008-07-14 2020-11-17 The General Hospital Corporation Apparatus and method for facilitating at least partial overlap of dispersed ration on at least one sample
US8649581B2 (en) 2008-10-23 2014-02-11 Koninklijke Philips N.V. Colour management for biological samples
WO2010046821A1 (en) * 2008-10-23 2010-04-29 Koninklijke Philips Electronics N.V. Colour management for biological samples
CN102197305A (en) * 2008-10-23 2011-09-21 皇家飞利浦电子股份有限公司 Colour management for biological samples
US20110200240A1 (en) * 2008-10-23 2011-08-18 Koninklijke Philips Electronics N.V. Colour management for biological samples
US8937724B2 (en) 2008-12-10 2015-01-20 The General Hospital Corporation Systems and methods for extending imaging depth range of optical coherence tomography through optical sub-sampling
US9615748B2 (en) 2009-01-20 2017-04-11 The General Hospital Corporation Endoscopic biopsy apparatus, system and method
US8097864B2 (en) 2009-01-26 2012-01-17 The General Hospital Corporation System, method and computer-accessible medium for providing wide-field superresolution microscopy
US9178330B2 (en) 2009-02-04 2015-11-03 The General Hospital Corporation Apparatus and method for utilization of a high-speed optical wavelength tuning source
US11490826B2 (en) 2009-07-14 2022-11-08 The General Hospital Corporation Apparatus, systems and methods for measuring flow and pressure within a vessel
US8804126B2 (en) 2010-03-05 2014-08-12 The General Hospital Corporation Systems, methods and computer-accessible medium which provide microscopic images of at least one anatomical structure at a particular resolution
US9642531B2 (en) 2010-03-05 2017-05-09 The General Hospital Corporation Systems, methods and computer-accessible medium which provide microscopic images of at least one anatomical structure at a particular resolution
US10463254B2 (en) 2010-03-05 2019-11-05 The General Hospital Corporation Light tunnel and lens which provide extended focal depth of at least one anatomical structure at a particular resolution
US9408539B2 (en) 2010-03-05 2016-08-09 The General Hospital Corporation Systems, methods and computer-accessible medium which provide microscopic images of at least one anatomical structure at a particular resolution
US9069130B2 (en) 2010-05-03 2015-06-30 The General Hospital Corporation Apparatus, method and system for generating optical radiation from biological gain media
US9951269B2 (en) 2010-05-03 2018-04-24 The General Hospital Corporation Apparatus, method and system for generating optical radiation from biological gain media
US9557154B2 (en) 2010-05-25 2017-01-31 The General Hospital Corporation Systems, devices, methods, apparatus and computer-accessible media for providing optical imaging of structures and compositions
US9795301B2 (en) 2010-05-25 2017-10-24 The General Hospital Corporation Apparatus, systems, methods and computer-accessible medium for spectral analysis of optical coherence tomography images
US10939825B2 (en) 2010-05-25 2021-03-09 The General Hospital Corporation Systems, devices, methods, apparatus and computer-accessible media for providing optical imaging of structures and compositions
US10285568B2 (en) 2010-06-03 2019-05-14 The General Hospital Corporation Apparatus and method for devices for imaging structures in or at one or more luminal organs
US9510758B2 (en) 2010-10-27 2016-12-06 The General Hospital Corporation Apparatus, systems and methods for measuring blood pressure within at least one vessel
US9330092B2 (en) 2011-07-19 2016-05-03 The General Hospital Corporation Systems, methods, apparatus and computer-accessible-medium for providing polarization-mode dispersion compensation in optical coherence tomography
US10241028B2 (en) 2011-08-25 2019-03-26 The General Hospital Corporation Methods, systems, arrangements and computer-accessible medium for providing micro-optical coherence tomography procedures
US9341783B2 (en) 2011-10-18 2016-05-17 The General Hospital Corporation Apparatus and methods for producing and/or providing recirculating optical delay(s)
US9629528B2 (en) 2012-03-30 2017-04-25 The General Hospital Corporation Imaging system, method and distal attachment for multidirectional field of view endoscopy
US11490797B2 (en) 2012-05-21 2022-11-08 The General Hospital Corporation Apparatus, device and method for capsule microscopy
US9418414B2 (en) 2012-05-30 2016-08-16 Panasonic Intellectual Property Management Co., Ltd. Image measurement apparatus, image measurement method and image measurement system
US9020221B2 (en) 2012-07-13 2015-04-28 Sony Corporation Method and apparatus for automatic cancer diagnosis scoring of tissue samples
US9415550B2 (en) 2012-08-22 2016-08-16 The General Hospital Corporation System, method, and computer-accessible medium for fabrication miniature endoscope using soft lithography
US9892510B2 (en) 2012-10-24 2018-02-13 Sony Corporation Method and apparatus for automatic cancer diagnosis using percentage scoring
US10002426B2 (en) 2012-10-24 2018-06-19 Sony Corporation System for cancer diagnosis
US9881371B2 (en) 2012-10-24 2018-01-30 Sony Corporation System for visualization of a cancer diagnosis
US10176577B2 (en) 2012-10-24 2019-01-08 Sony Corporation System for determining a cancer diagnosis score derived from stained nuclei
US9471977B2 (en) 2012-12-07 2016-10-18 Fuji Xerox Co., Ltd. Image processing device, image processing system, and non-transitory computer readable medium
EP2930683A4 (en) * 2012-12-07 2016-09-28 Fuji Xerox Co Ltd Image processing device, image processing system, and program
EP3407258A1 (en) * 2012-12-07 2018-11-28 Fujifilm Corporation Image processing device, image processing system, and program
CN104854620A (en) * 2012-12-07 2015-08-19 富士施乐株式会社 Image processing device, image processing system, and program
US9968261B2 (en) 2013-01-28 2018-05-15 The General Hospital Corporation Apparatus and method for providing diffuse spectroscopy co-registered with optical frequency domain imaging
US10893806B2 (en) 2013-01-29 2021-01-19 The General Hospital Corporation Apparatus, systems and methods for providing information regarding the aortic valve
US11179028B2 (en) 2013-02-01 2021-11-23 The General Hospital Corporation Objective lens arrangement for confocal endomicroscopy
US10478072B2 (en) 2013-03-15 2019-11-19 The General Hospital Corporation Methods and system for characterizing an object
US9784681B2 (en) 2013-05-13 2017-10-10 The General Hospital Corporation System and method for efficient detection of the phase and amplitude of a periodic modulation associated with self-interfering fluorescence
US10117576B2 (en) 2013-07-19 2018-11-06 The General Hospital Corporation System, method and computer accessible medium for determining eye motion by imaging retina and providing feedback for acquisition of signals from the retina
US11452433B2 (en) 2013-07-19 2022-09-27 The General Hospital Corporation Imaging apparatus and method which utilizes multidirectional field of view endoscopy
US10058250B2 (en) 2013-07-26 2018-08-28 The General Hospital Corporation System, apparatus and method for utilizing optical dispersion for fourier-domain optical coherence tomography
US20150051484A1 (en) * 2013-08-14 2015-02-19 Siemens Aktiengesellschaft Histological Differentiation Grade Prediction of Hepatocellular Carcinoma in Computed Tomography Images
US9585627B2 (en) * 2013-08-14 2017-03-07 Siemens Healthcare Gmbh Histological differentiation grade prediction of hepatocellular carcinoma in computed tomography images
US9733460B2 (en) 2014-01-08 2017-08-15 The General Hospital Corporation Method and apparatus for microscopic imaging
US10736494B2 (en) 2014-01-31 2020-08-11 The General Hospital Corporation System and method for facilitating manual and/or automatic volumetric imaging with real-time tension or force feedback using a tethered imaging device
US10228556B2 (en) 2014-04-04 2019-03-12 The General Hospital Corporation Apparatus and method for controlling propagation and/or transmission of electromagnetic radiation in flexible waveguide(s)
US9569464B1 (en) * 2014-05-28 2017-02-14 Pivotal Software, Inc. Element identification in database
US10912462B2 (en) 2014-07-25 2021-02-09 The General Hospital Corporation Apparatus, devices and methods for in vivo imaging and diagnosis
US10191053B2 (en) 2014-07-29 2019-01-29 Flagship Biosciences, Inc. Methods for measuring and reporting vascularity in a tissue sample
US10628658B2 (en) * 2014-11-10 2020-04-21 Ventana Medical Systems, Inc. Classifying nuclei in histology images
CN106885802A (en) * 2015-12-16 2017-06-23 范石军 The real-time monitoring system of the antibiotic residue of sewage discharge in a kind of breeding process
US11158049B2 (en) * 2015-12-18 2021-10-26 Abbott Laboratories Methods and systems for assessing histological stains
CN106896100A (en) * 2015-12-18 2017-06-27 范石军 The quick decision-making system of antibiotic residue in a kind of livestock products
WO2018097883A1 (en) * 2016-11-22 2018-05-31 Agilent Technologies, Inc. A method for unsupervised stain separation in pathological whole slide image
WO2019009893A1 (en) * 2017-07-05 2019-01-10 Flagship Biosciences Inc. Methods for measuring and reporting vascularity in a tissue sample
CN111079637A (en) * 2019-12-12 2020-04-28 武汉轻工大学 Method, device and equipment for segmenting rape flowers in field image and storage medium
CN117575977A (en) * 2024-01-17 2024-02-20 锦恒科技(大连)有限公司 Follicular region enhancement method for ovarian tissue analysis

Also Published As

Publication number Publication date
AU2003255763A1 (en) 2004-03-03
WO2004017052A2 (en) 2004-02-26
WO2004017052A3 (en) 2004-06-03
EP1529207A2 (en) 2005-05-11
JP2005535892A (en) 2005-11-24
JP4420821B2 (en) 2010-02-24
US20050276457A1 (en) 2005-12-15
US7079675B2 (en) 2006-07-18

Similar Documents

Publication Publication Date Title
US7079675B2 (en) Histological assessment
EP1563457B1 (en) Image analysis
Keenan et al. An automated machine vision system for the histological grading of cervical intraepithelial neoplasia (CIN)
US7483919B2 (en) Object based image retrieval
Kazemi et al. Automatic recognition of acute myelogenous leukemia in blood microscopic images using k-means clustering and support vector machine
Cardoso et al. Towards an intelligent medical system for the aesthetic evaluation of breast cancer conservative treatment
TWI412949B (en) Automated selection of image regions
US7027627B2 (en) Medical decision support system and method
US8340372B2 (en) Image analysis
CN109145921A (en) A kind of image partition method based on improved intuitionistic fuzzy C mean cluster
JP2008077677A (en) Measurement of mitotic activity
EP1579366B1 (en) Histological assessment of nuclear pleomorphism
CN116188423B (en) Super-pixel sparse and unmixed detection method based on pathological section hyperspectral image
Lassen Automated determination of crystal orientations from electron backscattering patterns
EP4075325A1 (en) Method and system for the classification of histopathological images based on multiple instance learning
JP4981112B2 (en) Histological assessment
Aubreville et al. Field of Interest Proposal for Augmented Mitotic Cell Count: Comparison of Two Convolutional Networks.
JP2006507486A (en) Automatic histological categorization of tubules
CN117893450A (en) Digital pathological image enhancement method, device and equipment
Tay Algorithms for Tissue Image Analysis using Multifractal Techniques
Venà et al. Micrometastasis Detection Guidance by Whole-Slide Image Texture Analysis in Colorectal Lymph Nodes
Potapov et al. K-means approach in tumors cell color segmentation in lab color space

Legal Events

Date Code Title Description
AS Assignment

Owner name: QINETIQ LIMITED, UNITED KINGDOM

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAMER, MICHAEL J.;PETROU, MARIA;KESIDIS, TASOS;REEL/FRAME:013610/0833;SIGNING DATES FROM 20021031 TO 20021205

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING PUBLICATION PROCESS