WO1996009604A1 - Apparatus for automated identification of cell groupings on a biological specimen - Google Patents

Apparatus for automated identification of cell groupings on a biological specimen Download PDF

Info

Publication number
WO1996009604A1
WO1996009604A1 PCT/US1995/011460 US9511460W WO9609604A1 WO 1996009604 A1 WO1996009604 A1 WO 1996009604A1 US 9511460 W US9511460 W US 9511460W WO 9609604 A1 WO9609604 A1 WO 9609604A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
representation
output
residue
nuclei
Prior art date
Application number
PCT/US1995/011460
Other languages
French (fr)
Inventor
Paul S. Wilhelm
Shih-Jong J. Lee
Original Assignee
Neopath, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neopath, Inc. filed Critical Neopath, Inc.
Priority to AU35861/95A priority Critical patent/AU3586195A/en
Publication of WO1996009604A1 publication Critical patent/WO1996009604A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • G06V30/155Removing patterns interfering with the pattern to be recognised, such as ruled lines or underlines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Definitions

  • This invention relates to the automated detection of aggregates of cells within biologic samples such as cytologic specimens and more particularly to an automated cytology system that classifies cytological specimens, based in part on the analysis of aggregates of cells, as normal or needing human review.
  • sample adequacy as defined in The Bethesda System "The Bethesda System for Reporting Cervical/Vaginal Cytologic Diagnoses", Robert J. Karman, Diane Soloman, Springer-Verlag, 1994 is determined, in part, through the identification of endocervical component cells in the sample that appear almost exclusively within aggregates. Therefore, the design of an automated screening device for cytologic samples must include the ability to detect and identify cellular aggregates.
  • the invention provides a method and apparatus for the detection of cellular aggregates within cytologic samples.
  • the invention comprises an image analysis system that further comprises an image gathering system having a camera, a motion controller, an illumination system, and an image transfer interface.
  • the image gathering system is constructed for gathering image data of a specimen mounted on a slide.
  • the image gathering system is coupled to a data processing system to transfer image data from the image gathering system to the data processing system.
  • the data processing system analyzes the image data to identify objects of interest.
  • the data processing system implements a four step process. The first step is the acquisition of an image for analysis.
  • the second step is the extraction of image features.
  • the third step is classification of the image to determine if any potential cellular aggregates may exist in the image.
  • the fourth step is segmentation of objects which include the substeps of detecting and locating potential cellular aggregates.
  • Figures 1A, 1B and 1C show an apparatus for automatic identification of cell groupings on a biomedical specimen.
  • Figure 2A shows a process flow diagram of the image processing and analysis performed for each image of biologic specimens.
  • Figure 2B shows a process flow diagram of the method of the invention to segment each image of biologic specimens.
  • Figure 3 shows a process flow diagram for background object removal.
  • Figures 4A, 4B, 4C and 4D show a schematic of a combination of two segmentation masks.
  • Figures 5A, 5B, 5C and 5D show process flow diagrams for the nuclear thresholding of the invention.
  • Figure 5D further comprises Figure 5E and Figure 5F which are intended to be read pieced together as a single figure.
  • Figure 6 shows the process flow diagram for object refinement of the invention.
  • FIGS 7A and 7B show process flow diagrams for nuclei clustering of the invention. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • the camera system disclosed herein is used in a system for analyzing cervical pap smears, such as that shown and disclosed in U.S. Patent Application Serial No. 07/838,064, entitled “Method For Identifying Normal Biomedical Specimens", by Alan C. Nelson, et al., filed February 18, 1992; U.S. Patent Application Serial No. 08/179,812 filed January 10, 1994 which is a continuation in part of U.S. Patent Application Serial No. 07/838,395, entitled “Method For Identifying Objects Using Data Processing Techniques", by S. James Lee, et al., filed February 18, 1992; U.S. Patent Application Serial No.
  • the present invention is also related to biological and cytological systems as described in the following patent applications which are assigned to the same assignee as the present invention, filed on September 20, 1994 unless otherwise noted, and which are all hereby incorporated by reference including U.S. Patent Application Serial No. 08/309,118, to Kuan et al. entitled, "Field Prioritization Apparatus and Method," U.S. Patent Application Serial No. 08/309,116 to Meyer et al. entitled “Apparatus for Automated Identification of Thick Cell Groupings on a Biological Specimen," U.S. Patent Application Serial No. 08/309,115 to Lee et al. entitled “Biological Analysis System Self Calibration Apparatus," U.S. Patent Application Serial No.
  • Figures 1A and 1B show a schematic diagram of one embodiment of the apparatus of the invention for checking illumination integrity for an automated microscope. While the method and apparatus of the invention will be discussed in terms of an example herein related to an automated cytology apparatus, it will be understood that the invention is not so limited. The features and principles of the invention may be applied to check urine analysis processes, semiconductor process defects, liquid crystal devices and other types of processing systems employing, for example, continuous arc lamps, filament lamps, laser sources, tube cameras, PIN diodes and photomultiplier tubes.
  • the apparatus of the invention comprises an imaging system 502, a motion control system 504, an image processing system 536, a central processing system 540, and a workstation 542.
  • the imaging system 502 is comprised of an illuminator 508, imaging optics 510, a CCD camera 512, an illumination sensor 514 and an image capture and focus system 516.
  • the image capture and focus system 516 provides video timing data to the CCD cameras 512, the CCD cameras 512 provide images comprising scan lines to the image capture and focus system 516.
  • An illumination sensor intensity is provided to the image capture and focus system 516 where an illumination sensor 514 receives the sample of the image from the optics 510.
  • the optics may further comprise an automated microscope.
  • the illuminator 508 provides illumination of a slide.
  • the image capture and focus system 516 provides data to a VME bus 538.
  • the VME bus 538 distributes the data to an image processing system 536.
  • the image processing system 536 is comprised of field-of-view processors 568.
  • the images are sent along the image bus 564 from the image capture and focus system 516.
  • a central processor 540 controls the operation of the invention through the VME bus 538.
  • the central processor 562 comprises a Motorola 68030 CPU.
  • the motion controller 504 is comprised of a tray handler 518, a microscope stage controller 520, a microscope turret controller 522, and a calibration slide 524.
  • the motor drivers 526 position the slide under the optics.
  • a bar code reader 528 reads a barcode located on the slide 524.
  • a touch sensor 530 determines whether a slide is under the microscope objectives, and a door interlock 532 prevents operation in case the doors are open.
  • Motion controller 534 controls the motor drivers 526 in response to the central processor 540.
  • An Ethernet (TM) communication system 560 communicates to a workstation 542 to provide control of the system.
  • a hard disk 544 is controlled by workstation processor 550.
  • workstation 542 may comprise a Sun Sparc Classic (TM) workstation.
  • a tape drive 546 is connected to the workstation processor 550 as well as a modem 548, a monitor 552, a keyboard 554, and a mouse pointing device 556.
  • a printer 558 is connected to the Ethernet (TM) network system 560.
  • the central computer 540 running a real time operating system, controls the automated microscope and the processor to acquire and digitize images from the microscope.
  • the flatness of the slide may be checked, for example, by contacting the four corners of the slide using a computer controlled touch sensor.
  • the computer 540 also controls the microscope stage to position the specimen under the microscope objective, and from one to 15 field of view (FOV) processors 568 which receive images under control of the computer 540.
  • FOV field of view
  • the calibration and test target may be mounted on a stage 521 substantially in a horizontal X,Y plane which intersects the optical path.
  • the stage 521 is movable in the X,Y plane as well as along a Z axis which is perpendicular to the X,Y plane and which is parallel to the optical axis of the automated microscope.
  • the turret 21 may comprise multiple objective lenses as is well known in the art.
  • the microscope turret control 522 provides signals in a well known manner for positioning a selected objective lens into position for viewing a slide, for example.
  • FIG. 2A shows a process flow diagram of the method of the invention to analyze cell aggregates.
  • An image is acquired in step 12.
  • the image is acquired using a digital camera attached to a microscope as shown in Figures 1A, 1B and 1C.
  • the image acquired by the camera is of the cytologic specimen, magnified by an objective lens of 20x magnification.
  • the camera digitizes the image to 512 by 512 pixels to a depth of 8 bits.
  • the magnification of 20x and an image size of 512 by 512 pixels is by way of example and not limitation, and one skilled in the art will appreciate that other magnifications and image sizes may be used without departing from the scope of the invention.
  • Image feature extraction 14 and image classification 16 show the process to achieve rapid removal of unproductive images.
  • Features and properties of the image are measured in image feature extraction in step 14.
  • the measured features are then used to determine if the image may contain identifiable cellular aggregates.
  • a characteristic called AverageHighPixelValue is greater than 240, then the image is rejected.
  • AverageHighPixelValue may be defined as the average intensity value of all pixels with pixel counts above 200 in an image where 0 is black and 255 is white.
  • the AverageHighPixelValue rule will identify images with very little cellular material. Such images have little chance of representing a cellular aggregate.
  • SmallDarkEdgeAverage 35000 + HighPixelCount ⁇ 15000, then the image is rejected, where SmallDarkEdgeAverage may be defined as the average value of image subject to a 5x5 closing residue operation,
  • N Pixels is the number of pixels in the image
  • All pixels indicates that the summation covers all pixels in the image
  • I Orig is the original image
  • is the morphological dilation operator (for example, as disclosed in Serra, J., "Image Analysis and Mathematical Morphology", Academic Press, 1982)
  • is the morphological erosion operator
  • closing residue is represented by the operation enclosed in parenthesis above
  • HighPixelCount is the number of pixels with pixel counts above 200 in the original image.
  • the SmallDarkEdgeAverage rule will identify images with so much material that reliable detection and identification of cellular aggregates is unlikely.
  • the image segmentation step 18 performs the identification of potential cellular aggregates. It is based on first identifying potential cell nuclei and then determining which nuclei lie close enough to other nuclei to be considered part of a cellular aggregate.
  • the image segmentation step includes five sub-steps. Segmentation steps 28 and 30 remove background objects, segmentation step 32 is image thresholding, segmentation step 34 is object refinement, and segmentation step 36 is nuclei clustering.
  • Figure 3 shows the method of the invention to remove large objects from the image. Since nuclei have a finite size range, it is useful to remove objects larger and smaller than that range before image thresholding step 32.
  • large objects are removed by closing the image 38 with a 27 by 5 flat top structuring element 40 and with a 5 by 27 flat top structuring element 42.
  • I Closed min[(( I Orig ⁇ (27x5)) ⁇ (27x5)), ((( I Orig ⁇ (5x27)) ⁇ (5x27))]
  • the closed image 44 is then iteratively eroded conditioned on the original image until no more erosion takes place. This is termed ultimate conditional erosion 46.
  • the structuring element for the ultimate conditional erosion is a flat top cross of 3x3 pixels.
  • I Erode(1) max [ I Orig , (I Erode(0) ⁇ ((3x3) Cross ))]
  • I Eride(i + 1) max [ I Orig , I Erode(i) ⁇ ((3x3) Cross ))]
  • a 3x3 cross structuring element is a center pixel, two adjacent horizontal pixels, and two adjacent vertical pixels, and where I Erode(i) represents the i th iteration of conditional erosion equation in the while loop above, of the image.
  • the residue 48 of the conditionally closed, eroded image and the original image contain only objects that are small enough to be nuclei, large objects are removed.
  • the residue image is then opened with a 9 by 9 flat top structuring element 50 to remove objects smaller than valid nuclei thereby producing a residue image 52.
  • the gray scale image is thresholded to produce a binary mask in step 32.
  • pixels may take on one of two values, active or inactive. Active pixels represent regions where potential nuclei have been identified.
  • thresholding is done by combining results of two different methods, thereby incorporating the advantages of one to offset the disadvantages of the other.
  • the first method segments the majority of nuclei completely with little problem of over segmentation.
  • the nuclei identified by the first method is used when the second method confirms that the mask of the first is not a false segmentation shown schematically in Figures 4A, 4B, 4C and 4D.
  • the first segmentation mask is created by blurring, scaling, and clipping the original image 55 for use as a threshold image for the image that resulted from step 28 above.
  • Blurring removes information other than background from the threshold image.
  • Scaling provides appropriate threshold levels. Clipping assures that nuclei have at least a minimum residue strength before they are included in the mask.
  • the functions are implemented as:
  • I Blur ((I Orig ) ⁇ (9x9)) ⁇ (6x6) ,
  • the first mask is generated by the following rule:
  • I iResidue is the i th pixel of image I Residue and I iCllp is the i th pixel of image I Clip .
  • the second mask is created by conditioning the residue image I Residue .
  • nuclei Often nuclei appear overlapped in an image. If a threshold is calculated for all nuclei without regarding overlap, nuclei that overlap will be segmented as one. If the threshold for overlapping objects is adjusted to segment less, the nuclei may segment as separate objects. When objects overlap, their apparent darkness is greater; their residue will be stronger. Therefore, in one preferred embodiment, objects with strong residue are thresholded differently than those with weak residue.
  • a modified residue image, I ResidueMax containing only monotonically valued objects is created by conditionally dilating the residue image only in areas where pixels are greater than zero so that each object has all pixels equal to the value of the maximum pixel value for the object in the residue image.
  • I Temp (I ResidueMax(i-1) ⁇ ((3x3) Cross ))
  • An object is considered to have a strong residue if its maximum residue value is greater than StrongObjectTestValue
  • An edge image is created.
  • a threshold is computed.
  • the mask for the second method is:
  • the nuclei mask is the conditional dilation of the second mask conditioned on the first mask (the conditional dilation is repeated 4 times to allow the complete dilation of all nuclei).
  • FIG. 5A shows one embodiment of nuclear thresholding as employed in the method of the invention.
  • An original image 38 and a residue image 52 are fed to a first segmenter 172 and a second segmenter 174.
  • the first segmenter 172 generates a first mask 176.
  • the second segmenter 174 generates a second mask 178.
  • the combination of the first mask 176 and the second mask 178 in combiner 89 produces a nuclei image 124.
  • Figure 5B shows the first segmenter of the invention.
  • the original image 38 is fed to be blurred in 180.
  • the image is then scaled in 182.
  • the image is then clipped in 184.
  • the residue image 52 is then thresholded with the clipped image in 186 to generate a first mask 176.
  • Figure 5C shows the creation of the second mask 178.
  • the original image 38 is fed to test the strong object value calculator 188.
  • the residue image is fed to create a maximum residue image 190 and an edge image 192.
  • a strong object image is created by thresholding 194.
  • the second mask 178 is created by taking the residue image 52, and thresholding with the threshold images, generated from the strong object image 194 and the edge image 192.
  • Figure 5D comprising Figure 5E and Figure 5F which are intended to be pieced together to show the processing flow for a preferred embodiment of nuclear thresholding to find a nuclei image from an original image, where a residue image has already been created from the original image.
  • the cytological image processing method starts with the step of obtaining a digital representation of the cytological image 38.
  • the invention then does a 3x3 erode 81 of the residue image 52 to provide a first representation 101.
  • a 3x3 dilation 82 of the residue image 52 provides a second representation 102.
  • Subtracting 83 the residue image 52 from the second representation 102 provides a third representation 103.
  • Subtracting 84 the third representation 103 from the first representation 101 and setting all negative values to zero provides a fourth representation 104.
  • the invention compares the residue image 52 to zero to provide a binary condition control signal 125.
  • the invention then repeats a binary conditional dilation 86 with a 3x3 cross, eight times to provide a fifth representation 105.
  • the residue image 52 is transferred to a sixth representation 106 if the fifth representation 105 is greater than a nineteenth representation 119.
  • the invention then morphologically computes 88 a binary result to a seventh representation 107, the binary result being one if the residue image 52 is greater than a predetermined combination of the fourth representation 104, the residue image 52 and the sixth representation 106, zero otherwise.
  • the seventh representation 107 is set to a one, zero otherwise.
  • a 3x3 blurring 91 of the original image 38 provides an eighth representation 108.
  • a 3x3 dilation 92 of the original image 38 provides a ninth representation 109.
  • a 9x9 erosion 94 of the original image 38 provides a tenth representation 110.
  • the invention then subtracts the original image 38 from the eighth representation 108 to provide an eleventh representation 111.
  • Subtraction 95 of the original image 38 from the ninth representation 109 provides a twelfth representation 112.
  • Dilation 97 of the tenth representation 110 provides a thirteenth representation 113.
  • Conversion of negative pixels to positive pixels of the same magnitude 98 for the eleventh representation 111 gives the fifteenth representation 115.
  • Computation of the pixel average 99 of the twelfth representation 112 provides a fourteenth representation 114.
  • Computation of the pixel average 79 of the fifteenth representation 115 provides a seventeenth representation 117.
  • Computation of the pixel average 78 of the original image 38 provides an eighteenth representation 118.
  • Shifting 100 of the thirteenth representation 113 right one bit provides a sixteenth representation 116.
  • Computation of the sum 77 of the fourteenth representation 114, seventeenth representation 117, three times the eighteenth representation 118 and subtracting 255 provides the nineteenth representation 119.
  • Taking the maximum 75 of the sixteenth representation 116 and the value 10 provides a twentieth representation 120.
  • Comparison 90 of the residue image 52 to the twentieth representation 120 provides a twenty-first representation 121.
  • Conditional dilation 89 of the seventh representation 107 and twenty-first representation 121 provides the nuclei image 124.
  • Object refinement is conducted in step 34, Figure 2B.
  • Figure 6 shows the object refinement step 34 in more detail.
  • Small holes may be filled in segmentation by closing with a 3x3 structuring element 126.
  • the segmentation mask may then be smoothed by opening with a 5x5 structuring element 128.
  • the nuclei image is updated upon completion of the smoothing operation, creating the smooth nuclei image 130.
  • I NucleiNoHoles (((I Nuclei ⁇ (3x3)) ⁇ (3x3) )).
  • a nuclei image 124 has very small objects removed at 198. Remaining objects are expanded depending upon their size where large objects are expanded more than small objects 199.
  • a dilated image 159 is generated.
  • clustering 36 ( Figure 2B) is nuclear size dependent. Small nuclei must be close to be considered part of an aggregate, while larger nuclei are not so restricted. Larger nuclei may be more distant and still considered part of an aggregate.
  • Clustering is accomplished by dilating nuclei dependent on size. Size dependent dilation is accomplished by creating nuclei masks for nuclei of different sizes, then dilating each according to size range and "OR"ing the masks to give a final cluster mask.
  • I Size1 (I Nuclei ⁇ (5x5))
  • I Size2 (I Size1 ⁇ ((3x3) Cross ))
  • I Size3 (I Size1 ⁇ ((3x3) Cross ⁇ )
  • I Dilate 3 ( IW Dilate 1 ⁇ ((7x7) Diamond )) ⁇ (4x4) where a 5x5 Diamond structuring element is
  • a 7x7 Diamond is the 5x5 Diamond dilated by a 3x3 Cross.
  • the invention takes the nuclei image 124 and does a 5x5 erosion 132 of the nuclei image 124 to provide a first cluster representation 151.
  • a 3x3 cross erosion 136 of the first cluster representation 151 provides a second cluster representation 152.
  • a 3x3 cross erosion 138 of the second cluster representation 152 provides a third cluster representation 153.
  • a 5x5 diamond dilation 140 of the third cluster representation 153 provides a fourth cluster representation 154.
  • the logical ORing 142 of the second cluster representation 152 and fourth cluster representation 154 provides a fifth cluster representation 155. Dilating 144 the fifth cluster representation provides a sixth cluster representation 156.
  • Logical ORing 146 the first cluster representation 151 and sixth cluster representation 156 provides a seventh cluster representation 157.
  • a 7x7 cross dilation 148 of the seventh cluster representation 157 provides an eighth cluster representation 158.
  • a 4x4 dilation 150 of the eighth cluster representation 158 provides the segmented image 159.
  • the object is considered to be a potential cellular aggregate. Other objects are removed from consideration.
  • feature extraction 20 is the measurement of features related to the segmented potential cellular aggregates.
  • the ratio of the standard deviation of the pixel values of the nuclei to the standard deviation of the pixel values of the cluster are measured.
  • the standard deviation of the nuclear compactness is measured. Where the nuclear compactness is defined as:
  • NuclearCompactness (Perimeter ⁇ 2) /Area With feature values available the object classification 22 step may be performed. In one embodiment, an object is classified as probable squamous artifact if:

Abstract

The detection of cellular aggregates within cytologic samples. An image analysis system with an image gathering system (511) includes a camera (512), a motion controller (520), an illumination system (508) and an interface obtains images (55) of cell groupings. The image gathering system (511) is constructed for gathering image data (38) of a specimen mounted on a slide and is coupled to a data processing system (540). Image data (38) is transferred from the image gathering system (511) to the data processing system (540). The data processing system (540) obtains objects of interest. A four step process finds cellular aggregates. The first step is acquisition of an image for analysis (12). The second step is extraction of image features (14). The third step is classification of the image (16) to determine if any potential cellular aggregates may exist in the image (24). The fourth step is segmentation of objects (18) which includes the substeps of detecting and locating potential cellular aggregates.

Description

APPARATUS FOR AUTOMATED IDENTIFICATION OF CELL GROUPINGS ON A BIOLOGICAL SPECIMEN
This invention relates to the automated detection of aggregates of cells within biologic samples such as cytologic specimens and more particularly to an automated cytology system that classifies cytological specimens, based in part on the analysis of aggregates of cells, as normal or needing human review.
BACKGROUND OF THE INVENTION
Historically, screening of cytologic material has been a task for trained technicians and cytopathologists. Even though screening is done by highly trained individuals, the task is repetitive and requires acute attention at all times. Since screening of cytologic material is repetitive and tedious, it has been thought to be ripe for automation. On the other hand, the complexity and variety of material found in cytologic specimens has proven very difficult to examine in an automated fashion. As a result, automated screening of cytologic specimens has been the unrealized goal of research for many years.
Recent research has demonstrated an effective approach for detection and identification of cellular abnormalities as demonstrated by isolated cells in a cytologic sample (U.S. Patent Application Serial No. 08/179,812, Method for Identifying Objects Using Data Processing Techniques). However, in many cases, significant abnormalities are manifested within cell aggregates rather than as isolated cells. See "Diagnostic Cytopathology of the Uterine Cervix", by Stanley F. Patten, Jr. Identification of cell groupings in biological specimens has previously been achieved by human visual identification. It is important, therefore, that any automated screening device have the capability of identifying abnormalities within cell aggregates as well as among isolated cells. Additionally, in cervical cytology, sample adequacy as defined in The Bethesda System "The Bethesda System for Reporting Cervical/Vaginal Cytologic Diagnoses", Robert J. Karman, Diane Soloman, Springer-Verlag, 1994 is determined, in part, through the identification of endocervical component cells in the sample that appear almost exclusively within aggregates. Therefore, the design of an automated screening device for cytologic samples must include the ability to detect and identify cellular aggregates.
Therefore it is a motivation of the invention to automatically identify cell groupings within cellular specimens.
SUMMARY OF THE INVENTION
The invention provides a method and apparatus for the detection of cellular aggregates within cytologic samples. The invention comprises an image analysis system that further comprises an image gathering system having a camera, a motion controller, an illumination system, and an image transfer interface. The image gathering system is constructed for gathering image data of a specimen mounted on a slide. The image gathering system is coupled to a data processing system to transfer image data from the image gathering system to the data processing system. The data processing system analyzes the image data to identify objects of interest. The data processing system implements a four step process. The first step is the acquisition of an image for analysis. The second step is the extraction of image features. The third step is classification of the image to determine if any potential cellular aggregates may exist in the image. The fourth step is segmentation of objects which include the substeps of detecting and locating potential cellular aggregates.
Other objects, features and advantages of the present invention will become apparent to those skilled in the art through the description of the preferred embodiment, claims and drawings herein wherein like numerals refer to like elements.
BRIEF DESCRIPTION OF THE DRAWINGS
To illustrate this invention, a preferred embodiment will be described herein with reference to the accompanying drawings.
Figures 1A, 1B and 1C show an apparatus for automatic identification of cell groupings on a biomedical specimen.
Figure 2A shows a process flow diagram of the image processing and analysis performed for each image of biologic specimens.
Figure 2B shows a process flow diagram of the method of the invention to segment each image of biologic specimens.
Figure 3 shows a process flow diagram for background object removal.
Figures 4A, 4B, 4C and 4D show a schematic of a combination of two segmentation masks.
Figures 5A, 5B, 5C and 5D show process flow diagrams for the nuclear thresholding of the invention. Figure 5D further comprises Figure 5E and Figure 5F which are intended to be read pieced together as a single figure.
Figure 6 shows the process flow diagram for object refinement of the invention.
Figures 7A and 7B show process flow diagrams for nuclei clustering of the invention. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
In a presently preferred embodiment of the invention, the camera system disclosed herein is used in a system for analyzing cervical pap smears, such as that shown and disclosed in U.S. Patent Application Serial No. 07/838,064, entitled "Method For Identifying Normal Biomedical Specimens", by Alan C. Nelson, et al., filed February 18, 1992; U.S. Patent Application Serial No. 08/179,812 filed January 10, 1994 which is a continuation in part of U.S. Patent Application Serial No. 07/838,395, entitled "Method For Identifying Objects Using Data Processing Techniques", by S. James Lee, et al., filed February 18, 1992; U.S. Patent Application Serial No. 07/838,070, now U.S. Pat. No. 5,315,700, entitled "Method And Apparatus For Rapidly Processing Data Sequences", by Richard S. Johnston, et al., filed February 18, 1992; U.S. Patent Application Serial No. 07/838,065, filed 02/18/92, entitled "Method and Apparatus for Dynamic Correction of Microscopic Image Signals" by Jon W. Hayenga, et al.; and U.S. Patent Application Serial No. 08/302,355, filed September 7, 1994 entitled "Method and Apparatus for Rapid Capture of Focused Microscopic Images" to Hayenga, et al., which is a continuation-in-part of Application Serial No. 07/838,063 filed on February 18, 1992 the disclosures of which are incorporated herein, in their entirety, by the foregoing references thereto.
The present invention is also related to biological and cytological systems as described in the following patent applications which are assigned to the same assignee as the present invention, filed on September 20, 1994 unless otherwise noted, and which are all hereby incorporated by reference including U.S. Patent Application Serial No. 08/309,118, to Kuan et al. entitled, "Field Prioritization Apparatus and Method," U.S. Patent Application Serial No. 08/309,116 to Meyer et al. entitled "Apparatus for Automated Identification of Thick Cell Groupings on a Biological Specimen," U.S. Patent Application Serial No. 08/309,115 to Lee et al. entitled "Biological Analysis System Self Calibration Apparatus," U.S. Patent Application Serial No. 08/308,992, to Lee et al. entitled "Apparatus for Identification and Integration of Multiple Cell Patterns," U.S. Patent Application Serial No. 08/309,063 to Lee et al. entitled "A Method for Cytological System Dynamic Normalization," U.S. Patent Application Serial No. 08/309,248 to Rosenlof et al. entitled "Method and Apparatus for Detecting a Microscope Slide Coverslip, " U.S. Patent Application Serial No. 08/309,077 to Rosenlof et al. entitled "Apparatus for Detecting Bubbles in Coverslip Adhesive," U.S. Patent Application Serial No. 08/309,931, to Lee et al. entitled "Cytological Slide Scoring Apparatus," U.S. Patent Application Serial No. 08/309,148 to Lee et al. entitled "Method and Apparatus for Image Plane Modulation Pattern Recognition," U.S. Patent Application Serial No. 08/309,250 to Lee et al. entitled "Apparatus for the Identification of Free-Lying Cells," U.S. Patent Application Serial No. 08/309,209 to Oh et al. entitled "A Method and Apparatus for Robust Biological Specimen Classification," U.S. Patent Application Serial No. 08/309,117, to Wilhelm et al. entitled "Method and Apparatus for Detection of Unsuitable Conditions for Automated Cytology Scoring."
Now refer to Figures 1A and 1B which show a schematic diagram of one embodiment of the apparatus of the invention for checking illumination integrity for an automated microscope. While the method and apparatus of the invention will be discussed in terms of an example herein related to an automated cytology apparatus, it will be understood that the invention is not so limited. The features and principles of the invention may be applied to check urine analysis processes, semiconductor process defects, liquid crystal devices and other types of processing systems employing, for example, continuous arc lamps, filament lamps, laser sources, tube cameras, PIN diodes and photomultiplier tubes.
The apparatus of the invention comprises an imaging system 502, a motion control system 504, an image processing system 536, a central processing system 540, and a workstation 542. The imaging system 502 is comprised of an illuminator 508, imaging optics 510, a CCD camera 512, an illumination sensor 514 and an image capture and focus system 516. The image capture and focus system 516 provides video timing data to the CCD cameras 512, the CCD cameras 512 provide images comprising scan lines to the image capture and focus system 516. An illumination sensor intensity is provided to the image capture and focus system 516 where an illumination sensor 514 receives the sample of the image from the optics 510. In one embodiment of the invention, the optics may further comprise an automated microscope. The illuminator 508 provides illumination of a slide. The image capture and focus system 516 provides data to a VME bus 538. The VME bus 538 distributes the data to an image processing system 536. The image processing system 536 is comprised of field-of-view processors 568. The images are sent along the image bus 564 from the image capture and focus system 516. A central processor 540 controls the operation of the invention through the VME bus 538. In one embodiment the central processor 562 comprises a Motorola 68030 CPU. The motion controller 504 is comprised of a tray handler 518, a microscope stage controller 520, a microscope turret controller 522, and a calibration slide 524. The motor drivers 526 position the slide under the optics. A bar code reader 528 reads a barcode located on the slide 524. A touch sensor 530 determines whether a slide is under the microscope objectives, and a door interlock 532 prevents operation in case the doors are open. Motion controller 534 controls the motor drivers 526 in response to the central processor 540. An Ethernet (TM) communication system 560 communicates to a workstation 542 to provide control of the system. A hard disk 544 is controlled by workstation processor 550. In one embodiment, workstation 542 may comprise a Sun Sparc Classic (TM) workstation. A tape drive 546 is connected to the workstation processor 550 as well as a modem 548, a monitor 552, a keyboard 554, and a mouse pointing device 556. A printer 558 is connected to the Ethernet (TM) network system 560.
During image collection integrity checking, the central computer 540, running a real time operating system, controls the automated microscope and the processor to acquire and digitize images from the microscope. The flatness of the slide may be checked, for example, by contacting the four corners of the slide using a computer controlled touch sensor. The computer 540 also controls the microscope stage to position the specimen under the microscope objective, and from one to 15 field of view (FOV) processors 568 which receive images under control of the computer 540.
Referring now to Figure 1C, there shown is placement of a calibration and test target 1 into an optical path of an automated microscope 3 having a turret 21. The calibration and test target may be mounted on a stage 521 substantially in a horizontal X,Y plane which intersects the optical path. The stage 521 is movable in the X,Y plane as well as along a Z axis which is perpendicular to the X,Y plane and which is parallel to the optical axis of the automated microscope. The turret 21 may comprise multiple objective lenses as is well known in the art. The microscope turret control 522 provides signals in a well known manner for positioning a selected objective lens into position for viewing a slide, for example.
It is to be understood that the various processes described herein may be implemented in software suitable for running on a digital processor. The software may be embedded, for example, in the central processor 540.
Refer now to Figure 2A which shows a process flow diagram of the method of the invention to analyze cell aggregates. An image is acquired in step 12. In a preferred embodiment of the invention, the image is acquired using a digital camera attached to a microscope as shown in Figures 1A, 1B and 1C. The image acquired by the camera is of the cytologic specimen, magnified by an objective lens of 20x magnification. The camera digitizes the image to 512 by 512 pixels to a depth of 8 bits. The magnification of 20x and an image size of 512 by 512 pixels is by way of example and not limitation, and one skilled in the art will appreciate that other magnifications and image sizes may be used without departing from the scope of the invention.
Since cellular aggregates may not exist in every acquired image, and since it is important to process images rapidly, the invention avoids extensive processing of images that contain no material of interest. Image feature extraction 14 and image classification 16 show the process to achieve rapid removal of unproductive images. Features and properties of the image are measured in image feature extraction in step 14. The measured features are then used to determine if the image may contain identifiable cellular aggregates. In one preferred embodiment, if a characteristic called AverageHighPixelValue is greater than 240, then the image is rejected. AverageHighPixelValue may be defined as the average intensity value of all pixels with pixel counts above 200 in an image where 0 is black and 255 is white. The AverageHighPixelValue rule will identify images with very little cellular material. Such images have little chance of representing a cellular aggregate.
Additionally, if:
(SmallDarkEdgeAverage * 35000) + HighPixelCount < 15000, then the image is rejected, where SmallDarkEdgeAverage may be defined as the average value of image subject to a 5x5 closing residue operation,
Figure imgf000011_0001
where NPixels is the number of pixels in the image, Allpixels indicates that the summation covers all pixels in the image, IOrig is the original image, ⊕ is the morphological dilation operator (for example, as disclosed in Serra, J., "Image Analysis and Mathematical Morphology", Academic Press, 1982), θ is the morphological erosion operator, closing residue is represented by the operation enclosed in parenthesis above, and HighPixelCount is the number of pixels with pixel counts above 200 in the original image. The SmallDarkEdgeAverage rule will identify images with so much material that reliable detection and identification of cellular aggregates is unlikely.
Refer now to Figure 2B which shows the image segmentation method of the invention. The image segmentation step 18, performs the identification of potential cellular aggregates. It is based on first identifying potential cell nuclei and then determining which nuclei lie close enough to other nuclei to be considered part of a cellular aggregate. In one preferred embodiment, the image segmentation step includes five sub-steps. Segmentation steps 28 and 30 remove background objects, segmentation step 32 is image thresholding, segmentation step 34 is object refinement, and segmentation step 36 is nuclei clustering.
Refer now to Figure 3 which shows the method of the invention to remove large objects from the image. Since nuclei have a finite size range, it is useful to remove objects larger and smaller than that range before image thresholding step 32. In one preferred embodiment, large objects are removed by closing the image 38 with a 27 by 5 flat top structuring element 40 and with a 5 by 27 flat top structuring element 42.
IClosed = min[(( IOrig ⊕ (27x5)) θ (27x5)), ((( IOrig⊕ (5x27)) θ (5x27)))]
The closed image 44 is then iteratively eroded conditioned on the original image until no more erosion takes place. This is termed ultimate conditional erosion 46. The structuring element for the ultimate conditional erosion is a flat top cross of 3x3 pixels. I Erode(0) = IClosed
IErode(1) = max [ IOrig , (IErode(0) Θ ((3x3) Cross))]
i = 1
While(IErode(i) ≠ IErode(i - 1))
{
IEride(i + 1) = max [ IOrig , IErode(i) Θ ((3x3)Cross))]
i + +
I Erode = I Erode(i)
In the above equations, a 3x3 cross structuring element is a center pixel, two adjacent horizontal pixels, and two adjacent vertical pixels, and where IErode(i) represents the ith iteration of conditional erosion equation in the while loop above, of the image. The residue 48 of the conditionally closed, eroded image and the original image contain only objects that are small enough to be nuclei, large objects are removed. The residue image is then opened with a 9 by 9 flat top structuring element 50 to remove objects smaller than valid nuclei thereby producing a residue image 52.
To define potential nuclear regions of the image, the gray scale image is thresholded to produce a binary mask in step 32. In a binary mask, pixels may take on one of two values, active or inactive. Active pixels represent regions where potential nuclei have been identified. In one preferred embodiment, thresholding is done by combining results of two different methods, thereby incorporating the advantages of one to offset the disadvantages of the other. The first method segments the majority of nuclei completely with little problem of over segmentation. In one preferred embodiment, the nuclei identified by the first method is used when the second method confirms that the mask of the first is not a false segmentation shown schematically in Figures 4A, 4B, 4C and 4D.
Now refer to Figures 4A, 4B, 4C and 4D, which show a graphical example of combination of two segmentation masks to take advantage of the strengths of each. In this preferred embodiment, the first segmentation mask is created by blurring, scaling, and clipping the original image 55 for use as a threshold image for the image that resulted from step 28 above. Blurring removes information other than background from the threshold image. Scaling provides appropriate threshold levels. Clipping assures that nuclei have at least a minimum residue strength before they are included in the mask. In a preferred embodiment, the functions are implemented as:
IBlur = ((IOrig ) θ (9x9)) ⊕ (6x6) ,
IScale = IBlur/2,
IClip = max(IScale,10),
then the first mask is generated by the following rule:
For each Pixel i
{
If(IiResidue > IiClip) then IiFirstMask = 1 else IiFirstMask =0
}
Where IiResidue is the ith pixel of image IResidue and IiCllp is the ith pixel of image IClip . In this preferred embodiment, the second mask is created by conditioning the residue image IResidue.
Often nuclei appear overlapped in an image. If a threshold is calculated for all nuclei without regarding overlap, nuclei that overlap will be segmented as one. If the threshold for overlapping objects is adjusted to segment less, the nuclei may segment as separate objects. When objects overlap, their apparent darkness is greater; their residue will be stronger. Therefore, in one preferred embodiment, objects with strong residue are thresholded differently than those with weak residue.
A modified residue image, IResidueMax, containing only monotonically valued objects is created by conditionally dilating the residue image only in areas where pixels are greater than zero so that each object has all pixels equal to the value of the maximum pixel value for the object in the residue image.
IResidueMax(0) = IResidue
i = 1
while(i≤8)
{
ITemp = (IResidueMax(i-1)⊕ ((3x3)Cross))
For each Pixel j
{
If {IjResidue > 0) then IjResιdueMax(i) = IJTemp) else IjResidueMax(i) = 0
}
I + +
An object is considered to have a strong residue if its maximum residue value is greater than StrongObjectTestValue where
Figure imgf000016_0001
Where
Figure imgf000016_0002
represents a binomial filter (in the case of a 3x3, it is a convolution operation, for which the kernel weights would be for the top row, for the middle row, and for the bottom row) . An image, IStrongObjects, is produced by retaining only the strong objects. For each Pixel i
An edge image is created.
From the edge image, the residue image, and the strong object image a threshold is computed.
The mask for the second method is:
For each Pixel i
The nuclei mask is the conditional dilation of the second mask conditioned on the first mask (the conditional dilation is repeated 4 times to allow the complete dilation of all nuclei).
i=1
while (i ≤ 4)
Now refer to Figure 5A which shows one embodiment of nuclear thresholding as employed in the method of the invention. An original image 38 and a residue image 52 are fed to a first segmenter 172 and a second segmenter 174. The first segmenter 172 generates a first mask 176. The second segmenter 174 generates a second mask 178. The combination of the first mask 176 and the second mask 178 in combiner 89 produces a nuclei image 124.
Now refer to Figure 5B which shows the first segmenter of the invention. The original image 38 is fed to be blurred in 180. The image is then scaled in 182. The image is then clipped in 184. The residue image 52 is then thresholded with the clipped image in 186 to generate a first mask 176.
Now refer to Figure 5C which shows the creation of the second mask 178. The original image 38 is fed to test the strong object value calculator 188. The residue image is fed to create a maximum residue image 190 and an edge image 192. A strong object image is created by thresholding 194. The second mask 178 is created by taking the residue image 52, and thresholding with the threshold images, generated from the strong object image 194 and the edge image 192.
Now refer to Figure 5D comprising Figure 5E and Figure 5F which are intended to be pieced together to show the processing flow for a preferred embodiment of nuclear thresholding to find a nuclei image from an original image, where a residue image has already been created from the original image. The cytological image processing method starts with the step of obtaining a digital representation of the cytological image 38. The invention then does a 3x3 erode 81 of the residue image 52 to provide a first representation 101. A 3x3 dilation 82 of the residue image 52 provides a second representation 102. Subtracting 83 the residue image 52 from the second representation 102 provides a third representation 103. Subtracting 84 the third representation 103 from the first representation 101 and setting all negative values to zero provides a fourth representation 104. The invention then compares the residue image 52 to zero to provide a binary condition control signal 125. The invention then repeats a binary conditional dilation 86 with a 3x3 cross, eight times to provide a fifth representation 105. The residue image 52 is transferred to a sixth representation 106 if the fifth representation 105 is greater than a nineteenth representation 119. The invention then morphologically computes 88 a binary result to a seventh representation 107, the binary result being one if the residue image 52 is greater than a predetermined combination of the fourth representation 104, the residue image 52 and the sixth representation 106, zero otherwise. In one embodiment, if the residue image 52 is greater than two times the fourth representation 104 plus 0.5 times the residue image 52 plus .375 times the sixth representation 106, then the seventh representation 107 is set to a one, zero otherwise. A 3x3 blurring 91 of the original image 38 provides an eighth representation 108. A 3x3 dilation 92 of the original image 38 provides a ninth representation 109. A 9x9 erosion 94 of the original image 38 provides a tenth representation 110. The invention then subtracts the original image 38 from the eighth representation 108 to provide an eleventh representation 111. Subtraction 95 of the original image 38 from the ninth representation 109 provides a twelfth representation 112. Dilation 97 of the tenth representation 110 provides a thirteenth representation 113. Conversion of negative pixels to positive pixels of the same magnitude 98 for the eleventh representation 111 gives the fifteenth representation 115. Computation of the pixel average 99 of the twelfth representation 112 provides a fourteenth representation 114. Computation of the pixel average 79 of the fifteenth representation 115 provides a seventeenth representation 117. Computation of the pixel average 78 of the original image 38 provides an eighteenth representation 118. Shifting 100 of the thirteenth representation 113 right one bit provides a sixteenth representation 116. Computation of the sum 77 of the fourteenth representation 114, seventeenth representation 117, three times the eighteenth representation 118 and subtracting 255 provides the nineteenth representation 119. Taking the maximum 75 of the sixteenth representation 116 and the value 10 provides a twentieth representation 120. Comparison 90 of the residue image 52 to the twentieth representation 120 provides a twenty-first representation 121. Conditional dilation 89 of the seventh representation 107 and twenty-first representation 121 provides the nuclei image 124.
Object refinement is conducted in step 34, Figure 2B. Figure 6 shows the object refinement step 34 in more detail. Small holes may be filled in segmentation by closing with a 3x3 structuring element 126. The segmentation mask may then be smoothed by opening with a 5x5 structuring element 128. Note that the nuclei image is updated upon completion of the smoothing operation, creating the smooth nuclei image 130. INucleiNoHoles = (((INuclei⊕ (3x3)) θ (3x3) )).
Now refer to Figure 7A which shows the INuclei = INucleiSmooth = (((INucleiNoHoles θ (5x5 ))⊕ (5x5 ))) segmentation step for nuclei clustering. A nuclei image 124 has very small objects removed at 198. Remaining objects are expanded depending upon their size where large objects are expanded more than small objects 199. A dilated image 159 is generated.
Now refer to Figure 7B which shows processing flow for nuclei clustering. In one preferred embodiment, clustering 36 (Figure 2B) is nuclear size dependent. Small nuclei must be close to be considered part of an aggregate, while larger nuclei are not so restricted. Larger nuclei may be more distant and still considered part of an aggregate. Clustering is accomplished by dilating nuclei dependent on size. Size dependent dilation is accomplished by creating nuclei masks for nuclei of different sizes, then dilating each according to size range and "OR"ing the masks to give a final cluster mask.
ISize1 = (INuclei θ (5x5))
ISize2 = (ISize1 θ ((3x3)Cross))
ISize3 = (ISize1 θ ((3x3)Cross})
IDilate 1 = (ISize3 ⊕ {(5x5)Diamond)) + ISize2
IDilate 2 = (IDilate 1 ⊕ (9x9)) + ISize1
IDilate 3 = ( IWDilate 1 ⊕ ((7x7)Diamond)) ⊕ (4x4) where a 5x5 Diamond structuring element is
Figure imgf000020_0001
and a 7x7 Diamond is the 5x5 Diamond dilated by a 3x3 Cross. The invention takes the nuclei image 124 and does a 5x5 erosion 132 of the nuclei image 124 to provide a first cluster representation 151. A 3x3 cross erosion 136 of the first cluster representation 151 provides a second cluster representation 152. A 3x3 cross erosion 138 of the second cluster representation 152 provides a third cluster representation 153. A 5x5 diamond dilation 140 of the third cluster representation 153 provides a fourth cluster representation 154. The logical ORing 142 of the second cluster representation 152 and fourth cluster representation 154 provides a fifth cluster representation 155. Dilating 144 the fifth cluster representation provides a sixth cluster representation 156. Logical ORing 146 the first cluster representation 151 and sixth cluster representation 156 provides a seventh cluster representation 157. A 7x7 cross dilation 148 of the seventh cluster representation 157 provides an eighth cluster representation 158. A 4x4 dilation 150 of the eighth cluster representation 158 provides the segmented image 159.
In one preferred embodiment, if objects in the segmented image 159 (IDilate 3 ) are larger than 2400 pixels, the object is considered to be a potential cellular aggregate. Other objects are removed from consideration.
Referring again to Figure 2A, feature extraction 20, is the measurement of features related to the segmented potential cellular aggregates. In one embodiment, the ratio of the standard deviation of the pixel values of the nuclei to the standard deviation of the pixel values of the cluster are measured. Also, the standard deviation of the nuclear compactness is measured. Where the nuclear compactness is defined as:
NuclearCompactness = (Perimeter^2) /Area With feature values available the object classification 22 step may be performed. In one embodiment, an object is classified as probable squamous artifact if:
(StdNuclei/StdCluster) + (StdNucCompact * 0.038) > 1.14.
The invention has been described herein in considerable detail in order to comply with the Patent Statutes and to provide those skilled in the art with the information needed to apply the novel principles and to construct and use such specialized components as are required. However, it is to be understood that the invention can be carried out by specifically different equipment and devices, and that various modifications, both as to the equipment details and operating procedures, can be accomplished without departing from the scope of the invention itself.
What is claimed is:

Claims

1. An automated cell group recognizer (10) for locating groups of cells on a biological specimen comprising:
(a) an automated microscope (511) for acquiring an image representation (12) of a portion of the biological specimen wherein the automated microscope (511) has an image representation output; (b) an image feature extractor (14) coupled to the automated microscope (511) to receive the image representation, wherein the image feature extractor
(14) has an image feature vector output; and
(c) a classifier (16) coupled to the image feature vector output, wherein the classifier (16) has an output indicative of a likelihood that a predetermined part of the biological specimen contains a group of cells wherein the classifier (16) also has a likelihood output.
2. The automated cell group recognizer of claim 1 further including a segmenter (18) connected to the image representation output, the image feature vector output, and the likelihood output where the segmenter (18) has an output that locates cell groups on the biological specimen.
3. The apparatus of claim 2 further including an object feature extractor (20) connected to the group location output, feature vector output and image representation having a group feature vector output.
4. The apparatus of claim 3 further including an object classifier (22) connected to the group feature vector output, image feature vector output wherein the object classifier (22) has an object classification output.
5. The apparatus of claim 1 wherein the biological specimen is a pap smear.
6. The apparatus of claim 1 wherein the biological specimen is a gynecological specimen.
7. The apparatus of claim 1 wherein the group of cells (55) comprise cells that overlap.
8. The apparatus of claim 1 wherein the portion part of the biological specimen comprises at least one field of view of the automated microscope (511).
9. The apparatus of claim 1 wherein the image representation further comprises pixels and wherein feature extractor (19) further comprises:
(a) a pixel intensity averager (562) connected to the image representation output wherein the pixel intensity averager has an average pixel output for pixels having an intensity greater than a first predetermined intensity;
(b) a small dark averager (562) connected to the image representation output having a small dark averager output; and
(c) a high pixel counter (562) connected to the image representation to count a number of pixels greater than a second predetermined intensity wherein the high pixel counter has a count output.
10. The apparatus of claim 9 wherein the first predetermined intensity is at least 200 for an 8 bit pixel.
11. The apparatus of claim 9 wherein the small dark averager further comprises a computer processor (562) coupled to the image representation output to compute a small dark average by the following equation:
Figure imgf000025_0001
12. The apparatus of claim 1 wherein the classifier (16) further comprises an average high pixel value comparator (562) to compare an average high pixel value against a first predetermined intensity threshold wherein the average high pixel value comparator outputs the likelihood output.
13. The apparatus of claim 1 wherein the classifier (16) further comprises a small dark averager/high pixel comparator (562) connected to the image representation output wherein the small dark averager/high pixel comparator outputs the likelihood output.
14. The apparatus of claim 1 wherein the classifier (16) further comprises a computer processor (562) coupled to the image representation output to compute a small dark average by the following equation:
Figure imgf000026_0001
15. The apparatus of claim 1 wherein the classifier (16) further comprises a small dark averager/high pixel comparator connected to the image representation wherein the small dark averager/high pixel comparator further comprises a processor
(562) where image representation output is used to compute a small dark average by the following equation:
Figure imgf000026_0002
16. The apparatus of claim 2 wherein the segmenter (18) further comprises:
(a) a large object remover (28) to filter objects greater than a first predetermined size from the image representation output having a first image output;
(b) a small object remover (30) connected to the first image output to filter objects smaller than a second predetermined size having a second image output;
(c) an image thresholder (32) connected to the second image output to locate a nuclei-like object having a third image output;
(d) an object refiner (34) to fill small holes and remove jagged edges from the third image output having a fourth image output; and
(e) a cluster (36) connected to the fourth image output to group nuclei that are in close proximity and eliminate those that are not in close proximity wherein the cluster (36) has an output that locates all groups on the biological specimen.
17. The apparatus of claim 16 where the large object remover (28) and small object remover (30) comprise grey scale morphological operators.
18. The apparatus of claim 16, wherein the large object remover (28) further comprises a field of view computer (568).
19. The apparatus of claim 16 wherein the small object remover (30) further comprises a morphological processor (88) to perform an opening operation.
20. The apparatus of claim 19 wherein the opening operation comprises a 9x9 flat top operation (94).
21. An image segmenting apparatus (18) for segmenting an image comprising:
(a) a large object remover (28) to filter objects greater than a first predetermined size from the image (55) having a first image output;
(b) a small object remover (30) connected to the first image output to filter objects small than a second predetermined size having a second image output;
(c) an image thresholder (32) connected to the second image output to locate a nuclei-like object having a third image output;
(d) an object refiner (34) to fill small holes and remove jagged edges from the third image output having a fourth image output; and
(e) a clusterer (36) connected to the fourth image output to group nuclei that are in close proximity and eliminate those that are not in close proximity.
22. A method of creating a residue image (52) from a digital image (38) comprising the steps of:
(a) first closing the digital image to provide a first representation (40); (b) second closing the digital image to provide a second representation (42); (c) minimizing the first representation and second representation to provide a third representation (44); (d) ultimately conditionally eroding with a 3x3 cross, the third representation (46);
(e) subtracting the third representation from the digital image to provide a fourth representation (48); and
(f) opening the fourth representation (50) to provide the residue image (52).
23. The method of claim 22 wherein step (a), first closing the digital image, comprises a 5x27 closing (40).
24. The method of claim 22 wherein step (a), first closing the digital image, comprises a 27x5 closing (42).
25. A cytological image processing method (Fig.
5D) for finding a nuclei image in an image
(55), where a residue image (52) has already been created, the cytological image processing method (Fig. 5D) comprising the steps of:
(a) obtaining a image representation of the image (52);
(b) 3x3 eroding the residue image to provide a first representation (81); (c) 3x3 dilating the residue image to provide a second representation (82); (d) subtracting the residue image from the second representation to provide a third representation (83);
(e) subtracting the third representation from the first representation and zeroing negative pixels to provide a fourth representation (84);
(f) comparing the residue image to zero (85) to provide a binary condition control signal (125);
(g) repeating a binary conditional dilation with a 3x3 cross, eight times to provide a fifth representation (86);
(h) providing the residue image to a sixth representation if the fifth representation is greater than a nineteenth representation (87);
(i) morphologically computing a binary result to a seventh representation, the binary result being one if the residue image is greater than a predetermined combination of the residue image, the fourth representation, and the sixth representation, zero otherwise (88); (j) 3x3 blurring the image to provide a eighth representation (91);
(k) 3x3 dilating the image to provide a ninth representation (92);
(l) 9x9 eroding the image to provide a tenth representation (94);
(m) subtracting the image from the eighth representation to provide an eleventh representation (95);
(n) subtracting the image from the ninth representation to provide a twelfth representation (96);
(o) dilating the tenth representation to provide a thirteenth representation
(97);
(p) computing an absolute value of the eleventh representation to provide a fifteenth representation (98);
(q) computing a pixel average of the twelfth representation to provide a fourteenth representation (99);
(r) computing a pixel average of the fifteenth representation to provide a seventeenth representation (79);
(s) computing a pixel average of the image to provide an eighteenth representation (78);
(t) shifting the thirteenth representation right one bit to provide a sixteenth representation (100);
(u) computing a sum of the seventeenth representation, fourteenth representation, three times the eighteenth representation and subtracting 255 to provide the nineteenth representation (77);
(v) taking a maximum of the sixteenth representation and the value 10 to provide a twentieth representation (75);
(w) comparing the residue image (52) to the twentieth representation to provide a twenty-first representation (90); and
(x) conditionally dilating the seventh representation and twenty- first representation (89) to provide the nuclei image (124).
26. The cytological image processing method (Fig. 5D) of claim 25 further comprising the steps of closing (126) the nuclei image (124) to provide a closed image and opening the closed image (128) to provide a smoothed nuclei image (130).
27. A method of segmenting an image into a segmented image (36), the method comprising the steps of:
(a) 5x5 eroding a nuclei image (124) to provide a first representation (132);
(b) 3x3 cross eroding the first representation to provide a second representation (136);
(c) 3x3 cross eroding the second representation to provide a third representation (138);
(d) 5x5 diamond dilating the third representation to provide a fourth representation (140);
(e) logical ORing the second representation and fourth representation to provide a fifth representation (142);
(f) dilating the fifth representation to provide a sixth representation (144); (g) logical ORing the first representation and sixth representation to provide a seventh representation (146); (h) 7X7 cross dilating the seventh representation to provide an eighth representation (148); and
(i) 4X4 dilating the eighth representation (150) to provide the segmented image (159).
28. A method of nuclear thresholding comprising the steps of:
(a) acquiring an original image of a biological specimen (38);
(b) computing a residue image of the original image (52);
(c) performing a first segmentation method on the original image and the residue image (172);
(d) performing a second segmentation method on the original image and the residue image (174); and
(e) combining results of the first segmentation method and second segmentation method (89) to identify nuclei (124) of objects on the biological specimen.
29. The method of claim 28 wherein the first segmentation method (172) comprises:
(a) blurring the original image to form a blurred image (180);
(b) scaling the blurred image to form a scaled image (182);
(c) clipping the scaled image to form a clipped image (184); and
(d) thresholding (186) the residue image (52) with the clipped image (184) to generate a first mask (176) that is combined with an output of the second segmentation method to identify nuclei of cells on the biological specimen.
30. The method of claim 28 wherein the second segmentation method (174) comprises the steps of:
(a) calculating a strong object test value from the original image (188);
(b) creating a maximum residue image (190) from the residue image (52);
(c) creating an edge image from a clipped image (192);
(d) thresholding a maximum clipped image with the strong object test value to generate a strong object image (194);
(e) creating a threshold image from the strong object image, the edge image, and residue image (196); and (f) generating a second mask that is combined with an output of the first segmentation method to identify nuclei in the original image of the biological specimen (178).
31. A method of nuclei clustering (Fig. 7A) comprising the steps of:
(a) removing very small objects from a nuclei image (198);
(b) expanding remaining objects depending upon size where large objects are expanded more than small objects (199) to provide an expanded output and a nuclei clustered output (159).
PCT/US1995/011460 1994-09-20 1995-09-11 Apparatus for automated identification of cell groupings on a biological specimen WO1996009604A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU35861/95A AU3586195A (en) 1994-09-20 1995-09-11 Apparatus for automated identification of cell groupings on a biological specimen

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US30906194A 1994-09-20 1994-09-20
US08/309,061 1994-09-20

Publications (1)

Publication Number Publication Date
WO1996009604A1 true WO1996009604A1 (en) 1996-03-28

Family

ID=23196510

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1995/011460 WO1996009604A1 (en) 1994-09-20 1995-09-11 Apparatus for automated identification of cell groupings on a biological specimen

Country Status (3)

Country Link
US (1) US5978498A (en)
AU (1) AU3586195A (en)
WO (1) WO1996009604A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999015875A1 (en) * 1997-09-25 1999-04-01 Macquarie Research Ltd. Apparatus for removing a sample from an array of samples and a cutting tool for use with that apparatus
DE10143441A1 (en) * 2001-09-05 2003-03-27 Leica Microsystems Process and microscope system for observing dynamic processes
EP1339017A1 (en) * 2000-12-01 2003-08-27 Japan Science and Technology Corporation Nuclear area recognizing method and nuclear genealogy creating method
WO2003090169A1 (en) * 2002-04-22 2003-10-30 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for separating a group of cells which are contained in an image sample into individual cells
EP2270712A1 (en) * 2009-06-30 2011-01-05 Koninklijke Philips Electronics N.V. Quality detection method and device for cell and tissue samples
US7894645B2 (en) 2000-08-10 2011-02-22 Ohio State University High-resolution digital image processing in the analysis of pathological materials
US9240043B2 (en) 2008-09-16 2016-01-19 Novartis Ag Reproducible quantification of biomarker expression
CN108693096A (en) * 2017-12-13 2018-10-23 青岛汉朗智能医疗科技有限公司 Red blood cell detection method and system

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7006674B1 (en) 1999-10-29 2006-02-28 Cytyc Corporation Apparatus and methods for verifying the location of areas of interest within a sample in an imaging system
ES2283347T3 (en) * 1999-10-29 2007-11-01 Cytyc Corporation APPARATUS AND PROCEDURE TO VERIFY THE LOCATION OF AREAS OF INTEREST WITHIN A SAMPLE IN AN IMAGE FORMATION SYSTEM.
US6498863B1 (en) 2000-09-20 2002-12-24 Media Cybernetics Inc. Method, system, and product for analyzing a digitized image of an array to create an image of a grid overlay
WO2002048949A1 (en) * 2000-12-15 2002-06-20 Cellavision Ab Method and arrangment for processing digital image information
WO2003095986A1 (en) * 2002-05-14 2003-11-20 Amersham Biosciences Niagara, Inc. System and methods for rapid and automated screening of cells
US20060050376A1 (en) * 2004-09-02 2006-03-09 Houston Edward S Robotic microscopy apparatus for high throughput observation of multicellular organisms
GB0503629D0 (en) * 2005-02-22 2005-03-30 Durand Technology Ltd Method and apparatus for automated analysis of biological specimen
US7796815B2 (en) * 2005-06-10 2010-09-14 The Cleveland Clinic Foundation Image analysis of biological objects
US20070031043A1 (en) 2005-08-02 2007-02-08 Perz Cynthia B System for and method of intelligently directed segmentation analysis for automated microscope systems
JP4703605B2 (en) * 2007-05-31 2011-06-15 アイシン・エィ・ダブリュ株式会社 Feature extraction method, image recognition method and feature database creation method using the same
US8379960B2 (en) * 2009-03-30 2013-02-19 Ge Healthcare Bio-Sciences Corp. System and method for distinguishing between biological materials
CN103907023B (en) 2011-09-13 2016-10-05 皇家飞利浦有限公司 Abnormal system and method in detection biological sample
CN103793709A (en) * 2012-10-26 2014-05-14 西门子医疗保健诊断公司 Cell recognition method and device, and urine analyzer
CN105095901B (en) * 2014-04-30 2019-04-12 西门子医疗保健诊断公司 Method and apparatus for handling the block to be processed of sediment urinalysis image
WO2015172025A1 (en) 2014-05-08 2015-11-12 The Cleveland Clinic Foundation Systems and methods for detection, analysis, isolation and/or harvesting of biological objects
US10223590B2 (en) * 2016-08-01 2019-03-05 Qualcomm Incorporated Methods and systems of performing adaptive morphology operations in video analytics

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4097845A (en) * 1976-11-01 1978-06-27 Rush-Presbyterian-St. Luke's Medical Center Method of and an apparatus for automatic classification of red blood cells
US4975972A (en) * 1988-10-18 1990-12-04 At&T Bell Laboratories Method and apparatus for surface inspection
US5268967A (en) * 1992-06-29 1993-12-07 Eastman Kodak Company Method for automatic foreground and background detection in digital radiographic images

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3824393A (en) * 1971-08-25 1974-07-16 American Express Invest System for differential particle counting
US4122518A (en) * 1976-05-17 1978-10-24 The United States Of America As Represented By The Administrator Of The National Aeronautics & Space Administration Automated clinical system for chromosome analysis
US4183013A (en) * 1976-11-29 1980-01-08 Coulter Electronics, Inc. System for extracting shape features from an image
US4175860A (en) * 1977-05-31 1979-11-27 Rush-Presbyterian-St. Luke's Medical Center Dual resolution method and apparatus for use in automated classification of pap smear and other samples
DE2903855A1 (en) * 1979-02-01 1980-08-14 Bloss Werner H Prof Dr Ing METHOD FOR AUTOMATICALLY MARKING CELLS AND DETERMINING THE CHARACTERISTICS OF CELLS FROM CYTOLOGICAL MEASUREMENT DEVICES
US4725543A (en) * 1979-12-04 1988-02-16 Kung Patrick C Hybrid cell line for producing complement-fixing monoclonal antibody to human suppressor T cells, antibody and methods
US4538299A (en) * 1981-12-04 1985-08-27 International Remote Imaging Systems, Inc. Method and apparatus for locating the boundary of an object
US4513438A (en) * 1982-04-15 1985-04-23 Coulter Electronics, Inc. Automated microscopy system and method for locating and re-locating objects in an image
DE3578241D1 (en) * 1985-06-19 1990-07-19 Ibm METHOD FOR IDENTIFYING THREE-DIMENSIONAL OBJECTS BY MEANS OF TWO-DIMENSIONAL IMAGES.
US5281517A (en) * 1985-11-04 1994-01-25 Cell Analysis Systems, Inc. Methods for immunoploidy analysis
US5086476A (en) * 1985-11-04 1992-02-04 Cell Analysis Systems, Inc. Method and apparatus for determining a proliferation index of a cell sample
US4709333A (en) * 1986-01-03 1987-11-24 General Electric Company Method and apparatus for imaging in the presence of multiple high density objects
US5231005A (en) * 1987-03-13 1993-07-27 Coulter Corporation Method and apparatus for screening cells or formed bodies with populations expressing selected characteristics
US5544650A (en) * 1988-04-08 1996-08-13 Neuromedical Systems, Inc. Automated specimen classification system and method
US4965725B1 (en) * 1988-04-08 1996-05-07 Neuromedical Systems Inc Neural network based automated cytological specimen classification system and method
US4973111A (en) * 1988-09-14 1990-11-27 Case Western Reserve University Parametric image reconstruction using a high-resolution, high signal-to-noise technique
US5253302A (en) * 1989-02-28 1993-10-12 Robert Massen Method and arrangement for automatic optical classification of plants
US5162990A (en) * 1990-06-15 1992-11-10 The United States Of America As Represented By The United States Navy System and method for quantifying macrophage phagocytosis by computer image analysis
US5257182B1 (en) * 1991-01-29 1996-05-07 Neuromedical Systems Inc Morphological classification system and method
US5361140A (en) * 1992-02-18 1994-11-01 Neopath, Inc. Method and apparatus for dynamic correction of microscopic image signals
US5315700A (en) * 1992-02-18 1994-05-24 Neopath, Inc. Method and apparatus for rapidly processing data sequences

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4097845A (en) * 1976-11-01 1978-06-27 Rush-Presbyterian-St. Luke's Medical Center Method of and an apparatus for automatic classification of red blood cells
US4975972A (en) * 1988-10-18 1990-12-04 At&T Bell Laboratories Method and apparatus for surface inspection
US5268967A (en) * 1992-06-29 1993-12-07 Eastman Kodak Company Method for automatic foreground and background detection in digital radiographic images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ACADEMIC PRESS, issued 1982, J. SERRA, "Image Analysis and Mathematical Morphology", pages 377-378. *
APPLIED OPTICS, Volume 26, No. 16, issued 15 August 1987, J.W. BACUS et al., "Optical Microscope System for Standardized Cell Measurements and Analysis", pages 3280-3293. *
IEEE, issued 07 April 1992, A. ELMOATAZ et al., "Segmentation and Classification of Various Types of Cells in Cytological Images", pages 385-388. *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999015875A1 (en) * 1997-09-25 1999-04-01 Macquarie Research Ltd. Apparatus for removing a sample from an array of samples and a cutting tool for use with that apparatus
AU758531B2 (en) * 1997-09-25 2003-03-27 Macquarie Research Limited Apparatus for removing a sample from an array of samples and a cutting tool for use with that apparatus
US7894645B2 (en) 2000-08-10 2011-02-22 Ohio State University High-resolution digital image processing in the analysis of pathological materials
EP1339017A1 (en) * 2000-12-01 2003-08-27 Japan Science and Technology Corporation Nuclear area recognizing method and nuclear genealogy creating method
EP1339017A4 (en) * 2000-12-01 2007-08-29 Japan Science & Tech Corp Nuclear area recognizing method and nuclear genealogy creating method
DE10143441A1 (en) * 2001-09-05 2003-03-27 Leica Microsystems Process and microscope system for observing dynamic processes
US7120281B2 (en) 2001-09-05 2006-10-10 Leica Microsystems Cms Gmbh, Method; microscope system and software program for the observation of dynamic processes
WO2003090169A1 (en) * 2002-04-22 2003-10-30 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for separating a group of cells which are contained in an image sample into individual cells
US9240043B2 (en) 2008-09-16 2016-01-19 Novartis Ag Reproducible quantification of biomarker expression
EP2270712A1 (en) * 2009-06-30 2011-01-05 Koninklijke Philips Electronics N.V. Quality detection method and device for cell and tissue samples
CN108693096A (en) * 2017-12-13 2018-10-23 青岛汉朗智能医疗科技有限公司 Red blood cell detection method and system

Also Published As

Publication number Publication date
AU3586195A (en) 1996-04-09
US5978498A (en) 1999-11-02

Similar Documents

Publication Publication Date Title
US5978498A (en) Apparatus for automated identification of cell groupings on a biological specimen
US5987158A (en) Apparatus for automated identification of thick cell groupings on a biological specimen
Liao et al. An accurate segmentation method for white blood cell images
US5566249A (en) Apparatus for detecting bubbles in coverslip adhesive
US5933519A (en) Cytological slide scoring apparatus
US5757954A (en) Field prioritization apparatus and method
EP2053535B1 (en) Automated detection of cell colonies and coverslip detection using hough transforms
US5828776A (en) Apparatus for identification and integration of multiple cell patterns
US5627908A (en) Method for cytological system dynamic normalization
US6134354A (en) Apparatus for the identification of free-lying cells
US5528703A (en) Method for identifying objects using data processing techniques
CA2200457C (en) Biological analysis system self-calibration apparatus
US6198839B1 (en) Dynamic control and decision making method and apparatus
JPH10506462A (en) Method and apparatus for detecting inappropriate conditions for automated cytological scoring
US8064679B2 (en) Targeted edge detection method and apparatus for cytological image processing applications
JP4864709B2 (en) A system for determining the staining quality of slides using a scatter plot distribution
WO1996041301A1 (en) Image enhancement method and apparatus
GB2329014A (en) Automated identification of tubercle bacilli
EP0595506A2 (en) Automated detection of cancerous or precancerous tissue by measuring malignancy associated changes
JP4897488B2 (en) A system for classifying slides using a scatter plot distribution
Ma et al. A counting and segmentation method of blood cell image with logical and morphological feature of cell
Zhao et al. A survey of sperm detection techniques in microscopic videos
CN111583275A (en) Method, system, device and storage medium for identifying pathological number of section
WO2000062241A1 (en) Method and apparatus for determining microscope specimen preparation type
WO2000062240A1 (en) Automatic slide classification using microscope slide preparation type

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase