WO1991020048A1 - Cellular analysis utilizing video processing and neural network - Google Patents

Cellular analysis utilizing video processing and neural network Download PDF

Info

Publication number
WO1991020048A1
WO1991020048A1 PCT/US1991/004410 US9104410W WO9120048A1 WO 1991020048 A1 WO1991020048 A1 WO 1991020048A1 US 9104410 W US9104410 W US 9104410W WO 9120048 A1 WO9120048 A1 WO 9120048A1
Authority
WO
WIPO (PCT)
Prior art keywords
cell
color
cells
vectors
nucleus
Prior art date
Application number
PCT/US1991/004410
Other languages
French (fr)
Inventor
Eric T. Espenhahn
Jamie Pereira
Original Assignee
Applied Electronic Vision, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Applied Electronic Vision, Inc. filed Critical Applied Electronic Vision, Inc.
Publication of WO1991020048A1 publication Critical patent/WO1991020048A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology

Definitions

  • the present invention relates to a system for analyzing and classifying stained cells, viewed under a microscope, that incorporates the video processing techniques and a neural network.
  • Microscopic cellular analysis is customarily done by medical technicians. These technicians study the various types of cell structures and analyze the stain cells under a microscope. In blood analysis, error rates sometimes approach as high as twenty five (25%) percent. These error rates are due to the technician viewing the cells eight hours or more per day, five days a week, and the difficulty of identifying subtle variations in color, shape, size, density and texture of the stained cells, among other things.
  • the Hematrak 590 which was the most successful of these image based systems, was last produced in 1987.
  • the failure of all of these image based systems was related to the inability of the systems to meet price and performance expectations for obtaining blood differentials. Accordingly, there was general disillusionment with image base systems by the mid 1980's.
  • the Hematrak 590 was manufactured by Geometric Data, a division of Smith-Kline Corporation of Wayne, Pennsylvania.
  • the Coulter electronic machine takes whole blood samples, separates the white cells from the red cells and platelets and then processes the white cells using a flow system.
  • the machine passes the white cells in single file through an impedance measurement sensor to produce a three part differential count.
  • the cytochemical systems utilize the same flow mechanism but instead of using impedance flow the machines utilize laser light scattering and absorption patterns of the white blood cells.
  • the Technicon H-l and Coulter VCS produce a six part differential count utilizing these types of systems.
  • the cytochemical and impedance flow systems base their results on classifying a large number of cells, typically on the order of thousands, based upon the known distribution of normal white cell populations. This approach however does not provide enough information to accurately classify small distributions of abnormal cells.
  • Most of the instruments provide flags that alert the user for the need to proceed with a vision differential count based upon the presence of abnormal cells. None of them attempt to provide a quantitative measurement of these abnormal cells.
  • the system for microscopic analysis and classification utilizes an automated microscope that can be positioned both laterally and longitudinally as well as focused under the control of positioning commands.
  • the color image signal of the microscopic image of stained cells on a slide are fed to one of a plurality of vision systems.
  • Each vision system includes a frame buffer
  • a host computer assigns tasks to each vision system based upon the vision system's activity level.
  • the vision system identifies a single cell among the plurality of stain cells on the slide by locating color image signals falling within a predetermined color band about a predetermined stained cell color.
  • the vision system calculates vectors representing at least one set of color characteristics of the cell from the identified cell signals. For example, these vectors represent hue, saturation, and intensity histograms of a single cell. In a more comprehensive system, these vectors also include hue, saturation, and intensity histograms of the nucleus of the cell. Other cell features such as whole cell area, area of the nucleus and area of the cell cytoplasm are also used as input data.
  • the neural network has three paths and is a feed forward network of neurons.
  • One path handles vectors representing color characteristics of the cell
  • another path handles vectors representing color characteristics of nucleus
  • the third path handles the miscellaneous cell features such as area and nucleus color texture information.
  • These three paths converge towards an output layer in the network.
  • the output layer classifies the single cell based upon the vectors and miscellaneous information.
  • FIG. 1 illustrates, in block diagram form, the general system in accordance with the principles of the present invention
  • Fig. 2 illustrates, in block diagram form, the functional characteristics of the system
  • Fig. 3 is a block diagram of one vision system
  • Fig. 4 is a general flow chart of the operation of the system
  • Fig. 5A diagrammatically illustrates a blood smear on a slide and the search pattern of the microscope
  • Fig. 5B diagrammatically illustrates the positioning of a plurality of slides on a tray
  • Figs. 6A and 6B show exemplary intensity histograms utilized in the positioning routine for the microscope
  • Fig. 7 is a marked photograph showing red blood cells, white cells and platelets at 1000X microscopic power
  • Figs. 8A, 8B and 8C represent a flow chart of the principal program which includes a scope positioning routine as well as referring to the extraction routine, calculate features routine, classification routine, and display output routine;
  • Fig. 9 diagrammatically illustrates a color band about a predetermined stained cell color that is purple in the example discussed in conjunction with a current application of the present invention
  • Fig. 10 is a flow chart illustrating the steps involved in the extraction routine
  • Fig. 11 is a flow chart illustrating the calculate features routine
  • Fig. 12 is a flow chart illustrating the classification routine
  • Fig. 13 is a diagrammatical representation of a neuron in a neural network
  • Fig. 14 is a diagrammatical representation of the neural network used in conjunction with one application of the present invention.
  • Fig. 15 is a flow chart illustrating the red blood cell classification routine. Detailed Description of the Preferred Embodiments
  • This invention relates to a system for microscopic analysis and classification of stained cells.
  • Fig. 1 is a general system diagram for the present invention.
  • a plurality of slides, one of which is slide 10 are placed in certain positions on slide tray 12.
  • An exemplary, diagrammatic illustration of a blood smear slide is shown in Fig. 5A and the positioning of the slides on a slide tray 12 is shown in Fig. 5B, both of which will be discussed in detail later.
  • a microscope 14 is controllably driven so that the microscope can be moved laterally as shown by double headed arrow 16, as well as longitudinally as shown by the X within a circle 18 in Fig. 1. Additionally, microscope 14 can be moved towards or away from slide tray 12 as well as focused as noted by the curved line 20 having double arrows thereon.
  • scope control 24 is a stepper motor controller and the mechanical drive mechanism, represented by block 26, is a plurality of stepper motors connected to microscope 14.
  • the motors drive the microscope laterally and longitudinally with respect to the slide under study as well as towards and away from the slide. Also, the motors focus the scope with respect to the stained cells on slide.
  • Color image signals representative of the microscopic image of the stained cells are obtained by a video camera 30.
  • the output of the video camera is applied to bus line 32 which, in the preferred embodiment, carries video image frame signals, one frame signal representing the red color image, another representing the blue color image and a third representing the green color image.
  • These video frame signals include timing signals such as vertical and horizontal blanking signals.
  • the color image signals are applied to one of the vision systems, VS j , VS 2 , VS 3 , ... VS n all under the control of host computer 40.
  • the vision systems are coupled to host computer 40 via VME Bus 42 that permits extremely fast transfer of data as well as control and command signals between the various devices.
  • Host computer or CPU 40 also utilizes memory 42, and input/output (I/O) device 44 that are similarly coupled to VME Bus 42.
  • a display monitor 46 is connected to the system via I/O 44 as is keyboard 48.
  • Fig. 2 generally illustrates the functional aspects of the system wherein each of the major components in Fig. 2 operate relatively independent of each other and generally under the control of host computer or CPU 40.
  • one major function of the system is to position and focus microscope 14.
  • a major functional block is the positioner and focus operation 50.
  • a video frame image or color image signals for that video frame are grabbed or stored by one of the frame grabbers in the vision system, for example, VSj.
  • the positioning and focus operations also utilize a vision system. This vision system is then placed in the VS-Used Queue 52.
  • function block 54 is generally identified as comprising a number of extractors, such as extractor x through extractor,,. After the extractors have isolated a particular cell on the slide, and quantified certain color characteristics such as hue, saturation, intensity, and also certain miscellaneous cell information such as cell area, nucleus area, etc. , that information, collectively called herein a "cell block" is placed cell block queue 56.
  • the vision system Since the vision system has now completed its task of isolating and extracting certain information from the video frame, which would entail isolation and extraction of all the single type cells on the video frame, that ⁇ vision system is returned and placed in the VS-Free queue 58. Accordingly, the vision system is then available to the positioner and focus functional block 50 in order to further focus or position the microscope as well as accept another video image.
  • Cell block queue 56 is utilized by a third major functional block 60 that includes a number of classifiers, classifies through n.
  • the classifiers in the present embodiment are configured as software in the host computer and essentially comprise neural ⁇ networks that will be described in detail later. However, since neural networks can also be configured as very large scale integrated circuits (VLSI) , the present invention is not meant to be limited to the software implementation of a neural network but rather encompasses all types of neural networks whether implemented as software or hardware.
  • the classifier assigned to analyze the cell block that generally includes vectors representing color characteristics of a single cell as well as quantified cell features, determines what type of cell is represented by the cell block. For example, a working embodiment of the present invention classifies white blood cells represented by the cell block information.
  • the current embodiment of the present invention identifies a thirteen part differential count, including six normal types of white blood cells and seven abnormal types of white blood cells.
  • the perform differential functional block 64 is the overseer and controller of all the other functional blocks.
  • the perform differential function 64 also displays the identified and classified cell utilizing the cell a
  • the perform differential function monitors the number of different cells found. For example, in the working embodiment of the present invention a two hundred cell, thirteen part differential must be obtained in order to stop the positioning, extraction, and classification of various cells on the blood smear slide. Accordingly, the perform differential function 64 monitors the number of cells classified and stops the other functional processes after the system has identified that specified number cells. Further, perform differential function 64 monitors the overall process, keeps track of the vision systems that are being used as well as the vision systems that are free, and monitors the error ratio and failure flags on all the processes.
  • Fig. 3 is a block diagram of a single vision system as used in a working embodiment of the present invention.
  • vision system 66 is a board placed in a computer frame in order to achieve real time imaging on VME Bus 42.
  • Inputs 68 comprise color image signals for frame A, that is, separate red, blue and green frame A signals (FrA, ⁇ ) as well as timing signals.
  • the vision system 66 includes a color frame grabber, a thirty two bit plane, four flexible 512 X 512 X 8 image buffers (with an optional four image buffers available) , arithmetic logic units (ALU) and statistical processing up to 12.5 million pixels per second, inter- image arithmetic, including subtraction, real time frame averaging, convolutions, morphology, histograms and area profiles, area of interest window processing, and a 68000 on board micro processor.
  • Inputs 68 are applied to line 70 which are fed to input look-up tables (LUTS) 72 and sync stripper/generator 74.
  • ACRTC 76 is a video controller chip or integrated circuit that produces all the timing signals for video acquisition and processing.
  • Arithmetic logic unit (ALU) 78 further conditions and alters the video frame images and places them into frame buffer 80 which is a 512 X 512 X 32 on board video memory.
  • Frame buffer 80 is connected to VME Bus 42 by an internal VRAM Bus 82 to assist in the very fast input and output of data from the frame buffer.
  • Output look-up table (LUTS) and digital to analog convertors (DACS) 84 provide various outputs 86 from the vision system.
  • the vision system also includes a statistical processor 88, an event counter 90, and an interface to MVP-NP 92. These items are connected to an internal processing bus 94 as is ALU 78 frame buffer 80 and other components.
  • MVP-NP is a co-processor 96 which increases the processing speed for neighborhood operations, such as morphological transforms, binary pattern matching, feature extraction and color classification.
  • Co ⁇ processor 96 is coupled to processing bus 94 through digital expansion bus 98.
  • the vision system includes its own independent CPU or processor 110 which in the working embodiment is a Motorola 6800 micro processor.
  • Memory 112 is available to VS-CPU 110 through CPU bus 114.
  • Control logic 116 assists the VS-CPU 110 in controlling the operations of the other hardware and software functions.
  • the vision system is a MVP-VME video board manufactured by Matrox Electronic Systems Ltd. of Dorval, Quebec, Canada.
  • Co-processor 96 is the machine vision accelerator sold in conjunction with the MVP-VME image board. Further details and functional specifications of the Matrix image processing board are available from the specifications on the board.
  • Fig. 4 is a general system flow chart diagram showing the major steps in the particular application of the invention described hereinafter.
  • system specifications table identifies certain specifications utilized in the present application of the invention. These system specifications are meant to be exemplary only.
  • Type CCD color camera Output Separate Red, Green, Blue Resolution 786H X 493V Manufacturer Song RBG camera Image Acquisition System
  • MVP-VME with vision accelerator (MVP-NP) by Matrox Electronic Systems Ltd. of Dorval, Quebec, Canada.
  • step 150 includes changing the microscopic power of the microscope from 200X to 1000X and fine focusing the microscope at the first identified cell location on the slide. These steps are described in detail hereinafter.
  • Step 154 isolates a single cell of a predetermined type, e.g. a white cell, in the video frame image, and step 156 calculates the cell features. Steps 154 and 156 are the extraction steps done by one of the extractors in function block 54 of Fig.
  • step 158 cell features are analyzed by one of the classifiers in Fig. 2.
  • Step 160 calculates and analyzes other cell types, e.g., red blood cells, on the slide under study.
  • Decision step 162 determines whether the differential count has been exceeded.
  • the differential count threshold is set at 100 white cells spanning at least thirteen different classes of white cells. These cells must be identified in a particular blood smear before the system will stop. Also, the differential count also must exceed a certain number of red blood cells and platelets in the blood smear. If the differential count threshold is not exceeded, the no branch is taken from decision step 162 and in step 164, the scope is moved, and the program returns to the isolate single cell step in 154.
  • the system displays results in step 166 to the operator.
  • the operator sees every cell.
  • the cell images are complied in a special display frame and the operator approves or confirms the system's identification of those cells. This confirmation is noted in step 168.
  • system displays only the non-classifiable white cells, that is the white cells having a low probability of classification, and the operator classifies those cells as appropriate.
  • the operator sees every cell classified by the system and approves each cell on an individual basis. Since the type of display and operator confirmation is dependent upon certain commercial aspects of the invention, the display and confirmation steps may be selectable by the operator of the system.
  • the present invention is described in detail in conjunction with analyzing a blood smear.
  • the smear on a slide is diagrammatically illustrated in Fig. 5A.
  • Human blood was stained with Wright stain, which is a standard stain technique for human blood cell microscopic analysis and classification.
  • the predetermined stain cell color from the Wright stain results in red blood cells (R) (see Fig. 7, a 1000X magnification) being colored light red and the white blood cells (W) ranging in color from light blue, purple to orange.
  • the nucleus of the white blood cells (W N ) is dark purple and the white cell cytoplasm (W c ) ranges from light blue to purple dots to light orange.
  • Slide 200 is placed on slide tray 12 at certain locations, in the present embodiment, tray 12 holds 10 slides at predetermined locations.
  • the blood smear slides are normally oblong or oval in shape and extend longitudinally with respect to the slide.
  • Fig. 5A diagrammatically illustrates intensity contour lines A through E wherein A intensity contour line is extremely dark or black and E contour line illustrates the outer most feathered edge of the blood smear that is slightly colored.
  • the slides on slide tray 12 (Fig. B) are not only positioned at certain spots on the tray but also aligned such that the feathered edge of each blood smear points towards, for example, fore edge 210 of tray 12..
  • the feathered edge E of each blood smear could point towards the rear edge of slide tray 12 and the positioning routine (step 150 of Fig. 4) could be altered to sense the feathered edge in that direction.
  • Figs. 8A through 8C illustrates the principal or primary software routine encompassing steps 150, 152, 162, 164, 166 and 168 in Fig. 4;
  • Fig. 10 is the extraction or isolation routine corresponding to step 154 in Fig. 4;
  • Fig. 11 is the calculate features routine corresponding to step 156 in Fig. 4;
  • Fig. 12 is the classify white cell routine corresponding to step 158 in Fig. 4;
  • Fig. 15 is the red blood cell classification routine (including platelet identification) corresponding to step 160 in Fig. 4.
  • the principal program in Fig. 8A begins with the step of positioning the scope above the center point (CTPT) of the slide in step 240. Since all the slides are placed at certain locations on slide tray 12 (Fig. 5B) the system can generally identify and move the scope relative to the slide to position the scope above the center point of the slide. For example, with respect to slide 200, the center point 242 is identified and the scope is positioned thereat. The microscope is focused at 200X in step 244 in this particular application regarding the analysis and classification of cells in a blood smear. In step 246, the scope is moved longitudinally with respect to slide 200 until a bimodal intensity peak ratio is found. Ideally, the intensity of a complementary color band about a stained cell color is utilized.
  • the stain cell color in the present example is a Wright stain which essentially colors the red blood cells (cells R in Fig. 7) red.
  • the blood smear is very dark and almost black due to the great amount of red blood cells in the smear.
  • the complimentary color band for the Wright stained cell color is green.
  • the video image signals are essentially three video frames, one frame having the red color signals FrA,, another frame having the green color signals FrA-. and a third frame having the blue color signals FrA t ,. These frames are kept in frame buffers. Accordingly, there is a red frame buffer, a green frame buffer and a blue frame buffer for frame A (FrA ⁇ b ) .
  • the green buffer is selected because there is * 14 little or no green in the stained red blood cells. Accordingly, the stained red blood cells appear as dark or black spots on the green video frame image (FrA g ) .
  • a histogram of the intensity of the green buffer frame is obtained. Fig. 6A is an example of such a histogram when the scope is positioned at approximately intensity contour line B in Fig. 5A.
  • a histogram is a frequency distribution of the number of pixels having a certain intensity value. In Fig. 6A, the number of pixels in the FrA g that are darker exceeds the number of pixels that are lighter.
  • Peak 211 is the number of pixels having an intensity or brightness level of approximately 80 whereas peak 213 represents a fewer amount of pixels that are brighter (approximately 200 intensity level) . Accordingly, a peak ratio is obtained by comparing the peak number of background pixels to the peak number of pixels under study.
  • the background in the green frame buffer is represented by pixels having a bright or high intensity whereas the cells under study are dark because of the absence of any green in the stained red blood cells and stained white blood cells. Accordingly, the bimodal peak ratio in Fig. 6A is approximately _L,.8.
  • Step 246 moves the microscope longitudinally down the slide 200 as shown by the dash lines 243 until the bimodal peak ratio reaches approximately 0.5.
  • Fig. 6B graphically illustrates this peak ratio.
  • intensity contour line C in Fig. 5A there are a relatively few number of cells within FrA-. as compared with the background.
  • the microscope is moved towards the feathered edge until the program detects the proper bimodal peak ratio in the intensity of a complimentary color band about a stained cell color.
  • the stained cell color in the present invention is the red Wright stain color for the red cells and the white cells.
  • the complimentary color band is found in the green buffer.
  • the system When the system reaches the appropriate bimodal peak ratio, the system causes the microscope to begin a scan pattern as noted in step 468 in Fig. 8A.
  • the scan pattern in the present application, is shown as dash lines in Fig. 5A and includes essentially first lateral movement (opposite the first lateral direction) across the slide, slight longitudinal movement, further lateral movement across the entire feathered edge of the blood smear, additional longitudinal movement and a repetition of these movements until the bimodal peak ratio passes beyond the pre- established bimodal peak ratio range.
  • the particular scan pattern for the scope can be changed.
  • the scan pattern diagrammatically illustrated in Fig. 5A is grossly inaccurate because the microscope moves very small distances in relation to the slide both laterally and longitudinally throughout the scan pattern.
  • the scan pattern is simply tracking the feathered edge of the blood smear and various scanning patterns could be utilized to track this feathered edge.
  • the photograph depicted in Fig. 7 is a 1000X photograph but is generally illustrative of the cell distribution at 200X, that is, the cells are substantially in a single layer along the feathered edge generally between intensity contour line C and contour line E in Fig. 5A.
  • Step 248 obtains the video frame or color image signals FrA-, ⁇ . Specifically, there are three video frame images, one for the red buffer, one for the blue buffer and one for the green buffer. In step 250, the program calculates the hue of FrA H with all three color images.
  • the hue frame FrA H is passed through a threshold filter to identify, in the present application, white cell pixel groups.
  • the thresholding is simply screening or filtering the entire hue frame to identify pixels in a predetermined color band about a predetermined stained cell color.
  • the stain cell color in the present application is the bluish-purple Wright stain color.
  • the predetermined color band is the purple band (shown in Fig. 9) .
  • Fig. 9 illustrates that if a zero hue value is the mid point between the pure blue hue and the pure green hue, the purple band lies approximately at values 30 to 70 (counter clockwise from 0 to 255) .
  • Hue value 255 is immediately to the right of zero value. Accordingly, the hue frame is sent through a color signal filter, thereby passing color image signals falling within the purple band which extends about the predetermined purple stain cell color.
  • step 250 and 252 The purpose of step 250 and 252 is to identify the gross position of cells within the entire frame which, in this application is the red buffer, green buffer and blue buffer.
  • the calculation and creation of the hue frame is mathematical in nature and simply combines the three colors into a single frame in order to reduce processing time. By thresholding that hue frame and mapping pixels in the frame, that is pixels falling within a predetermined color band (purple) , a gross cell location is obtained.
  • mathematical morphology or shape filters are utilized to obtain the center points of the purple pixel groups. The center points are the gross locations of the white blood cells in this particular application.
  • step 256 the center points are stored as ctpt n with respect to a slide reference point and all the center point positions of the grossly identified white blood cells are stored in a list in memory.
  • step 258 the red, green and blue frame buffers are cleared or discarded as well as the hue frame created in step 250.
  • step 260 the microscope in moved a very small amount such that a new video frame is obtained showing different cells.
  • step 262 the program checks the bimodal peak ratio for the new frame FrB gI . Particularly, a bimodal peak ratio of between 0.5 and 0.2 is acceptable. Referring to Fig.
  • step 264 determines whether bimodal peak ratio is acceptable, and if it is, step 266 uses frame B, the new frame, as frame A and returns to step 250.
  • step 268 determines whether the scan pattern has been completed based principally on the failure to obtain a bimodal peak ratio within the prescribed range for the green frame buffer for the next frame. If the scan is not finished, the program jumps from jump point A-2 in Fig. 8B to the same jump point in Fig. 8A immediately preceding the move scope step 260.
  • step 272 the microscope is moved, in step 272, to the location of the first center point ctpt ! on the list generated by step 256. This scanning and gross identification of the white cell positions decreases processing time in the overall system.
  • Fig. 7 is a picture illustrating white blood cells W, red cells R and platelets P. The picture is taken at 1000X. At 1000X, the white blood cells occupy a pixel block of approximately 70 X 70 pixels assuming a video frame of 512 X 512 pixels. At 200X the same white blood cells occupy a pixel block approximately 13 X 13. If high definition television video signals were used, at 20OX the white blood cells would occupy more pixels due to the larger number of pixels in the video frame.
  • the Sobel edge detection filter generates a focus factor or a detection count or value based upon the clarity and sharpness of the edges in the video frame. Accordingly, in step 278, the program compares the previous focus factor for frame FrA g to a new focus factor for the frame when the scopes focus has changed. The routine maximizes the focus factor or the Sobel edge detection count or value by changing the focus of the scope.
  • Steps 280, 282 and 284 are described in detail in conjunction with Figs. 10, 11 and 12.
  • step 280 extracts a white cell from the video frame images and the white cell nucleus from the video frame images Fr , gb and obtains a cell block consisting of a red, green and blue pixel matrix about the white blood cell as well as the nucleus.
  • step 282 calculates the features of the white blood cell from the cell block.
  • Step 284 classifies the white blood cell from the features obtained in step 282.
  • step 286 After the white blood cells have been classified, in step 286, both the white blood cell video image block or image matrix (the red, green and blue buffer portions for that particular white cell) and the classification of that white cell are stored.
  • step 288, the program repeats the extraction step 280, the calculate feature step 282, the classify step 284 and the store step 286 for all white cell center points ctpt n in the high powered window of frame FrA.
  • the scope is moved to the center point of the grossly identified location of the first white blood cell found during the pattern scan.
  • the system operates on that video frame clarifying the focus.
  • the program operates on a single cell located within a high powered window frame A.
  • the high powered window may be smaller than the actual video frame in order to eliminate any white blood cells that are split by the frame border.
  • Each white blood cell is isolated, and its location and color image signals are extracted for the entire cell as well as the nucleus.
  • Features of that extracted white blood cell are obtained and then the white blood cell itself is classified.
  • the cell video information or data and classification data is stored and then the program repeats the extraction, calculation, features and classification and stores data for each white blood cell within the high power window of frame A.
  • Step 300 calculates and obtains a hue frame for all of high power window frame A (FrA H ) .
  • the hue frame is equalized and expanded to full digital scale.
  • the hue frame is a mathematical composite of the red, green and blue buffers for the video frame. Since the hue may not extend the full digital scale of the histogram for the hue values, step 302 expands the histogram wave form to the full dynamic range of the values. This is a normalizing technique.
  • Step 304 passes the hue frame histogram through a threshold filter, the frame as modified by step 302, and creates a binary image. Essentially, every pixel in the frame having a value less than a pre-set value, e.g., 20, is set to (1) in all pixels in the frame greater than or equal to 20 is set to 0. The threshold level for this hue frame could be changed as necessary depending upon the particular cell under study.
  • a morphological open operation is conducted on the binary hue frame.
  • the morphological open operation is a mathematical operation that first erodes the binary image and then dilates the binary image. This morphological open operation removes noise in the hue frame, trims edges and smooths contours.
  • the program identifies a white cell block about all of the white cells in the hue frame image as modified and creates a white cell mask for the hue frame.
  • the white cell mask is identified for cell,.
  • the cell block corresponding to the red, green and blue buffer portions representing each white cell and the mask defining the outer boundaries of each cell are stored.
  • the location of each identified cell is stored along with the RBG color image signals.
  • the color image signals fall within a predetermined color band about a predetermined stain cell color.
  • Hue is simply a mathematical representation of color space from the RBG color image signals. Since the hue frame is screened by a color signal filter, all color image signals passing through that hue filter represent the color image of individual white cells in that video frame.
  • the cell block discussed herein is essentially the RBG portion of the frame buffer and the mask layer showing the location of the white cell in the video frame as well as an identification of what video frame this particular cell was extracted from.
  • the program identifies the nucleus of the white blood cell.
  • the first white cell block color image is transformed into a saturation frame.
  • Saturation is a measure of whiteness of an image and is also mathematically related to the RGB buffer portions for the white blood cell.
  • the saturation frame portion is identified as cell,.
  • the saturation frame portion is cell, averaged to smooth contour edges.
  • the saturation frame portion is passed through a threshold filter such that everything above, for example, saturation value 75 is set to "1" and everything below that saturation threshold value is set to "0". Accordingly, a binary image of the cell block is created based upon a predetermined chromaticness threshold. Chromaticness is the hue and saturation of a color.
  • step 320 morphological close operation is conducted on the binary version of cell,. This morphological close operation first dilates the image and then erodes it in order to fill holes within closed shape figures. Since a nucleus is generally solid (Fig. 7, nucleus W N ) , the morphological close operation closes the solid spaces defined by the nucleus.
  • the cell nucleus mask is defined NUC X for cell,.
  • step 324 the cytoplasm of the cell, that is W c in Fig. 7, is set to "1" and the nucleus of the cell NUC, is set to "2".
  • the cell block for cell is now defined by cytoplasm identified by a "1" and the nucleus defined by a "2" as well as red, green and blue frame portions for the cell.
  • the cell block therefore defines the outer contour of the white cell as well as the contour of the nucleus within the cell and color data for the cell and nucleus.
  • the cytoplasm is simply the difference between the whole cell image mask obtained in step 310 and the nucleus image mask obtained in step 322.
  • the cell block is then the RBG color image signals as well as the mask defining the cell cytoplasm and the cell nucleus.
  • the extract white cell program then ends and the program returns to the next step shown in Fig. 8B, the calculate features of white cell from the cell block in step 282.
  • Fig. 11 is the calculate features routine and begins with step 340 which calculates the hue, saturation and intensity histograms of cell ! , that is C H , C s , and C x for the particular cell block.
  • Step 342 calculates hue, saturation and intensity histograms of the nucleus NUC X (N H , N s , N .
  • These steps constitute a means for obtaining vectors representing at least one set of color characteristics of the single cell from the identified cell signals.
  • the RBG frame portions could be part of the color characteristics.
  • these vectors are a series of numbers representing histogram values.
  • the histograms may be represented by a series of numbers ranging from 0 to 255 in value. However, as noted later in conjunction with Fig. 12, the histogram vector is reduced by taking every fourth value, and reducing the vector approximately down to a 64 value vector.
  • step 344 miscellaneous cell features and cell nucleus features are identified. Also, the area of the cell C A , the area of the nucleus N A , the area of the cell cytoplasm C c is calculated.
  • step 346 a texture matrix is made from the color signals associated with the cell nucleus. Texture matrices for visual images are discussed in detail in a book entitled “Digital Image Processing” by R. Gonzalez and P. Wintz copyrighted in 1987. Essentially, the texture matrix is utilized to determine whether the image is smoo.th or rough. A smooth image will generate a texture matrix having values falling within a relatively narrow range in a diagonal band from the upper left to the lower right of the texture matrix.
  • a rough look is defined by values of a similar range in a diagonal extending from the lower left to the upper right of the texture matrix.
  • the maximum probably N p , element difference moment N M and uniformity factors Nu from the nucleus are also obtained from the texture matrix. If the maximum probability is high, the image is smooth. If the maximum probability is low, then the image is coarse. If the image is coarse, it has a random look to the image.
  • the element difference moment determines whether the texture has some order. If the texture does have order, than the moment value is relatively low. If the texture is random than the moment is a high value. Uniformity is the opposite of entropy and is discussed in detail in "Digital Image Processing" by Gonzalez and Wintz.
  • the cell nucleus features give an indication of the degree of randomness, smoothness, and order within the white cell.
  • Fig. 7 illustrates that various white cell nuclei have different shapes, sizes, positions and textures. These miscellaneous nucleus cell features provide factors which are helpful in classifying the cell.
  • the cell features are stored for white cell ! .
  • the program then enters the classify white cell program step 284 in Fig. 8B.
  • the classify white cell subroutine is shown in flow chart form in Fig. 12.
  • an input vector is created from the features of celli. These input vectors are hue, saturation and intensity histogram values from cell,,, the hue and saturation histogram values from NUCj, and the area of the cell, area of the nucleus, and area of the cell cytoplasm, and the miscellaneous factors such as maximum probability, element difference moment, and uniformity factors from the nucleus.
  • these vectors are propagated through a feed forward neural network. The neural network will be described in detail hereinafter.
  • step 354 the program collects the results of the neural network output, and in step 356, the results are analyzed and the white blood cells are classified by type or category or identified as being as non- classifiable.
  • the white cell class table that follows is exemplary of the classifications that may be assigned to a particular white blood cell.
  • Fig. 14 diagrammatically illustrates the neural network used in this present application of the invention.
  • Fig. 13 diagrammatically illustrates a single neuron in a neural network for explanatory purposes. Neural networks are sometimes called parallel distributed processors and such devices are meant to be encompassed within the scope of the present invention.
  • a neural network is a series of neurons associated in layers, an input layer, one or more hidden layers, and an output layer. Each layer has a plurality of neurons that are connected in a certain fashion to the following layer in a feed forward neural network.
  • Fig. 13 illustrates neuron 410 that has inputs Ij, I 2 , I 3 and I n . Each input has assigned thereto a weight that changes the value of the input.
  • a weight function 412 may double the value of the input. So upon the appearance of a 1 at I lf a value of 2 would be assigned thereto by weight function 412.
  • function 414 is a sigmoidal threshold function which essentially averages the output to a smooth curve from a straight edge threshold function rising as a step from 0 up to 1.
  • the weights for the inputs to the neural network are defined when the neural network is trained to recognize and identify certain known cells. The knowledge of the neural network is stored in the weights. The teaching process or operation sets the weights for each neuron in the network.
  • Fig. 14 diagrammatically illustrates the architecture of the neural network in the present application.
  • This neural network as three paths designated path 510, for the input vectors related to the entire cell color characteristics, path 512, handling the miscellaneous parameters such as the cell area, nucleus area, cytoplasm area, and nucleus features such as texture values, and a third path 514 handling color characteristics of the cell nucleus.
  • path 510 for the input vectors related to the entire cell color characteristics
  • path 512 handling the miscellaneous parameters such as the cell area, nucleus area, cytoplasm area, and nucleus features such as texture values
  • a third path 514 handling color characteristics of the cell nucleus.
  • Each of these paths in relatively independent and they converge towards an output layer.
  • Neural network also includes an input layer 516, a first hidden layer, a second hidden layer, a third hidden layer and an output layer 518.
  • the hue vector for the entire cell is represented by 64 values
  • the saturation vector for the hue cell is also by 64
  • Neuron group 520 is an input layer composed of 192 inputs, 64 inputs each for the hue, saturation, and intensity histograms of the cell.
  • Neuron group or cluster group 522 is fully connected (FC) as shown by the three dots, to neuron group 520.
  • Group 522 of hidden layer one consists of 30 neurons.
  • Hidden layer 2 for path 510 consists of ten neurons in group or cluster 524.
  • neuron cluster 522 is fully connected to neuron group 524.
  • Fully connected means that each input is connected to each neuron in the layer immediately below.
  • Neuron cluster 524 is fully connected to hidden layer 3 consisting of neuron cluster 526 comprising 20 neurons.
  • the second path 512 for the miscellaneous perimeters of cell area, nucleus area, cytoplasm area, maximum probability from texture matrix, element of difference moment, and uniformity factor (7 inputs) is fed to input neuron cluster 528.
  • Neuron cluster 528 is 7 neurons which hold the inputs for that path.
  • Neuron cluster 528 is fully connected to neuron cluster 430 of the first hidden layer that consists of 10 neurons.
  • Neuron cluster 530 is fully connected to hidden layer 3, neuron cluster 526.
  • input layer 516 includes 128 neurons as neuron cluster 532.
  • Cluster 532 is fully connected to the first hidden layer consisting of 20 neurons in cluster 534.
  • Cluster 534 is fully connected to neuron cluster 536 of the second hidden layer for path 514.
  • Neuron cluster 536 is fully connected to hidden layer three, neuron cluster 526.
  • Neuron cluster 526 of hidden layer three is fully connected to output layer 518 which consists, in this application, of 13 neurons in cluster 538.
  • Each neuron in cluster 538 holds an output from the neural network.
  • Each output neuron corresponds to a different type of white cell class and these classes are shown in the white cell class table.
  • output layer may hold: SEG.O; LYMPH 0.61; MONO. 0.30; EOSIN 0.1, etc. Step 356 in the classify white e
  • 25 cell routine (Fig. 12) analyzes these results and either picks the maximum, which in the exemplary case is LYMPH. , or may determine whether one or more of the output neuron values exceeds an output threshold value.
  • the output threshold value may be 0.60.
  • the LYMPH white cell classification would be selected as the classification for that particular white cell. However if two output neurons have a value of more than 0.60, the program would classify this cell as being "non-classifiable". Similarly, if all the output neurons had a value less than 0.60, the white cell would be declared non-classifiable.
  • a three path neural network was chosen because of the complexity and the number of different vectors involved in the cellular analysis. It is possible that a neural network having a single path would be appropriate depending upon the cell under study. Whole cell color vectors are preferably input into a single path network. Also, the number of hidden layers was expanded from 1 hidden layer to 3 hidden layers in order to increase the speed of processing as well as to increase the speed of teaching the neural network to recognize traits of white blood cells.
  • the inputs to the neural network are rotationally and positionally invariant because color characteristics of the entire cell, color characteristics of the cell nucleus, and miscellaneous of the cell features, are extracted by the vision system and the vectors or values representing these characteristics are fed to the neural network.
  • any change in the rotation or the position of the cell may affect the teaching speed of the network and the speed of processing once the network has been properly taught. Accordingly, the system is insensitive to rotational and positional aspects of the white cells.
  • the backward error propagation formula is as follows: the change in weight is equal to the learning rate times the error times the output plus the momentum time the previous change in weight. This is found in a book entitled "Parallel Distributed Processing" by D.E. Rumelhart, et al, 1986.
  • the present network was taught using back propagation with above listed training set. The learning rate was 0.2.
  • the momentum was 0.6, the pattern error was 0.01 and the convergence time was 10.
  • a 100 white cell differential can be found in about ten minutes.
  • the neural network is taught by feeding the network the teaching set and checking the output neurons. The programmer then reviews the cells classified and determines whether the output obtained is correct for that cell. If the output is erroneous, then the weights in the interneural connections are changed based upon the formula.
  • Step 288 repeats the extract cell routine, calculate features routine, classify white cells and store routines for all the cells within the frame window.
  • step 610 masks out the white cells from the red, green and blue frame buffers for frame FrA. It should be noted that these few next steps could be conducted in parallel after the extract white cell and nucleus step 280 in Fig. 8B. Accordingly, it is not necessary that all the steps as shown herein be done sequentially since there are opportunities for parallel processing and multitasking.
  • the hue, in step 612 is calculated for the modified frame A. In step 614, the purple pixel blocks less than pixel block size X are counted.
  • step 616 the small size purple pixel count is discounted or reduced for error and the new value is assigned as the platelet count. According to standard blood smear analysis, a rough count of platelets is made to determine whether sufficient platelets are present in the blood. Accordingly, the platelet count is only an estimate. The discount for error in step 616 is necessary to eliminate noise in the video frame.
  • step 618 the classification and count is made of the red blood cells. Details of step 618 are found in Fig. 15 which is a flow chart of the red blood cell classification routine.
  • the red blood cell classification in Fig. 15 begins with step 710 which utilizes the frame A red, green and blue buffers, modified by masking out the white blood cells.
  • the following red cell class table identifies and describes the 7 different types of red cells classified by the program
  • crenated erythrocytes star or spiked circles.
  • leptocytes target cells (look like bullseye) .
  • drepanocytes sickle cells (look like stringbeans) .
  • poikilocytes with pointed and blunt features look pear shaped, sometime with double bulbs protruding from irregular circle, blob-like appendages) .
  • hypochromic erythrocytes circular with red borders and white centers
  • f. diffusely basophilic or polychromatophilic (look bluish compared to other red cells) .
  • normal red cells round and all red
  • the red cell class table includes the medical name of the cell, the informal name of the cell and a brief description of what the cell looks like. All the cells except for diffusely basophilic (class f) are red due to the Wright stain in the blood smear.
  • Step 712 calculates a hue frame based upon the modified frame A, that is, the frame wherein the white cells are masked out.
  • Step 714 counts the purple pixel blocks greater than Y size. A narrow color band threshold or filter would be utilized in order to identify the diffusely basophilic or class f cells since those cells are blue rather than red.
  • the class f cells are counted.
  • pixel blocks larger than Y size are chosen to eliminate any recognition of the smaller purple platelets in the video frame.
  • the green buffer is utilized and a threshold is taken to create a binary image of the cells.
  • the green buffer is used because red cells have the smallest amount of green values. Accordingly, the red cells in the green buffer appear dark.
  • the thresholding operates on the intensity histogram of the green buffer since the morphology or the shape of the remaining red blood cells is determined by the balance of the red blood cell classification sub-routine which is not sensitive to color differences.
  • the program identifies the red cell blocks by identifying pixel blocks larger than Y size and stores the binary image of the cell blocks in a compact red cell-shaped
  • RC-shape (RC-shape) frame.
  • a special RC-shape frame is utilized in order to conduct morphological open and morphological close operations on the entire frame. These operations use a relatively large amount of processing time. Since the red blood cell shapes do not occupy the entire video frame under study, these shapes can be placed in a special RC-shaped frame and the entire shape frame can be processed thereby reducing intermediate or background noise and shortening the processing time.
  • Decision step 724 determines whether the RC-shape frame is full. If it is not full, the no branch is taken and the program jumps from jump point A-5 to jump point A-5 in Fig. 8C immediately after the classify and count red blood cells 618. As is described in detail later, the general program then cycles through in processes by obtaining a new video frame. Eventually the RC-shape frame becomes full and the yes branch is taken from decision step 724 (Fig. 15) . The next step
  • the morphological open operation is configured as a shape filter that filters out or separates the round shaped cells from those cells that are not round.
  • the cells that are not round in the red cell classification table are class a, star or spiked circle; class c, sickle cells (which look like string beans) and class d which are pear-shaped or have double bulbs protruding from an irregular circle or are blob-like with appendages.
  • the other red blood cells are round.
  • the non-round cells or the cells that drop out after the morphological open operation are identified on the yes branch and a further decision step 730 is made as to whether the cells are shaped as linear strings.
  • step 732 determines whether the cells have spiked edges by a spike edge detection routine.
  • the spike edge detection routine in decision step 736 determines whether the cells are class a, that is, shaped as star or spiked circles.
  • the program counts the spiked circles in step 738. If the cells are class d, pointed with blunt features, these cells are counted in step 740.
  • step 760 conducts a morphological close operation on the remaining red cell shapes.
  • a morphological close operation determines whether the red cell shapes are solid in color, indicating a normal red cell, or whether the cells have a white center or look like a bullseye, that is, with a plurality of inner rings.
  • Decision step 762 separates the open red cell shapes from the closed red cell shapes. The no branch from decision step 762 identifies the number of normal red blood cells, class g, in step 764.
  • step 766 determines whether the center of the cells under study are white. If the center of cell is white, as determined by an appropriate shape identification routine, the yes branch is taken and in step 768, a count is taken of class e cells. If the no branch is taken, target cells are identified as class b cells in step 770. All of these counting steps lead to step 772 which stores the non- classified cell count by comparing any remaining pixel blocks in the RC-frame that have not dropped out through the decision chain. After the red blood cell classification routine ends, the main program picks up in Fig.
  • step 620 discards the center points identified in the gross location identification for the white blood cells within the high power target window of frame A.
  • step 622 the microscope is moved to the next center point on the list beyond the previously identified center points. This location identification was made in step 256 in Fig. 8A.
  • decision step 624 which determines whether the list is done. If the list ctpt n is not done, the program jumps to jump point A-4 in Fig. 8B immediately proceeding extract white cell and nucleus step 280. Alternatively, jump point A-4 may be advanced to a point immediately proceeding the conduct focus routine 276 if it is found that, due to the movement of the microscope, a new focus must be obtained after the scope moves.
  • step 624 determines that the list is done, the program takes the yes branch and step 626 displays all non-classifiable white cell blocks to the operator for assistance in classifying those cells.
  • step 628 the operator inputs data regarding the non-classified white blood cells.
  • step 630 the system displays data regarding all the white cells classified, the red blood cell count, the abnormal red blood cell count and the platelet count.
  • step 632 the date is stored for the slide, the white cell block color data, classification data, red cell count data, both normal and abnormal, the name of the operator and the results of the entire test.

Abstract

A system for microscopic analysis and classification includes a plurality of vision systems (VS1 - VSn) and a host computer (40). The vision system assigns vectors representing a color characteristic of a stained cell on a slide (10). These vectors represent hue, saturation and intensity histograms of a cell and/or the nucleus of the cell. Other cell features such as area, area of the nucleus and area of the cell cytoplasm can also be used as vectors. The vectors are fed into a neural network (fig. 14) which performs the classification.

Description

CELLULAR ANALYSIS UTILIZING VIDEO PROCESSING AND NEURAL NETWORK Field of Invention
The present invention relates to a system for analyzing and classifying stained cells, viewed under a microscope, that incorporates the video processing techniques and a neural network.
Microscopic cellular analysis is customarily done by medical technicians. These technicians study the various types of cell structures and analyze the stain cells under a microscope. In blood analysis, error rates sometimes approach as high as twenty five (25%) percent. These error rates are due to the technician viewing the cells eight hours or more per day, five days a week, and the difficulty of identifying subtle variations in color, shape, size, density and texture of the stained cells, among other things.
In the late 1970's and early 1980's, some companies attempted to automate blood smear analysis. However, the automation did not meet with commercial success for the following reasons: computer speed was too slow for the large volume of data involved in image acquisition and analysis; computer costs were extremely high and sufficient computing power was needed to achieve desired speeds; it was difficult, if not impossible, to program conventional software to account for cell complexity and variability; and, the vision systems or image processing available at that time were not able to obtain true color (compared to gray scale) image signals of the microscopic cell and perform the complex digital procedures thereon. Four primary systems were developed and produced at this time. The Perkin-Elmer Cell Scan, Hematrak 590, Coulter Diff 3 and the Technicon Hemalog D/90. The Hematrak 590, which was the most successful of these image based systems, was last produced in 1987. The failure of all of these image based systems was related to the inability of the systems to meet price and performance expectations for obtaining blood differentials. Accordingly, there was general disillusionment with image base systems by the mid 1980's.
The Hematrak 590 was manufactured by Geometric Data, a division of Smith-Kline Corporation of Wayne, Pennsylvania.
The Coulter electronic machine takes whole blood samples, separates the white cells from the red cells and platelets and then processes the white cells using a flow system. The machine passes the white cells in single file through an impedance measurement sensor to produce a three part differential count. The cytochemical systems utilize the same flow mechanism but instead of using impedance flow the machines utilize laser light scattering and absorption patterns of the white blood cells. The Technicon H-l and Coulter VCS produce a six part differential count utilizing these types of systems. The cytochemical and impedance flow systems base their results on classifying a large number of cells, typically on the order of thousands, based upon the known distribution of normal white cell populations. This approach however does not provide enough information to accurately classify small distributions of abnormal cells. Most of the instruments provide flags that alert the user for the need to proceed with a vision differential count based upon the presence of abnormal cells. None of them attempt to provide a quantitative measurement of these abnormal cells.
There are two principal reasons why digital image processing have not been successfully utilized to obtain differential blood counts.
1. The technology required to effectively solve the problem was too expensive. In the last five years, computing power has increased enormously and prices have dropped significantly.
2. Conventional pattern classifiers were not as effective as neural network classifiers to handle input signals with a high degree of variability such as the ones encountered in a white cell differential analysis. Because of these reasons, early machines such as the Hematrak 590 did not perform well over the wide spectrum of white cell classes.
Objects of the Invention
It is an object of the present invention to provide a system for microscopic cellular analysis utilizing video processing and neural networks.
It is another object of the present invention to automate blood smear analysis while achieving a high level of consistency a
3 and accuracy.
It is a further object of the present invention to utilize visual imaging techniques, in a digital environment, to obtain inputs for the neural network wherein the inputs are rotationally and positionally invariant.
It is another object of the present invention to provide a system which can produce a thirteen part differential count white blood cell analysis in addition to platelet count and red cell morphology analysis.
It is an object of the present invention to quantitatively measure, as well as identify, abnormal cells. Summary of the Invention
The system for microscopic analysis and classification utilizes an automated microscope that can be positioned both laterally and longitudinally as well as focused under the control of positioning commands. The color image signal of the microscopic image of stained cells on a slide are fed to one of a plurality of vision systems. Each vision system includes a frame buffer
(dedicated memory) and a micro processor. A host computer (micro processor) assigns tasks to each vision system based upon the vision system's activity level. The vision system identifies a single cell among the plurality of stain cells on the slide by locating color image signals falling within a predetermined color band about a predetermined stained cell color. Also, the vision system calculates vectors representing at least one set of color characteristics of the cell from the identified cell signals. For example, these vectors represent hue, saturation, and intensity histograms of a single cell. In a more comprehensive system, these vectors also include hue, saturation, and intensity histograms of the nucleus of the cell. Other cell features such as whole cell area, area of the nucleus and area of the cell cytoplasm are also used as input data. The vectors and this miscellaneous data are fed into a neural network. In the comprehensive embodiment, the neural network has three paths and is a feed forward network of neurons. One path handles vectors representing color characteristics of the cell, another path handles vectors representing color characteristics of nucleus, and the third path handles the miscellaneous cell features such as area and nucleus color texture information. These three paths converge towards an output layer in the network. The output layer classifies the single cell based upon the vectors and miscellaneous information. Brief Description of the Drawings
Further objects and advantages of the present invention can be found in the detailed description of the preferred embodiments when taken in conjunction with the accompanying drawings in which:
Fig. 1 illustrates, in block diagram form, the general system in accordance with the principles of the present invention;
Fig. 2 illustrates, in block diagram form, the functional characteristics of the system;
Fig. 3 is a block diagram of one vision system;
Fig. 4 is a general flow chart of the operation of the system;
Fig. 5A diagrammatically illustrates a blood smear on a slide and the search pattern of the microscope;
Fig. 5B diagrammatically illustrates the positioning of a plurality of slides on a tray;
Figs. 6A and 6B show exemplary intensity histograms utilized in the positioning routine for the microscope;
Fig. 7 is a marked photograph showing red blood cells, white cells and platelets at 1000X microscopic power;
Figs. 8A, 8B and 8C represent a flow chart of the principal program which includes a scope positioning routine as well as referring to the extraction routine, calculate features routine, classification routine, and display output routine;
Fig. 9 diagrammatically illustrates a color band about a predetermined stained cell color that is purple in the example discussed in conjunction with a current application of the present invention;
Fig. 10 is a flow chart illustrating the steps involved in the extraction routine;
Fig. 11 is a flow chart illustrating the calculate features routine;
Fig. 12 is a flow chart illustrating the classification routine;
Fig. 13 is a diagrammatical representation of a neuron in a neural network;
Fig. 14 is a diagrammatical representation of the neural network used in conjunction with one application of the present invention; and
Fig. 15 is a flow chart illustrating the red blood cell classification routine. Detailed Description of the Preferred Embodiments
This invention relates to a system for microscopic analysis and classification of stained cells.
Fig. 1 is a general system diagram for the present invention. A plurality of slides, one of which is slide 10 are placed in certain positions on slide tray 12. An exemplary, diagrammatic illustration of a blood smear slide is shown in Fig. 5A and the positioning of the slides on a slide tray 12 is shown in Fig. 5B, both of which will be discussed in detail later. A microscope 14 is controllably driven so that the microscope can be moved laterally as shown by double headed arrow 16, as well as longitudinally as shown by the X within a circle 18 in Fig. 1. Additionally, microscope 14 can be moved towards or away from slide tray 12 as well as focused as noted by the curved line 20 having double arrows thereon. The mechanical movement and focusing of microscope 14 is provided by position and focus command signals applied on line 22 from a scope control 24. In a working embodiment, scope control 24 is a stepper motor controller and the mechanical drive mechanism, represented by block 26, is a plurality of stepper motors connected to microscope 14. The motors drive the microscope laterally and longitudinally with respect to the slide under study as well as towards and away from the slide. Also, the motors focus the scope with respect to the stained cells on slide.
Color image signals representative of the microscopic image of the stained cells are obtained by a video camera 30. The output of the video camera is applied to bus line 32 which, in the preferred embodiment, carries video image frame signals, one frame signal representing the red color image, another representing the blue color image and a third representing the green color image. These video frame signals include timing signals such as vertical and horizontal blanking signals. The color image signals are applied to one of the vision systems, VSj, VS2, VS3, ... VSn all under the control of host computer 40. The vision systems are coupled to host computer 40 via VME Bus 42 that permits extremely fast transfer of data as well as control and command signals between the various devices. Host computer or CPU 40 also utilizes memory 42, and input/output (I/O) device 44 that are similarly coupled to VME Bus 42. A display monitor 46 is connected to the system via I/O 44 as is keyboard 48.
Fig. 2 generally illustrates the functional aspects of the system wherein each of the major components in Fig. 2 operate relatively independent of each other and generally under the control of host computer or CPU 40. As will be described in detail later on, one major function of the system is to position and focus microscope 14. Accordingly, a major functional block is the positioner and focus operation 50. Once the microscope is positioned at a proper location above the slide and properly focused, a video frame image or color image signals for that video frame are grabbed or stored by one of the frame grabbers in the vision system, for example, VSj. The positioning and focus operations also utilize a vision system. This vision system is then placed in the VS-Used Queue 52. The vision systems are extremely fast and therefore the hardware and speed capabilities are utilized in major function block 54 in order to identify a single cell among the plurality of cells on the slide, to extract color image or color information from the particular image video frame, and further to extract or identify and quantify certain cell characteristics. Accordingly, function block 54 is generally identified as comprising a number of extractors, such as extractorx through extractor,,. After the extractors have isolated a particular cell on the slide, and quantified certain color characteristics such as hue, saturation, intensity, and also certain miscellaneous cell information such as cell area, nucleus area, etc. , that information, collectively called herein a "cell block" is placed cell block queue 56. Since the vision system has now completed its task of isolating and extracting certain information from the video frame, which would entail isolation and extraction of all the single type cells on the video frame, that^ vision system is returned and placed in the VS-Free queue 58. Accordingly, the vision system is then available to the positioner and focus functional block 50 in order to further focus or position the microscope as well as accept another video image.
Cell block queue 56 is utilized by a third major functional block 60 that includes a number of classifiers, classifies through n. The classifiers in the present embodiment are configured as software in the host computer and essentially comprise neural^ networks that will be described in detail later. However, since neural networks can also be configured as very large scale integrated circuits (VLSI) , the present invention is not meant to be limited to the software implementation of a neural network but rather encompasses all types of neural networks whether implemented as software or hardware. The classifier assigned to analyze the cell block, that generally includes vectors representing color characteristics of a single cell as well as quantified cell features, determines what type of cell is represented by the cell block. For example, a working embodiment of the present invention classifies white blood cells represented by the cell block information. In one working embodiment, six different types of white blood cells were identified out of approximately one hundred twenty white cells with an accuracy of approximately ninety five (95%) percent. The current embodiment of the present invention identifies a thirteen part differential count, including six normal types of white blood cells and seven abnormal types of white blood cells.
After a classifier, such as classifieri, identifies the single cell, the cell block and cell identification information is passed to the cell block feature queue 62. The perform differential functional block 64 is the overseer and controller of all the other functional blocks. The perform differential function 64 also displays the identified and classified cell utilizing the cell a
8 block features on a monitor as well as storing them for future analysis or recall. Further, the perform differential function monitors the number of different cells found. For example, in the working embodiment of the present invention a two hundred cell, thirteen part differential must be obtained in order to stop the positioning, extraction, and classification of various cells on the blood smear slide. Accordingly, the perform differential function 64 monitors the number of cells classified and stops the other functional processes after the system has identified that specified number cells. Further, perform differential function 64 monitors the overall process, keeps track of the vision systems that are being used as well as the vision systems that are free, and monitors the error ratio and failure flags on all the processes.
Fig. 3 is a block diagram of a single vision system as used in a working embodiment of the present invention. Essentially, vision system 66 is a board placed in a computer frame in order to achieve real time imaging on VME Bus 42. Inputs 68 comprise color image signals for frame A, that is, separate red, blue and green frame A signals (FrA,^) as well as timing signals. The vision system 66 includes a color frame grabber, a thirty two bit plane, four flexible 512 X 512 X 8 image buffers (with an optional four image buffers available) , arithmetic logic units (ALU) and statistical processing up to 12.5 million pixels per second, inter- image arithmetic, including subtraction, real time frame averaging, convolutions, morphology, histograms and area profiles, area of interest window processing, and a 68000 on board micro processor. Inputs 68 are applied to line 70 which are fed to input look-up tables (LUTS) 72 and sync stripper/generator 74. ACRTC 76 is a video controller chip or integrated circuit that produces all the timing signals for video acquisition and processing. Arithmetic logic unit (ALU) 78 further conditions and alters the video frame images and places them into frame buffer 80 which is a 512 X 512 X 32 on board video memory. Frame buffer 80 is connected to VME Bus 42 by an internal VRAM Bus 82 to assist in the very fast input and output of data from the frame buffer. Output look-up table (LUTS) and digital to analog convertors (DACS) 84 provide various outputs 86 from the vision system. The vision system also includes a statistical processor 88, an event counter 90, and an interface to MVP-NP 92. These items are connected to an internal processing bus 94 as is ALU 78 frame buffer 80 and other components. MVP-NP is a co-processor 96 which increases the processing speed for neighborhood operations, such as morphological transforms, binary pattern matching, feature extraction and color classification. Co¬ processor 96 is coupled to processing bus 94 through digital expansion bus 98.
The vision system includes its own independent CPU or processor 110 which in the working embodiment is a Motorola 6800 micro processor. Memory 112 is available to VS-CPU 110 through CPU bus 114. Control logic 116 assists the VS-CPU 110 in controlling the operations of the other hardware and software functions. In the working embodiment, the vision system is a MVP-VME video board manufactured by Matrox Electronic Systems Ltd. of Dorval, Quebec, Canada. Co-processor 96 is the machine vision accelerator sold in conjunction with the MVP-VME image board. Further details and functional specifications of the Matrix image processing board are available from the specifications on the board.
Fig. 4 is a general system flow chart diagram showing the major steps in the particular application of the invention described hereinafter.
The following system specifications table identifies certain specifications utilized in the present application of the invention. These system specifications are meant to be exemplary only.
SYSTEM SPECIFICATIONS TABLE Two axis linear positioning device
X Axis
Travel length 305 mm
Stepper motor (half step) 400 step/rev
Resolution 0.0055 mm/step
Unidirectional Repeatability Less than 0.025 mm
Bidirectional Repeatability Less than 0.005 mm
Position accuracy Less than 0.005 mm Maximum linear speed 10 mm/sec 10
Y Axis
Travel length 50 mm
Stepper motor (half step) 400 step/rev
Resolution 0.005 mm/step
Unidirectional Repeatability Less than 0.005 mm
Bidirectional Repeatability Less than 0.005 mm
Position accuracy Less than 0.005 mm
Maximum linear speed 15 mm/sec
Focus positioning device
Travel length 2 mm
Stepper motor (micro step) 1000 step/rev
Resolution 0.000086 mm/step
Unidirectional Repeatability 0.0005 mm
Bidirectional Repeatability 0.0005 mm
Position accuracy 0.0005 mm
Minimum linear speed 0.0086 mm/sec
Microscope magnification
Low power lens
Type Planachromat dry lens
Magnification 2OX Numerical Aperture 0.5
High power lens
Type Planachromat dry lens
Magnification 100X Numerical Aperture 0.9
Light source
Type 6V Xenon lamp
Output 20 Watt
Color temperature 3200 K
Monitor Display
Type Analog RBG Color Input Separate Red, Green, Blue
Resolution 800H X 600Vnon-interlaced Display Longpersistence phosphorus
Video camera
Type CCD color camera Output Separate Red, Green, Blue Resolution 786H X 493V Manufacturer Song RBG camera Image Acquisition System
Type True color RGB frame grab
Resolution 512H X 480V X 24 bits Color Palette Full 16.7 million colors
MVP-VME with vision accelerator (MVP-NP) by Matrox Electronic Systems Ltd. of Dorval, Quebec, Canada.
Current Hardware
Host CPU Motorola 68000 at 20MHZ
Memory 4Mb Ram, 40Mb Hard Drive
The major steps involved in this particular application of the invention are positioning the microscope or scope near the feather edge of the slide in step 150, scanning the slide, and identifying cell locations in step 152. Additionally, step 152 includes changing the microscopic power of the microscope from 200X to 1000X and fine focusing the microscope at the first identified cell location on the slide. These steps are described in detail hereinafter. Step 154 isolates a single cell of a predetermined type, e.g. a white cell, in the video frame image, and step 156 calculates the cell features. Steps 154 and 156 are the extraction steps done by one of the extractors in function block 54 of Fig.
2. In step 158, cell features are analyzed by one of the classifiers in Fig. 2. Step 160 calculates and analyzes other cell types, e.g., red blood cells, on the slide under study. Decision step 162 determines whether the differential count has been exceeded. In the particular application described hereinafter, the differential count threshold is set at 100 white cells spanning at least thirteen different classes of white cells. These cells must be identified in a particular blood smear before the system will stop. Also, the differential count also must exceed a certain number of red blood cells and platelets in the blood smear. If the differential count threshold is not exceeded, the no branch is taken from decision step 162 and in step 164, the scope is moved, and the program returns to the isolate single cell step in 154.
If the yes branch is taken, the system displays results in step 166 to the operator. In the particular application described hereinafter, there are three versions for displaying the results. In the first version, the operator sees every cell. The cell images are complied in a special display frame and the operator approves or confirms the system's identification of those cells. This confirmation is noted in step 168. In another version, system displays only the non-classifiable white cells, that is the white cells having a low probability of classification, and the operator classifies those cells as appropriate. In the third version, the operator sees every cell classified by the system and approves each cell on an individual basis. Since the type of display and operator confirmation is dependent upon certain commercial aspects of the invention, the display and confirmation steps may be selectable by the operator of the system.
The present invention is described in detail in conjunction with analyzing a blood smear. The smear on a slide is diagrammatically illustrated in Fig. 5A. Human blood was stained with Wright stain, which is a standard stain technique for human blood cell microscopic analysis and classification. The predetermined stain cell color from the Wright stain results in red blood cells (R) (see Fig. 7, a 1000X magnification) being colored light red and the white blood cells (W) ranging in color from light blue, purple to orange. The nucleus of the white blood cells (WN) is dark purple and the white cell cytoplasm (Wc) ranges from light blue to purple dots to light orange.
Slide 200 is placed on slide tray 12 at certain locations, in the present embodiment, tray 12 holds 10 slides at predetermined locations. The blood smear slides are normally oblong or oval in shape and extend longitudinally with respect to the slide. Fig. 5A diagrammatically illustrates intensity contour lines A through E wherein A intensity contour line is extremely dark or black and E contour line illustrates the outer most feathered edge of the blood smear that is slightly colored. The slides on slide tray 12 (Fig. B) are not only positioned at certain spots on the tray but also aligned such that the feathered edge of each blood smear points towards, for example, fore edge 210 of tray 12.. Of course, the feathered edge E of each blood smear could point towards the rear edge of slide tray 12 and the positioning routine (step 150 of Fig. 4) could be altered to sense the feathered edge in that direction.
Figs. 8A through 8C illustrates the principal or primary software routine encompassing steps 150, 152, 162, 164, 166 and 168 in Fig. 4; Fig. 10 is the extraction or isolation routine corresponding to step 154 in Fig. 4; Fig. 11 is the calculate features routine corresponding to step 156 in Fig. 4; Fig. 12 is the classify white cell routine corresponding to step 158 in Fig. 4; and, Fig. 15 is the red blood cell classification routine (including platelet identification) corresponding to step 160 in Fig. 4.
The principal program in Fig. 8A begins with the step of positioning the scope above the center point (CTPT) of the slide in step 240. Since all the slides are placed at certain locations on slide tray 12 (Fig. 5B) the system can generally identify and move the scope relative to the slide to position the scope above the center point of the slide. For example, with respect to slide 200, the center point 242 is identified and the scope is positioned thereat. The microscope is focused at 200X in step 244 in this particular application regarding the analysis and classification of cells in a blood smear. In step 246, the scope is moved longitudinally with respect to slide 200 until a bimodal intensity peak ratio is found. Ideally, the intensity of a complementary color band about a stained cell color is utilized. For example, the stain cell color in the present example is a Wright stain which essentially colors the red blood cells (cells R in Fig. 7) red. Within the dense central region A defined by contour line A in Fig. 5A, the blood smear is very dark and almost black due to the great amount of red blood cells in the smear. The complimentary color band for the Wright stained cell color is green. The video image signals are essentially three video frames, one frame having the red color signals FrA,, another frame having the green color signals FrA-. and a third frame having the blue color signals FrAt,. These frames are kept in frame buffers. Accordingly, there is a red frame buffer, a green frame buffer and a blue frame buffer for frame A (FrA^b) . The green buffer is selected because there is * 14 little or no green in the stained red blood cells. Accordingly, the stained red blood cells appear as dark or black spots on the green video frame image (FrAg) . A histogram of the intensity of the green buffer frame is obtained. Fig. 6A is an example of such a histogram when the scope is positioned at approximately intensity contour line B in Fig. 5A. A histogram is a frequency distribution of the number of pixels having a certain intensity value. In Fig. 6A, the number of pixels in the FrAg that are darker exceeds the number of pixels that are lighter. Peak 211 is the number of pixels having an intensity or brightness level of approximately 80 whereas peak 213 represents a fewer amount of pixels that are brighter (approximately 200 intensity level) . Accordingly, a peak ratio is obtained by comparing the peak number of background pixels to the peak number of pixels under study. The background in the green frame buffer is represented by pixels having a bright or high intensity whereas the cells under study are dark because of the absence of any green in the stained red blood cells and stained white blood cells. Accordingly, the bimodal peak ratio in Fig. 6A is approximately _L,.8.
Step 246 moves the microscope longitudinally down the slide 200 as shown by the dash lines 243 until the bimodal peak ratio reaches approximately 0.5. Fig. 6B graphically illustrates this peak ratio. At this point on the slide, diagrammatically illustrated at intensity contour line C in Fig. 5A, there are a relatively few number of cells within FrA-. as compared with the background. Essentially, the microscope is moved towards the feathered edge until the program detects the proper bimodal peak ratio in the intensity of a complimentary color band about a stained cell color. The stained cell color in the present invention is the red Wright stain color for the red cells and the white cells. The complimentary color band is found in the green buffer.
When the system reaches the appropriate bimodal peak ratio, the system causes the microscope to begin a scan pattern as noted in step 468 in Fig. 8A. The scan pattern, in the present application, is shown as dash lines in Fig. 5A and includes essentially first lateral movement (opposite the first lateral direction) across the slide, slight longitudinal movement, further lateral movement across the entire feathered edge of the blood smear, additional longitudinal movement and a repetition of these movements until the bimodal peak ratio passes beyond the pre- established bimodal peak ratio range. The particular scan pattern for the scope can be changed. Also, the scan pattern diagrammatically illustrated in Fig. 5A is grossly inaccurate because the microscope moves very small distances in relation to the slide both laterally and longitudinally throughout the scan pattern. Essentially, the scan pattern is simply tracking the feathered edge of the blood smear and various scanning patterns could be utilized to track this feathered edge. The photograph depicted in Fig. 7 is a 1000X photograph but is generally illustrative of the cell distribution at 200X, that is, the cells are substantially in a single layer along the feathered edge generally between intensity contour line C and contour line E in Fig. 5A.
Step 248 obtains the video frame or color image signals FrA-,^. Specifically, there are three video frame images, one for the red buffer, one for the blue buffer and one for the green buffer. In step 250, the program calculates the hue of FrAH with all three color images.
In step 252, the hue frame FrAH is passed through a threshold filter to identify, in the present application, white cell pixel groups. The thresholding is simply screening or filtering the entire hue frame to identify pixels in a predetermined color band about a predetermined stained cell color. The stain cell color in the present application is the bluish-purple Wright stain color. The predetermined color band is the purple band (shown in Fig. 9) . Fig. 9 illustrates that if a zero hue value is the mid point between the pure blue hue and the pure green hue, the purple band lies approximately at values 30 to 70 (counter clockwise from 0 to 255) . Hue value 255 is immediately to the right of zero value. Accordingly, the hue frame is sent through a color signal filter, thereby passing color image signals falling within the purple band which extends about the predetermined purple stain cell color.
The purpose of step 250 and 252 is to identify the gross position of cells within the entire frame which, in this application is the red buffer, green buffer and blue buffer. The calculation and creation of the hue frame is mathematical in nature and simply combines the three colors into a single frame in order to reduce processing time. By thresholding that hue frame and mapping pixels in the frame, that is pixels falling within a predetermined color band (purple) , a gross cell location is obtained. In step 254, mathematical morphology or shape filters are utilized to obtain the center points of the purple pixel groups. The center points are the gross locations of the white blood cells in this particular application. In step 256 the center points are stored as ctptn with respect to a slide reference point and all the center point positions of the grossly identified white blood cells are stored in a list in memory. In step 258, the red, green and blue frame buffers are cleared or discarded as well as the hue frame created in step 250. In step 260, the microscope in moved a very small amount such that a new video frame is obtained showing different cells. In step 262, the program checks the bimodal peak ratio for the new frame FrBgI. Particularly, a bimodal peak ratio of between 0.5 and 0.2 is acceptable. Referring to Fig. 5A, once the microscope is moved beyond the feathered edge E that is beyond the feathered edge of the blood smear, the intensity levels of the green buffer would move to the high intensity value range and eventually the dual peak or bimodal peak would disappear since all the pixels in the green frame buffer would represent background which are high intensity pixel values. After the bimodal peak ratio in the FrBg intensity histogram falls below the pre-set range, e.g. 0.2, (see Fig. 6A and 6B) the scope changes direction, and moves either longitudinally or laterally in order to search for a video frame having the bimodal peak ratio within the range. Decision step 264 determines whether bimodal peak ratio is acceptable, and if it is, step 266 uses frame B, the new frame, as frame A and returns to step 250. If the bimodal peak ratio is not acceptable even after a change in the scan pattern, the no branch is taken and the program jumps from jump point A-l in Fig. 8A to jump point A-l in Fig. 8B. In step 268, the scope changes direction either from a lateral movement direction to a longitudinal movement direction or from a longitudinal movement direction to a lateral movement direction shown in Fig. 5A. Decision step 270 determines whether the scan pattern has been completed based principally on the failure to obtain a bimodal peak ratio within the prescribed range for the green frame buffer for the next frame. If the scan is not finished, the program jumps from jump point A-2 in Fig. 8B to the same jump point in Fig. 8A immediately preceding the move scope step 260. If the Yes branch is taken from finish scan decision step 270, the microscope is moved, in step 272, to the location of the first center point ctpt! on the list generated by step 256. This scanning and gross identification of the white cell positions decreases processing time in the overall system.
In step 274, the microscope is switched from low power (200X) to high power (HP) or approximately 1000X. In step 276, a focus routine is utilized incorporating a Sobel edge detection filter on the magnified green buffer frame FrAg. To provide some perspective to this image processing. Fig. 7 is a picture illustrating white blood cells W, red cells R and platelets P. The picture is taken at 1000X. At 1000X, the white blood cells occupy a pixel block of approximately 70 X 70 pixels assuming a video frame of 512 X 512 pixels. At 200X the same white blood cells occupy a pixel block approximately 13 X 13. If high definition television video signals were used, at 20OX the white blood cells would occupy more pixels due to the larger number of pixels in the video frame. The Sobel edge detection filter generates a focus factor or a detection count or value based upon the clarity and sharpness of the edges in the video frame. Accordingly, in step 278, the program compares the previous focus factor for frame FrAg to a new focus factor for the frame when the scopes focus has changed. The routine maximizes the focus factor or the Sobel edge detection count or value by changing the focus of the scope.
Steps 280, 282 and 284 are described in detail in conjunction with Figs. 10, 11 and 12. In general, step 280 extracts a white cell from the video frame images and the white cell nucleus from the video frame images Fr ,gb and obtains a cell block consisting of a red, green and blue pixel matrix about the white blood cell as well as the nucleus. Step 282 calculates the features of the white blood cell from the cell block. Step 284 classifies the white blood cell from the features obtained in step 282.
After the white blood cells have been classified, in step 286, both the white blood cell video image block or image matrix (the red, green and blue buffer portions for that particular white cell) and the classification of that white cell are stored. In step 288, the program repeats the extraction step 280, the calculate feature step 282, the classify step 284 and the store step 286 for all white cell center points ctptn in the high powered window of frame FrA.
In the current embodiment, the scope is moved to the center point of the grossly identified location of the first white blood cell found during the pattern scan. The system operates on that video frame clarifying the focus. Thereafter, the program operates on a single cell located within a high powered window frame A. The high powered window may be smaller than the actual video frame in order to eliminate any white blood cells that are split by the frame border. Each white blood cell is isolated, and its location and color image signals are extracted for the entire cell as well as the nucleus. Features of that extracted white blood cell are obtained and then the white blood cell itself is classified. The cell video information or data and classification data is stored and then the program repeats the extraction, calculation, features and classification and stores data for each white blood cell within the high power window of frame A.
The extraction routine referred to in step 280 of Fig. 8B is shown in detail in Fig. 10. Step 300 calculates and obtains a hue frame for all of high power window frame A (FrAH) . The magnified, high powered window frame discussed earlier herein in referred to as frame A. In step 302, the hue frame is equalized and expanded to full digital scale. The hue frame is a mathematical composite of the red, green and blue buffers for the video frame. Since the hue may not extend the full digital scale of the histogram for the hue values, step 302 expands the histogram wave form to the full dynamic range of the values. This is a normalizing technique.
Step 304 passes the hue frame histogram through a threshold filter, the frame as modified by step 302, and creates a binary image. Essentially, every pixel in the frame having a value less than a pre-set value, e.g., 20, is set to (1) in all pixels in the frame greater than or equal to 20 is set to 0. The threshold level for this hue frame could be changed as necessary depending upon the particular cell under study. In step 306, a morphological open operation is conducted on the binary hue frame. The morphological open operation is a mathematical operation that first erodes the binary image and then dilates the binary image. This morphological open operation removes noise in the hue frame, trims edges and smooths contours. In step 310, the program identifies a white cell block about all of the white cells in the hue frame image as modified and creates a white cell mask for the hue frame. The white cell mask is identified for cell,. In step 312, the cell block corresponding to the red, green and blue buffer portions representing each white cell and the mask defining the outer boundaries of each cell are stored. Also, the location of each identified cell is stored along with the RBG color image signals. The color image signals fall within a predetermined color band about a predetermined stain cell color. Hue is simply a mathematical representation of color space from the RBG color image signals. Since the hue frame is screened by a color signal filter, all color image signals passing through that hue filter represent the color image of individual white cells in that video frame. The cell block discussed herein is essentially the RBG portion of the frame buffer and the mask layer showing the location of the white cell in the video frame as well as an identification of what video frame this particular cell was extracted from.
In step 314, the program identifies the nucleus of the white blood cell. In step 314, the first white cell block color image is transformed into a saturation frame. Saturation is a measure of whiteness of an image and is also mathematically related to the RGB buffer portions for the white blood cell. The saturation frame portion is identified as cell,. In step 316, the saturation frame portion is cell, averaged to smooth contour edges. In step 318, the saturation frame portion is passed through a threshold filter such that everything above, for example, saturation value 75 is set to "1" and everything below that saturation threshold value is set to "0". Accordingly, a binary image of the cell block is created based upon a predetermined chromaticness threshold. Chromaticness is the hue and saturation of a color.
In step 320, morphological close operation is conducted on the binary version of cell,. This morphological close operation first dilates the image and then erodes it in order to fill holes within closed shape figures. Since a nucleus is generally solid (Fig. 7, nucleus WN) , the morphological close operation closes the solid spaces defined by the nucleus. In step 322, the cell nucleus mask is defined NUCX for cell,. In step 324, the cytoplasm of the cell, that is Wc in Fig. 7, is set to "1" and the nucleus of the cell NUC, is set to "2". Accordingly, the cell block for cell, is now defined by cytoplasm identified by a "1" and the nucleus defined by a "2" as well as red, green and blue frame portions for the cell. The cell block therefore defines the outer contour of the white cell as well as the contour of the nucleus within the cell and color data for the cell and nucleus. The cytoplasm is simply the difference between the whole cell image mask obtained in step 310 and the nucleus image mask obtained in step 322. The cell block is then the RBG color image signals as well as the mask defining the cell cytoplasm and the cell nucleus. The extract white cell program then ends and the program returns to the next step shown in Fig. 8B, the calculate features of white cell from the cell block in step 282.
Fig. 11 is the calculate features routine and begins with step 340 which calculates the hue, saturation and intensity histograms of cell!, that is CH, Cs, and Cx for the particular cell block. Step 342 calculates hue, saturation and intensity histograms of the nucleus NUCX (NH, Ns, N . These steps constitute a means for obtaining vectors representing at least one set of color characteristics of the single cell from the identified cell signals. Similarly, the RBG frame portions could be part of the color characteristics. As will be discussed later, these vectors are a series of numbers representing histogram values. The histograms may be represented by a series of numbers ranging from 0 to 255 in value. However, as noted later in conjunction with Fig. 12, the histogram vector is reduced by taking every fourth value, and reducing the vector approximately down to a 64 value vector.
In step 344, miscellaneous cell features and cell nucleus features are identified. Also, the area of the cell CA, the area of the nucleus NA, the area of the cell cytoplasm Cc is calculated. In step 346, a texture matrix is made from the color signals associated with the cell nucleus. Texture matrices for visual images are discussed in detail in a book entitled "Digital Image Processing" by R. Gonzalez and P. Wintz copyrighted in 1987. Essentially, the texture matrix is utilized to determine whether the image is smoo.th or rough. A smooth image will generate a texture matrix having values falling within a relatively narrow range in a diagonal band from the upper left to the lower right of the texture matrix. In contrast, a rough look is defined by values of a similar range in a diagonal extending from the lower left to the upper right of the texture matrix. The maximum probably Np, element difference moment NM and uniformity factors Nu from the nucleus are also obtained from the texture matrix. If the maximum probability is high, the image is smooth. If the maximum probability is low, then the image is coarse. If the image is coarse, it has a random look to the image. The element difference moment determines whether the texture has some order. If the texture does have order, than the moment value is relatively low. If the texture is random than the moment is a high value. Uniformity is the opposite of entropy and is discussed in detail in "Digital Image Processing" by Gonzalez and Wintz. The cell nucleus features give an indication of the degree of randomness, smoothness, and order within the white cell. Fig. 7 illustrates that various white cell nuclei have different shapes, sizes, positions and textures. These miscellaneous nucleus cell features provide factors which are helpful in classifying the cell. In step 348, the cell features are stored for white cell!.
At the end of the calculate features routine, the program then enters the classify white cell program step 284 in Fig. 8B. The classify white cell subroutine is shown in flow chart form in Fig. 12. In step 350, an input vector is created from the features of celli. These input vectors are hue, saturation and intensity histogram values from cell,,, the hue and saturation histogram values from NUCj, and the area of the cell, area of the nucleus, and area of the cell cytoplasm, and the miscellaneous factors such as maximum probability, element difference moment, and uniformity factors from the nucleus. In step 352, these vectors are propagated through a feed forward neural network. The neural network will be described in detail hereinafter. In step 354, the program collects the results of the neural network output, and in step 356, the results are analyzed and the white blood cells are classified by type or category or identified as being as non- classifiable. The white cell class table that follows is exemplary of the classifications that may be assigned to a particular white blood cell.
WHITE CELL CLASS TABLE
Normal:
Segmented Neutrophils
Banded Neutrophils
Lymphocytes
Monocytes
Eosinophils
Basophils
Abnormal:
Variant Lymphocytes
Metamyelocytes
Myelocytes
Promyelocytes
Blasts
Smudge Cells
Basket Cells
Fig. 14 diagrammatically illustrates the neural network used in this present application of the invention. Fig. 13 diagrammatically illustrates a single neuron in a neural network for explanatory purposes. Neural networks are sometimes called parallel distributed processors and such devices are meant to be encompassed within the scope of the present invention. A neural network is a series of neurons associated in layers, an input layer, one or more hidden layers, and an output layer. Each layer has a plurality of neurons that are connected in a certain fashion to the following layer in a feed forward neural network. Fig. 13 illustrates neuron 410 that has inputs Ij, I2, I3 and In. Each input has assigned thereto a weight that changes the value of the input. For example, with respect to Ij a weight function 412 may double the value of the input. So upon the appearance of a 1 at Ilf a value of 2 would be assigned thereto by weight function 412. After being weighted, the inputs are summed by neuron 410 and modified by a function 414 before a single output is developed. In the present case, function 414 is a sigmoidal threshold function which essentially averages the output to a smooth curve from a straight edge threshold function rising as a step from 0 up to 1. The weights for the inputs to the neural network are defined when the neural network is trained to recognize and identify certain known cells. The knowledge of the neural network is stored in the weights. The teaching process or operation sets the weights for each neuron in the network.
Fig. 14 diagrammatically illustrates the architecture of the neural network in the present application. This neural network as three paths designated path 510, for the input vectors related to the entire cell color characteristics, path 512, handling the miscellaneous parameters such as the cell area, nucleus area, cytoplasm area, and nucleus features such as texture values, and a third path 514 handling color characteristics of the cell nucleus. Each of these paths in relatively independent and they converge towards an output layer. Neural network also includes an input layer 516, a first hidden layer, a second hidden layer, a third hidden layer and an output layer 518. The hue vector for the entire cell is represented by 64 values, the saturation vector for the hue cell is also by 64 values as well as the intensity or brightness vector for the cell. Neuron group 520 is an input layer composed of 192 inputs, 64 inputs each for the hue, saturation, and intensity histograms of the cell. Neuron group or cluster group 522 is fully connected (FC) as shown by the three dots, to neuron group 520. Group 522 of hidden layer one consists of 30 neurons. Hidden layer 2 for path 510 consists of ten neurons in group or cluster 524. Again, neuron cluster 522 is fully connected to neuron group 524. Fully connected means that each input is connected to each neuron in the layer immediately below. Neuron cluster 524 is fully connected to hidden layer 3 consisting of neuron cluster 526 comprising 20 neurons. The second path 512 for the miscellaneous perimeters of cell area, nucleus area, cytoplasm area, maximum probability from texture matrix, element of difference moment, and uniformity factor (7 inputs) is fed to input neuron cluster 528. Neuron cluster 528 is 7 neurons which hold the inputs for that path. Neuron cluster 528 is fully connected to neuron cluster 430 of the first hidden layer that consists of 10 neurons. Neuron cluster 530 is fully connected to hidden layer 3, neuron cluster 526.
With respect to path 514, input layer 516 includes 128 neurons as neuron cluster 532. Cluster 532 is fully connected to the first hidden layer consisting of 20 neurons in cluster 534. Cluster 534 is fully connected to neuron cluster 536 of the second hidden layer for path 514. Neuron cluster 536 is fully connected to hidden layer three, neuron cluster 526. Neuron cluster 526 of hidden layer three is fully connected to output layer 518 which consists, in this application, of 13 neurons in cluster 538. Each neuron in cluster 538 holds an output from the neural network. Each output neuron corresponds to a different type of white cell class and these classes are shown in the white cell class table. As an example of some output values in neuron cluster 538, for a single cell whose color characteristics (entire cell) , nucleus color characteristics and miscellaneous cell feature values are processed through the neural network, output layer may hold: SEG.O; LYMPH 0.61; MONO. 0.30; EOSIN 0.1, etc. Step 356 in the classify white e
25 cell routine (Fig. 12) analyzes these results and either picks the maximum, which in the exemplary case is LYMPH. , or may determine whether one or more of the output neuron values exceeds an output threshold value. For example, the output threshold value may be 0.60. In this case, the LYMPH white cell classification would be selected as the classification for that particular white cell. However if two output neurons have a value of more than 0.60, the program would classify this cell as being "non-classifiable". Similarly, if all the output neurons had a value less than 0.60, the white cell would be declared non-classifiable.
A three path neural network was chosen because of the complexity and the number of different vectors involved in the cellular analysis. It is possible that a neural network having a single path would be appropriate depending upon the cell under study. Whole cell color vectors are preferably input into a single path network. Also, the number of hidden layers was expanded from 1 hidden layer to 3 hidden layers in order to increase the speed of processing as well as to increase the speed of teaching the neural network to recognize traits of white blood cells. The inputs to the neural network are rotationally and positionally invariant because color characteristics of the entire cell, color characteristics of the cell nucleus, and miscellaneous of the cell features, are extracted by the vision system and the vectors or values representing these characteristics are fed to the neural network. If the cell images were directly fed into the neural network, any change in the rotation or the position of the cell may affect the teaching speed of the network and the speed of processing once the network has been properly taught. Accordingly, the system is insensitive to rotational and positional aspects of the white cells.
In order to teach this network, a backward error propagation for a feed forward neural network was utilized. The following teaching set table was utilized to teach the working embodiment of the network.
TEACHING SET TABLE Seg. : 100
Figure imgf000028_0001
In general, the backward error propagation formula is as follows: the change in weight is equal to the learning rate times the error times the output plus the momentum time the previous change in weight. This is found in a book entitled "Parallel Distributed Processing" by D.E. Rumelhart, et al, 1986. The present network was taught using back propagation with above listed training set. The learning rate was 0.2. The momentum was 0.6, the pattern error was 0.01 and the convergence time was 10. With the taught neural network, a 100 white cell differential can be found in about ten minutes. Essentially the neural network is taught by feeding the network the teaching set and checking the output neurons. The programmer then reviews the cells classified and determines whether the output obtained is correct for that cell. If the output is erroneous, then the weights in the interneural connections are changed based upon the formula.
Returning to Fig. 8B, the classification data and image data for each white cell is stored on a per cell block basis for the entire image frame window. Each single cell is classified and the cell block color data and the classification of the cell and the location of the cell on the slide is stored. Step 288 repeats the extract cell routine, calculate features routine, classify white cells and store routines for all the cells within the frame window.
The program then jumps to jump point A-3 in Fig. 8C. The next few steps in the main program correspond to the general step 160 in Fig. 4 relating to calculating and analyzing other cell types. In Fig. 8C, step 610 masks out the white cells from the red, green and blue frame buffers for frame FrA. It should be noted that these few next steps could be conducted in parallel after the extract white cell and nucleus step 280 in Fig. 8B. Accordingly, it is not necessary that all the steps as shown herein be done sequentially since there are opportunities for parallel processing and multitasking. The hue, in step 612 is calculated for the modified frame A. In step 614, the purple pixel blocks less than pixel block size X are counted. These purple pixel blocks of small size correspond to platelets P in the picture shown in Fig. 7. The platelets generally fall within the purple hues and are very small in relation to both the white cells W and the red blood cells R. In step 616, the small size purple pixel count is discounted or reduced for error and the new value is assigned as the platelet count. According to standard blood smear analysis, a rough count of platelets is made to determine whether sufficient platelets are present in the blood. Accordingly, the platelet count is only an estimate. The discount for error in step 616 is necessary to eliminate noise in the video frame. In step 618, the classification and count is made of the red blood cells. Details of step 618 are found in Fig. 15 which is a flow chart of the red blood cell classification routine.
The red blood cell classification in Fig. 15 begins with step 710 which utilizes the frame A red, green and blue buffers, modified by masking out the white blood cells. The following red cell class table identifies and describes the 7 different types of red cells classified by the program
RED CELL CLASS TABLE a. crenated erythrocytes: star or spiked circles. b. leptocytes: target cells (look like bullseye) . c. drepanocytes: sickle cells (look like stringbeans) . d. poikilocytes with pointed and blunt features (look pear shaped, sometime with double bulbs protruding from irregular circle, blob-like appendages) . e. hypochromic erythrocytes (circular with red borders and white centers) . f. diffusely basophilic or polychromatophilic (look bluish compared to other red cells) . g. normal red cells (round and all red) .
The red cell class table includes the medical name of the cell, the informal name of the cell and a brief description of what the cell looks like. All the cells except for diffusely basophilic (class f) are red due to the Wright stain in the blood smear. Step 712 calculates a hue frame based upon the modified frame A, that is, the frame wherein the white cells are masked out. Step 714 counts the purple pixel blocks greater than Y size. A narrow color band threshold or filter would be utilized in order to identify the diffusely basophilic or class f cells since those cells are blue rather than red. In step 718, the class f cells are counted. In step 714, pixel blocks larger than Y size are chosen to eliminate any recognition of the smaller purple platelets in the video frame.
In step 720, the green buffer is utilized and a threshold is taken to create a binary image of the cells. Again, the green buffer is used because red cells have the smallest amount of green values. Accordingly, the red cells in the green buffer appear dark. The thresholding operates on the intensity histogram of the green buffer since the morphology or the shape of the remaining red blood cells is determined by the balance of the red blood cell classification sub-routine which is not sensitive to color differences. In step 722, the program identifies the red cell blocks by identifying pixel blocks larger than Y size and stores the binary image of the cell blocks in a compact red cell-shaped
(RC-shape) frame. A special RC-shape frame is utilized in order to conduct morphological open and morphological close operations on the entire frame. These operations use a relatively large amount of processing time. Since the red blood cell shapes do not occupy the entire video frame under study, these shapes can be placed in a special RC-shaped frame and the entire shape frame can be processed thereby reducing intermediate or background noise and shortening the processing time. Decision step 724 determines whether the RC-shape frame is full. If it is not full, the no branch is taken and the program jumps from jump point A-5 to jump point A-5 in Fig. 8C immediately after the classify and count red blood cells 618. As is described in detail later, the general program then cycles through in processes by obtaining a new video frame. Eventually the RC-shape frame becomes full and the yes branch is taken from decision step 724 (Fig. 15) . The next step
726 conducts a morphological open operation on the RC-shape frame.
A determination is made in decision step 728 whether the cell shapes are round, that is, whether the cell shapes are different. Essentially, the morphological open operation is configured as a shape filter that filters out or separates the round shaped cells from those cells that are not round. The cells that are not round in the red cell classification table are class a, star or spiked circle; class c, sickle cells (which look like string beans) and class d which are pear-shaped or have double bulbs protruding from an irregular circle or are blob-like with appendages. The other red blood cells are round. The non-round cells or the cells that drop out after the morphological open operation are identified on the yes branch and a further decision step 730 is made as to whether the cells are shaped as linear strings. If the cells are linear strings, a count of the remaining cells is made and this count is classified as class c, sickle cell, cell count in step 732. If the cells are not shaped as linear strings, the no branch is taken, and step 734 determines whether the cells have spiked edges by a spike edge detection routine. The spike edge detection routine in decision step 736 determines whether the cells are class a, that is, shaped as star or spiked circles. The program counts the spiked circles in step 738. If the cells are class d, pointed with blunt features, these cells are counted in step 740.
If the no branch is taken from the cell difference decision block 728, that is, the red cell blocks in the RC-shape frame are round, step 760 conducts a morphological close operation on the remaining red cell shapes. A morphological close operation determines whether the red cell shapes are solid in color, indicating a normal red cell, or whether the cells have a white center or look like a bullseye, that is, with a plurality of inner rings. Decision step 762 separates the open red cell shapes from the closed red cell shapes. The no branch from decision step 762 identifies the number of normal red blood cells, class g, in step 764. If the yes branch is taken from decision step 762, cell class b, target cells and cell class e, red cells with red borders and white centers, are passed to decision step 766 which determines whether the center of the cells under study are white. If the center of cell is white, as determined by an appropriate shape identification routine, the yes branch is taken and in step 768, a count is taken of class e cells. If the no branch is taken, target cells are identified as class b cells in step 770. All of these counting steps lead to step 772 which stores the non- classified cell count by comparing any remaining pixel blocks in the RC-frame that have not dropped out through the decision chain. After the red blood cell classification routine ends, the main program picks up in Fig. 8C with step 620 which discards the center points identified in the gross location identification for the white blood cells within the high power target window of frame A. In step 622 the microscope is moved to the next center point on the list beyond the previously identified center points. This location identification was made in step 256 in Fig. 8A. In the program, decision step 624 which determines whether the list is done. If the list ctptn is not done, the program jumps to jump point A-4 in Fig. 8B immediately proceeding extract white cell and nucleus step 280. Alternatively, jump point A-4 may be advanced to a point immediately proceeding the conduct focus routine 276 if it is found that, due to the movement of the microscope, a new focus must be obtained after the scope moves.
If decision step 624 determines that the list is done, the program takes the yes branch and step 626 displays all non- classifiable white cell blocks to the operator for assistance in classifying those cells. In step 628, the operator inputs data regarding the non-classified white blood cells. In step 630, the system displays data regarding all the white cells classified, the red blood cell count, the abnormal red blood cell count and the platelet count. In step 632 the date is stored for the slide, the white cell block color data, classification data, red cell count data, both normal and abnormal, the name of the operator and the results of the entire test.
The claims appended hereto are meant to cover modifications and changes within the scope and spirit of the present invention.
What is claimed is:

Claims

_> 31CLAIMS
1. A system for microscopic analysis and classification of a plurality of stained cells comprising: means for obtaining color image signals representative of a microscopic image of said stained cells; means for identifying a single cell within said plurality of stained cells by locating signals falling within a predetermined color band; means for obtaining vectors representing at least one set of color characteristics of said single cell from the identified cell signals; a neural network means for classifying said single cell based upon said vectors.
2. A system as claimed in claim 1 including means for identifying a nucleus within said single cell by locating signals exceeding a predetermined chromaticness threshold within said color band, wherein said means for obtaining vectors obtains cell vectors and includes means for obtaining nucleus vectors representing at least one set of color characteristics of said nucleus from the identified nucleus signals, and wherein said neural network means classifies said cell based upon said cell vectors and said nucleus vectors.
3. A system as claimed in claim 1 including means for identifying a nucleus within said single cell by locating signals exceeding a predetermined chromaticness threshold within said color band, wherein said means for obtaining vectors obtains cell vectors and includes means for obtaining nucleus vectors representing at least one texture characteristic of said nucleus from the identified nucleus signals, and wherein said neural network means classifies said cell based upon said cell vectors and said nucleus vectors.
4. A system as claimed in claim 2 wherein said neural network means includes at least a dual path, feed forward network of neurons that converge towards an output neural layer, wherein one path receives said cell vectors and the other path receives said nucleus vectors.
5. A system as claimed in claim 2 including means for identifying cellular size feature data from said identified cell signals, and wherein said neural network means classifies said cell based upon said cell vectors, said nucleus vectors and said cell feature data.
6. A system as claimed in claim 5 wherein said neural network means includes a three path, feed forward network of neurons that converge towards an output neural layer, wherein a first path receives said cell vectors and a second path receives said nucleus vectors, and a third path receives said cell feature data.
7. A system as claimed in claim 6 wherein said means for identifying said single cell includes a hue filter to locate signals falling within said color band.
8. A system as claimed in claim 7 wherein said means for identifying said nucleus includes a saturation filter for locating signals exceeding said chromaticness threshold.
9. A system as claimed in claim 8 wherein each path of said network of neurons is fully connected.
10. A system as claimed in claim 9 wherein said first and second path within said network of neurons includes at least two layers of neuron groups.
11. A system as claimed in claim 1 wherein said one set of color characteristics represents one of a hue, a saturation and an intensity histogram.
12. A system as claimed in claim 2 wherein said one set of color characteristics represents one of a hue, a saturation and an intensity histogram.
13. A system as claimed in claim 2 wherein said one set of color characteristics of said single cell represents a hue histogram and said one set of color characteristics of said nucleus represents a saturation histogram.
14. A system as claimed in claim 1 including means for identifying cellular size feature data from said identified cell signals, and wherein said neural network means classifies said cell based upon said cell vectors and said cell feature data.
15. A system as claimed in claim 14 wherein said neural network means includes a two path, feed forward network of neurons that converge towards an output neural layer, wherein a first path receives said cell vectors and a second path receives said cell feature data.
16. A system for microscopic analysis and classification of a plurality of stained cells on a slide under a microscope comprising: means for obtaining color image signals representative of a microscopic image of said stained cells via said microscope; means for scanning said slide and finding a feather edge of said stained cells on said slide based upon said color image signals; a low power cell position identifier that locates a multiplicity of single cells of a predetermined type near said feather edge; means for positioning said microscope adjacent a first one of said multiplicity of located single cells; means for isolating, under a high power, color image signals representing single cells of said first type of cell with a color filter about a predetermined color band; means for identifying a nucleus within said color image signals representative of said single cells with a further color filter; means for obtaining cell and nucleus vectors representing at least one set of color characteristics of said single cell from the color image signals representative of said single cells and the respective nucleus therein; a neural network means for classifying each said single cell based upon said vectors.
17. A system for microscopic analysis and classification of a plurality of stained cells comprising: a camera generating color image signals representative of a microscopic image of said stained cells; a color signal filter passing color image signals falling within a predetermined color band about a predetermined stain cell color wherein the passed color image signals represent single cells of said plurality of cells; a vector generator converting the passed color image signals into color vectors unique to each said single cell; and a neural network operating on said unique color vectors, said neural network having an output layer classifying said single cells.
18. A system for microscopic analysis and classification of a plurality of stained cells comprising: means for obtaining color image signals representative of a microscopic image of said stained cells; means for identifying a single cell within said plurality of stained cells by locating signals falling with a predetermined color band; means for obtaining positionally and rotationally invariant data on said single cell from the identified cell signals; and, a neural network means for classifying said single cell based upon said positionally and rotationally invariant data.
19. A system as claimed in claim 18 wherein said means for obtaining rotationally and positionally invariant data generates the cell data utilizing color filters and shape filters.
20. A system as claimed in claim 1 wherein said means for identifying includes means for grossly identifying single cell positions and means for extracting cellular color image signals; said means for obtaining vectors operating on said cellular color image signals.
21. A system as claimed in claim 20 wherein said means for grossly identifying utilizes a color intensity filter.
22. A system as claimed in claim 21 wherein said means for extracting utilizes a hue filter and a shape filter.
23. A system as claimed in claim 2 wherein said means for identifying includes means for grossly identifying single cell positions and means for extracting cellular color image signals; said means for obtaining vectors operating on said cellular color image signals.
24. A system as claimed in claim 23 wherein said means for grossly identifying utilizes a color intensity filter.
25. A system as claimed in claim 24 wherein said means for extracting utilizes a hue filter and a shape filter.
26. A system as claimed in claim 2 wherein said means for identifying a nucleus utilizes a saturation filter and a shape filter.
27. A method for classifying a plurality of stained cells comprising the steps of: obtaining a color image of said stained cells under a microscope at a first magnification; locating a feather edge of said plurality of stained cells; scanning said edge and grossly identifying single cell positions; increasing the magnification of said microscope substantially beyond said first magnification and obtaining a further color image of said stained cells about said cell positions; deriving at least one of color, shape and size data from said further color image for single cells at said cell positions; providing a neural network; o
37 classifying said simple cells with said neural network utilizing the single cell data.
28. A method as claimed in claim 27 wherein the step of locating said feather edge determines whether an object to background intensity ratio of said color image exceeds a threshold.
29. A method as claimed in claim 28 wherein the step of grossly identifying single cell positions determines whether predetermined color data from said color image exceeds a threshold.
30. A method as claimed in claim 29 wherein said thresholding color data step filters said color image about a predetermined color band.
31. A method as claimed in claim 27 wherein the step of grossly identifying single cell positions filters said color image by cell shape.
32. A method as claimed in claim 29 wherein the step of grossly identifying simple cells filters the thresholded color data by cell shape.
33. A method as claimed in claim 27 including the step of returning to the first grossly identified cell position prior to the step of increasing magnification.
34. A method as claimed in claim 27 including the step of storing further color images of single cells and optionally displaying said further color images of said single cells subsequent to classifying said single cell.
35. A method for classifying stained cells from microscopic color images of said stained cells comprising the steps of: identifying single cell color images from said microscopic color images; extracting at least one of color, shape and size data from said single cell color images; providing a trained neural network; and, classifying the single cells based upon the single cell data with said neural network.
36. A method as claimed in claim 35 including the step of storing said single cell color images for later display.
37. A method as claimed in claim 36 wherein the step of identifying includes identification of a first type of single cells to be classified by said neural network and identification of a second type of single cells, the method further comprising the steps of: filtering out said first type of single cells from said microscopic color images; and, classifying said second type of single cells by filtering cell shapes, sizes and color.
38. A method as claimed in claim 37 including the step of counting the cells filtered by shapes, size and color.
PCT/US1991/004410 1990-06-21 1991-06-21 Cellular analysis utilizing video processing and neural network WO1991020048A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US54170590A 1990-06-21 1990-06-21
US541,705 1990-06-21

Publications (1)

Publication Number Publication Date
WO1991020048A1 true WO1991020048A1 (en) 1991-12-26

Family

ID=24160696

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1991/004410 WO1991020048A1 (en) 1990-06-21 1991-06-21 Cellular analysis utilizing video processing and neural network

Country Status (2)

Country Link
AU (1) AU8229591A (en)
WO (1) WO1991020048A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997011350A2 (en) * 1995-09-19 1997-03-27 Morphometrix Technologies Inc. A neural network assisted multi-spectral segmentation system
WO1997041416A1 (en) * 1996-04-27 1997-11-06 Boehringer Mannheim Gmbh Automated, microscope-assisted examination process of tissue or bodily fluid samples
WO2013037119A1 (en) * 2011-09-16 2013-03-21 长沙高新技术产业开发区爱威科技实业有限公司 Device and method for erythrocyte morphology analysis
EP3598194A1 (en) * 2018-07-20 2020-01-22 Olympus Soft Imaging Solutions GmbH Method for microscopic assessment
EP3608701A1 (en) * 2018-08-09 2020-02-12 Olympus Soft Imaging Solutions GmbH Method for providing at least one evaluation method for samples

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4129854A (en) * 1976-10-25 1978-12-12 Hitachi, Ltd. Cell classification method
US4965725A (en) * 1988-04-08 1990-10-23 Nueromedical Systems, Inc. Neural network based automated cytological specimen classification system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4129854A (en) * 1976-10-25 1978-12-12 Hitachi, Ltd. Cell classification method
US4965725A (en) * 1988-04-08 1990-10-23 Nueromedical Systems, Inc. Neural network based automated cytological specimen classification system and method
US4965725B1 (en) * 1988-04-08 1996-05-07 Neuromedical Systems Inc Neural network based automated cytological specimen classification system and method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
IEEE ASSP MAGAZINE, April 1987, LIPPMAN, "An Introduction to Computing with Neural Nets", pages 4-22. *
NEURAL NETWORKS, Vol. 1, No. 1, 1988, DAYHOFF et al., "Segmentation of True Color Microscopic Images Using a Back Propagating Neural Network", page 169. *
Report No. CONF-871175L: RADIOLOGICAL SOCIETY OF NORTH AMERICA INC., 29 November 1987, OLDHAM et al., "Neural Recognition of Mammographic Lesions", page 318. *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997011350A2 (en) * 1995-09-19 1997-03-27 Morphometrix Technologies Inc. A neural network assisted multi-spectral segmentation system
WO1997011350A3 (en) * 1995-09-19 1997-05-22 Morphometrix Techn Inc A neural network assisted multi-spectral segmentation system
AU726049B2 (en) * 1995-09-19 2000-10-26 Veracel Inc. A neural network assisted multi-spectral segmentation system
US6463425B2 (en) 1995-09-19 2002-10-08 Morphometrix Technologies Inc. Neural network assisted multi-spectral segmentation system
WO1997041416A1 (en) * 1996-04-27 1997-11-06 Boehringer Mannheim Gmbh Automated, microscope-assisted examination process of tissue or bodily fluid samples
US6246785B1 (en) 1996-04-27 2001-06-12 Roche Diagnostics Gmbh Automated, microscope-assisted examination process of tissue or bodily fluid samples
WO2013037119A1 (en) * 2011-09-16 2013-03-21 长沙高新技术产业开发区爱威科技实业有限公司 Device and method for erythrocyte morphology analysis
US9170256B2 (en) 2011-09-16 2015-10-27 Ave Science & Technology Co., Ltd Device and method for erythrocyte morphology analysis
EP3598194A1 (en) * 2018-07-20 2020-01-22 Olympus Soft Imaging Solutions GmbH Method for microscopic assessment
EP3598195A1 (en) 2018-07-20 2020-01-22 Olympus Soft Imaging Solutions GmbH Method for microscopic assessment
CN110806636A (en) * 2018-07-20 2020-02-18 奥林巴斯软成像解决方案公司 Microscopic analysis method
US11199689B2 (en) 2018-07-20 2021-12-14 Olympus Soft Imaging Solutions Gmbh Method for microscopic analysis
CN110806636B (en) * 2018-07-20 2024-01-23 奥林巴斯软成像解决方案公司 Microscopic analysis method
EP3608701A1 (en) * 2018-08-09 2020-02-12 Olympus Soft Imaging Solutions GmbH Method for providing at least one evaluation method for samples

Also Published As

Publication number Publication date
AU8229591A (en) 1992-01-07

Similar Documents

Publication Publication Date Title
US5933519A (en) Cytological slide scoring apparatus
EP1977371B1 (en) Method and system for identifying illumination fields in an image
US5764792A (en) Method and apparatus for processing images
Singhal et al. Local binary pattern for automatic detection of acute lymphoblastic leukemia
EP0336608B1 (en) Neural network based automated cytological specimen classification system and method
CA2228062C (en) Robustness of classification measurement
Shirazi et al. Efficient leukocyte segmentation and recognition in peripheral blood image
US6330350B1 (en) Method and apparatus for automatically recognizing blood cells
Gautam et al. Classification of white blood cells based on morphological features
WO1996009606A1 (en) Field prioritization apparatus and method
WO1992013308A1 (en) Morphological classification system and method
CN105320970B (en) A kind of potato disease diagnostic device, diagnostic system and diagnostic method
Sobrevilla et al. White blood cell detection in bone marrow images
Kumari et al. Performance analysis of support vector machine in defective and non defective mangoes classification
WO1991020048A1 (en) Cellular analysis utilizing video processing and neural network
GB2329014A (en) Automated identification of tubercle bacilli
Sabino et al. Toward leukocyte recognition using morphometry, texture and color
Francis et al. Screening of bone marrow slide images for leukemia using multilayer perceptron (MLP)
CN114037868B (en) Image recognition model generation method and device
Mui et al. Automated classification of blood cell neutrophils.
GB2305723A (en) Cytological specimen analysis system
WO2020120039A1 (en) Classification of cell nuclei
Gunasinghe et al. Domain generalisation for glaucoma detection in retinal images from unseen fundus cameras
Shelke et al. Diabetic Retinopathy Detection Using SVM
Vejjanugraha et al. An automatic screening method for primary open-angle glaucoma assessment using binary and multi-class support vector machines

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BB BG BR CA FI HU JP KP KR LK MC MG MW NO RO SD SU

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE BF BJ CF CG CH CI CM DE DK ES FR GA GB GN GR IT LU ML MR NL SE SN TD TG

NENP Non-entry into the national phase

Ref country code: CA

122 Ep: pct application non-entry in european phase