WO2000004497A1 - Automatic masking of objects in images - Google Patents

Automatic masking of objects in images Download PDF

Info

Publication number
WO2000004497A1
WO2000004497A1 PCT/US1999/015796 US9915796W WO0004497A1 WO 2000004497 A1 WO2000004497 A1 WO 2000004497A1 US 9915796 W US9915796 W US 9915796W WO 0004497 A1 WO0004497 A1 WO 0004497A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
intensity
threshold
pin
background
Prior art date
Application number
PCT/US1999/015796
Other languages
French (fr)
Inventor
Bruce E. Ii Desimas
Jeff A. Levi
Original Assignee
The Perkin-Elmer Corporation Pe Biosystems Division
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Perkin-Elmer Corporation Pe Biosystems Division filed Critical The Perkin-Elmer Corporation Pe Biosystems Division
Priority to AU49898/99A priority Critical patent/AU754884B2/en
Priority to AT99933959T priority patent/ATE217995T1/en
Priority to JP2000560543A priority patent/JP2002520746A/en
Priority to CA002336885A priority patent/CA2336885A1/en
Priority to DE69901565T priority patent/DE69901565T2/en
Priority to EP99933959A priority patent/EP1095357B1/en
Publication of WO2000004497A1 publication Critical patent/WO2000004497A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10064Fluorescence image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30072Microarray; Biochip, DNA array; Well plate

Definitions

  • This invention relates to digital imaging, particularly boundary determination of objects in images, and more particularly such determination in optical analytical instruments.
  • light is received from a plurality of samples to effect separate image areas associated with the samples.
  • the light may be transmitted through the samples, or may be emitted by the samples such as with fluorescence or chemiluminescence samples.
  • An analysis is performed for each of the image areas.
  • chemiluminescence light is emitted via a chemical reaction and may be detected with a luminometer.
  • fluorescence for example, in an optical instrument for monitoring polymerase chain reaction production of DNA in a reaction apparatus, fluorescing dye is attached to double stranded DNA so as to produce fluorescence when excited with high energy light.
  • fluorescing dye is attached to double stranded DNA so as to produce fluorescence when excited with high energy light.
  • AOI area of interest
  • objects objects
  • the intensities of the objects correspond to the concentration of DNA or other sample material, and the intensities are processed by computer to provide such information.
  • computer programming is needed to effect adaptive masking to accurately identify object areas from the image data.
  • acquisition includes sampling, digitization, storage and (optionally) compression.
  • Manipulation includes enhancement (if required), segmentation and morphology.
  • Extraction involves object representation and/or quantitation for the desired purpose.
  • the present invention is particularly directed to segmentation which preliminarily defines object edges from grey-scale images, and extraction of the objects for higher level quantitation.
  • a histogram is used to analyze intensities.
  • a "histogram” has ordinary meaning herein as a plot (or stored data equivalent) of frequency of each intensity vs. intensity for an image or section thereof.
  • a threshold intensity in a histogram separates the intensity clusters of objects and background. Once thresholds for each pixel are determined, object pixels can be identified and preliminary edges of the objects detected, subject to refinement, such as by morphological operations.
  • Thresholding is reviewed, for example, in an article "A Survey of Thresholding Techniques" by P. Sahoo, S. Saltani and A. Wong, Computer Graphics and Image Processing, 41_, 233-260 (1988).
  • histogram data is fitted to curves such as a sum of two
  • Gaussian curves (one for each cluster in a "bimodal" histogram), and a threshold intensity is determined stastically to minimize overlap areas.
  • Another method for computing threshold is iterative selection taught in an article "Picture Thresholding Using an Iterative Selection Method” by T.W. Ridler and S. Calvard, IEEE Transactions on Systems, Man and Cybernetics, Vol. SMC-8, No. 8, Jan. 1978.
  • Yet another method is taught in an article "A Threshold Selection Method from Gray-Level Histograms” by N. Otsu, IEEE Transactions on Sytems, Man and Cybernetics, SMC-9, No. 1, Jan. 1979. This involves a criterion function that is selected to maximize the separation of the two pixel classes to give the best threshold. The function is derived from discriminant analysis.
  • the foregoing thresholding methods have been applied to entire images, to determine a "global threshold" which is adequate when all object and background pixels have well separated grey levels.
  • the global threshold may give poor results by returning too many erroneous object pixels in some regions and missing object pixels in others.
  • edge detection Another form of segmentation is edge detection.
  • Several techniques have been used for identifying object boundaries or edges as taught, for example, in an article "A Survey of Edge Detection Techniques” by L. Davis, Computer Graphics and Image Processing, 4 248-270 (1975).
  • a significant problem is pixel noise in the image, making the edges irregular and blurred.
  • region growing in which an initial "seed” pixel is selected and subsequent pixels are added to the region as they qualify.
  • a variation is to use “split and merge” where the image is successively split until each region is homogeneous. The image sub-regions are then merged if similar characteristics exist.
  • noise is a problem.
  • region growing extensive computations are used which impacts performance in both running time and computer emory.
  • a common method for image processing used after segmentation is morphological operations.
  • Morphological operations generally use a structuring system to define the object shape.
  • One such operation is dilation which grows or dilates the target object, and another is erosion which removes pixels from the object. These may be combined in either “opening” which is performed by eroding followed by dilation, or “closing” which is dilation followed by erosion.
  • An objective of the invention is to provide a novel method and a novel means for establishing boundaries of objects in a stored digital image associated with pixels that define an image area.
  • a particular objective is to provide such a method and a means for establishing boundaries automatically.
  • Another objective is to provide such a method and a means for establishing boundaries in an optical instrument for analyzing a plurality of samples such as by fluorescence or chemiluminescence.
  • Total intensity for a section of pixels is considered to be bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity.
  • the image area is divided into nonoverlapping blocks each formed of a plurality of pixels. Identification is made as to whether each block has unimodal intensity or bimodal intensity.
  • a threshold is computed for each block having bimodal intensity, such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity.
  • a block threshold is designated for each block, the block threshold for bimodal intensity being the computed threshold, and for unimodal intensity being an assigned threshold derived from neighboring block thresholds.
  • a pixel threshold is derived for each pixel from block thresholds, preferably by interpolating a pixel threshold for each pixel from block thresholds. From the pixel threshold, each pixel is identified as having either an object intensity and a background intensity.
  • a first value is assigned to each pixel having an object intensity, and a second value to each pixel having a background intensity, thereby converting image objects to binary objects. The binary objects are delineated to thereby establish the boundaries of the image objects.
  • the image area has bimodal intensity
  • the step of dividing the image area comprises n-fold (advantageously 4-fold) subdividing the image area into non-overlapping n-fold sections, and successively n-fold subdividing such sections a selected number of times until final sections define the blocks.
  • Each set of n-fold sections is considered to be a set of child sections of a parent that itself is a section or is the image area.
  • the step of designating an assigned threshold comprises identifying whether each section has unimodal intensity or bimodal intensity, computing thresholds for the image area and each section having bimodal intensity, and designating each computed threshold as a section threshold.
  • the step of designating an assigned threshold further comprises, for each section having unimodal intensity in a set of n-fold sections, designating an average of the computed thresholds for other sections in the set if said other sections have bimodal intensity, and otherwise designating the threshold intensity of the parent of the set, the section thresholds being designated successively through successive subdividings until a block threshold is designated for each block.
  • a search entity is defined mathematically as a combination of a pin, a rod pivotable on the pin, and a wheel attached to the rod approximately one pixel space from the pin.
  • the pin is constrained to move among object pixels
  • the wheel is constrained to move among background pixels.
  • the step of delineating the binary objects comprises, in sequence, steps of searching, alternately moving the wheel and the pin and, if necessary skipping. Searching is effected in a first-hand direction from a comer of the image area (e.g.
  • the pin is placed on the initial pixel with the wheel in an adjacent background pixel.
  • the wheel is moved in a second-hand direction (e.g. clockwise) opposite from the first-hand direction until the wheel is constrained by an object pixel.
  • the pin is moved in the first-hand direction (e.g. clockwise) to a next object pixel and, unless constrained, the pin is further moved until the pin is constrained by a background pixel.
  • an instrument receptive of light from a plurality of samples to effect separate image objects for analysis thereof.
  • the instrument includes a processor for effecting such analysis, with the processor comprising means for effecting the foregoing steps.
  • Objects are further achieved by a computer readable storage medium for utilization with an instrument receptive of light from a plurality of samples to effect separate image objects for analysis thereof.
  • the storage medium contains program code embodied therein so as to be readable by the processor, the program code including means for effecting the foregoing steps.
  • FIG. 1 is a schematic drawing of an instrument incorporating the invention.
  • FIG. 2 is an illustration of a computer screen display of an image area with objects corresponding to a plurality of light emitting samples with respect to the instrument of FIG. i.
  • FIG. 3 is an idealized histogram of frequency vs. intensity for a section of an image area such as illustrated in FIG. 2, illustrating a threshold between background and object intensities.
  • FIG. 4 is a schematic drawing of 4-fold subdividing of an image area such as in FIG. 2 into blocks.
  • FIG. 5 is a flow chart for determination of block thresholds and pixel thresholds utilizing a processor of the instrument of FIG. 1 and histograms such as in FIG. 3.
  • FIG. 6 is an illustration of a computer screen display of an image area showing block thresholds determined according to the flow chart of FIG. 5 for blocks effected according to the subdividing of FIG. 4.
  • FIG. 7 is a schematic drawing for a procedure of deriving pixel thresholds from block thresholds in an image area according to the flow chart of FIG. 5.
  • FIG. 9 is a flow chart for determination of object binderies from pixel intensities and the pixel thresholds determined according to the flow chart of FIG. 8.
  • FIG. 10 is a flow chart for a delineation procedure of the flow chart of FIG. 9.
  • FIG. 11 is a schematic drawing of a portion of an image area illustrating a "pin-and-wheel” procedure utilized in the flow chart of FIG. 10.
  • An apparatus 10 and method of the invention are generally useful for establishing boundaries of objects in a stored digital image associated with pixels that define an image area.
  • the images are generally of the type that are acquired by a video camera 12 which has a lens 14 to focus an image onto an array detector 16.
  • the detector may be, for example, a charge injection device (CLD) or, preferably, a charge coupled device (CCD).
  • CLD charge injection device
  • CCD charge coupled device
  • a conventional video camera containing a CCD detector, a lens and associated electronics for the detector should be suitable, such as an Electrim model 1000L which has 751 active pixels horizontal and 242 (non-interlaced) active pixels vertical.
  • This camera includes a circuit board that directly interfaces to a computer ISA bus.
  • Analog/digital (A/D) interfacing 17 and framegrabber circuitry are in this circuit but may be provided separately.
  • any other digital imaging device or subsystem may be used or adapted that is capable of taking still or freeze- frame images that are stored in memory 18, e.g. in a linear representation, for processing by a processing unit (CPU) 20 in a computer 22 and display of a processed image or associated information on a monitor 24.
  • the raw image typically is stored in a linear representation where scan lines (rows) of pixels are held sequentially in memory.
  • Image intensity for each pixel typically is stored as a byte or word of 8 or 16 bits respectively.
  • the invention is particularly advantageous with an optical analytical instrument 26 wherein light 28 is received from a plurality of samples 30 to effect separate image areas associated with the samples.
  • the DNA reaction apparatus has vials 32 as sample holders containing light-emitting sample material that effect a pattern of well defined regions 34 in the image area 36 (FIG. 2) in a low intensity background 37.
  • the intensity in each region is processed to provide information on the samples such as concentration of DNA.
  • AOI areas of interest
  • the present invention utilizing computer programming is useful for automatic adaptive masking to accurately identify object areas from the image data, in which the shapes may be irregular and the shapes and background may have non-uniform illumination.
  • the invention also should be more broadly useful for outlining objects in any image that may include scenery or the like.
  • the computer programming is conventional such as with C ++ language. Adaptations of the programming for the present invention from flow charts and descriptions herein will readily be recognized and achieved by those skilled in the art.
  • the flow charts represent means and steps for carrying out various aspects of the invention. Details of the computer and the programming are not important to the invention, and any conventional or other desired computer and programming components that would serve the purposes described herein should be deemed equivalent and interchangeable.
  • a conventional or other desired graphics program such as Image Pro PlusTM from Media Cybermetrics may be incorporated to display the image if desired.
  • TIFF Trigger Image Format
  • Adobe version 6.0 TLFF Trigger Image Format
  • a suitable development software package is Microsoft Visual C ++ Developer Studio. Grey scale 8-bit and 16-bit programming should be used without compression (or with lossless compage ression), as a present goal is to identify object shape based on intensity and contrast.
  • Each pixel has an associated intensity (brightness, represented digitally) corresponding to background or an object in the image. There also may be fluctuations due to noise.
  • a group or section of pixels may be bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity.
  • Bimodal intensity of a region may be seen in a grey-scale histogram 38 (FIG. 3) of frequency (number of pixels with a given intensity) against intensity. This histogram essentially is a combination of two bell shaped curves, one for background 40 and the other for an object 42.
  • a threshold 44 is computed from the pixels in a selected region of the image, using a histogram, for use in classifying the intensity for each pixel in the region as either object or background. Any of a variety of methods of computing the threshold may be used.
  • One conventional method is minimizing error of classification which assumes two Gaussian distributions.
  • a simplified form of this, useful for the present case, is to compute the half-way point between the two peaks using peak analysis. After smoothing, all peaks are found in the histogram. Then a two-stage hill-climbing method is used to identify the two major peaks corresponding to the "background" and "object" distributions.
  • the background peak is identified as the highest peak at low intensity, as found by a search of peaks from left to right.
  • the object peak is identified as the highest peak at high intensity, as found by a search of peaks from right to left.
  • an initial histogram size may be 10 bits, with additional bits added if the raw histogram does not contain at least 64 bins.
  • a test for bimodality should be performed, where two different peaks must be found and the valley must be lower than both peaks.
  • Another suitable method for computing the threshold is iterative selection taught in the aforementioned article by T.W. Ridler and S. Calvard. A preferred method is taught in the aforementioned article by N. Otsu, incorporated herein by reference in its entirety.
  • Unimodal intensity represents either background or an object, and "threshold" is zero representing "invalid". This will exist for a region small enough to encompass only an object or area of interest in a sample pattern, or a background area between objects or along the edge of the image. Procedures for dealing with this are described below.
  • the global (full) image area is sub-divided in a sequence of steps, advantageously utilizing quad-tree procedures (FIG. 4).
  • the image area 36 is fourfold subdivided into non-overlapping, fourfold sections 46 (i.e. with four sections in the image area) which preferably are dimensionally equal. These sections are similarly subdivided into four more sections 48 which may further be subdivided into more fourfold sections 50.
  • the successive fourfold subdividing is effected a selected number of times until a final pattern of sections constitute blocks of suitable size for analysis. (For clarity FIG. 4 shows successive child subdivisions of only one parent.
  • each set of fourfold sections are deemed to be child sections of a parent that is the image area or a section, in each pair of generation levels.
  • fourfold subdivisions advantageously provide sufficient detail for the present purpose, others may be used such as twofold, threefold or eightfold; this is symbolized more broadly herein as "n-fold”. Even more broadly, "n” may vary from level to level.
  • An initial threshold is computed 52 (FIG. 5) for each section through the subdivisions including each final block.
  • the computed threshold for unimodal intensity is zero, and for bimodal intensity is nonzero such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity.
  • the zero or nonzero computation effectively identifies whether each block has unimodal intensity or bimodal intensity. Zero is designated as an invalid threshold and, substituted therefor, a section threshold is derived from neighboring thresholds (siblings). If there are enough near (siblings) with nonzero, an average of their thresholds should be suitable, otherwise the threshold of the parent is designated. (Zero may be substituted with another value different than any possible values for the bimodal intensity.)
  • section thresholds are designated successively through the successive subdivided sections until a block threshold is established for each final block. If the computed threshold 55 for a child section 58 being considered is nonzero 54 (bimodal), the computed threshold is designated 56 to be the section threshold 57. If a computed threshold for the child section 58 is zero 60 (unimodal), the sibling sections 66 in its fourfold set are analyzed. If all of the computed thresholds 64 of the siblings are nonzero 68. then the section threshold for the section 58 under consideration is designated 56 to be an average 62 of the computed thresholds for sibling sections in the set. Otherwise 70, the previously-dete ⁇ nined section threshold 72 of the parent 74 is designated 56. Although an average may be sufficient if only two of the other sections have nonzero threshold intensity, preferably the average is utilized only if all three of the other sections have nonzero threshold intensity. Similar procedures can be used for other types of "n-fold".
  • A upper left threshold
  • B upper right threshold
  • a set of virtual outside pixels 79 is established outside of the boundary. These outside pixels are spaced from their nearest pixels by the same distance as the spacings between neighboring center pixels within the image boundary, and thus lie on an outer perimeter 87 spaced a half block out from the image boundary.
  • the outside pixels are given thresholds by extrapolation 77 from one or more neighboring pixels; for example, an outside pixel 79' may be given the same threshold as its nearest neighboring center pixel.
  • a linear extrapolation is made from two adjacent center pixels 67'.
  • the four corner outside pixels may extrapolated diagonally.
  • the interpolations for the pixels P' lying outside of the inner perimeter utilize the virtual pixels.
  • FIG. 8 shows a more uniformly varying pattern of pixel thresholds interpolated from the block thresholds of FIG. 6.
  • the thresholds for the outer pixels P' may be determined directly by extrapolation.
  • interpolation includes such extrapolation near the boundary.
  • identification 76 is made as to whether its actual intensity 78 is either object 80 or background 82.
  • a first value 84 (e.g. 1) is assigned 86 to each pixel having an object intensity
  • a second value 88 (e.g. 0) is assigned to each pixel having a background intensity, thereby converting image objects to binary objects.
  • the assigned values are arbitrary. In the case of light emitting samples (FIG. 2), the objects will be brighter than me background. More generally, for other tvpes of images, either could be brighter.
  • the binary objects are then delineated 90, and this delineation establishes the boundaries 92 of the image objects.
  • the global image area should be bimodal for the foregoing procedures, although in some circumstances this may not be necessary.
  • the image intensities should have sufficiently smooth gradients across the blocks in the image to support the assumption that the center pixel in each block has a threshold equal to the block threshold. For these reasons, at least where the procedures are used for a fixed pattern of objects as with an instrument with light emitting areas, the edges of the fixed objects may be determined only once with a suitable image area. Once these are determined, all subsequent measurements may be made from the same delineated areas. Alternatively, if there is the possibility of area drift, a model pattern may be provided for periodic use in delineating the object areas.
  • an initial neighbor evaluation is made to label every object pixel having at least one neighboring background pixel.
  • the neighbors may be 4-connected boundary pixels of the object (4 neighbors excluding diagonals) or, preferably for reduced ambiguities, 8-connected (all 8 neighbors including diagonals).
  • each such object pixel is assigned a third value (e.g. 2), thereby creating associated tertiary pixels that identify boundary objects.
  • Another technique for delineation is a "pin-and-wheel" search described below.
  • Advantageously both are used consecutively, with the 8-connected neighbor evaluation 91 (FIG. 10) to preliminarily label boundary or edge objects 93 for a purpose of initial searching for starting points, followed by the pin-and-wheel to refme and record the boundaries.
  • an initial scanning search 104 (FIGS. 10 & 11) is made beginning in the upper right corner 106 of the image area, traversing leftward line by line until an initial object pixel 108 is located.
  • Tf a neighbor evaluation 91 has preliminarily labeled edge (tertiary) objects 93, only the latter are searched for. The reason is that such a search more readily allows subsequent searches for an initial object pixel for other objects, without sorting out pixels of objects already delineated by the pin-and-wheel technique.
  • each pixel is considered to be either a background pixel 102 or an object pixel 103.
  • An object (non-background) pixel may have either the ordinary binary value (e.g. 1) or may be an object pixel labeled as a preliminary edge pixel 93 having a tertiary value (e.g. 2).
  • a search entity is defined mathematically as a combination of a pin 94, a rod 96 pivotable on the pin, and a wheel 98 attached to the rod approximately one pixel space from the pin. The pin is constrained to move only among object pixels, and the wheel is constrained to move only among background pixels.
  • the pin is placed 110 on the initial pixel with the wheel in an adjacent background pixel 112, preferably above the pin for a right-to-left search, so as to ensure that the wheel is outside of the object.
  • the wheel is moved 116 in a clockwise direction until the wheel is constrained by an object pixel 118.
  • the pin is moved 120 counterclockwise to a next object pixel 122.
  • a sort of walking motion (virtually by computer), the steps of moving the pin across object pixels are repeated until the pin is constrained 124 by a background pixel, and then the wheel is moved 116 again.
  • the pin and the wheel may both be found to be constrained simultaneously 126 by an adjacent object pixel 128 and an adjacent background pixel 130 respectively, h such case, the wheel is skipped 132 over the adjacent object pixel to a next background pixel 130, and the alternating moving of the pin and the wheel are resumed 116, 120.
  • These boundary pixels have their locations stored for future use in analyzing the objects, for example analyzing intensity of fluorescent samples.
  • the boundary pixels advantageously are given a fourth value (e.g. 3) replacing the original preliminary (tertiary) values which otherwise may be ignored or returned to the binary value (e.g. 1).
  • the boundary pixels thereby delineate a binary object and consequently an actual object.
  • the scanning search 104' is resumed to find an initial pixel for another object.
  • the search is made easier by searching only for preliminary (tertiary) edge pixels not identified as actual boundaries already identified by pin-and-wheel.
  • the steps of searching and applying pin and wheel are further repeated 136 to identify other binary objects until no further initial pixel is located.
  • the tertiary and fourth values may be returned to the binary value (e.g. 1 ). Alternatively the fourth values may be retained to allow display of the object boundaries.
  • the scanning search for an initial pixel for each object should be a grid search preferably beginning in any comer and going from right to left, left to right, etc.
  • the wheel motion may be clockwise or counterclockwise.
  • the pin motion must be opposite from the wheel motion to achieve the "walk".
  • the wheel motion should be "opposite" from the search direction, meaning if the search is in rows from right to left from the upper ⁇ ght (like an initial counterclockwise), then the wheel should be moved clockwise, and vice versa.
  • first-hand direction refers to the initial searching direction which may be right to left in successive lines starting from the upper right, or left to right in successive lines starting from the lower left, or top to bottom in successive columns starting from the upper left, or bottom to top starting from the lower right; i.e "counterclockwise” for the initial line or column search.
  • first- hand direction may initially be “clockwise”, e.g left to right from the upper left.
  • the pin motion should always be this same “first-hand direction”, i.e. counterclockwise if the search starts “counterclockwise”, or clockwise if the search starts "clockwise”.
  • the temi "secondhand direction" for the wheel means oppositely from the first-hand direction, i.e. clockwise if the initial search and wheel motion are counterclockwise, and vice versa.
  • search it is preferable for the search to begin at a comer, other search patterns may be suitable, in which case the term "first-hand direction” should be adapted accordingly and deemed equivalent and interchangeable to the foregoing patterns.

Abstract

Object boundaries, such as for light emitting samples in an optical analytical instrument, are determined in a digital image area by subdividing the image area into blocks of pixels. Thresholds between background and objects are determined for each block. Pixel thresholds are determined by interpolation of block thresholds. From each pixel intensity and its threshold, each pixel is assigned a bimodal intensity representing either object or background. Bimodal objects are delineated to determine the object boundaries. A pin-and-wheel procedure is disclosed for the delineation.

Description

AUTOMATIC MASKING OF OBJECTS IN IMAGES
This application claims benefit of copending provisional application serial No. 60/092,785 filed 07/14/98.
This invention relates to digital imaging, particularly boundary determination of objects in images, and more particularly such determination in optical analytical instruments.
BACKGROUND ART Digital imaging has evolved with the advent of digital processing and digital image acquisition such as with video cameras, ultrasound imaging, robotic vision, scanning of optical images such as photographs, x-rays and radar images, and the like. One aspect of such imaging is masking or determination of boundaries of objects or areas of interest in images so as to allow analysis of the objects such as object identification or classification. Studies and developments in masking have included counting of blood cells and detection of foreign cells in blood, detection of specified objects in x-rays, and surface analysis in materials science.
, In certain optical analytical instruments, light is received from a plurality of samples to effect separate image areas associated with the samples. The light may be transmitted through the samples, or may be emitted by the samples such as with fluorescence or chemiluminescence samples. An analysis is performed for each of the image areas. In chemiluminescence, light is emitted via a chemical reaction and may be detected with a luminometer. In fluorescence, for example, in an optical instrument for monitoring polymerase chain reaction production of DNA in a reaction apparatus, fluorescing dye is attached to double stranded DNA so as to produce fluorescence when excited with high energy light. Such an instrument is disclosed in co-pending provisional patent applications serial No. 60/085765 filed 16 May 1998, and serial No. 60/092784 filed My 14, 1998, filed as International Application No. PCT/US99/11088 on May 17, 1999, entitled "Instrument for Monitoring Polymerase Chain Reaction of DNA".
In such analytical instruments, light from the samples is focused on an array detector that effects an image formed of a low level background with light areas corresponding to the sample emissions, termed "areas of interest" (AOI) or "objects". The intensities of the objects correspond to the concentration of DNA or other sample material, and the intensities are processed by computer to provide such information. In order to allow the processing, it is necessary to identify boundaries for the areas of interest (AOI) in the image. Thus computer programming is needed to effect adaptive masking to accurately identify object areas from the image data.
There are three main categories in low-level vision or digital image processing, namely acquisition, manipulation and extraction. Acquisition includes sampling, digitization, storage and (optionally) compression. Manipulation includes enhancement (if required), segmentation and morphology. Extraction involves object representation and/or quantitation for the desired purpose. The present invention is particularly directed to segmentation which preliminarily defines object edges from grey-scale images, and extraction of the objects for higher level quantitation.
Successive steps for segmentation are thresholding and edge detection. A histogram is used to analyze intensities. (A "histogram" has ordinary meaning herein as a plot (or stored data equivalent) of frequency of each intensity vs. intensity for an image or section thereof.) A threshold intensity in a histogram separates the intensity clusters of objects and background. Once thresholds for each pixel are determined, object pixels can be identified and preliminary edges of the objects detected, subject to refinement, such as by morphological operations.
Thresholding is reviewed, for example, in an article "A Survey of Thresholding Techniques" by P. Sahoo, S. Saltani and A. Wong, Computer Graphics and Image Processing, 41_, 233-260 (1988). In a conventional method, histogram data is fitted to curves such as a sum of two
Gaussian curves (one for each cluster in a "bimodal" histogram), and a threshold intensity is determined stastically to minimize overlap areas. Another method for computing threshold is iterative selection taught in an article "Picture Thresholding Using an Iterative Selection Method" by T.W. Ridler and S. Calvard, IEEE Transactions on Systems, Man and Cybernetics, Vol. SMC-8, No. 8, Jan. 1978. Yet another method is taught in an article "A Threshold Selection Method from Gray-Level Histograms" by N. Otsu, IEEE Transactions on Sytems, Man and Cybernetics, SMC-9, No. 1, Jan. 1979. This involves a criterion function that is selected to maximize the separation of the two pixel classes to give the best threshold. The function is derived from discriminant analysis.
The foregoing thresholding methods have been applied to entire images, to determine a "global threshold" which is adequate when all object and background pixels have well separated grey levels. For images with spatially varying object and/or background intensities, the global threshold may give poor results by returning too many erroneous object pixels in some regions and missing object pixels in others.
This problem is addressed in an article "Automatic Boundary Detection of the Left Ventricle from Cineangiograms" by C. Chow and T. Kaneko, Computers and Biomedical Research 5, 388-410 (1972), which teaches a method whereby the image is divided into a fixed number of overlapping regions. Histograms are selected for regions with large variances and appreciable bimodality. Thresholds from the center points of these regions are interpolated to all image points. Results are presented in the article to successfully demonstrate feasibility. However, the static selection criteria used therein must be defined somewhat arbitrarily, and implementation may be complex.
Another form of segmentation is edge detection. Several techniques have been used for identifying object boundaries or edges as taught, for example, in an article "A Survey of Edge Detection Techniques" by L. Davis, Computer Graphics and Image Processing, 4 248-270 (1975). A significant problem is pixel noise in the image, making the edges irregular and blurred.
One common method for edge detection involves derivatives, where a derivative is taken of the intensity across pixel location in an image to depict significant changes in intensity. A second derivative finds the center in a transition range of edge pixels. Variations include derivative operations in multiple directions, as implemented in the Sobel operator (Pratt, W., "Digital Image Processing", John Wiley and Sons, Inc., New York, 1991, and Russ, John C, "The Image Processing Handbook", Second Ed., CRC Press, Boca Raton, 1994). A problem with derivatives is that the center of the transition range is not necessarily the exact edge because of uncertainties particularly due to noise.
A "Mexican hat filter" (Kurzweil, R., The Age of Intelligent Machines. Massachusetts Institute of Technology (1990)) has been developed to combine smoothing and edge detection using convolution techniques, as summarized, for example, in "Vision, A Computational Investigation into the Human Representation and Processing of Visual Information" by David Marr (W.H. Freeman and Co., N.Y. 1982)
Another category of segmentation is "region growing" in which an initial "seed" pixel is selected and subsequent pixels are added to the region as they qualify. A variation is to use "split and merge" where the image is successively split until each region is homogeneous. The image sub-regions are then merged if similar characteristics exist. There are advanced techniques for region growing but, again, noise is a problem. Also, in region growing, extensive computations are used which impacts performance in both running time and computer emory.
A common method for image processing used after segmentation is morphological operations.
Morphological operations generally use a structuring system to define the object shape. One such operation is dilation which grows or dilates the target object, and another is erosion which removes pixels from the object. These may be combined in either "opening" which is performed by eroding followed by dilation, or "closing" which is dilation followed by erosion.
Contour following is the act of finding the boundary or border of an object, (Ballard, Dana H. and Brown, Christopher M., "Computer Vision", Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1982, and Pavlidis, T., "Algorithms for Graphics and Image Processing", Computer Science Press, Rockville, MD, 1982). In simple algorithms, changes in initial pixel locations may produce different borders. This is a disadvantage because it is desirable to find the exact border independent of the initial search pixel. An objective of the invention is to provide a novel method and a novel means for establishing boundaries of objects in a stored digital image associated with pixels that define an image area. A particular objective is to provide such a method and a means for establishing boundaries automatically. Another objective is to provide such a method and a means for establishing boundaries in an optical instrument for analyzing a plurality of samples such as by fluorescence or chemiluminescence.
SUMMARY OF THE INVENTION The foregoing and other objectives are achieved, at least in part, by a method for establishing boundaries of objects in a stored digital image associated with pixels that define an image area, each pixel having an associated intensity corresponding to background or an object in the image. Total intensity for a section of pixels is considered to be bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity. The image area is divided into nonoverlapping blocks each formed of a plurality of pixels. Identification is made as to whether each block has unimodal intensity or bimodal intensity. A threshold is computed for each block having bimodal intensity, such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity. A block threshold is designated for each block, the block threshold for bimodal intensity being the computed threshold, and for unimodal intensity being an assigned threshold derived from neighboring block thresholds. A pixel threshold is derived for each pixel from block thresholds, preferably by interpolating a pixel threshold for each pixel from block thresholds. From the pixel threshold, each pixel is identified as having either an object intensity and a background intensity. A first value is assigned to each pixel having an object intensity, and a second value to each pixel having a background intensity, thereby converting image objects to binary objects. The binary objects are delineated to thereby establish the boundaries of the image objects.
In a preferred aspect, the image area has bimodal intensity, and the step of dividing the image area comprises n-fold (advantageously 4-fold) subdividing the image area into non-overlapping n-fold sections, and successively n-fold subdividing such sections a selected number of times until final sections define the blocks. Each set of n-fold sections is considered to be a set of child sections of a parent that itself is a section or is the image area. The step of designating an assigned threshold comprises identifying whether each section has unimodal intensity or bimodal intensity, computing thresholds for the image area and each section having bimodal intensity, and designating each computed threshold as a section threshold. The step of designating an assigned threshold further comprises, for each section having unimodal intensity in a set of n-fold sections, designating an average of the computed thresholds for other sections in the set if said other sections have bimodal intensity, and otherwise designating the threshold intensity of the parent of the set, the section thresholds being designated successively through successive subdividings until a block threshold is designated for each block.
Each pixel having the second value of intensity is deemed to be a background pixel, and each other pixel is an object pixel. In another preferred aspect, a search entity is defined mathematically as a combination of a pin, a rod pivotable on the pin, and a wheel attached to the rod approximately one pixel space from the pin. The pin is constrained to move among object pixels, and the wheel is constrained to move among background pixels. The step of delineating the binary objects comprises, in sequence, steps of searching, alternately moving the wheel and the pin and, if necessary skipping. Searching is effected in a first-hand direction from a comer of the image area (e.g. right to left from the upper right) to locate an initial object pixel, and the pin is placed on the initial pixel with the wheel in an adjacent background pixel. Unless constrained, and with the pin retained, the wheel is moved in a second-hand direction (e.g. clockwise) opposite from the first-hand direction until the wheel is constrained by an object pixel. Unless constrained, and with the wheel retained on its latest background pixel, the pin is moved in the first-hand direction (e.g. clockwise) to a next object pixel and, unless constrained, the pin is further moved until the pin is constrained by a background pixel. If the pin and the wheel are respectively constrained simultaneously by an adjacent object pixel and an adjacent background pixel, the wheel is skipped over the adjacent object pixel to a next background pixel. The steps of moving and (if necessary) skipping are repeated until the initial object pixel is reached. All pixels traversed by the pin are identified as boundary pixels, thereby delineating a binary object. The steps of searching, moving, identifying and skipping are further repeated until no further initial pixel is located, thereby delineating other binary objects.
Objectives are also achieved by an instrument receptive of light from a plurality of samples to effect separate image objects for analysis thereof. The instrument includes a processor for effecting such analysis, with the processor comprising means for effecting the foregoing steps.
Objects are further achieved by a computer readable storage medium for utilization with an instrument receptive of light from a plurality of samples to effect separate image objects for analysis thereof. The storage medium contains program code embodied therein so as to be readable by the processor, the program code including means for effecting the foregoing steps.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic drawing of an instrument incorporating the invention.
FIG. 2 is an illustration of a computer screen display of an image area with objects corresponding to a plurality of light emitting samples with respect to the instrument of FIG. i.
FIG. 3 is an idealized histogram of frequency vs. intensity for a section of an image area such as illustrated in FIG. 2, illustrating a threshold between background and object intensities.
FIG. 4 is a schematic drawing of 4-fold subdividing of an image area such as in FIG. 2 into blocks.
FIG. 5 is a flow chart for determination of block thresholds and pixel thresholds utilizing a processor of the instrument of FIG. 1 and histograms such as in FIG. 3.
FIG. 6 is an illustration of a computer screen display of an image area showing block thresholds determined according to the flow chart of FIG. 5 for blocks effected according to the subdividing of FIG. 4. FIG. 7 is a schematic drawing for a procedure of deriving pixel thresholds from block thresholds in an image area according to the flow chart of FIG. 5.
FIG. 8 is a computer screen display of an image area illustrating pixel thresholds determined according to the flow chart of FIG. 5.
FIG. 9 is a flow chart for determination of object binderies from pixel intensities and the pixel thresholds determined according to the flow chart of FIG. 8.
FIG. 10 is a flow chart for a delineation procedure of the flow chart of FIG. 9.
FIG. 11 is a schematic drawing of a portion of an image area illustrating a "pin-and-wheel" procedure utilized in the flow chart of FIG. 10.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
An apparatus 10 (FIG. 1) and method of the invention are generally useful for establishing boundaries of objects in a stored digital image associated with pixels that define an image area. The images are generally of the type that are acquired by a video camera 12 which has a lens 14 to focus an image onto an array detector 16. The detector may be, for example, a charge injection device (CLD) or, preferably, a charge coupled device (CCD). A conventional video camera containing a CCD detector, a lens and associated electronics for the detector should be suitable, such as an Electrim model 1000L which has 751 active pixels horizontal and 242 (non-interlaced) active pixels vertical. This camera includes a circuit board that directly interfaces to a computer ISA bus. Analog/digital (A/D) interfacing 17 and framegrabber circuitry are in this circuit but may be provided separately. Essentially any other digital imaging device or subsystem may be used or adapted that is capable of taking still or freeze- frame images that are stored in memory 18, e.g. in a linear representation, for processing by a processing unit (CPU) 20 in a computer 22 and display of a processed image or associated information on a monitor 24. The raw image typically is stored in a linear representation where scan lines (rows) of pixels are held sequentially in memory. Image intensity for each pixel typically is stored as a byte or word of 8 or 16 bits respectively. The invention is particularly advantageous with an optical analytical instrument 26 wherein light 28 is received from a plurality of samples 30 to effect separate image areas associated with the samples. Particularly of interest is an optical instrument for monitoring polymerase chain reaction production of DNA in a reaction apparatus, fluorescing dye is attached to double stranded DNA so as to produce fluorescence when excited with high energy light. Such an instrument is disclosed in the aforementioned co-pending provisional patent application serial No. 60/085765 filed 16 May 1998 [Attorney docket No. BT-4584] incorporated herein by reference in its entirety. The invention also is particularly useful for chemiluminescence analysis of multiple samples.
The DNA reaction apparatus has vials 32 as sample holders containing light-emitting sample material that effect a pattern of well defined regions 34 in the image area 36 (FIG. 2) in a low intensity background 37. The intensity in each region is processed to provide information on the samples such as concentration of DNA. In order to allow the processing, it is necessary to identify the areas of interest (AOI) in the image. The present invention utilizing computer programming is useful for automatic adaptive masking to accurately identify object areas from the image data, in which the shapes may be irregular and the shapes and background may have non-uniform illumination. The invention also should be more broadly useful for outlining objects in any image that may include scenery or the like.
The computer programming is conventional such as with C++ language. Adaptations of the programming for the present invention from flow charts and descriptions herein will readily be recognized and achieved by those skilled in the art. The flow charts represent means and steps for carrying out various aspects of the invention. Details of the computer and the programming are not important to the invention, and any conventional or other desired computer and programming components that would serve the purposes described herein should be deemed equivalent and interchangeable.
A conventional or other desired graphics program such as Image Pro Plus™ from Media Cybermetrics may be incorporated to display the image if desired. A Tagged Image File
Format (TIFF), e.g. Adobe version 6.0 TLFF, is useful as it is well documented and in the public domain. A suitable development software package is Microsoft Visual C++ Developer Studio. Grey scale 8-bit and 16-bit programming should be used without compression (or with lossless compage ression), as a present goal is to identify object shape based on intensity and contrast.
Each pixel has an associated intensity (brightness, represented digitally) corresponding to background or an object in the image. There also may be fluctuations due to noise. A group or section of pixels may be bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity. Bimodal intensity of a region may be seen in a grey-scale histogram 38 (FIG. 3) of frequency (number of pixels with a given intensity) against intensity. This histogram essentially is a combination of two bell shaped curves, one for background 40 and the other for an object 42.
For the bimodal case, a threshold 44 is computed from the pixels in a selected region of the image, using a histogram, for use in classifying the intensity for each pixel in the region as either object or background. Any of a variety of methods of computing the threshold may be used.
One conventional method is minimizing error of classification which assumes two Gaussian distributions. A simplified form of this, useful for the present case, is to compute the half-way point between the two peaks using peak analysis. After smoothing, all peaks are found in the histogram. Then a two-stage hill-climbing method is used to identify the two major peaks corresponding to the "background" and "object" distributions. The background peak is identified as the highest peak at low intensity, as found by a search of peaks from left to right. The object peak is identified as the highest peak at high intensity, as found by a search of peaks from right to left. To deal with histogram noise and unwanted peaks, conventional boxcar smoothing (weighted average with all weights equal to 1/n) or the like may be used, with additional filtering of spurious peaks such as by data reduction if necessary as with 16-bit images. In 16-bit images, an initial histogram size may be 10 bits, with additional bits added if the raw histogram does not contain at least 64 bins. A test for bimodality should be performed, where two different peaks must be found and the valley must be lower than both peaks. Another suitable method for computing the threshold is iterative selection taught in the aforementioned article by T.W. Ridler and S. Calvard. A preferred method is taught in the aforementioned article by N. Otsu, incorporated herein by reference in its entirety.
Unimodal intensity represents either background or an object, and "threshold" is zero representing "invalid". This will exist for a region small enough to encompass only an object or area of interest in a sample pattern, or a background area between objects or along the edge of the image. Procedures for dealing with this are described below.
In procedures to compute threshold values, the global (full) image area is sub-divided in a sequence of steps, advantageously utilizing quad-tree procedures (FIG. 4). The image area 36 is fourfold subdivided into non-overlapping, fourfold sections 46 (i.e. with four sections in the image area) which preferably are dimensionally equal. These sections are similarly subdivided into four more sections 48 which may further be subdivided into more fourfold sections 50. The successive fourfold subdividing is effected a selected number of times until a final pattern of sections constitute blocks of suitable size for analysis. (For clarity FIG. 4 shows successive child subdivisions of only one parent. All parent sections are actually subdivided.) It was found that three subdivisions yi elding 64 blocks is suitable for an intensity analysis of 96, 384 and 1536 objects (areas of interest, e.g. regions of sample luminescence). More subdivision is possible, but at some level the sections approach unimodal and the results become essentially the same, i.e. extra computations do not provide additional regional sensitivity.
The subdivisions may be visualized as the image area being a grandparent, the first sections being parents, the next sections being children or each of the parent sections, and subsequent sections being grandchildren. However, as termed herein and the claims, each set of fourfold sections are deemed to be child sections of a parent that is the image area or a section, in each pair of generation levels. Although fourfold subdivisions advantageously provide sufficient detail for the present purpose, others may be used such as twofold, threefold or eightfold; this is symbolized more broadly herein as "n-fold". Even more broadly, "n" may vary from level to level. An initial threshold is computed 52 (FIG. 5) for each section through the subdivisions including each final block. The computed threshold for unimodal intensity is zero, and for bimodal intensity is nonzero such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity. The zero or nonzero computation effectively identifies whether each block has unimodal intensity or bimodal intensity. Zero is designated as an invalid threshold and, substituted therefor, a section threshold is derived from neighboring thresholds (siblings). If there are enough near (siblings) with nonzero, an average of their thresholds should be suitable, otherwise the threshold of the parent is designated. (Zero may be substituted with another value different than any possible values for the bimodal intensity.)
After the computed thresholds are determined for all sections, section thresholds are designated successively through the successive subdivided sections until a block threshold is established for each final block. If the computed threshold 55 for a child section 58 being considered is nonzero 54 (bimodal), the computed threshold is designated 56 to be the section threshold 57. If a computed threshold for the child section 58 is zero 60 (unimodal), the sibling sections 66 in its fourfold set are analyzed. If all of the computed thresholds 64 of the siblings are nonzero 68. then the section threshold for the section 58 under consideration is designated 56 to be an average 62 of the computed thresholds for sibling sections in the set. Otherwise 70, the previously-deteπnined section threshold 72 of the parent 74 is designated 56. Although an average may be sufficient if only two of the other sections have nonzero threshold intensity, preferably the average is utilized only if all three of the other sections have nonzero threshold intensity. Similar procedures can be used for other types of "n-fold".
The procedures are repeated 61 until the section thresholds are determined down to the blocks, providing block thresholds 63. FIG. 6 shows possible block thresholds for a subdivision of 64 blocks 75 in an image area 36. Effective pixel thresholds are derived from the block thresholds. The block thresholds may be sufficient as pixel thresholds for certain applications. However, the block thresholds are stepped from block to block, and a comparison with FIG. 2 reveals that such block thresholds could cause problems in distinguishing a larger number such as 96 objects. A further procedure (FIGS. 5 & 7) determines pixel thresholds having smoother variations.
It is assumed that the center pixel in each block has the block threshold. The center pixels 67 are identified 69, e.g. from a rectangle computed from each block edge. Pixel thresholds 71 for all other pixels are established by interpolation 73 from the center values. Although higher order interpolation may be used, linear interpolation generally should be sufficient. For example, with each pixel P having a position within a rectangle having corners defined by four center pixels having thresholds A, B, C, D, a simple formula for the pixel threshold PT by interpolation is:
PT = ( bdA + bcB + adC + acD ) / ( a+b)(c+d)
where A = upper left threshold; B = upper right threshold;
C - lower left threshold; D = lower right threshold;
and
a = distance down from A to pixel of interest; b = distance up from C to pixel of interest; c = distance to the right from C to pixel of interest; d = distance to the left from D to pixel of interest.
For pixels P' lying outside of an inner perimeter 83 defined by the center pixels nearest the boundary 85 of the image 36, a set of virtual outside pixels 79 is established outside of the boundary. These outside pixels are spaced from their nearest pixels by the same distance as the spacings between neighboring center pixels within the image boundary, and thus lie on an outer perimeter 87 spaced a half block out from the image boundary The outside pixels are given thresholds by extrapolation 77 from one or more neighboring pixels; for example, an outside pixel 79' may be given the same threshold as its nearest neighboring center pixel. Preferably a linear extrapolation is made from two adjacent center pixels 67'. The four corner outside pixels may extrapolated diagonally. The interpolations for the pixels P' lying outside of the inner perimeter utilize the virtual pixels. FIG. 8 shows a more uniformly varying pattern of pixel thresholds interpolated from the block thresholds of FIG. 6.
Alternatively, the thresholds for the outer pixels P' may be determined directly by extrapolation. As used in the claims, the term "interpolation" includes such extrapolation near the boundary.
From the pixel threshold 71 (FIG. 9) for each pixel, identification 76 is made as to whether its actual intensity 78 is either object 80 or background 82. A first value 84 (e.g. 1) is assigned 86 to each pixel having an object intensity, and a second value 88 (e.g. 0) is assigned to each pixel having a background intensity, thereby converting image objects to binary objects. The assigned values are arbitrary. In the case of light emitting samples (FIG. 2), the objects will be brighter than me background. More generally, for other tvpes of images, either could be brighter. The binary objects are then delineated 90, and this delineation establishes the boundaries 92 of the image objects.
Generally the global image area should be bimodal for the foregoing procedures, although in some circumstances this may not be necessary. Also, the image intensities should have sufficiently smooth gradients across the blocks in the image to support the assumption that the center pixel in each block has a threshold equal to the block threshold. For these reasons, at least where the procedures are used for a fixed pattern of objects as with an instrument with light emitting areas, the edges of the fixed objects may be determined only once with a suitable image area. Once these are determined, all subsequent measurements may be made from the same delineated areas. Alternatively, if there is the possibility of area drift, a model pattern may be provided for periodic use in delineating the object areas. For cases not having a fixed pattern, such as for analyzing a photographic type of image, the number of subdivisions of sections may be increased until the block thresholds become mostly unimodal. The delineation of binary objects, which otherwise may be made in any desired manner, should be consistent with the accuracy needed. In one delineation technique under the present invention, an initial neighbor evaluation is made to label every object pixel having at least one neighboring background pixel. The neighbors may be 4-connected boundary pixels of the object (4 neighbors excluding diagonals) or, preferably for reduced ambiguities, 8-connected (all 8 neighbors including diagonals). For the label, each such object pixel is assigned a third value (e.g. 2), thereby creating associated tertiary pixels that identify boundary objects. Another technique for delineation is a "pin-and-wheel" search described below. Advantageously both are used consecutively, with the 8-connected neighbor evaluation 91 (FIG. 10) to preliminarily label boundary or edge objects 93 for a purpose of initial searching for starting points, followed by the pin-and-wheel to refme and record the boundaries.
After the neighbor evaluation (if any), an initial scanning search 104 (FIGS. 10 & 11) is made beginning in the upper right corner 106 of the image area, traversing leftward line by line until an initial object pixel 108 is located. Tf a neighbor evaluation 91 has preliminarily labeled edge (tertiary) objects 93, only the latter are searched for. The reason is that such a search more readily allows subsequent searches for an initial object pixel for other objects, without sorting out pixels of objects already delineated by the pin-and-wheel technique.
For the pin-and-wheel technique, each pixel is considered to be either a background pixel 102 or an object pixel 103. An object (non-background) pixel may have either the ordinary binary value (e.g. 1) or may be an object pixel labeled as a preliminary edge pixel 93 having a tertiary value (e.g. 2). A search entity is defined mathematically as a combination of a pin 94, a rod 96 pivotable on the pin, and a wheel 98 attached to the rod approximately one pixel space from the pin. The pin is constrained to move only among object pixels, and the wheel is constrained to move only among background pixels.
The pin is placed 110 on the initial pixel with the wheel in an adjacent background pixel 112, preferably above the pin for a right-to-left search, so as to ensure that the wheel is outside of the object. Unless the wheel is constrained (as indicated above), with the pin retained on the initial pixel, the wheel is moved 116 in a clockwise direction until the wheel is constrained by an object pixel 118. Next, unless the pin is constrained, with the wheel retained on its latest background pixel, the pin is moved 120 counterclockwise to a next object pixel 122. In a sort of walking motion (virtually by computer), the steps of moving the pin across object pixels are repeated until the pin is constrained 124 by a background pixel, and then the wheel is moved 116 again.
Occasionally, upon a detection test 125, the pin and the wheel may both be found to be constrained simultaneously 126 by an adjacent object pixel 128 and an adjacent background pixel 130 respectively, h such case, the wheel is skipped 132 over the adjacent object pixel to a next background pixel 130, and the alternating moving of the pin and the wheel are resumed 116, 120.
The steps of moving and (as necessary) skipping are repeated until a return 134 to the initial object pixel 108 is effected. (If there is no return, the "object" is rejected as a non-entity.) Each object (non-background) pixel including the initial pixel traversed by the pin, whether or not a preliminary (tertiary) edge pixel, is identified 114 as a boundary pixel 92, preferably subject to an object (area of interest) test to qualify the entire object as desirable, i.e. whether it is within a selected range of length or size. These boundary pixels have their locations stored for future use in analyzing the objects, for example analyzing intensity of fluorescent samples. The boundary pixels advantageously are given a fourth value (e.g. 3) replacing the original preliminary (tertiary) values which otherwise may be ignored or returned to the binary value (e.g. 1). The boundary pixels thereby delineate a binary object and consequently an actual object.
The scanning search 104' is resumed to find an initial pixel for another object. The search is made easier by searching only for preliminary (tertiary) edge pixels not identified as actual boundaries already identified by pin-and-wheel. The steps of searching and applying pin and wheel are further repeated 136 to identify other binary objects until no further initial pixel is located. After all objects are found, the tertiary and fourth values may be returned to the binary value (e.g. 1 ). Alternatively the fourth values may be retained to allow display of the object boundaries. It will be appreciated that the scanning search for an initial pixel for each object should be a grid search preferably beginning in any comer and going from right to left, left to right, etc. Similarly the wheel motion may be clockwise or counterclockwise. However, for the technique to be properly effective, there should be further restrictions including that the pin motion must be opposite from the wheel motion to achieve the "walk". Moreover the wheel motion should be "opposite" from the search direction, meaning if the search is in rows from right to left from the upper πght (like an initial counterclockwise), then the wheel should be moved clockwise, and vice versa.
To generalize this situation, the term "first-hand direction" as used herein and in the claims refers to the initial searching direction which may be right to left in successive lines starting from the upper right, or left to right in successive lines starting from the lower left, or top to bottom in successive columns starting from the upper left, or bottom to top starting from the lower right; i.e "counterclockwise" for the initial line or column search. Alternatively the first- hand direction may initially be "clockwise", e.g left to right from the upper left. The pin motion should always be this same "first-hand direction", i.e. counterclockwise if the search starts "counterclockwise", or clockwise if the search starts "clockwise". The temi "secondhand direction" for the wheel means oppositely from the first-hand direction, i.e. clockwise if the initial search and wheel motion are counterclockwise, and vice versa. Although it is preferable for the search to begin at a comer, other search patterns may be suitable, in which case the term "first-hand direction" should be adapted accordingly and deemed equivalent and interchangeable to the foregoing patterns.
Although the foregoing description is directed particularly to masking of sample images in optical analytical instruments, the invention may be utilized in other fields. These include ultrasound imaging, robotic vision, scanning of optical images such as photographs, x-rays and radar images, counting of blood cells and detection of foreign cells in blood. The present techniques have also been applied to an ordinary image of a scene with people. Although the procedures described herein are advantageously computer automated, method steps may be initiated sequentially by an operator. While the invention has been described above in detail with reference to specific embodiments, various changes and modifications which fall within the spirit of the invention and scope of the appended claims will become apparent to those skilled in this art. Therefore, the invention is intended only to be limited by the appended claims or their equivalents.

Claims

What is claimed is:
1. A method for establishing boundaries of objects in a stored digital image associated with pixels that define an image area, each pixel having an associated intensity coπesponding to background or an object in the image, such that total intensity for a section of pixels is bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity; the method comprising steps of:
dividing the image area into nonoverlapping blocks each formed of a plurality of pixels;
identifying whether each block has unimodal intensity or bimodal intensity;
computing a threshold for each block having bimodal intensity, such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity;
designating a block threshold for each block, the block threshold for bimodal intensity being the computed threshold, and for unimodal intensity being an assigned threshold derived from neighboring block thresholds;
deriving a pixel ΓÇó.╬▒reshold for each pixel from block thresholds;
identifying from the pixel threshold each pixel having an object intensity and each pixel having a background intensity, and assigning a first value to each pixel having an object intensity and a second value to each pixel having a background intensity, thereby converting image objects to binary objects; and
delineating the binary objects to thereby establish the boundaries of the image objects.
2. The method of claim 1 wherein the step of deriving comprises interpolating a pixel threshold for each pixel from block thresholds.
3. The method of claim 1 wherein the image area has bimodal intensity, and wherein:
the step of dividing the image area comprises n-fold subdividing the image area into non- overlapping n-fold sections, and successively n-fold subdividing each of the sections a selected number of times until final sections define the blocks, each set of n-fold sections being a set of child sections of a parent that is the image area or a section;
the step of designating an assigned threshold comprises identifying whether each section has unimodal intensity or bimodal intensity, computing thresholds for the image area and each section having bimodal intensity, and designating each computed threshold as a section threshold; and
the step of designating an assigned threshold further comprises, for each section having unimodai intensity in a set of n-fold sections, designating an average of the computed thresholds for other sections in the set if said other sections have bimodal intensity, and otherwise designating the threshold intensity of the parent of the set, the section thresholds being designated successively through successive subdividings until a block threshold is designated for each block.
4. The method of claim 3 wherein the average of the computed thresholds for other sections in the set is designated only if all other sections in the set have nonzero threshold iutensity.
5. The method of claim 3 wherein n-fold is fourfold.
6. The method of claim 1 wherein each pixel having the second value of intensity is a background pixel, each other pixel is an object pixel, and a search entity is defined mathematically as a combination of a pin, a rod pivotable on the pin, and a wheel attached to the rod approximately one pixel space from the pin, the pin being constrained to move among object pixels, and the wheel being constrained to move among background pixels, and the step of delineating the binary objects comprises, in sequence: searching in a first-hand direction from a comer of the image area to locate an initial object pixel, and placing the pin on the initial pixel with the wheel in an adjacent background pixel;
unless constrained, and with the pin retained, moving the wheel in a second-hand direction opposite from the first-hand direction until the wheel is constrained by an object pixel;
unless constrained, and with the wheel retained on its latest background pixel, moving the pin in the first-hand direction to a next object pixel and, unless constrained, repeating the step of moving the pin until the pin is constrained by a background pixel;
if the pin and the wheel are respectively constrained simultaneously by an adjacent object pixel and an adjacent background pixel, skipping the wheel over the adjacent object pixel to a next background pixel;
repeating the steps of moving and skipping until the initial object pixel is reached;
identifying al) pixels traversed by the pin as boundary pixels, thereby delineating a binary object; and
further repeating the steps of searching, moving, identifying and skipping until no further initial pixel is located, thereby delineating other binary objects.
7. The method of claim 6 wherein the step of searching comprises initially labeling as an edge pixel each object pixel having at least one neighboring background pixel, and searching only for edge pixels to locate an initial pixel.
8. The method of claim 7 wherein the boundary pixels of each binary object are identified before further searching for a next initial pixel, and the step of identifying comprises assigning a further value to each boundary pixel so as to remove associated edge pixels from the further searching.
9. The method of claim 8 further comprising reassigning each boundary pixel with the first value after no further initial pixel is located.
10. The method of claim 6 wherein the step of identifying comprises assigning a further value to each boundary pixel.
11. The method of claim 10 further comprising utilizing each further value to display boundary images.
12. An instrument receptive of light from a plurality of samples to effect separate image objects for analysis thereof, the instrument including a processor for effecting such analysis, the processor comprising:
means for storing the objects in a stored digital image associated with pixels that define an image area, each pixel having an associated intensity corresponding to background or an object in the image, such that total intensity for a section of pixels is bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity;
means for dividing the image area into nonoverlapping blocks each formed of a plurality of pixels;
means for identifying whether each block has unimodal intensity or bimodal intensity;
means for computing a threshold for each block having bimodal intensity, such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity;
means for designating a block threshold for each block, the block threshold for bimodal intensity being the computed threshold, and for unimodal intensity being an assigned threshold derived from neighboring block thresholds; means for deriving a pixel threshold for each pixel from block thresholds;
means for identifying from the pixel threshold each pixel having an object intensity and each pixel having a background intensity, and assigning a first value to each pixel having an object intensity and a second value to each pixel having a background intensity, thereby converting image objects to binary objects; and
means for delineating the binary objects to thereby establish the boundaries of the image objects for analysis thereof.
13. The instrument of claim 12 wherein the means for deriving comprises means for interpolating a pixel threshold for each pixel from block thresholds.
14. The instrument of claim 12 wherein the image area has bimodal intensity, and wherein:
the means for dividing the image area comprises means for n-fold subdividing the image area into non-overlapping n-fold sections, and successively n-fold subdividing each of ihe sections a selected number of times until final sections define the blocks, each set of n-fold sections being a set of child sections of a parent that is the image area or a section;
the means for designating an assigned threshold comprises means for identifying whether each section has unimodal intensity or bimodal intensity, means for computing thresholds for the image area and each section having bimodal intensity, and means for designating each computed threshold as a section threshold; and
the means for designating an assigned threshold further comprises, for each section having unimodal intensity in a set of n-fold sections, means for designating an average of the computed thresholds for other sections in the set if said other sections have bimodal intensity, and otherwise designating the threshold intensity of the parent of the set, the section thresholds being designated successively through successive subdividings until a block threshold is designated for each block.
15. The instrument of claim 14 wherein the average of the computed thresholds for other sections in the set is designated only if all other sections in the set have nonzero threshold intensity.
16. The instrument of claim 14 wherein n-fold is fourfold.
17. The instrument of claim 12 wherein each pixel having the second value of intensity is a background pixel, each other pixel is an object pixel, and a search entity is defined mathematically as a combination of a pin, a rod pivotable on the pin, and a wheel attached to the rod approximately one pixel space from the pin, the pin being constrained to move among object pixels, and the wheel being constrained to move among background pixels, and the means for delineating the binary objects comprise, in sequence:
means foi searching in a first-hand direction from a comer of the image area to locate an initial object pK-el;
means for placing the pin on the initial pixel with the wheel in an adjacent background pixel;
means for moving the wheel in a wheel motion with the pin retained, the wheel motion being in a second-hand direction opposite from the first-hand direction until the wheel is constrained by an object pixel;
means for moving the pin in a pin motion with the wheel retained on its latest background pixel, the pin motion being in the first-hand direction to a next object pixel and, unless the pin is constrained, further moving the pin until the pin is constrained by a background pixel;
means for detecting whether the pin and the wheel are respectively constrained simultaneously by an adjacent object pixel and an adjacent background pixel, and, if the pin and the wheel are so constrained, means for skipping the wheel over the adjacent object pixel to a next background pixel; means for repeating the pin motion and the wheel motion alternately, and the detecting and the skipping if simultaneous constraining is detected, until the initial object pixel is reached;
means for identifying all pixels traversed by the pin as boundary pixels, thereby delineating a binary object; and
means for further repeating the searching, the pin motion and the wheel motion, and the detecting and the skipping if simultaneous constraining is detected, and the identifying, until no further initial pixel is located, thereby delineating other binary objects.
18. The instrument of claim 17 wherein the means for searching comprise means for initially labeling as an edge pixel each object pixel having at least one neighboring background pixel, and means for searching only for edge pixels to locate an initial pixel.
19. The instrument of claim 18 wherein the boundary pixels of each binaiy object are identified before further searching for a next initial pixel, and the means for identifying comprise means for assigning a further value to each boundary pixel so as to remove associated edge pixels from the further searching
20. The instmment of claim 19 further comprising means for reassigning each boundary pixel with the first value after no further initial pixel is located.
21. The instalment of claim 17 wherein the means for identifying comprise means for assigning a further value to each boundary pixel.
22. The instmment of claim 21 further comprising means for utilizing each further value to display boundary images.
23. A computer readable storage medium for utilization with an instmment receptive of light from a plurality of samples to effect separate image objects for analysis thereof, the instmment including a processor for such analysis, the objects being stored in the processor in a stored digital image associated with pixels that define an image area, each pixel having an associated intensity corresponding to background or an object in the image, such that total intensity for a section of pixels is bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity, and the storage medium having program code embodied therein so as to be readable by the processor, wherein the program code comprises:
means for dividing the image area into nonoverlapping blocks each formed of a plurality of pixels;
means for identifying whether each block has unimodal intensity or bimodal intensity;
means for computing a threshold for each block having bimodal intensity, such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity;
means for designating a block threshold for each block, the block threshold for bimodal intensity being the computed threshold, and for unimodal intensity being an assigned threshold derived from neighboring block thresholds;
means for deriving a pixel threshold for each pixel from block thresholds;
means for identifying from the pixel threshold each pixel having an object intensity and each pixel having a background intensity, and assigning a first value to each pixel having an object intensity and a second value to each pixel having a background intensity, thereby converting image objects to binary objects; and
means for delineating the binary objects to thereby establish the boundaries of the image objects.
24. The storage medium of claim 23 wherein the means for deriving comprises means for interpolating a pixel threshold for each pixel from block thresholds.
25. The storage medium of claim 23 wherein the image area has bimodal intensity, and wherein:
the means for dividing the image area comprises means for n-fold subdividing the image area into non-overlapping n-fold sections, and successively n-fold subdividing each of the sections a selected number of times until final sections define the blocks, each set of n-fold sections being a set of child sections of a parent that is the image area or a section;
the means for designating an assigned threshold comprises means for identifying whether each section has unimodal intensity or bimodal intensity, means for computing thresholds for the image area and each section having bimodal intensity, and means for designating each computed threshold as a section threshold; and
the means for designating an assigned threshold further comprises, for each section having unimodal intensity in a set of n-fold sections, means for designating an average of the computed thresholds for other sections in the set if said other sections have bimodal intensity, and otherwise designating the threshold intensity of the parent of the set, the section thresholds being designated successively through successive subdividings until a block threshold is designated for each block.
26. The storage medium of claim 25 wherein the average of the computed thresholds for other sections in the set is designated only if all other sections in the set have nonzero threshold intensity.
27. The storage medium of claim 25 wherein n-fold is fourfold.
28. The storage medium of claim 23 wherein each pixel having the second value of intensity is a background pixel, each other pixel is an object pixel, and a search entity is defined mathematically as a combination of a pin, a rod pivotable on the pin, and a wheel attached to the rod approximately one pixel space from the pin, the pin being constrained to move among object pixels, and the wheel being constrained to move among background pixels, and the means for delineating the binary objects comprise, in sequence:
means for searching in a first-hand direction from a comer of the image area to locate an initial object pixel;
means for placing the pin on the initial pixel with the wheel in an adjacent background pixel;
means for moving the wheel in a wheel motion with the pin retained, the wheel motion being in a second-hand direction opposite from the first-hand direction until the wheel is constrained by an object pixel;
means for moving the pin in a pin motion with the wheel retained on its latest background pixel, the pin motion being in the first-hand direction to a next object pixel and, unless the pin is constrained, further moving the pin until the pin is constrained by a background pixel;
means for detecting whether the pin and the v/heel are respectively constrained simultaneously by an adjacent object pixel and an adjacent background pixel, and, if the pin and the wheel are so constrained, means for skipping the wheel over the adjacent object pixel to a next background pixel;
means for repeating the pin motion and the wheel motion alternately, and the detecting and the skipping if simultaneous constraining is detected, until the initial object pixel is reached;
means for identifying all pixels traversed by the pin as boundary pixels, thereby delineating a binary object; and means for further repeating the searching, the pin motion and the wheel motion, and the detecting and the skipping if simultaneous constraining is detected, and the identifying, until no further initial pixel is located, thereby delineating other binary objects.
29. The storage medium of claim 28 wherein the means for searching comprise means for initially labeling as an edge pixel each object pixel having at least one neighboring background pixel, and means for searching only for edge pixels to locate an initial pixel.
30. The storage medium of claim 29 wherein the boundary pixels of each binary object are identified before further searching for a next initial pixel, and the means for identifying comprise means for assigning a further value to each boundary pixel so as to remove associated edge pixels from the further searching.
31. The storage medium of claim 30 further comprising means for reassigning each boundary pixel with the first value after no further initial pixel is located.
32. The storage medium of claim 28 wherein the means for identifying comprise means for - assigning a further value to each boundary pixel.
33. The storage medium of claim 29 further comprising means for utilizing each further value to display boundary images.
PCT/US1999/015796 1998-07-14 1999-07-13 Automatic masking of objects in images WO2000004497A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
AU49898/99A AU754884B2 (en) 1998-07-14 1999-07-13 Automatic masking of objects in images
AT99933959T ATE217995T1 (en) 1998-07-14 1999-07-13 AUTOMATIC MASKING OF OBJECTS IN IMAGES
JP2000560543A JP2002520746A (en) 1998-07-14 1999-07-13 Automatic masking of objects in images
CA002336885A CA2336885A1 (en) 1998-07-14 1999-07-13 Automatic masking of objects in images
DE69901565T DE69901565T2 (en) 1998-07-14 1999-07-13 AUTOMATIC MASKING OF OBJECTS IN IMAGES
EP99933959A EP1095357B1 (en) 1998-07-14 1999-07-13 Automatic masking of objects in images

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US9278598P 1998-07-14 1998-07-14
US60/092,785 1999-07-13
US09/351,660 1999-07-13

Publications (1)

Publication Number Publication Date
WO2000004497A1 true WO2000004497A1 (en) 2000-01-27

Family

ID=22235150

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/015796 WO2000004497A1 (en) 1998-07-14 1999-07-13 Automatic masking of objects in images

Country Status (1)

Country Link
WO (1) WO2000004497A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003012742A2 (en) * 2001-07-27 2003-02-13 3M Innovative Properties Company tUTOTHRESHOLDING OF NOISY IMAGES
EP1617207A3 (en) * 1999-07-21 2006-05-31 Applera Corporation Luminescence detection workstation
WO2007105107A2 (en) * 2006-03-14 2007-09-20 Agency For Science, Technology And Research Methods, apparatus and computer-readable media for image segmentation
WO2017158560A1 (en) * 2016-03-18 2017-09-21 Leibniz-Institut Für Photonische Technologien E.V. Method for examining distributed objects by segmenting an overview image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119439A (en) * 1990-02-06 1992-06-02 Dainippon Screen Mfg. Co., Ltd. Method of and apparatus for extracting image contour data
WO1997043732A1 (en) * 1996-05-10 1997-11-20 Oncometrics Imaging Corp. Method and apparatus for automatically detecting malignancy-associated changes
US5901245A (en) * 1997-01-23 1999-05-04 Eastman Kodak Company Method and system for detection and characterization of open space in digital images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119439A (en) * 1990-02-06 1992-06-02 Dainippon Screen Mfg. Co., Ltd. Method of and apparatus for extracting image contour data
WO1997043732A1 (en) * 1996-05-10 1997-11-20 Oncometrics Imaging Corp. Method and apparatus for automatically detecting malignancy-associated changes
US5901245A (en) * 1997-01-23 1999-05-04 Eastman Kodak Company Method and system for detection and characterization of open space in digital images

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1617207A3 (en) * 1999-07-21 2006-05-31 Applera Corporation Luminescence detection workstation
WO2003012742A2 (en) * 2001-07-27 2003-02-13 3M Innovative Properties Company tUTOTHRESHOLDING OF NOISY IMAGES
WO2003012742A3 (en) * 2001-07-27 2004-02-19 3M Innovative Properties Co tUTOTHRESHOLDING OF NOISY IMAGES
US6961476B2 (en) 2001-07-27 2005-11-01 3M Innovative Properties Company Autothresholding of noisy images
WO2007105107A2 (en) * 2006-03-14 2007-09-20 Agency For Science, Technology And Research Methods, apparatus and computer-readable media for image segmentation
WO2007105107A3 (en) * 2006-03-14 2008-01-24 Agency Science Tech & Res Methods, apparatus and computer-readable media for image segmentation
WO2017158560A1 (en) * 2016-03-18 2017-09-21 Leibniz-Institut Für Photonische Technologien E.V. Method for examining distributed objects by segmenting an overview image
CN109478230A (en) * 2016-03-18 2019-03-15 光学技术注册协会莱布尼兹研究所 The method for checking distributed objects by segmentation general view image
US11599738B2 (en) 2016-03-18 2023-03-07 Leibniz-Institut Für Photonische Technologien E.V. Method for examining distributed objects by segmenting an overview image

Similar Documents

Publication Publication Date Title
US5566249A (en) Apparatus for detecting bubbles in coverslip adhesive
CN111445478B (en) Automatic intracranial aneurysm region detection system and detection method for CTA image
Kittler et al. Threshold selection based on a simple image statistic
JP3296494B2 (en) Adaptive display system
US5982916A (en) Method and apparatus for automatically locating a region of interest in a radiograph
JP3296492B2 (en) How to identify and characterize valid objects by color
Spirkovska A summary of image segmentation techniques
JP3296493B2 (en) How to determine interior points of objects in the background
Naufal et al. Preprocessed mask RCNN for parking space detection in smart parking systems
US7609887B2 (en) System and method for toboggan-based object segmentation using distance transform
JP3490482B2 (en) Edge and contour extraction device
EP0681722A1 (en) Methods for determining the exterior points of an object in a background
US7565009B2 (en) System and method for dynamic fast tobogganing
WO2001008098A1 (en) Object extraction in images
EP1095357B1 (en) Automatic masking of objects in images
WO2000004497A1 (en) Automatic masking of objects in images
Wilkinson Automated and manual segmentation techniques in image analysis of microbes
Dinç et al. Super-thresholding: Supervised thresholding of protein crystal images
Bhardwaj et al. An imaging approach for the automatic thresholding of photo defects
CN113420636A (en) Nematode identification method based on deep learning and threshold segmentation
Tsai A new approach for image thresholding under uneven lighting conditions
JP3539581B2 (en) Particle image analysis method
Durak et al. Automated Coronal-Loop Detection based on Contour Extraction and Contour Classification from the SOHO/EIT Images
Li et al. Automatic tracking of proteins in sequences of fluorescencce images
Qin Text Spotting in the Wild

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2336885

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 49898/99

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 1999933959

Country of ref document: EP

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2000 560543

Kind code of ref document: A

Format of ref document f/p: F

WWP Wipo information: published in national office

Ref document number: 1999933959

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 1999933959

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 49898/99

Country of ref document: AU