WO1996009610A1 - Method and apparatus for detecting a microscope slide coverslip - Google Patents

Method and apparatus for detecting a microscope slide coverslip Download PDF

Info

Publication number
WO1996009610A1
WO1996009610A1 PCT/US1995/011391 US9511391W WO9609610A1 WO 1996009610 A1 WO1996009610 A1 WO 1996009610A1 US 9511391 W US9511391 W US 9511391W WO 9609610 A1 WO9609610 A1 WO 9609610A1
Authority
WO
WIPO (PCT)
Prior art keywords
edge
coverslip
slide
image
microscope
Prior art date
Application number
PCT/US1995/011391
Other languages
French (fr)
Inventor
Mikel D. Rosenlof
Robert C. Schmidt
Shi-Jong J. Lee
Chih-Chau L. Kuan
Original Assignee
Neopath, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neopath, Inc. filed Critical Neopath, Inc.
Priority to DE0782738T priority Critical patent/DE782738T1/en
Priority to EP95931766A priority patent/EP0782738A4/en
Priority to CA002200445A priority patent/CA2200445C/en
Priority to AU35083/95A priority patent/AU709029B2/en
Priority to JP8510932A priority patent/JPH10506461A/en
Publication of WO1996009610A1 publication Critical patent/WO1996009610A1/en
Priority to GR980300049T priority patent/GR980300049T1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning

Definitions

  • This invention relates to an apparatus for detecting a microscope slide coverslip, and more particularly to a apparatus for determining the boundaries of a microscope slide coverslip prior to processing of images of areas under the coverslip.
  • Automated cytology systems such as the NeoPath Autopap 300 (TM) automated cytology system available from NeoPath Corporation of Bellevue, Washington analyze an image of a biological specimen for evidence of objects such as squamous intraepithelial lesions and cancerous cells.
  • the biological specimen is typically fixed to a microscope slide where the microscope slide includes a coverslip over the biological specimen. It is advantageous to detect the microscope slide coverslip prior to the analysis of the biological specimen for a number of reasons.
  • the image of the specimen is considered of highest quality, exhibiting regular and uniform optical properties only if it is contained under the coverslip.
  • the coverslip position is invariant with respect to the sample, and, as a result, the coverslip may be used as a fiducial optical element for locating specific objects in the sample.
  • microscope objective lenses are designed to image specimens under a coverslip of consistent thickness.
  • images acquired from any biological specimens not lying under the coverslip lack the proper optical properties necessary for high quality image analysis. Therefore, the location of the coverslip must be detected with a very high degree of certainty, or the entire slide may be rejected as unsatisfactory for analysis.
  • the coverslip may span more than one field of view.
  • Traditional edge detection techniques do not address the analysis of a small number of partial edges for reconstruction of a coverslip object spanning multiple fields of view.
  • coverslips are typically applied by hand, as a result, the quantity of adhesive used tends to vary from slide to slide. In some cases the adhesive does not reach the edge of the coverslip and voids are created in the adhesive. The edges of the voids may appear much like coverslip edges. In other cases, the adhesive may ooze out from under the coverslip causing dirt to collect on the adhesive which results in the optical properties around the edge changing.
  • the specimen may have structures that, in some cases, resemble edges. Such structures may degrade or prevent the successful location of the coverslip.
  • coverslip and adhesive ranges in size from about 0.10 mm to 0.14 mm thick.
  • the average coverslip with adhesive is approximately 0.12 mm thick.
  • the slide is approximately 1.0 mm thick. The differences in thickness between the slide and the coverslip lead to the slide forming a much stronger edge image which tends to overwhelm the image formed by the coverslip edge when magnified by the microscope.
  • corners of the coverslip may be used as fiducial marks for defining a coordinate system to locate objects on the slide.
  • An error even as small as one degree in calculating the angle of the edge line, from the center of a 60 mm coverslip yields an error at the coverslip extreme of 0.52 mm. Therefore, a high degree of accuracy and repeatability is necessary to define coordinates which are useful for precisely locating such objects. It is one motivation of the invention to provide an apparatus to locate a microscope slide coverslip spanning multiple fields of view to provide a longer basis for more accurately determining the coverslip position.
  • the invention provides a coverslip detection apparatus that locates all four coverslip edges.
  • the apparatus of the invention comprises a field of view processor coupled to receive image data from a charge coupled device camera.
  • the camera views a slide and coverslip that is mounted on a movable frame.
  • the slide is illuminated with a uniform light source.
  • the moveable frame is controlled by a computer in response to the host computer.
  • the host locates the coverslip by first positioning the movable frame to view a portion of the slide within a predetermined area of the slide.
  • the slide is then reimaged after the movable frame moves the slide toward a chosen edge of direction.
  • Edge objects are located and followed over multiple fields of view. If an edge object satisfies a set of criteria, the coverslip edge has been found.
  • Figures 1A, IB, 1C, ID and IE schematically illustrate the slide imaging apparatus of the invention.
  • Figure 2 shows a schematic of a microscope slide in a slide receptacle with the smallest standard coverslip installed at two positional extremes.
  • Figure 3 shows a schematic diagram of processing used to detect the field of view edge.
  • Figure 4 shows a schematic of processing required to image the edge with a filter.
  • Figure 5 shows a coverslip detection flowchart.
  • Figure 6 shows a method of accepting or rejecting an edge.
  • Figures 7A, 7B and 7C show edge fit methods used to determine a true edge.
  • Figures 8A, 8B, 8C, 8D, 8E and 8F show examples of filtered projection results.
  • the system disclosed herein is used in a system for analyzing cervical pap smears, such as that shown and disclosed in U.S. Patent Application Serial No. 07/838,064, entitled “Method For Identifying Normal Biomedical Specimens", by Alan C. Nelson, et al . , filed February 18, 1992; U.S. Patent Application Serial No. 08/179,812 filed January 10, 1994 which is a continuation in part of U.S. Patent Application Serial No. 07/838,395, entitled "Method For Identifying Objects Using Data Processing Techniques", by S. James Lee, et al.
  • the present invention is also related to biological and cytological systems as described in the following patent applications which are assigned to the same assignee as the present invention, filed on September 20, 1994 unless otherwise noted, and which are all hereby incorporated by reference including U.S. Patent Application Serial No. 08/309,118, to Kuan et al . entitled, "Field Prioritization Apparatus and Method," U.S. Patent Application Serial No. 08/309,061, to Wilhelm et al. , entitled “Apparatus for Automated Identification of Cell Groupings on a Biological Specimen," U.S. Patent Application Serial No. 08/309,116 to Meyer et al .
  • the apparatus of the invention comprises an imaging system 502, a motion control system 504, an image processing system 536, a central processing system 540, and a workstation 542.
  • the imaging system 502 is comprised of an illuminator 508, imaging optics 510, a CCD camera 512, an illumination sensor 514 and an image capture and focus system 516.
  • the image capture and focus system 516 provides video timing data to the CCD cameras 512, the CCD cameras 512 provide images comprising scan lines to the image capture and focus system 516.
  • An illumination sensor intensity is provided to the image capture and focus system 516 where an illumination sensor 514 receives the sample of the image from the optics 510.
  • the optics may further comprise an automated microscope 511.
  • the illuminator 508 provides illumination of a slide.
  • the image capture and focus system 516 provides data to a VME bus 538.
  • the VME bus distributes the data to an image processing system 536.
  • the image processing system 536 is comprised of field of view processors 568.
  • the images are sent along the image bus 564 from the image capture and focus system 516.
  • a central processor 540 controls the operation of the invention through the VME bus 538.
  • the central processor 562 comprises a MOTOROLA 68030 (TM) CPU.
  • the motion controller 504 is comprised of a tray handler 518, a microscope stage controller 520, a microscope tray controller 522, and a calibration slide 524.
  • the motor drivers 526 position the slide under the optics.
  • a bar code reader 528 reads a barcode located on the slide 524.
  • a touch sensor 530 determines whether a slide is under the microscope objectives, and a door interlock 532 prevents operation if the doors are open.
  • Motion controller 534 controls the motor drivers 526 in response to the central processor 540.
  • An Ethernet (TM) communication system 560 communicates to a workstation 542 to provide control of the system.
  • a hard disk 544 is controlled by workstation 550.
  • workstation 550 may comprise a SUN SPARC CLASSIC (TM) workstation.
  • a tape drive 546 is connected to the workstation 550 as well as a modem 548, a monitor 552, a keyboard 554, and a mouse pointing device 556.
  • a printer 558 is connected to
  • the central computer 540 controls the microscope 511 and the processor to acquire and digitize images from the microscope 511.
  • the flatness of the slide may be checked, for example, by contacting the four corners of the slide using a computer controlled touch sensor.
  • the computer 540 also controls the microscope 511 stage to position the specimen under the microscope objective, and from one to fifteen field of view (FOV) processors 568 which receive images under control of the computer 540.
  • FOV field of view
  • the host computer 568 runs a real time operating system that controls the operation of the remainder of the system comprising a microscope 26 with a xenon strobe 508 for illumination, a processor 536 to acquire and digitize images from the microscope 26, a computer controlled microscope stage 40 to position the specimen under the microscope objective 30, and from one to 15 Field Of View (FOV) processors 568 which receive images under control of the host computer and from the image acquisition processor 536.
  • the FOV processors 568 perform image processing and object classification operations on the images and transmit the results to the host computer.
  • Figure IE shows an expanded view of one corner of a microscope slide 12 covered by coverslip 14.
  • the adhesive 11 bonds the coverslip 14 to the slide.
  • Figure 2 shows an example slide 12 in slide receiving receptacle 20.
  • Slide 12 is a typical specimen glass slide having two general areas, the label area 18, which may comprise, for example, frosted glass largely covered by an opaque label, and the transparent specimen area 19.
  • a glass coverslip 14 is shown here in two locations indicated by dotted coverslips 14a, 14b.
  • a first slide location 15a and a second slide location 15b are examples of coverslip placement showing placement on the light transmissive portion 19 of slide 12.
  • the slide location 15a shows the location of the coverslip in the upper-left-most extreme in relation to Figure 2.
  • the slide location 15b shows the coverslip in the bottom-right-most extreme position.
  • a window region indicated by 16 and outlined by a border 17 is the region of the slide receptacle 20 whereby due to physical restriction of the slide mount, there must always exist a coverslip.
  • the slide 12 may be positioned anywhere within the limits of the slide receptacle 20 from the upper left hand corner of slide receptacle 20 to the bottom right hand corner. Regardless of the location of the slide 12 with respect to the slide receptacle 20, window region 16 provides a window where the coverslip will always be found. The invention exploits this condition by starting the search for the coverslip edge within the window region 16.
  • width refers to a dimension in X
  • height refers to a dimension in Y on a conventional Cartesian coordinate system when viewing a slide shown in a top view, such as in Figure 2.
  • the slide 12 position may vary within a rectangle of dimensions stageWidth by stageHeight.
  • stageHeight represents a similar spatial measure the height of the space for the slide within the receiving receptacle 20.
  • a portion of the slide 12 is designated the label area 18 which may not be covered by any portion of the coverslip 14a, 14b. The label area spans the height of the slide 12 and has a width defined as labelWidth from the left edge of the slide 12.
  • a coverslip 14a, 14b has minimum dimensions of coverslipWidth by coverslipHeight, and may be adhered to any point on the slide 12 such that no part of the coverslip extends beyond the slide edge, and no part of the coverslip extends into the label area. It follows that coverslipWidth is a parameter measuring the coverslip width and coverslipHeight is a parameter measuring the coverslip height .
  • the region 16 that must be covered by the coverslip may be computed.
  • maxLeft stageWidth - coverslipWidth.
  • minRight labelWidth + coverslipWidth.
  • the apparatus of the invention includes a tray 24 that may hold at least one slide, but preferably a number of slides. In one example of the invention, tray 24 holds up to eight slides in position for moving about under the microscope lens 30.
  • the tray 24 includes receptacle 20 where receptacle 20 includes an opening suitably sized to accept a standard range of slide sizes and where the opening may be advantageously large enough to facilitate insertion and removal of the slides.
  • receptacle is sized to accommodate slide positions that may vary by as much as 2 millimeters about the X and Y axes relative to the receptacle 20.
  • the coverslip 14 applied to the slide 12 is smaller than the slide 12, the coverslip position relative to the slide edges 22 may vary. This variation depends on both the slide size and the coverslip size.
  • the total variation in the coverslip position is the sum of the variation of the coverslip position on the slide 12 and the variation of the slide position in the tray 24.
  • the optical system 28 for illuminating the specimen requires that the area illuminated be clear of obstructions for 0.318 mm from the edge of any field imaged.
  • Coverslip edges because of their effect on illumination, are considered obstructions.
  • coverslip thicknesses and slide thicknesses vary, so will the size of the area that must be clear of obstructions.
  • the corners of the coverslip 14 may be used as fiducial marks to define a coordinate system for each slide.
  • a coordinate system based on the coverslip corners is invariant with respect to translation and rotation of the slide. Therefore, an object located in a coordinate system based on the coverslip corners may still be found if the slide is removed from the slide tray and replaced, or moved around in the tray. The slide may even be moved to a different type of microscope, provided that the different type of microscope can be used to accurately locate the same coverslip corners, and that the coverslip has not moved relative to the slide.
  • One embodiment of a coverslip edge detection process as contemplated by the invention comprises two sub-processes.
  • a first sub-process called Edge Detection FOV Processing, is performed on the image processing computers to analyze one image per FOV acquired from the microscope and determine if an edge exists or not.
  • a second sub-process called FOV Processing Result Collection and Analysis, directs a searching pattern, collects the individual image results from the image processing computers, and analyzes the collected data to determine the edge positions and direct additional searches.
  • An edge detection FOV processor which may be one of the FOV processors 568, receives one image as acquired from the microscope with a low power objective, such as a 4X objective.
  • an image comprises 512x512 pixels, with each pixel having a size of 2.75 microns square, yielding an image size of 1.408 millimeters square. It will be understood that the size of the image may vary depending upon the type of equipment used and the invention is not considered to be limited to the specific parameters of this illustrative example.
  • Image preparation uses several input parameters passed from the host computer 568, including queue location, edge search orientation (that is, horizontal or vertical orientation) and left or right edge position prioritization and normalizes the image 33. Normalization comprises the steps of orienting the image 33 to search in a vertical direction only and positioning the image in a predetermined image buffer for the image processing step. Horizontal edges are searched for by rotating the image 90 degrees and searching in the vertical direction.
  • the image processing operation accepts the original image in gray scale, enhances vertically oriented edges, and removes any falsely detected artifacts in the image 33, thereby improving the signal to noise ratio of the edge objects.
  • the original gray scale image 33 is horizontally dilated in step 52. It is also vertically dilated in step 54 and a horizontal closing is performed on it in step 56.
  • the output of the horizontal dilation step 52 may be subtracted from the original gray scale image in step 58 to provide a 35x1 horizontal dilation residue image 64.
  • the output of the vertical dilation step 54 is subtracted from the original image, step 60, to produce a 1x65 vertical dilation residue in the image at step 66.
  • the output of the 13x1 horizontal closing step 56 is subtracted from the original image, step 62, to produce a 13x1 closing residue image 68.
  • the 13x1 closing residue image 68 is subtracted from the 1x65 vertical dilation residue image to produce a thin edge image 76.
  • the 35x1 horizontal dilation residue image 64 is subtracted from the 1x65 vertical dilation residue image 66 to produce a thick edge image 74.
  • the thin edge image 76 is multiplied by two, step 80, and added to the thick edge image in step 78 to produce a combined edge image 82.
  • the combined edge image is filtered with the process outlined in Figure 4 to produce a filtered edge image 86.
  • the projection method of the invention determines whether there are edges in an image from the filtered edge image 86.
  • the image is divided into a number of different images. In one example the image is divide into 4 horizonal regions. This enables the adaption the method to local image contrast. Each region has different filtered thresholds.
  • Multiple views allows the invention to process skew edges. For example, given a skew edge crossing 4 fields of view, 1/4 of the edge will span each field of view, becoming an object hit. These hits may be correlated into one skew edge in the final step of the project process of the invention resulting in a very robust method.
  • the correlation steps process broken edge images as well.
  • FIG. 8A, 8B, 8C, 8D, 8E and 8F show the processing of a filtered edge image.
  • Figure 8A shows a filtered edge image.
  • Figure 8B shows the image divided into 4 equal horizontal regions where a vertical projection of pixels in each region has been performed. The sum of the pixel values of each column region has been computed.
  • Figure 8C shows the results of projecting each region. Higher values indicate more vertical edge energy.
  • a low pass filter is performed on each project result curve.
  • Figure 8D shows curves that are the low pass results. The curves serve as thresholds to determine object hits in each region.
  • Figure 8E shows the object hits as detected. They might not be continuous due to poor image quality, improper focus or non-ideal image processing.
  • Figure 8F shows the edge detected as the object hits correlate to meet the minimum error criteria.
  • FIG. 4 a schematic of processing which is required to image the edge with a filter is shown.
  • the combined edge image 82 is vertically eroded in a 1x5 vertical erode step 94, then vertically dilated in a 1x9 vertical dilate step 96 .
  • the image is then vertically eroded again in a 1x13 vertical erode step 104 and vertically dilated again in a 1x17 vertical dilate step 102. It is subsequently vertically eroded again in a 1x26 vertical erode step 98 and vertically dilated in a 1x36 vertical dilate step 106.
  • Another vertical erosion in 1x51 vertical erosion step 108 is followed by a vertical dilation step in 1x67 vertical dilate step 110 again, respectively.
  • the image is then vertically dilated again in 1x101 vertical erode step 112 and vertically dilated in 1x134 vertical dilate step 114, each with larger and larger structuring elements.
  • the processed output is a filtered edge
  • the image may be substantially equally divided horizontally into four regions and the detected edge intensity may be projected onto these regions.
  • Each region is equal in width to the image and represents the projection of one fourth of the edge intensity.
  • the edge intensity projection comprises the summation of pixels in the vertical direction within the region. Objects with greater vertical energy sum to larger numbers than objects with less vertical energy.
  • the object projection values are lowpass filtered in one direction and thresholded. Each occurrence of the region projection exceeding the filtered thresholded projection is called an object hit.
  • FIGS. 8A, 8B, 8C, 8D, 8E and 8F show examples of edges that have been detected as object hits.
  • the object hits from each group are analyzed for their correlation to a perfect vertical line.
  • Potential edges are made up of one object hit from each region.
  • qualifications for an edge include a minimum of 3 region hits, i.e. the potential edge must span three regions or three quarters of the image. The span need not include contiguous regions.
  • Yaw (rotation) of the objects must not vary from the vertical by more than 10 degrees.
  • Line segment correlation preferably meets a minimum error criterion. If the points being considered meet the above criteria, then an edge is considered to be detected and the image pixel position for the edge is saved and returned.
  • a maximum of three edges that satisfy the correlation criteria are returned to the host. If more than three edges are present, one additional direction input to the FOV processing determines the position (leftmost or rightmost) of the three edges returned.
  • the slide coordinate system is specified with the long edge of the slide as the X axis, and the short side as the Y axis.
  • a label region of the slide, typically with frosted glass, is placed to the left, so the positive X axis points away from the label, and the positive Y axis points upward.
  • Edges are referred to as the left edge-the vertical edge with the smaller X value the right edge-the vertical edge with the larger X value the bottom edge-the horizontal edge with the smaller Y value and the top edge-the horizontal edge with the larger Y value.
  • the intersection of the bottom and left slide edges is the origin of the coordinate system.
  • a scan comprises the steps of specifying a starting point, a direction, a spacing to the automated microscope image acquisition and processing system, and a count of images to acquire. Also specified is the type of edge to look for horizontal or vertical.
  • the image collection and processing system acquires images, and returns the positions of the edges detected within each image. Edges are searched in the order of left, step 124, bottom, step 126, top, step 128, then right, step 130. Failing to find any one edge aborts any attempt to locate edges later in the list in step 132.
  • the search for a particular edge begins with a scan orthogonal to the edge in question. This scan is called a crossing search step 154.
  • the crossing search starts inside the area which must be covered by the coverslip region 16, with a spacing of 75 percent of the width of one image acquired from the microscope. If the bottom or top edge is the target of the search, a horizontal edge is expected, a vertical edge is expected for the left and right edge searches.
  • a following search 156 is performed.
  • the following search 156 begins centered at the point where the edge was detected, the direction is parallel to the edge in question, the spacing and count are chosen to give enough edge detections to ensure a good edge fit, and to not search outside the area that has been calculated to be within the coverslip.
  • FIG. 5 shows one example of a coverslip detection method of the invention.
  • the coverslip detection process starts with step 122.
  • the process first tries to find the left edge of the coverslip in step 124. If the left edge is found, the process proceeds to find the bottom edge in step 126. If the left edge is not found the process flows to the reject step 132 where the specimen is rejected. If the bottom edge is found in step 126 the process flows to step 128 to find the top edge. If the top edge is found, the process flows to step 130 to find the right edge. If the bottom, top, or right edge is not found the processes all flow to step 132. If in step 130 the right edge is found, the process flows to step 136 to project the corners of the coverslip.
  • step 134 determines whether the corners are square.
  • the detected edges are validated to ensure that true edges have been found.
  • the coverslip corners are projected by calculating the intersection of adjacent edge pairs. These intersection points are used in some of the following evaluations.
  • a coverslip is rectangular, with 90 degree corners, so the angle between adjacent edges is compared against a pre ⁇ defined standard. If a corner angle is outside the standard, the slide 12 is rejected for analysis as unable to locate the coverslip. If in step 134 the corners are determined not to be square then the slide is rejected in step 132.
  • step 138 determines whether the lengths are multiples of 10 mm.
  • Coverslips are typically manufactured with a width that is a multiple of 10 mm. The width of the detected coverslip is compared to 40, 50, and 60 mm within a pre-defined tolerance in step 140. If the width is outside the tolerance at all of these widths, one more search is made to attempt to find the true edge.
  • the right edge is assumed to be correct, and a follow search 156 is performed at positions 40, 50, and 60 mm from the right edge. If such an edge is found, the coverslip corner angles are evaluated again, as are the following criteria. This retry search is performed only one time. The technique of assuming one edge position is true and looking for the opposite edge at fixed distances could be extended to all four edges, but practice up to now has not shown a need to do this. If they are not multiples of 10 mm the process searches for the left edge at 40 mm, 50 mm and 60 mm from the right in step 140.
  • step 134 If the left edge is found the process returns to use the new data obtained in step 140 to project the corners again in step 136.
  • step 134 and step 138 if in step 138 the lengths are 10 mm multiples the process flows to step 142 to determine whether the width of the coverslip is acceptable. If in step 138 the length is not found to be 10 mm or a multiple thereof after the process is repeated, the process flows to step 132. If in step 142 the width is acceptable the process flows to step 144 to determine if the skew angle is acceptable, and if the skew angle is acceptable in step 144 the process flows to step 148 to report that the coverslip may be accepted. If the width is not acceptable in step 142 the process rejects the slide in step 146.
  • step 140 If in step 140 the search for a left edge at 40 mm, 50 mm and 60 mm from the right is unsuccessful, the process flows to step 146 to reject the slide.
  • Figure 6 which shows a detailed process flow for determining where an edge is while looking for the left, bottom, top or right edge.
  • the process starts at step 152 to begin the edge finding process.
  • the process flows to step 154 to perform a crossing search. If in step 154 an edge was found the process flows to the following search 156, if no edge was found the process searches up to five times until it finds an edge. If after five searches the process reports that the edge was not found, in step 154 a following search is preformed. In a following search the process flows to step 158 to rank edge fit candidates.
  • step 172 The process then flows to step 172 to select the highest candidate.
  • step 174 determines whether the highest candidate qualifies against a predetermined set of criteria. If the candidate does qualify the process flows to step 176 to report that the edge has been accepted. If the candidate does not qualify the process flows to step 178 to select the next highest candidate. If the next highest candidate does qualify, the process flows to step 176 again. If all the candidates have been exhausted, the process flows to step 180 to reject the edge.
  • Figure 7A shows a cross section of one scan of the apparatus of the invention.
  • a bubble edge 202 and an adhesive globule 204 is shown with the slide edge 208 and the coverslip edge 206.
  • Figure 7A shows what typically is found while imaging structures on a slide including a slide edge 208 next to a coverslip edge 206, and a globule of adhesive which extends beyond the coverslip.
  • Figure 7B shows detected edge positions represented by diamonds, D. Images fl through f7 represent a following search. Images cl through c5 represent a crossing search. Figure 7B shows a five image crossing search, and the location of the first detected edge from the crossing search. It also shows locations of images of a seven image following search and the edge detected within each image are indicated with a diamond symbol. The areas shown for the crossing search are offset slightly for clarity.
  • Figure 7C schematically illustrates how the process of the invention determines four lines 182, 184, 186, and 188 that may be fit from combinations of the detected edges.
  • the positions where edges are detected form a set of points on a Cartesian plane to which lines can be fit.
  • Trial lines are fit using at most one edge from each image.
  • the line fit minimizes the sum of the squared distance of the used points from the line in a conventional least squares linear fit.
  • Possible edge fits are evaluated by a process using at least four points to define the line.
  • Total error of the difference of the measured position from the calculated position, normalized by the number of points, called pointcount, used in the fit must not exceed a pre-defined value.
  • SUM( abs ( measured - calculated i ) ) / pointCount Lines with a smaller total error as defined in the equation above, are favored. Lines closer to the center of the coverslip are favored.
  • edge fit criteria listed above with respect to Figure 6 are in priority order.
  • the search keeps data on the best edge found so far in step 158 and 172, and as a new candidate is evaluated checks each item on the list.
  • the first two are absolute standards which must be met in step 174 for the candidate to be considered.
  • the third item is used to select the line which gives the lowest total error. Selecting the line closest to the center of the coverslip is used as a tie breaker step 178.
  • Coverslips are typically manufactured with a height within a specific tolerance. If the detected coverslip height is outside this tolerance, the slide is rejected because it is not possible to locate the coverslip. Experience shows that height errors failures are much less common than width failures, so no retry search is implemented.
  • a coverslip may be affixed to a microscope slide so that the coverslip edges are not precisely parallel to the slide edges.
  • the slide may also be placed in the tray so that the slide edges are not precisely aligned with the stage's coordinate system. The sum of these variations in placement yield a maximum angle that the coverslip may be rotated relative to the stage coordinate system.
  • the detected coverslip orientation is determined by calculating the angle of the line from the lower left to lower right corner projection points. If this angle is outside a predetermined range, the slide is rejected for excessive coverslip skew.
  • the invention locates all four coverslip edges, with the detected edge no more than a specified distance inside the coverslip edge so that an adequate percentage of the sample under the coverslip is analyzed.
  • the method of the invention also locates edges with the goal of taking as little time as possible. In practical use, in one embodiment of the invention the implementation requires about 30 seconds.

Abstract

A field of view processor (568) receives image data from a charge coupled device camera (512) which images a slide (12) and coverslip (14) mounted on a movable frame (40) illuminated from below with a uniform light source (508). The field of view processor locates the coverslip by finding edge objects which satisfy a predetermined set of criteria.

Description

METHOD AND APPARATUS FOR DETECTING A MICROSCOPE
SLIDE COVERSLIP FIELD OF THE INVENTION
This invention relates to an apparatus for detecting a microscope slide coverslip, and more particularly to a apparatus for determining the boundaries of a microscope slide coverslip prior to processing of images of areas under the coverslip.
BACKGROUND OF THE INVENTION Automated cytology systems such as the NeoPath Autopap 300 (TM) automated cytology system available from NeoPath Corporation of Bellevue, Washington analyze an image of a biological specimen for evidence of objects such as squamous intraepithelial lesions and cancerous cells. The biological specimen is typically fixed to a microscope slide where the microscope slide includes a coverslip over the biological specimen. It is advantageous to detect the microscope slide coverslip prior to the analysis of the biological specimen for a number of reasons. The image of the specimen is considered of highest quality, exhibiting regular and uniform optical properties only if it is contained under the coverslip. When applied, the coverslip position is invariant with respect to the sample, and, as a result, the coverslip may be used as a fiducial optical element for locating specific objects in the sample.
In automated cytology systems, microscope objective lenses are designed to image specimens under a coverslip of consistent thickness. Thus, images acquired from any biological specimens not lying under the coverslip lack the proper optical properties necessary for high quality image analysis. Therefore, the location of the coverslip must be detected with a very high degree of certainty, or the entire slide may be rejected as unsatisfactory for analysis.
In some imaging applications the coverslip may span more than one field of view. Traditional edge detection techniques do not address the analysis of a small number of partial edges for reconstruction of a coverslip object spanning multiple fields of view.
In the prior art, locating a coverslip by its edges has proven to be difficult for several reasons. For example, coverslips are typically applied by hand, as a result, the quantity of adhesive used tends to vary from slide to slide. In some cases the adhesive does not reach the edge of the coverslip and voids are created in the adhesive. The edges of the voids may appear much like coverslip edges. In other cases, the adhesive may ooze out from under the coverslip causing dirt to collect on the adhesive which results in the optical properties around the edge changing.
In addition, the specimen may have structures that, in some cases, resemble edges. Such structures may degrade or prevent the successful location of the coverslip.
In a typical slide the coverslip and adhesive ranges in size from about 0.10 mm to 0.14 mm thick. The average coverslip with adhesive is approximately 0.12 mm thick. The slide is approximately 1.0 mm thick. The differences in thickness between the slide and the coverslip lead to the slide forming a much stronger edge image which tends to overwhelm the image formed by the coverslip edge when magnified by the microscope.
Traditional edge detection techniques also have difficulty in differentiating transition areas from slide edges. For example, microscope specimens typically have an opaque label attached at one end of a slide. The transition area between the opaque label and the transparent slide creates a very strong edge image. Traditional edge detection techniques often detect this strong edge image better than the weaker images of the edges of the coverslip.
Since the position of a coverslip may vary, it is necessary to accurately locate the coverslip on each slide before a slide specimen can be analyzed. For acceptable results, analysis must take place more than a predetermined distance away from the coverslip edges. Further still, data may only be considered valid if it is acquired from under the coverslip.
Further, since slide analysis requires that a data plane of material on the slide be measured at a number of positions, a map is usually developed to estimate correct focus positions. Because a microscope objective lens is typically corrected for spherical aberrations associated with a particular coverslip and adhesive thickness, focus positions are only meaningful if focused through the coverslip and adhesive. Therefore, focus map positions must be measured through the coverslip.
Further still, corners of the coverslip may be used as fiducial marks for defining a coordinate system to locate objects on the slide. An error even as small as one degree in calculating the angle of the edge line, from the center of a 60 mm coverslip yields an error at the coverslip extreme of 0.52 mm. Therefore, a high degree of accuracy and repeatability is necessary to define coordinates which are useful for precisely locating such objects. It is one motivation of the invention to provide an apparatus to locate a microscope slide coverslip spanning multiple fields of view to provide a longer basis for more accurately determining the coverslip position. SUMMARY OF THE INVENTION
The invention provides a coverslip detection apparatus that locates all four coverslip edges. The apparatus of the invention comprises a field of view processor coupled to receive image data from a charge coupled device camera. The camera views a slide and coverslip that is mounted on a movable frame. The slide is illuminated with a uniform light source. The moveable frame is controlled by a computer in response to the host computer. The host locates the coverslip by first positioning the movable frame to view a portion of the slide within a predetermined area of the slide. The slide is then reimaged after the movable frame moves the slide toward a chosen edge of direction. Edge objects are located and followed over multiple fields of view. If an edge object satisfies a set of criteria, the coverslip edge has been found.
Other objects, features and advantages of the present invention will become apparent to those skilled in the art through the description of the preferred embodiment, claims and drawings herein wherein like numerals refer to like elements. BRIEF DESCRIPTION OF THE DRAWINGS To illustrate this invention, a preferred embodiment will be described herein with reference to the accompanying drawings.
Figures 1A, IB, 1C, ID and IE schematically illustrate the slide imaging apparatus of the invention. Figure 2 shows a schematic of a microscope slide in a slide receptacle with the smallest standard coverslip installed at two positional extremes.
Figure 3 shows a schematic diagram of processing used to detect the field of view edge. Figure 4 shows a schematic of processing required to image the edge with a filter.
Figure 5 shows a coverslip detection flowchart. Figure 6 shows a method of accepting or rejecting an edge. Figures 7A, 7B and 7C show edge fit methods used to determine a true edge.
Figures 8A, 8B, 8C, 8D, 8E and 8F show examples of filtered projection results.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT In a presently preferred embodiment of the invention, the system disclosed herein is used in a system for analyzing cervical pap smears, such as that shown and disclosed in U.S. Patent Application Serial No. 07/838,064, entitled "Method For Identifying Normal Biomedical Specimens", by Alan C. Nelson, et al . , filed February 18, 1992; U.S. Patent Application Serial No. 08/179,812 filed January 10, 1994 which is a continuation in part of U.S. Patent Application Serial No. 07/838,395, entitled "Method For Identifying Objects Using Data Processing Techniques", by S. James Lee, et al. , filed February 18, 1992; U.S. Patent Application Serial No. 07/838,070, now U.S. Pat. No. 5,315,700, entitled "Method And Apparatus For Rapidly Processing Data Sequences", by Richard S. Johnston, et al . , filed February 18, 1992; U.S. Patent Application Serial No. 07/838,065, filed 02/18/92, entitled "Method and Apparatus for Dynamic Correction of Microscopic Image Signals" by Jon W. Hayenga, et al. ,* and U.S. Patent Application Serial No. 08/302,355, filed September 7, 1994 entitled "Method and Apparatus for Rapid Capture of Focused Microscopic Images" to Hayenga, et al . , which is a continuation- in-part of Application Serial No. 07/838,063 filed on February 18, 1992 the disclosures of which are incorporated herein, in their entirety, by the foregoing references thereto.
The present invention is also related to biological and cytological systems as described in the following patent applications which are assigned to the same assignee as the present invention, filed on September 20, 1994 unless otherwise noted, and which are all hereby incorporated by reference including U.S. Patent Application Serial No. 08/309,118, to Kuan et al . entitled, "Field Prioritization Apparatus and Method," U.S. Patent Application Serial No. 08/309,061, to Wilhelm et al. , entitled "Apparatus for Automated Identification of Cell Groupings on a Biological Specimen," U.S. Patent Application Serial No. 08/309,116 to Meyer et al . entitled "Apparatus for Automated Identification of Thick Cell Groupings on a Biological Specimen," U.S. Patent Application Serial No. 08/309,115 to Lee et al. entitled "Biological Analysis System Self Calibration Apparatus," U.S. Patent Application Serial No. 08/308,992, to Lee et al . entitled "Apparatus for Identification and Integration of Multiple Cell Patterns," U.S. Patent Application Serial No. 08/309,063 to Lee et al . entitled "A Method for Cytological System Dynamic Normalization," U.S. Patent Application Serial No. 08/309,077 to Rosenlof et al . entitled "Apparatus for Detecting Bubbles in Coverslip Adhesive," U.S. Patent Application Serial No. 08/309,931, to Lee et al. entitled "Cytological Slide Scoring Apparatus," U.S. Patent Application Serial No. 08/309,148 to Lee et al . entitled "Method and Apparatus for Image Plane Modulation Pattern Recognition," U.S. Patent Application Serial No. 08/309,250 to Lee et al . entitled "Apparatus for the Identification of Free-Lying Cells," U.S. Patent Application Serial No. 08/309,209 to Oh et al . entitled "A Method and Apparatus for Robust Biological Specimen Classification," U.S. Patent Application Serial No. 08/309,117, to Wilhelm et al. entitled "Method and Apparatus for Detection of Unsuitable Conditions for Automated Cytology Scoring."
It is to be understood that the various processes described herein may be implemented in software suitable for running on a digital processor. The software may be embedded, for example, in the central processor 540.
Now refer to Figures 1A, IB and 1C which show a schematic diagram of one embodiment of the apparatus of the invention for field of view prioritization. The apparatus of the invention comprises an imaging system 502, a motion control system 504, an image processing system 536, a central processing system 540, and a workstation 542. The imaging system 502 is comprised of an illuminator 508, imaging optics 510, a CCD camera 512, an illumination sensor 514 and an image capture and focus system 516. The image capture and focus system 516 provides video timing data to the CCD cameras 512, the CCD cameras 512 provide images comprising scan lines to the image capture and focus system 516. An illumination sensor intensity is provided to the image capture and focus system 516 where an illumination sensor 514 receives the sample of the image from the optics 510. In one embodiment of the invention, the optics may further comprise an automated microscope 511. The illuminator 508 provides illumination of a slide. The image capture and focus system 516 provides data to a VME bus 538. The VME bus distributes the data to an image processing system 536. The image processing system 536 is comprised of field of view processors 568. The images are sent along the image bus 564 from the image capture and focus system 516. A central processor 540 controls the operation of the invention through the VME bus 538. In one embodiment the central processor 562 comprises a MOTOROLA 68030 (TM) CPU. The motion controller 504 is comprised of a tray handler 518, a microscope stage controller 520, a microscope tray controller 522, and a calibration slide 524. The motor drivers 526 position the slide under the optics. A bar code reader 528 reads a barcode located on the slide 524. A touch sensor 530 determines whether a slide is under the microscope objectives, and a door interlock 532 prevents operation if the doors are open. Motion controller 534 controls the motor drivers 526 in response to the central processor 540. An Ethernet (TM) communication system 560 communicates to a workstation 542 to provide control of the system. A hard disk 544 is controlled by workstation 550. In one embodiment, workstation 550 may comprise a SUN SPARC CLASSIC (TM) workstation. A tape drive 546 is connected to the workstation 550 as well as a modem 548, a monitor 552, a keyboard 554, and a mouse pointing device 556. A printer 558 is connected to the Ethernet (TM) communication system 560.
During coverslip detection, the central computer 540, running a real time operating system, controls the microscope 511 and the processor to acquire and digitize images from the microscope 511. The flatness of the slide may be checked, for example, by contacting the four corners of the slide using a computer controlled touch sensor. The computer 540 also controls the microscope 511 stage to position the specimen under the microscope objective, and from one to fifteen field of view (FOV) processors 568 which receive images under control of the computer 540. Referring now to Figures ID and IE which show schematics of the coverslip detection apparatus of the invention. The coverslip detection apparatus comprises a computer and image acquisition system comprising a host computer 568. The host computer 568 runs a real time operating system that controls the operation of the remainder of the system comprising a microscope 26 with a xenon strobe 508 for illumination, a processor 536 to acquire and digitize images from the microscope 26, a computer controlled microscope stage 40 to position the specimen under the microscope objective 30, and from one to 15 Field Of View (FOV) processors 568 which receive images under control of the host computer and from the image acquisition processor 536. The FOV processors 568 perform image processing and object classification operations on the images and transmit the results to the host computer.
Figure IE shows an expanded view of one corner of a microscope slide 12 covered by coverslip 14. The adhesive 11 bonds the coverslip 14 to the slide.
Figure 2 shows an example slide 12 in slide receiving receptacle 20. Slide 12 is a typical specimen glass slide having two general areas, the label area 18, which may comprise, for example, frosted glass largely covered by an opaque label, and the transparent specimen area 19. A glass coverslip 14 is shown here in two locations indicated by dotted coverslips 14a, 14b. A first slide location 15a and a second slide location 15b are examples of coverslip placement showing placement on the light transmissive portion 19 of slide 12. The slide location 15a shows the location of the coverslip in the upper-left-most extreme in relation to Figure 2. The slide location 15b shows the coverslip in the bottom-right-most extreme position. A window region indicated by 16 and outlined by a border 17 is the region of the slide receptacle 20 whereby due to physical restriction of the slide mount, there must always exist a coverslip. The slide 12 may be positioned anywhere within the limits of the slide receptacle 20 from the upper left hand corner of slide receptacle 20 to the bottom right hand corner. Regardless of the location of the slide 12 with respect to the slide receptacle 20, window region 16 provides a window where the coverslip will always be found. The invention exploits this condition by starting the search for the coverslip edge within the window region 16.
In the following descriptions, width refers to a dimension in X, and height refers to a dimension in Y on a conventional Cartesian coordinate system when viewing a slide shown in a top view, such as in Figure 2.
On the slide receiving receptacle 20, the slide 12 position may vary within a rectangle of dimensions stageWidth by stageHeight. Where stagβWidth represents a measure of the space for the slide within the receiving receptacle 20 and stageHeight represents a similar spatial measure the height of the space for the slide within the receiving receptacle 20. A portion of the slide 12 is designated the label area 18 which may not be covered by any portion of the coverslip 14a, 14b. The label area spans the height of the slide 12 and has a width defined as labelWidth from the left edge of the slide 12. A coverslip 14a, 14b has minimum dimensions of coverslipWidth by coverslipHeight, and may be adhered to any point on the slide 12 such that no part of the coverslip extends beyond the slide edge, and no part of the coverslip extends into the label area. It follows that coverslipWidth is a parameter measuring the coverslip width and coverslipHeight is a parameter measuring the coverslip height .
Given the stage limits and the minimum coverslip size, the region 16 that must be covered by the coverslip may be computed. The rightmost position of the left edge, maxLeft is calculated as maxLeft = stageWidth - coverslipWidth. The leftmost position of the right edge, minRight, is calculated as minRight = labelWidth + coverslipWidth.
The limits for the topmost value of the bottom coverslip edge, minTop, and the bottom-most position, maxBottom, of the top coverslip edge are similarly calculated maxBottom = stageHeight - coverslipHeight minTop = coverslipHeight. The region bounded by maxLeft and minRight in the X axis and maxBottom and minTop in the Y axis must fall under the coverslip. Similarly, an outer bound for the coverslip area exists defined by the parameters minLeft, maxRight, minBottom, and maxTop which are defined in the following equations. minLeft = labelWidth maxRight = stageWidth minBottom = 0 maxTop = stageHeight. Thus, to search for a particular edge of the coverslip, a search begins inside the region 16 which must be covered by the coverslip, proceeds in a direction orthogonal to the edge in question in one embodiment, and ends outside the coverslip outer bound area.
The apparatus of the invention includes a tray 24 that may hold at least one slide, but preferably a number of slides. In one example of the invention, tray 24 holds up to eight slides in position for moving about under the microscope lens 30. The tray 24 includes receptacle 20 where receptacle 20 includes an opening suitably sized to accept a standard range of slide sizes and where the opening may be advantageously large enough to facilitate insertion and removal of the slides. In one example of the invention receptacle is sized to accommodate slide positions that may vary by as much as 2 millimeters about the X and Y axes relative to the receptacle 20. Also, since the coverslip 14 applied to the slide 12 is smaller than the slide 12, the coverslip position relative to the slide edges 22 may vary. This variation depends on both the slide size and the coverslip size. The total variation in the coverslip position is the sum of the variation of the coverslip position on the slide 12 and the variation of the slide position in the tray 24.
The total variation is significant because, in certain applications, the optical system 28 for illuminating the specimen requires that the area illuminated be clear of obstructions for 0.318 mm from the edge of any field imaged. Coverslip edges, because of their effect on illumination, are considered obstructions. Those skilled in the art will recognize that as coverslip thicknesses and slide thicknesses vary, so will the size of the area that must be clear of obstructions.
Because the coverslip 14 is permanently attached to the slide 12, the corners of the coverslip 14 may be used as fiducial marks to define a coordinate system for each slide. A coordinate system based on the coverslip corners is invariant with respect to translation and rotation of the slide. Therefore, an object located in a coordinate system based on the coverslip corners may still be found if the slide is removed from the slide tray and replaced, or moved around in the tray. The slide may even be moved to a different type of microscope, provided that the different type of microscope can be used to accurately locate the same coverslip corners, and that the coverslip has not moved relative to the slide.
One embodiment of a coverslip edge detection process as contemplated by the invention comprises two sub-processes. A first sub-process, called Edge Detection FOV Processing, is performed on the image processing computers to analyze one image per FOV acquired from the microscope and determine if an edge exists or not. A second sub-process, called FOV Processing Result Collection and Analysis, directs a searching pattern, collects the individual image results from the image processing computers, and analyzes the collected data to determine the edge positions and direct additional searches. Referring now to Figure 3, a schematic diagram of processing used to detect the field of view edge is shown. An edge detection FOV processor, which may be one of the FOV processors 568, receives one image as acquired from the microscope with a low power objective, such as a 4X objective. In one embodiment of the invention, an image comprises 512x512 pixels, with each pixel having a size of 2.75 microns square, yielding an image size of 1.408 millimeters square. It will be understood that the size of the image may vary depending upon the type of equipment used and the invention is not considered to be limited to the specific parameters of this illustrative example.
Image preparation uses several input parameters passed from the host computer 568, including queue location, edge search orientation (that is, horizontal or vertical orientation) and left or right edge position prioritization and normalizes the image 33. Normalization comprises the steps of orienting the image 33 to search in a vertical direction only and positioning the image in a predetermined image buffer for the image processing step. Horizontal edges are searched for by rotating the image 90 degrees and searching in the vertical direction.
As described in more detail below with regard to the steps of Figure 3, the image processing operation accepts the original image in gray scale, enhances vertically oriented edges, and removes any falsely detected artifacts in the image 33, thereby improving the signal to noise ratio of the edge objects. Perform successive openings 84 using structuring elements with successively larger vertical length and unit width. Filter out any objects without substantial contiguous vertical energy.
The original gray scale image 33 is horizontally dilated in step 52. It is also vertically dilated in step 54 and a horizontal closing is performed on it in step 56. The output of the horizontal dilation step 52 may be subtracted from the original gray scale image in step 58 to provide a 35x1 horizontal dilation residue image 64. The output of the vertical dilation step 54 is subtracted from the original image, step 60, to produce a 1x65 vertical dilation residue in the image at step 66. The output of the 13x1 horizontal closing step 56 is subtracted from the original image, step 62, to produce a 13x1 closing residue image 68. At step 72, the 13x1 closing residue image 68 is subtracted from the 1x65 vertical dilation residue image to produce a thin edge image 76. At step 70, the 35x1 horizontal dilation residue image 64 is subtracted from the 1x65 vertical dilation residue image 66 to produce a thick edge image 74. The thin edge image 76 is multiplied by two, step 80, and added to the thick edge image in step 78 to produce a combined edge image 82. The combined edge image is filtered with the process outlined in Figure 4 to produce a filtered edge image 86.
The projection method of the invention determines whether there are edges in an image from the filtered edge image 86. The image is divided into a number of different images. In one example the image is divide into 4 horizonal regions. This enables the adaption the method to local image contrast. Each region has different filtered thresholds. Multiple views allows the invention to process skew edges. For example, given a skew edge crossing 4 fields of view, 1/4 of the edge will span each field of view, becoming an object hit. These hits may be correlated into one skew edge in the final step of the project process of the invention resulting in a very robust method. The correlation steps process broken edge images as well.
Now refer to Figures 8A, 8B, 8C, 8D, 8E and 8F which show the processing of a filtered edge image.
Figure 8A shows a filtered edge image. Figure 8B shows the image divided into 4 equal horizontal regions where a vertical projection of pixels in each region has been performed. The sum of the pixel values of each column region has been computed. Figure 8C shows the results of projecting each region. Higher values indicate more vertical edge energy. A low pass filter is performed on each project result curve. Figure 8D shows curves that are the low pass results. The curves serve as thresholds to determine object hits in each region. Figure 8E shows the object hits as detected. They might not be continuous due to poor image quality, improper focus or non-ideal image processing. Figure 8F shows the edge detected as the object hits correlate to meet the minimum error criteria.
Now refer to Figure 4 where a schematic of processing which is required to image the edge with a filter is shown. The combined edge image 82 is vertically eroded in a 1x5 vertical erode step 94, then vertically dilated in a 1x9 vertical dilate step 96 . The image is then vertically eroded again in a 1x13 vertical erode step 104 and vertically dilated again in a 1x17 vertical dilate step 102. It is subsequently vertically eroded again in a 1x26 vertical erode step 98 and vertically dilated in a 1x36 vertical dilate step 106. Another vertical erosion in 1x51 vertical erosion step 108 is followed by a vertical dilation step in 1x67 vertical dilate step 110 again, respectively. The image is then vertically dilated again in 1x101 vertical erode step 112 and vertically dilated in 1x134 vertical dilate step 114, each with larger and larger structuring elements. The processed output is a filtered edge image 86.
Once the image is enhanced by the above-described process such that the remaining image has only objects of substantial vertical energy, the image may be substantially equally divided horizontally into four regions and the detected edge intensity may be projected onto these regions. Each region is equal in width to the image and represents the projection of one fourth of the edge intensity. The edge intensity projection comprises the summation of pixels in the vertical direction within the region. Objects with greater vertical energy sum to larger numbers than objects with less vertical energy. The object projection values are lowpass filtered in one direction and thresholded. Each occurrence of the region projection exceeding the filtered thresholded projection is called an object hit.
The invention locates edges with the goal of rejecting no slide with a correctly applied coverslip. Practical use required a false rejection rate of no more than two or three percent. Figures 8A, 8B, 8C, 8D, 8E and 8F show examples of edges that have been detected as object hits. The object hits from each group are analyzed for their correlation to a perfect vertical line. Potential edges are made up of one object hit from each region. Qualifications for an edge include a minimum of 3 region hits, i.e. the potential edge must span three regions or three quarters of the image. The span need not include contiguous regions.
Yaw (rotation) of the objects must not vary from the vertical by more than 10 degrees. Line segment correlation preferably meets a minimum error criterion. If the points being considered meet the above criteria, then an edge is considered to be detected and the image pixel position for the edge is saved and returned.
In one useful embodiment of the invention, a maximum of three edges that satisfy the correlation criteria are returned to the host. If more than three edges are present, one additional direction input to the FOV processing determines the position (leftmost or rightmost) of the three edges returned. The slide coordinate system is specified with the long edge of the slide as the X axis, and the short side as the Y axis. A label region of the slide, typically with frosted glass, is placed to the left, so the positive X axis points away from the label, and the positive Y axis points upward. Edges are referred to as the left edge-the vertical edge with the smaller X value the right edge-the vertical edge with the larger X value the bottom edge-the horizontal edge with the smaller Y value and the top edge-the horizontal edge with the larger Y value. The intersection of the bottom and left slide edges is the origin of the coordinate system.
Now referring jointly to Figure 5 and Figure 6 which illustrate that the search for a particular coverslip edge is divided into a series of scans. Briefly summarized, a scan comprises the steps of specifying a starting point, a direction, a spacing to the automated microscope image acquisition and processing system, and a count of images to acquire. Also specified is the type of edge to look for horizontal or vertical. The image collection and processing system acquires images, and returns the positions of the edges detected within each image. Edges are searched in the order of left, step 124, bottom, step 126, top, step 128, then right, step 130. Failing to find any one edge aborts any attempt to locate edges later in the list in step 132.
The search for a particular edge begins with a scan orthogonal to the edge in question. This scan is called a crossing search step 154. The crossing search starts inside the area which must be covered by the coverslip region 16, with a spacing of 75 percent of the width of one image acquired from the microscope. If the bottom or top edge is the target of the search, a horizontal edge is expected, a vertical edge is expected for the left and right edge searches.
If an edge is detected in one or more of the fields processed from the crossing scan, 154, a following search 156 is performed. The following search 156 begins centered at the point where the edge was detected, the direction is parallel to the edge in question, the spacing and count are chosen to give enough edge detections to ensure a good edge fit, and to not search outside the area that has been calculated to be within the coverslip.
Now refer to Figure 5 which shows one example of a coverslip detection method of the invention. The coverslip detection process starts with step 122. The process first tries to find the left edge of the coverslip in step 124. If the left edge is found, the process proceeds to find the bottom edge in step 126. If the left edge is not found the process flows to the reject step 132 where the specimen is rejected. If the bottom edge is found in step 126 the process flows to step 128 to find the top edge. If the top edge is found, the process flows to step 130 to find the right edge. If the bottom, top, or right edge is not found the processes all flow to step 132. If in step 130 the right edge is found, the process flows to step 136 to project the corners of the coverslip. The process flows to step 134 to determine whether the corners are square. In step 134, after edges are detected, the detected edges are validated to ensure that true edges have been found. The coverslip corners are projected by calculating the intersection of adjacent edge pairs. These intersection points are used in some of the following evaluations. A coverslip is rectangular, with 90 degree corners, so the angle between adjacent edges is compared against a pre¬ defined standard. If a corner angle is outside the standard, the slide 12 is rejected for analysis as unable to locate the coverslip. If in step 134 the corners are determined not to be square then the slide is rejected in step 132.
If the corners are determined to be square the process flows to step 138 to determine whether the lengths are multiples of 10 mm. Coverslips are typically manufactured with a width that is a multiple of 10 mm. The width of the detected coverslip is compared to 40, 50, and 60 mm within a pre-defined tolerance in step 140. If the width is outside the tolerance at all of these widths, one more search is made to attempt to find the true edge.
Experience shows that the left edge that is near the label portion of the slide is the edge most likely to be missed or detected as an edge by this process. Therefore, the right edge is assumed to be correct, and a follow search 156 is performed at positions 40, 50, and 60 mm from the right edge. If such an edge is found, the coverslip corner angles are evaluated again, as are the following criteria. This retry search is performed only one time. The technique of assuming one edge position is true and looking for the opposite edge at fixed distances could be extended to all four edges, but practice up to now has not shown a need to do this. If they are not multiples of 10 mm the process searches for the left edge at 40 mm, 50 mm and 60 mm from the right in step 140. If the left edge is found the process returns to use the new data obtained in step 140 to project the corners again in step 136. The process is repeated once, step 134 and step 138 if in step 138 the lengths are 10 mm multiples the process flows to step 142 to determine whether the width of the coverslip is acceptable. If in step 138 the length is not found to be 10 mm or a multiple thereof after the process is repeated, the process flows to step 132. If in step 142 the width is acceptable the process flows to step 144 to determine if the skew angle is acceptable, and if the skew angle is acceptable in step 144 the process flows to step 148 to report that the coverslip may be accepted. If the width is not acceptable in step 142 the process rejects the slide in step 146. If in step 140 the search for a left edge at 40 mm, 50 mm and 60 mm from the right is unsuccessful, the process flows to step 146 to reject the slide. Now referring to Figure 6 which shows a detailed process flow for determining where an edge is while looking for the left, bottom, top or right edge. The process starts at step 152 to begin the edge finding process. The process flows to step 154 to perform a crossing search. If in step 154 an edge was found the process flows to the following search 156, if no edge was found the process searches up to five times until it finds an edge. If after five searches the process reports that the edge was not found, in step 154 a following search is preformed. In a following search the process flows to step 158 to rank edge fit candidates. The process then flows to step 172 to select the highest candidate. The process flows to step 174 to determine whether the highest candidate qualifies against a predetermined set of criteria. If the candidate does qualify the process flows to step 176 to report that the edge has been accepted. If the candidate does not qualify the process flows to step 178 to select the next highest candidate. If the next highest candidate does qualify, the process flows to step 176 again. If all the candidates have been exhausted, the process flows to step 180 to reject the edge.
Refer now to Figure 7A which shows a cross section of one scan of the apparatus of the invention. A bubble edge 202 and an adhesive globule 204 is shown with the slide edge 208 and the coverslip edge 206. Figure 7A shows what typically is found while imaging structures on a slide including a slide edge 208 next to a coverslip edge 206, and a globule of adhesive which extends beyond the coverslip.
Figure 7B shows detected edge positions represented by diamonds, D. Images fl through f7 represent a following search. Images cl through c5 represent a crossing search. Figure 7B shows a five image crossing search, and the location of the first detected edge from the crossing search. It also shows locations of images of a seven image following search and the edge detected within each image are indicated with a diamond symbol. The areas shown for the crossing search are offset slightly for clarity.
Figure 7C schematically illustrates how the process of the invention determines four lines 182, 184, 186, and 188 that may be fit from combinations of the detected edges. The positions where edges are detected form a set of points on a Cartesian plane to which lines can be fit. Trial lines are fit using at most one edge from each image. The line fit minimizes the sum of the squared distance of the used points from the line in a conventional least squares linear fit. For each following search 156, a large number of lines may be fit which satisfy these criteria. Possible edge fits are evaluated by a process using at least four points to define the line. Total error of the difference of the measured position from the calculated position, normalized by the number of points, called pointcount, used in the fit must not exceed a pre-defined value. SUM( abs ( measured - calculatedi ) ) / pointCount Lines with a smaller total error as defined in the equation above, are favored. Lines closer to the center of the coverslip are favored.
At most eight images may be processed in one following scan, and up to three edge hits are returned from the FOV processing. Combining points into all possible edge fits gives 3β or 6,561 possible lines to fit, however, in practical use, the actual number is usually much smaller so that an exhaustive search of the combinations may be done in a reasonable time. The edge fit criteria listed above with respect to Figure 6 are in priority order. The search keeps data on the best edge found so far in step 158 and 172, and as a new candidate is evaluated checks each item on the list. The first two are absolute standards which must be met in step 174 for the candidate to be considered. The third item is used to select the line which gives the lowest total error. Selecting the line closest to the center of the coverslip is used as a tie breaker step 178. Coverslips are typically manufactured with a height within a specific tolerance. If the detected coverslip height is outside this tolerance, the slide is rejected because it is not possible to locate the coverslip. Experience shows that height errors failures are much less common than width failures, so no retry search is implemented.
A coverslip may be affixed to a microscope slide so that the coverslip edges are not precisely parallel to the slide edges. The slide may also be placed in the tray so that the slide edges are not precisely aligned with the stage's coordinate system. The sum of these variations in placement yield a maximum angle that the coverslip may be rotated relative to the stage coordinate system. The detected coverslip orientation is determined by calculating the angle of the line from the lower left to lower right corner projection points. If this angle is outside a predetermined range, the slide is rejected for excessive coverslip skew. The invention locates all four coverslip edges, with the detected edge no more than a specified distance inside the coverslip edge so that an adequate percentage of the sample under the coverslip is analyzed. The method of the invention also locates edges with the goal of taking as little time as possible. In practical use, in one embodiment of the invention the implementation requires about 30 seconds.
The invention has been described herein in considerable detail in order to comply with the Patent Statutes and to provide those skilled in the art with the information needed to apply the novel principles and to construct and use such specialized components as are required. However, it is to be understood that the invention can be carried out by specifically different equipment and devices, and that various modifications, both as to the equipment details and operating procedures, can be accomplished without departing from the scope of the invention itself. What is claimed is:

Claims

1. A microscope slide coverslip detection apparatus comprising:
(a) a movable stage (520) for mounting a microscope slide (12) having a microscope slide coverslip (14) , where the microscope slide coverslip (14) has microscope slide coverslip edges (206) ,*
(b) an imaging means (502) for obtaining an image of the microscope slide coverslip
(14) ;
(c) an image processor (536) to receive and process the image and to locate the microscope slide coverslip edges (206) ,* and
(d) a light source (508) positioned to illuminate the microscope slide coverslip (14) .
2. The apparatus of claim 1 wherein the image processor (536) comprises:
(a) a field of view processor (568) connected to the image processor (536) to process images from the image processor (536) ; and (b) a host computer (562) to control the field of view processors (568) .
3. The apparatus of claim 1 where the image processor (536) searches in a predefined area of the microscope slide coverslip (14) .
4. The apparatus of claim 1 where the image processor (536) performs at least one morphological operation to generate an edge image .
5. The apparatus of claim 2 wherein the field of view processor (568) performs a 35x1 horizontal dilate (52) , a 1x65 vertical dilate (54) and a 13x1 horizontal closing
(56) to generate a thick edge image (74), a thin edge image (76) and a combined edge image (82) .
6. The apparatus of claim 1 wherein a combined edge image (82) is processed into a filtered edge image (86) .
7. The apparatus of claim 1 wherein the microscope slide (12) further comprises a specimen prepared by the Papanicolaou method.
8. The apparatus of claim 1 wherein the imaging means (502) comprises a CCD camera (512) .
9. The apparatus of claim 1 wherein the light source (508) is a arc lamp.
10. The apparatus of claim 1 wherein the light source (508) is a strobed arc lamp.
11. The apparatus of claim 1 further comprising a means for accumulating at least one set of point coordinates for an object of interest, wherein the at least one set of point coordinates are fit to a line (182, 184, 186, 188) with a predetermined fit (Figure 7C) .
12. The apparatus of claim 11 wherein the predetermined fit is a least squares error fit (Figure 7C) .
13. The apparatus of claim 11 wherein a plurality of sets of point coordinates are fit with the predetermined fit (Figure 7C) and one of the plurality of sets of point coordinates is chosen that satisfies a predetermined criteria (174) .
14. The apparatus of claim 13 wherein the predetermined fit comprises a number of points (Figure 7C) , and wherein the predetermined criteria (174) comprises a total error defined by a difference substantially equal to a measured position subtracted from a calculated position of each of the plurality of sets of points
(Figure 7C) , normalized by the number of points used in the predetermined fit does not exceed a pre-defined value.
15. The apparatus of claim 1 further comprising a means for checking whether the microscope slide coverslip edges (206) are substantially perpendicular (134) to each other within a predetermined tolerance.
16. The apparatus of claim 1 wherein the image processor (536) further comprises a plurality of field of view processors (568) to process multiple fields of view.
17. A method of detecting a coverslip on a slide comprising the steps of:
(a) finding a left edge (124) ;
(b) finding a bottom edge (126) ,*
(c) finding a top edge (128) ; (d) finding a right edge (130) ;
(e) projecting a corner of each intersecting pairs of edges (136) ,*
(f) determining that each corner is square (134) ,* (g) rejecting the slide if each corner is not square (132) , if each corner is square, determining whether a length of the coverslip is a multiple of a first distance (138) , and, if each length is not a multiple of the first distance, then searching for a left edge at a second distance, a third distance and a fourth distance from the right edge of the slide (140) ; and (h) determining if a width of the coverslip is proper (142) , if it is not, rejecting the slide (146) , if it is, determining if a coverslip skew angle is proper (144) , if it is, accepting the slide (148) , and if it isn't, then rejecting the slide (146) .
18. The method of claim 17 wherein the first distance is 10 mm (138) , the second distance is 40 mm, the third distance is 50mm, and the fourth distance is 60 mm (140) .
19. A coverslip slide ranking method comprising the steps of :
(a) performing a crossing search to find an edge (154) ;
(b) doing a following search to obtain a plurality of candidate edges (156) ;
(c) ranking each one of the plurality of candidate edges according to a predetermined set of criteria (158) ,*
(d) selecting at least one candidate edge (172) ;
(e) determining whether the at least one candidate edge qualifies (174) , and if the at least one candidate edge qualifies, accepting the at least one candidate edge (176) , otherwise selecting a next highest edge (178) ,* and
(f) selecting only edges that qualify (180) .
20. A method of registering objects under a coverslip comprising the steps of :
(a) locating a microscope coverslip (14) in relation to a microscope slide (12) ;
(b) locating at least one object of interest in relation to the coverslip (14) ; and
(c) remounting the microscope slide (12) and using the coverslip (14) as a basis for a coordinate system for locating the at least one object of interest.
21. A method of detecting a coverslip (14) on a slide (12) comprising the steps of: (a) obtaining an image of any portion of the slide (12) in response to computer control (562) ,* (b) illuminating the slide (12) from below with a uniform light source (508) ;
(c) positioning the slide (12) to view a portion (16) of the slide (12) within a predetermined area (17) of the slide
(12) ,*
(d) reimaging the slide (12) after the slide (12) has moved toward a chosen edge of direction; (e) locating edge objects (154) ;
(f) following edge object over multiple fields of view (156) ; and
(g) determining if an edge object satisfies a predetermined set of criteria (174) , and if it does determining that the coverslip edge (206) has been found.
22. The method of claim 21 wherein the predetermined set of criteria further comprise checking if the edge objects are substantially perpendicular (134) to each other within a predetermined tolerance.
23. The method of claim 21 wherein a detected edge (206) is no more than first distance
(138) away from a physical edge so that measurements of a data surface for developing a focus map are taken entirely from data under the coverslip (14) .
24. A method of projecting a digital representation of a microscope slide coverslip image wherein the image (33) spans more than one field of view comprising: (a) dividing the image (33) into a plurality of unique images (74, 76) of the field of view such that each one of the plurality of unique images (74, 76) contains data elements from a slide coverslip edge (206) ;
(b) determining an object hit for each data element comprising an edge element (Figure 7B) ; and
(c) correlating the edge elements (Figure 7C) in each sub image into a microscope edge image (82) .
25. The method of claim 24 wherein the plurality of unique images comprise filtered edge images (86) .
26. The method of claim 24 wherein a low pass filter operator is operated on the filtered edge image (86) .
27. The method of claim 24 wherein the data elements representing the edge are thresholded to determine the object hits
(Figure 8D) .
28. The method of claim 24 further comprising the steps of filtering the plurality of unique images (74, 76) with at least one morphological operator to produce a filtered edge image (86) .
29. The method of claim 28 wherein the at least one morphological operators (52, 54, 56, 64, 66, 68) are selected from a group consisting of erosions, dilations, vertical erosions and vertical dilations.
30. The method of claim 29 comprising the steps of dividing the plurality of unique images
(74, 76) into a plurality of sub-images such that the sub-images contain a portion of the image edge energy.
31. The method of claim 30 wherein the filtered edge image (86) is thresholded and each occurrence of a region projection exceeding a filtered threshold projection is determined to be an object hit (Figure 8D) .
PCT/US1995/011391 1994-09-20 1995-09-07 Method and apparatus for detecting a microscope slide coverslip WO1996009610A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
DE0782738T DE782738T1 (en) 1994-09-20 1995-09-07 METHOD AND APPARATUS FOR DETECTING THE COVER MASK OF A MICROSCOPE SLIDE
EP95931766A EP0782738A4 (en) 1994-09-20 1995-09-07 Method and apparatus for detecting a microscope slide coverslip
CA002200445A CA2200445C (en) 1994-09-20 1995-09-07 Method and apparatus for detecting a microscope slide coverslip
AU35083/95A AU709029B2 (en) 1994-09-20 1995-09-07 Method and apparatus for detecting a microscope slide coverslip
JP8510932A JPH10506461A (en) 1994-09-20 1995-09-07 Method and apparatus for detecting cover slips on microscope slides
GR980300049T GR980300049T1 (en) 1994-09-20 1998-07-31 Method and apparatus for detecting a microscope slide coverslip

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/309,248 US5638459A (en) 1994-09-20 1994-09-20 Method and apparatus for detecting a microscope slide coverslip
US08/309,248 1994-09-20

Publications (1)

Publication Number Publication Date
WO1996009610A1 true WO1996009610A1 (en) 1996-03-28

Family

ID=23197374

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1995/011391 WO1996009610A1 (en) 1994-09-20 1995-09-07 Method and apparatus for detecting a microscope slide coverslip

Country Status (9)

Country Link
US (2) US5638459A (en)
EP (1) EP0782738A4 (en)
JP (1) JPH10506461A (en)
AU (1) AU709029B2 (en)
CA (1) CA2200445C (en)
DE (1) DE782738T1 (en)
ES (1) ES2122949T1 (en)
GR (1) GR980300049T1 (en)
WO (1) WO1996009610A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2053535A3 (en) * 2007-10-22 2012-06-27 Genetix Corporation Automated detection of cell colonies and coverslip detection using hough transforms
WO2018091586A1 (en) * 2016-11-18 2018-05-24 Ventana Medical Systems, Inc. Method and system to detect substrate placement accuracy

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5266495A (en) 1990-03-02 1993-11-30 Cytyc Corporation Method and apparatus for controlled instrumentation of particles with a filter device
US5835620A (en) * 1995-12-19 1998-11-10 Neuromedical Systems, Inc. Boundary mapping system and method
US6148099A (en) * 1997-07-03 2000-11-14 Neopath, Inc. Method and apparatus for incremental concurrent learning in automatic semiconductor wafer and liquid crystal display defect classification
US6122397A (en) * 1997-07-03 2000-09-19 Tri Path Imaging, Inc. Method and apparatus for maskless semiconductor and liquid crystal display inspection
US6130967A (en) * 1997-07-03 2000-10-10 Tri Path Imaging, Inc. Method and apparatus for a reduced instruction set architecture for multidimensional image processing
US6130956A (en) * 1998-02-17 2000-10-10 Butterworth; Francis M. Continuous microbiotal recognition method
US6558623B1 (en) * 2000-07-06 2003-05-06 Robodesign International, Inc. Microarray dispensing with real-time verification and inspection
ES2283347T3 (en) * 1999-10-29 2007-11-01 Cytyc Corporation APPARATUS AND PROCEDURE TO VERIFY THE LOCATION OF AREAS OF INTEREST WITHIN A SAMPLE IN AN IMAGE FORMATION SYSTEM.
US7369304B2 (en) * 1999-10-29 2008-05-06 Cytyc Corporation Cytological autofocusing imaging systems and methods
US7006674B1 (en) 1999-10-29 2006-02-28 Cytyc Corporation Apparatus and methods for verifying the location of areas of interest within a sample in an imaging system
CA2368753C (en) * 1999-11-22 2009-09-01 Ventana Medical Systems, Inc. Stackable non-stick coverslip
GB2356063B (en) * 1999-11-22 2001-10-24 Genpak Ltd Adhesive label with grid for microscope slide
US6759011B1 (en) 1999-11-22 2004-07-06 Ventana Medical Systems, Inc. Stackable non-stick coverslip
US7025933B2 (en) * 2000-07-06 2006-04-11 Robodesign International, Inc. Microarray dispensing with real-time verification and inspection
AU2883702A (en) * 2000-11-03 2002-05-15 Cytyc Corp Cytological imaging systems and methods
US6740530B1 (en) * 2000-11-22 2004-05-25 Xerox Corporation Testing method and configurations for multi-ejector system
US6993169B2 (en) * 2001-01-11 2006-01-31 Trestle Corporation System and method for finding regions of interest for microscopic digital montage imaging
US7155049B2 (en) * 2001-01-11 2006-12-26 Trestle Acquisition Corp. System for creating microscopic digital montage images
US6993187B2 (en) * 2003-02-14 2006-01-31 Ikonisys, Inc. Method and system for object recognition using fractal maps
US7062079B2 (en) * 2003-02-14 2006-06-13 Ikonisys, Inc. Method and system for image segmentation
US20050094263A1 (en) * 2003-10-31 2005-05-05 Vincent Vaccarelli Microscope slide designed for educational purposes
US20060180489A1 (en) * 2005-02-02 2006-08-17 Cytyc Corporation Slide tray
US20100072272A1 (en) * 2005-10-26 2010-03-25 Angros Lee H Microscope slide coverslip and uses thereof
CN101346655B (en) * 2005-10-26 2010-10-06 利·H·安格罗斯 Microscope coverslip and uses thereof
US20100073766A1 (en) * 2005-10-26 2010-03-25 Angros Lee H Microscope slide testing and identification assembly
GB2474611B (en) * 2006-09-08 2011-09-07 Thermo Shandon Ltd Image based detector for slides or baskets for use in a slide processing apparatus
EP2153401B1 (en) 2007-05-04 2016-12-28 Leica Biosystems Imaging, Inc. System and method for quality assurance in pathology
EP2037407B1 (en) * 2007-09-13 2014-07-23 Delphi Technologies, Inc. Method for detecting an object
EP2376964A1 (en) * 2008-12-19 2011-10-19 Abbott Laboratories Method and apparatus for detecting microscope slide coverslips
JP2012514814A (en) 2009-01-06 2012-06-28 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッド Method and apparatus for automatic detection of the presence and type of caps on vials and other containers
JP5187851B2 (en) * 2009-03-05 2013-04-24 株式会社ミューチュアル Inspection apparatus and inspection method
US20110115896A1 (en) * 2009-11-19 2011-05-19 Drexel University High-speed and large-scale microscope imaging
JP5703609B2 (en) * 2010-07-02 2015-04-22 ソニー株式会社 Microscope and region determination method
DE102012101377B4 (en) 2012-02-21 2017-02-09 Leica Biosystems Nussloch Gmbh Method of preparing samples for microscopy and device for checking the cover quality of samples
JP6019798B2 (en) 2012-06-22 2016-11-02 ソニー株式会社 Information processing apparatus, information processing system, and information processing method
US10156503B2 (en) * 2013-03-05 2018-12-18 Ventana Medical Systems, Inc. Methods and apparatuses for detecting microscope slide coverslips
US9581800B2 (en) * 2014-11-21 2017-02-28 General Electric Company Slide holder for detection of slide placement on microscope
EP3899783A1 (en) * 2018-12-20 2021-10-27 BD Kiestra B.V. A system and method for monitoring bacterial growth of bacterial colonies and predicting colony biomass
EP3839596B1 (en) * 2019-12-20 2023-08-23 Euroimmun Medizinische Labordiagnostika AG Device and method for identification of covering glass areas of an object holder

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3824393A (en) * 1971-08-25 1974-07-16 American Express Invest System for differential particle counting
US4175860A (en) * 1977-05-31 1979-11-27 Rush-Presbyterian-St. Luke's Medical Center Dual resolution method and apparatus for use in automated classification of pap smear and other samples
US5086478A (en) * 1990-12-27 1992-02-04 International Business Machines Corporation Finding fiducials on printed circuit boards to sub pixel accuracy
US5129014A (en) * 1989-12-08 1992-07-07 Xerox Corporation Image registration
US5138667A (en) * 1989-06-08 1992-08-11 Bobst Sa Process and device for detecting print registration marks on a web from a multi-color printing press
US5173946A (en) * 1991-05-31 1992-12-22 Texas Instruments Incorporated Corner-based image matching
US5233669A (en) * 1990-11-22 1993-08-03 Murata Manufacturing Co., Ltd. Device for and method of detecting positioning marks for cutting ceramic laminated body
US5257182A (en) * 1991-01-29 1993-10-26 Neuromedical Systems, Inc. Morphological classification system and method
US5267325A (en) * 1991-09-06 1993-11-30 Unisys Corporation Locating characters for character recognition
US5287272A (en) * 1988-04-08 1994-02-15 Neuromedical Systems, Inc. Automated cytological specimen classification system and method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4538299A (en) * 1981-12-04 1985-08-27 International Remote Imaging Systems, Inc. Method and apparatus for locating the boundary of an object
US5058181A (en) * 1989-01-25 1991-10-15 Omron Tateisi Electronics Co. Hardware and software image processing system
US5072382A (en) * 1989-10-02 1991-12-10 Kamentsky Louis A Methods and apparatus for measuring multiple optical properties of biological specimens
US5196350A (en) * 1991-05-29 1993-03-23 Omnigene, Inc. Ligand assay using interference modulation
US5315700A (en) * 1992-02-18 1994-05-24 Neopath, Inc. Method and apparatus for rapidly processing data sequences
US5361140A (en) * 1992-02-18 1994-11-01 Neopath, Inc. Method and apparatus for dynamic correction of microscopic image signals

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3824393A (en) * 1971-08-25 1974-07-16 American Express Invest System for differential particle counting
US4175860A (en) * 1977-05-31 1979-11-27 Rush-Presbyterian-St. Luke's Medical Center Dual resolution method and apparatus for use in automated classification of pap smear and other samples
US5287272A (en) * 1988-04-08 1994-02-15 Neuromedical Systems, Inc. Automated cytological specimen classification system and method
US5287272B1 (en) * 1988-04-08 1996-08-27 Neuromedical Systems Inc Automated cytological specimen classification system and method
US5138667A (en) * 1989-06-08 1992-08-11 Bobst Sa Process and device for detecting print registration marks on a web from a multi-color printing press
US5129014A (en) * 1989-12-08 1992-07-07 Xerox Corporation Image registration
US5233669A (en) * 1990-11-22 1993-08-03 Murata Manufacturing Co., Ltd. Device for and method of detecting positioning marks for cutting ceramic laminated body
US5086478A (en) * 1990-12-27 1992-02-04 International Business Machines Corporation Finding fiducials on printed circuit boards to sub pixel accuracy
US5257182A (en) * 1991-01-29 1993-10-26 Neuromedical Systems, Inc. Morphological classification system and method
US5257182B1 (en) * 1991-01-29 1996-05-07 Neuromedical Systems Inc Morphological classification system and method
US5173946A (en) * 1991-05-31 1992-12-22 Texas Instruments Incorporated Corner-based image matching
US5267325A (en) * 1991-09-06 1993-11-30 Unisys Corporation Locating characters for character recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP0782738A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2053535A3 (en) * 2007-10-22 2012-06-27 Genetix Corporation Automated detection of cell colonies and coverslip detection using hough transforms
WO2018091586A1 (en) * 2016-11-18 2018-05-24 Ventana Medical Systems, Inc. Method and system to detect substrate placement accuracy
US10957071B2 (en) * 2016-11-18 2021-03-23 Ventana Medical Systems, Inc. Method and system to detect substrate placement accuracy
US11600016B2 (en) 2016-11-18 2023-03-07 Ventana Medical Systems, Inc. Method and system to detect substrate placement accuracy

Also Published As

Publication number Publication date
GR980300049T1 (en) 1998-07-31
US5638459A (en) 1997-06-10
EP0782738A4 (en) 1998-05-06
DE782738T1 (en) 1998-11-12
ES2122949T1 (en) 1999-01-01
AU709029B2 (en) 1999-08-19
JPH10506461A (en) 1998-06-23
CA2200445C (en) 2002-05-28
AU3508395A (en) 1996-04-09
CA2200445A1 (en) 1996-03-28
US5812692A (en) 1998-09-22
EP0782738A1 (en) 1997-07-09

Similar Documents

Publication Publication Date Title
CA2200445C (en) Method and apparatus for detecting a microscope slide coverslip
US5647025A (en) Automatic focusing of biomedical specimens apparatus
JP3822242B2 (en) Method and apparatus for evaluating slide and sample preparation quality
US5566249A (en) Apparatus for detecting bubbles in coverslip adhesive
US7668362B2 (en) System and method for assessing virtual slide image quality
TWI478101B (en) Method for assessing image focus quality
CA2200457C (en) Biological analysis system self-calibration apparatus
JP2003513251A (en) Apparatus and method for verifying the location of a region of interest in a sample in an image generation system
EP3410395B1 (en) System and method for assessing virtual slide image quality
US8103072B2 (en) Method and system for identifying biological specimen slides using unique slide fingerprints
JPH10506462A (en) Method and apparatus for detecting inappropriate conditions for automated cytological scoring
WO2000062241A1 (en) Method and apparatus for determining microscope specimen preparation type
WO2000062240A1 (en) Automatic slide classification using microscope slide preparation type

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2200445

Country of ref document: CA

Ref country code: CA

Ref document number: 2200445

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 1995931766

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1995931766

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1995931766

Country of ref document: EP