WO1998021687A1 - Method and system for imaging an object or pattern - Google Patents
Method and system for imaging an object or pattern Download PDFInfo
- Publication number
- WO1998021687A1 WO1998021687A1 PCT/US1997/020432 US9720432W WO9821687A1 WO 1998021687 A1 WO1998021687 A1 WO 1998021687A1 US 9720432 W US9720432 W US 9720432W WO 9821687 A1 WO9821687 A1 WO 9821687A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- image
- function
- determining
- signal
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8822—Dark field detection
- G01N2021/8825—Separate detection of dark field and bright field
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/956—Inspecting patterns on the surface of objects
- G01N2021/95638—Inspecting patterns on the surface of objects for PCB's
- G01N2021/95646—Soldering
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/9501—Semiconductor wafers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/956—Inspecting patterns on the surface of objects
- G01N21/95607—Inspecting patterns on the surface of objects using a comparative method
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
- G02B21/367—Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30148—Semiconductor; IC; Wafer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30152—Solder
Definitions
- the present invention is directed to a system and method for imaging objects or patterns. More particularly, the present invention is directed to a system and method for simultaneously obtaining a plurality of images of an object or pattern from a plurality of different viewpoints .
- Machine vision systems are commonly used in industry for high speed inspections. In particular, these systems are used to obtain digital images of objects in order to determine, with a computer, whether the object is of "acceptable" quality with respect to predetermined specifications. For example, a system may inspect a semiconductor chip package to determine whether each of the leads of the package have the proper dimensions. A system may also inspect for coplanarity of solder balls on ball grid arrays.
- Patterns such as bar codes and data codes are also imaged by such systems. Images of these patterns are analyzed by a computer and in order to "read" the information represented by these codes.
- an object In a machine vision system, an object (or pattern) is typically imaged by illuminating the object with light sources and capturing the light reflected from the object with a video camera (i.e., a photodetector) .
- a digital image is formed from the image received by the camera and the digital data is analyzed by a computer in order to determine characteristics of the object or pattern.
- Obtaining a proper contrast between the object or pattern and the background is critical to obtaining an image of sufficient clarity for accurate analysis by a computer.
- an engineer or knowledgeable user obtains the proper contrast by varying the positions of the light sources with respect to the object or pattern being viewed and with respect to the video camera recording the scene. Additionally, the intensity and possibly the polarization and color of the light sources are varied.
- the illumination is often manipulated to make the background either dark with respect to the object features or pattern (dark-field illumination) or bright with respect to the object features or pattern (bright-field illumination) .
- Obtaining the proper illumination is particularly difficult when working with specular (mirror-like) surfaces, especially when the specular surfaces are curved or multifaceted.
- the combined image tends to look smeared if there is any relative motion between the object and the camera. For example, vibration may cause the object to move slightly. Since an image of the object before the motion and after the motion will not exactly coincide, the combined image will appear smeared.
- the present invention provides a system and method for simultaneously obtaining a plurality of images of an object or pattern from a plurality of different viewpoints.
- proper image contrast is obtained by replacing the light sources of earlier systems with equivalent light sensitive devices and replacing the cameras of earlier systems with equivalent light sources. With such a system, bright-field images and dark-field images may be simultaneously obtained.
- a light source is positioned to illuminate at least a portion of an object.
- a plurality of light guides having input ends are positioned to simultaneously receive light reflected from the object and transmit the received light to a plurality of photodetectors.
- the light guides are arranged such that their respective input ends are spaced substantially equally along at least a portion of a surface of an imaginary hemisphere surrounding the object.
- the signals generated by the photodetectors are processed and a plurality of images of the object are formed.
- Another aspect of the invention provides a method for generating composite images from simultaneously obtained images. Equivalent regions of each image (corresponding to geographically identical subpictures) are compared.
- the subpicture having the highest entropy is selected and stored. This process continues until all subpictures have been considered. A new composite picture is generated by pasting together the selected subpictures.
- the vector of relative light values gathered for each pixel or region of an object illuminated or scanned (i.e., one value for each photodetector) is used to determine reflectance properties of points or regions illuminated on the object or pattern.
- the reflectance properties may be stored in a matrix and the matrix used to read, for example, a Bar Code.
- Fig. 1A is a diagram of a bright-field illumination system
- Fig. IB is an illustration of an image obtained using the bright-field method of illumination
- Fig. 2A is a diagram of a dark-field illumination system
- Fig. 2B is an illustration of an image obtained using the dark-field method of illumination
- Fig. 3 illustrates a sequential illumination system
- Fig. 4A is a diagram of an exemplary system illustrating the principles of the present invention.
- Fig. 4B is a diagram of an exemplary photodetector arrangement
- Fig. 4C is a diagram of a sequential illumination system for reading characters
- Fig. 4D is a diagram of an exemplary system in accordance with the principles of the present invention corresponding to Fig. 4C;
- Fig. 5 illustrates the principles of the present invention in further detail
- Fig. 6 illustrates the scanner and photodiode arrangement of Fig. 5 in further detail
- Fig. 7A is a flowchart of an illustrative process for patching an image
- Fig. 7B is a flowchart illustrating composite gradiant image generation
- Fig. 8 illustrates a scanner illuminating a point on a surface
- Fig. 9 illustrates a matrix representing reflectance properties of a 2-D Bar Code
- Fig. 10A illustrates reflecting properties of a shiny surface
- Fig. 10B illustrates reflecting properties of a diffuse surface
- Fig. IOC illustrates reflecting properties of a mirror (specular) surface
- Fig. 11 is a diagram of an exemplary embodiment of preprocessing hardware
- Fig. 12 is a diagram of an enhancement to Fig. 11.
- FIG. 1A there is illustrated a simple bright-field illumination system 100.
- a video camera 110 having a lens 115 is positioned to image a shiny plate 120 having a diffuse (Lambertian) gray circle 125 painted on it.
- the reflecting properties of shiny, diffuse, and mirror (specular) surfaces are shown in Figs. 10A, 10B, and 10C respectively.
- the shiny plate 120 is orthogonal to the viewing axis of the camera 110.
- the shiny plate 120 reflects light directly back to the camera 110.
- Fig. IB illustrates the bright-field image formed by the camera 110. As shown, the image of the circle 125B appears dark relative to the bright background 120B. If the shiny plate is replaced with a true mirror, beam splitter 160 and lamp 170 would be required to direct the light parallel to the camera axis to obtain true bright- field illumination.
- Figure 2A illustrates a dark- field illumination system.
- a camera 210, lens 215, and shiny plate 220 with a gray circle 225 are positioned in the same manner as in Fig. 1A.
- light sources (“lower light sources”) 260 and 265 are each positioned off to the side (with respect to the camera 210 field of view) and close to the shiny plate 220.
- the light sources 260 and 265 are also positioned approximately equi-distantly from the shiny plate 220.
- Light shrouds 275 prevent light from passing directly from lamps 260 and 265 to lens 215.
- Fig. 2B illustrates the dark-field image captured by the camera 210.
- the image of the circle 225B appears bright relative to the dark background 220B.
- a single system may include upper light sources 330 and 335
- Each set of light sources i.e., upper light sources 330 and 335, and lower light sources 360 and 365
- the most "useful" portions of the bright-field image and the dark-field image captured can be analyzed independently or can be combined to provide a single image of the object.
- Points on some surfaces have complex reflectance properties that are combinations of those shown in Figs. 10A, 10B, and IOC. Also, there may be surface regions viewed by the system of Fig. 3 that are curved or tilted with respect to the horizontal, which may spoil the bright-field or dark-field views. Therefore, the system of Fig.3 may not satisfy a wide range of conditions that include unusual reflectance characteristics, or curved, or multi-sloped surfaces.
- this sequential illumination method increases capture time since a picture, e.g., a video frame, is required for each illumination. Furthermore, the combined image will appear smeared if there is any relative movement between the camera and the object.
- the present invention solves many imaging problems by simultaneously obtaining a plurality of images of the object. Specifically, the present invention provides the proper "contrast" by replacing the light sources of earlier systems with equivalent “cameras” and the cameras of the earlier systems with equivalent light sources. With such a system, a wide choice of illumination viewpoints may be obtained to obtain bright-field or dark-field images regardless of the exact local surface properties or orientation of the object or pattern being viewed.
- a scanner 410 is positioned to illuminate a shiny plate 420 having a diffuse gray circle 425 painted on it.
- the scanner 410 which has a light beam that may be, for example, continuous or AC or pulse modulated, generates a raster scanned light spot that scans across the object but emanates from the location previously occupied by the camera lens 315 of Fig. 3.
- the light spot may be '"white" or a single color as is generated, for example, by a light emitting diode (LED) .
- the light spot may be a single wavelength as may be generated by a laser.
- the light sources of Fig. 3, i.e., 330, 335, 360, and 365, are replaced with photodetectors 430, 435, 460, and 465 such as, for example, photodiode pickups. Because the light spot is scanned in a raster pattern, each of the photodetectors 430, 435, 460, and 465 generates a "video" signal that is synchronized with all other photodetectors 430, 435, 460, and 465.
- the signal generated at each photodetector 430, 435, 460, and 465 is as a result of the illumination of the same "pixel" (light spot on a small region of the object) .
- the signals generated at each photodetector 430, 435, 460, and 465 vary in amplitude according to the reflectance properties and orientation of the area being illuminated with respect to the relative position of the scanner 410 and the photodetector 430, 435, 460, and 465.
- a region of the object i.e., the shiny plate 420 that would appear bright to the camera 310 of Fig. 3 when illuminated by a particular light source, will generate a strong signal when illuminated with a light source (i.e., scanner 410) at the position of the original camera, but sensed by a photodetector at the location of the original light source.
- a region that appeared to be dim to the camera 310 of Fig. 3 when illuminated by a particular light source will generate a weak signal when illuminated with a light source (scanner 310) at the position of the original camera 310 of Fig. 3, but sensed by a light sensor at the location of the original light source.
- photodetectors 430 and 435 when the background of the shiny plate 420 is illuminated by the scanner 410, photodetectors 430 and 435 generate a relatively strong signal while photodetectors 460 and 465 generate a relatively weak signal. Furthermore, when the gray diffuse circle 425 is illuminated by the scanner 410, photodetectors 430 and 435 generate a relatively weak signal while photodetectors 460 and 465 generate relatively strong signals. Accordingly, photodetectors 430 and 435 capture bright-field images of the shiny plate 420 while photodetector 460 and 465 capture dark-field images of the shiny plate 420.
- the light sensitive devices of the illustrated embodiments may employ lenses, fiber optics, light guides, or simple photodetectors.
- the photodetectors may be photomultipliers or semiconductor photodiodes such as, for example, avalanche photodiodes, or phototransistors .
- FIG. 4B an exemplary photodetector arrangement is illustrated, generally corresponding to an array of lensed LEDs used in many machine vision applications.
- Each lens 410B of a lenslet array 420B focusses light onto a corresponding photodiode 430B of a photodiode array 440B.
- the output signal from each photodiode 430B is applied to a summing amplifier 450B.
- the output signal from the summing amplifier 450B may then be sampled.
- the output signal from each individual photodiode 450B may be individually sampled.
- a single fiber bundle or light guide may be used to gather the light from each lenslet at its focal point and the light from all of the bundles or light guides may be summed at a single photodetector.
- Fiber-Lite ® manufactured by Dolan-Jenner Industries, is an illumination system that couples a light input through a fiber optic assembly to form a line of light (MV-IND150LA) , an area of light (MV-IND150ABL) , a point of light (MV-IND150FO) , or a ring of light (MV- IND150RL) .
- MV-IND150LA line of light
- MV-IND150ABL area of light
- MV-IND150FO point of light
- MV-IND150RL ring of light
- a camera 450C must capture three separate images of the characters - one per light source.
- each of the light sources i.e., 410C, 420C, and 430C
- a corresponding photodetector 410D, 420D, and 430D
- the camera 450B is replaced with a laser scanner (450C)
- the characters on the wafer are required to be scanned only one time, with three images being simultaneously captured by the photodetectors 410D, 420D, and 430D.
- Fig. 5 is a diagram illustrating the general principles of the present invention in further detail.
- a scanner 500 positioned to illuminate an object 501 (such as, for example, a semiconductor package) , is controlled by scan control circuitry 502 (under the control of sync signals from a microprocessor or associated hardware 503) to scan the object 501 in a raster pattern.
- the scan control circuitry 501 provides horizontal and vertical scan control signals 504 and a pixel clock signal 505, to control the scanner 500 to sequentially illuminate each spot (i.e., pixel) on the object 501.
- Photodetectors for example, photodiodes PD1 - PDn, are positioned to capture light reflected by the object 501 (as a result of illumination by the scanner 500) from various viewpoints. As illustrated, for a horizontal surface, the photodiodes positioned closest to the object 501 (i.e., photodiodes PD5-PDn) provide dark-field images while the remaining photodiodes (i.e., PD1-PD4) are positioned for bright-field imaging.
- PD1-PD4 and PD5-PDn may reverse in which case PD5-PDn would provide bright-field images and PD1-PD4 would provide dark-field images.
- more or less photodiodes than are shown in Fig. 5 may be used to gather sufficient data for a particular machine vision application.
- Each of the photodiodes PD1 - PDn is connected to an amplifier Al - An for amplifying the signals (representing the intensity of the reflected light detected by the photodiodes PD1 - PDn) generated by the photodiodes PD1 - PDn. Due to the variation in specularity of the surfaces of the scanned object 501, light intensity levels into the photodiodes PD1 - PDn may have a very large dynamic range. Accordingly, logarithmic amplifiers may be advantageously used. In the exemplary embodiment of the present invention, although other amplifiers may be used, logarithmic amplifiers provide several advantages over linear amplifiers (although other types of amplifiers may be used) :
- logarithmic output signals may be easily normalized since dividing the output signal by a reference signal is performed by simply subtracting the reference from the output.
- Each of the amplifiers Al - An is connected to a sample and hold circuit (or register) SHI - SHn for sampling the signals output by the amplifiers Al - An.
- the sample and hold circuits SHI - SHn are synchronized with the scanner by scan control circuitry 502 so that signals representing the intensity of reflected light detected by photo detectors PDl - PDn at the same given instant in time are sampled for each spot of the object illuminated by the scanner 500.
- the signals output by the sample and hold circuitry SHI - SHn are applied to a multiplexer MPX.
- the analog signals from the sample and hold circuitry SHI - SHn are sequentially applied to an analog to digital (A/D) converter 507 by the multiplexer MPX.
- the digital signals generated by the A/D converter 507 are buffered in a buffer memory 508 (or other recording device) at addresses identified by the data control circuitry 507 (under the control of the microprocessor 503) .
- each spot illuminated by the scanner 500 is simultaneously imaged by the photodiodes PDl - PDn. That is, for each spot illuminated at a given X-Y coordinate, a digital intensity value is stored in the buffer memory 508 representing the intensity of the light reflected by the object 501 as detected by each of the photodiodes PDl
- n images of the object 501 are captured (i.e., one image per photodiode) as a result of a single scan of the object 501.
- Fig. 6 illustrates the scanner 500 and the photodiode PDl
- a light beam 600 from a light source 610 is directed though a beam splitter 620 into a lens 630 that focusses the beam 600 to a predetermined spot size on plane 660 via X and Y mirror galvanometers 640 and 650.
- the X galvanometer 640 controlled by the X and Y scan control signals 504 of Fig. 5 and preferably oscillating in accordance with pixel clock 505, reflects the beam 600 onto a Y mirror galvanometer 650.
- the Y galvanometer 650 also controlled by the X and Y scan control signals 504, reflects the beam 600 onto a point of the object 660 under examination.
- sequentially moving the X galvanometer 640 in the direction of arrow 640A causes the beam 600 to illuminate points on the object's surface along an X axis
- sequentially moving the Y galvanometer 650 in the direction of arrow 650A causes the beam 600 to illuminate points along a Y axis
- the scanner 500 is controllable to illuminate each point on the object's surface 660 in a raster pattern. This spot may be illuminated continuously or just during a brief interval at each pixel position as it travels from one pixel position to another according to the pixel clock signal 505.
- the X galvanometer 640 may be replaced with a fixed mirror so that the object 660 is scanned in a single line along the Y axis. The object 660 may then be translated in the X direction via a conveyor or translation table in order to raster scan the object 660.
- the Y galvanometer 650 may be replaced with a fixed mirror and the object 660 translated in the Y direction, or both galvanometers 640 and 650 may be replaced with fixed mirrors and the object translated by an X-Y translation table.
- galvanometers 650 and 660 are shown, other deflection devices such as rotating polygons with mirrored surfaces, rotating prisms, and acousto- optic beam deflectors, all of which are well known in the art, may be used to obtain the desired scan pattern.
- the light beam deflection optics may have many variants such as the use of potical scan lenses 680 (for example, an F-Theta or telecentric lens) between the last beam deflector (here, galvanometer 650) and the object which can be used to provide a more uniform scan pattern or a pattern in which the beam remains substantially perpendicular to surface 660 over all X,Y beam positions.
- the scanner 500 of the exemplary embodiment further includes a lens 670 which focusses light reflected from the object along the beam path 600 onto a photodiode PDn+1 to sample the light that returns directly along the illumination path.
- This photodiode corresponds to light source 170 of Fig. 1A.
- a stop 671 is included to absorb light that is deflected by beam splitter 620.
- light guides LG1 - LGn are distributed around the periphery of an imaginary hemisphere surrounding the object such that their respective input ends are uniformly angularly spaced when viewed from the center of the hemisphere (i.e., the approximate object location).
- Simple patterns such as closely packed circles or hexagons may be used to evenly space the input ends of the light guides LG1 - LGn in azimuth and elevation along the entire surface of the hemisphere, each of the ends in a center of a circle or hexagon.
- the axis of the center beam from scanner 500 may be aligned with a corner where three hexagons meet.
- the axis of the center ray beam of the scanner 500 may be aligned with the center of the "top" hexagon.
- Many other distribution schemes are possible.
- each of the individual light guide LG1 - LGn input ends may lie above or below the surface of the hemisphere. However, each light guide LGl-LGn input end is positioned to maintain the desired angular location when viewed from the object. Output variations between the photodetectors that are connected to light guides LGl-LGn whose input ends are closer or further to the object may be removed during equipment calibration or via computation. Computations are based on the known distance between each input end and the normal object location using the inverse square law.
- the output ends of the light guides LG1 - LGn are proximity focused onto associated photodetectors, i.e., photodiodes PDl - PDn (described above in connection with Fig. 5) .
- a separate lens may be used to image the output end of each light guide LGl-LGn onto its corresponding photodiode PDl - PDn.
- a separate lens may be used to image the field of view onto each light guide input end.
- the separate lens may be selected with a large numerical aperture for maximum light capture. Depth of field and exact focus are not as important considerations as they would be in a camera lens which must resolve adjacent pixels. If the lens associated with the input end of the fiber is somewhat out of focus so that light spills outside of the fiber end, it merely reduces the amount of light captured - it down not affect the sharpness of the captured image. Conversely, depth of field and focus is an important issue for the raster scanned beam.
- the captured image will be blurred since the light spot impinging on the surface being scanned may be significantly larger than the space between pixels as defined by the distance between illuminating pulses or recording intervals. Maximizing the depth of field requires minimizing the numerical aperture of the spot scanner optics, which makes it important to choose a bright source if it is desired to maintain a high level of illumination.
- a laser is a suitable light source.
- narrow bandpass light filters for example, of lOnm width
- Such filters may be placed anywhere in the light path between the scene and the photodetectors.
- photodetectors when substituting photodetectors for a light source at a particular viewpoint as described in connection with Fig. 4A, 5, and 6, several factors may be taken into account, such as, for example, sensitivity at chosen wavelengths, dynamic range, and frequency response.
- Avalanche photodiodes are generally very fast devices with large dynamic ranges and are particularly well suited for capturing high speed pulses at extremely low light levels due to their very high sensitivity.
- Photomultipliers have similar properties.
- the ordinary photodiode, p-i-n photodiode, or phototransistor is also capable of good performance in video application but is of less utility with extremely high speed pulses at extremely low light levels. All of the solid state photodetector devices lose their high frequency capability as their area (and hence capacitance) increases. Accordingly, although it would appear easiest to emulate a light source viewpoint by locating the photosensitive device at the desired position and enlarging its area until it equaled that of the light source it was replacing, the loss of high frequency response and increase in attendant noise (due to increased area) will not always permit this approach.
- One exemplary approach is to use a lens to image the scanned scene onto the photosensitive device. This increases the energy collected to that collected over the area of the lens without increasing the area of the photodetector (with all of the corresponding disadvantages) .
- a non-imagin device such as, for example, a tapered light pipe, may be used instead of a lens. With certain limits, the gain achieved via a tapered light pipe is equal to the input area divided by the area exiting to the photodetector. If an attempt is made to achieve too high a gain, the output rays will emerge almost parallel to the photodetector surface, and by Fresnel relationships, be reflected by the photodetector rather than absorbed.
- the exemplary embodiment illustrated in Figs. 5 and 6 may be calibrated by scanning a flat white object and normalizing the output from each photodiode PDl - PDn with respect to each other.
- the correction values for normalization may then be stored in a table in memory accessible by microprocessor 503, and used during image processing. Although one value could be recorded for each illuminated position of the raster scan, it may only be necessary to store a small subset of this information since the correction values will generally vary very slowly across the field of view.
- equivalent regions of each image are compared (step 710). Since the "useful" portion of a scene will generally be the portion with the highest entropy -- in a practical sense, the portion with the most change or "detail," the subpicture having the highest entropy (for the image information sought) is selected and is stored in memory (step 720) .
- One way to determine the entropy of each subpicture is to pass each subpicture through a 2-D high-pass spatial frequency filter and then square each resulting pixel value. If desired, each pixel may be compared to a threshold value and set to zero if less than the threshold value (in order to eliminate pixels representing noise) . The pixel values in the subpicture may then be summed to obtain a value for the entropy of the subpicture.
- the subpicture may be passed through a corresponding bandpass spatial filter in place of, or in addition to, the aforementioned high-pass filter.
- Fig. 7A continues until all subpictures have been considered (step 730) .
- a new composite picture is then generated (by patching together the selected subpictures) (step 740) that best expresses the detail or structure of the pattern or object being examined.
- patching is simple because there is no perspective distortion due to the varied viewpoints.
- Data captured at the same instant of time at each viewpoint will almost always be from the same illuminated spot or "pixel.” Occasionally light may be received via a multiple reflection, but this will not be the usual case.
- each bit in one of the subpictures for example, the dark-field subpicture
- the dark-field subpicture is XOR'ed with a "1" so as to "flip the bits" of the subpicture to conform it to the bright-field subpicture.
- the dark- field subpicture is converted to an equivalent bright- field subpicture.
- the bright-field subpicture may be converted to an equivalent dark- field subpicture in a similar manner.
- a composite edge image or composite gradient magnitude image may also be derived directly from the individual images obtained from each of the photodetectors.
- Fig. 7B is a flowchart of an exemplary process for deriving the composite gradient magnitude image directly from the individual images.
- step 710B df/dx is derived (for each of the image matrices P x - P s obtained from the photodetectors) from the convolution of each image matrix
- step 720B df/dy is derived for each of the image matrices P from the convolution of each image matrix P : - P s with the Sobel vertical mask h v (i.e.., the Sobel kernal sensitive to horizontal edges) :
- the composite gradient image may be optionally thresholded (step 750B) .
- Reflectance The vector of relative light values gathered for each pixel or region being illuminated (i.e., one value for each photodiode PDl - PDn of Fig. 5) provides a means to infer reflectance properties (e.g., the surface is specular or matte) of points or regions illuminated on the object or pattern.
- the processor 503 For each vector of relative light values collected for each pixel, for example, the following may be determined by the processor 503 or external circuitry (not shown) : 1) the location of the photodetector that has the largest signal and its signal amplitude (the signal amplitude may be used as a reference and the location used to determine the orientation of the point corresponding to the pixel on the object surface); 2) the total (relative) energy received
- the distance from the reference sensor for a given configuration, the location of each of the sensors is known thus the distance is easily calculated
- a significant fraction of the total energy received e.g, the fraction of the total number of detectors that capture almost all of the energy
- this may be calculated, for example, by adding the largest signals -in size order, largest first- until the total is a predetermined percentage of the total energy received by the system and determining how many of signals were added
- reflectance properties may be inferred. If, for example, a point on an object is highly specular (in an ideal sense) , one photodetector would receive all (or most) of the reflected light energy. As shown in Fig. 8, for example, a scanner 810 illuminates a point 820 on a surface of an object 830. If the object is specular at the point 820, the reflected light intensity detected by one of the photodiodes (in this case, photodiode PD2) will likely be significantly larger than the light detected by neighboring photodiodes (here, photodiodes PDl and PD3) . Similarly, if the vector of light intensity values associated with illumination of the point 820 consists of approximately equal values (except for the cosine falloff with angle) , the surface being illuminated is diffuse or matte .
- reflectance properties would be evident from all of the values calculated in connection with 1) through 7) above.
- the value calculated for item 5 i.e., the fraction of the total number of detectors that capture almost all of the energy, may be sufficient to infer specularity (e.g., item 5 would be very small) .
- the computed values corresponding to items 1, 3, and 4 will be close in amplitude.
- the computed values corresponding to items 6 and 7 will be small.
- reflectance properties corresponding to a point are contained in the computed relationships of the values in the vector (as set forth, for example, in items 1-7 above) , the amount of information that must be stored and processed to sufficiently describe these properties can be significantly decreased. Rather than storing each element of the vector (each element corresponding to one photodetector) , it may be only necessary to store a subset of values corresponding to the computed properties of the vector. For example, under certain circumstances, recording items 1, 5, and 6 (above) for each point scanned (as derived from the vector of light intensity values associated with that point) provides sufficient information to infer reflectance. This may be done on a pixel by pixel basis.
- Pre-Processing Circuitry When a reduction in the amount of data recorded as described above is desired, the input vector must be examined or pre-processed between illumination of adjacent pixels. Accordingly, special purpose hardware can be used to ensure that the required processing can be done in the inter-pixel time interval. Such special purpose processing may take advantage of parallel architecture or mixed analog and digital processing to obtain the necessary speed. The portion of the system outlined in dashed lines 511 illustrated in Fig. 5 may be replaced with such pre-processing hardware.
- Fig. 11 illustrates an exemplary embodiment of the pre- processing hardware.
- the signals from each of the log amplifiers Al - An of Fig. 5 are applied to corresponding analog to digital (A/D) converters A/Dl - A/Dn where the analog signals are converted to eight bit digital signals.
- the digital signals from the A/D converters A/Dl - A/Dn are applied, in parallel to that logic circuit 1101 which identifies and extracts the largest digital signal.
- logic circuit 1101 The details of logic circuit 1101 are not shown since there are any number of ways that this can be designed. For example, the most significant bit of each vector element may be examined. Elements having a "0" in this position may be eliminated from further consideration if any of the other elements have a "1" in the same position. This may be repeated for each bit position, one at a time, until the least significant bit of the elements have been considered. At that time, only the largest of the elements will remain. Although this can be performed as a sequential process, it may be advantageous to implement this operation (circuit 1101) as a parallel process by using hard wired logic (using, for example, a PAL, an ASIC, etc.) to obtain high speed operation. Using additional gating at the logic circuit 1101, a processor 1102 may address any of the vector elements in order to use the logic circuit 1101 as a demultiplexer or selector switch.
- the element extracted as the "largest” (i.e., a "reference” value) can now be used to normalize the other vector elements. Since the vector elements are log functions (as when amplifiers Al - An are log amplifiers) , normalization can be accomplished using digital subtraction circuits DS1 - DSn . Specifically, the digital signals from A/D converters A/Dl - A/Dn are applied to the positive input of corresponding subtraction circuits DS1 - DSn where the "largest" vector element from logic circuit 1101 is subtracted from each. The result will be a negative number for each of the elements (except, of course, for each element equal to the "largest" vector element) that is proportional to the log of the ratio of each element to the reference value.
- the processor 1102 polls the element values to rapidly determine which element values are larger than some particular fraction of the energy of the reference value. For example, if the processor is to determine the number of elements whose power is at least l/e**2 of the reference value, each of the signals output from the digital subtractors DS1 - DSn are applied to the positive input of corresponding digital subtraction circuits DSSl - DSSn, and log(l/e**2) 1103 is subtracted therefrom. The elements that have more power than l/e**2 of the reference value produces a positive value at the output of corresponding subtraction circuits DSSl - DSSn. The elements that have less power produce a negative value at the output of corresponding subtraction circuits DSSl - DSSn.
- the signals from the digital subtraction circuits DSSl - DSSn are applied to corresponding sign function (sgn) circuits SGN1 - SGNn, each of which output a high or positive signal if the input signal is positive and output a low or negative signal if the input signal is negative.
- the signals output by the sgn circuits SGN1 - SGNn are transmitted to the processor 1102.
- a processor i.e., processor 1102 having an n bit word can thus identify which of the n element values exceed a particular fraction of the reference power.
- the diagram of Fig. 12 illustrates an enhancement to the pre-processing circuitry of Fig. 11.
- the logic circuitry illustrated in Fig. 12 replaces logic circuit 1101.
- the analog signals from amplifiers Al - An of Fig. 5 are applied to corresponding high speed operational amplifier ("op amp") with diode networks OP1 - OPn.
- the op amp network (s) OP1 - OPn corresponding to the largest signal (s) input from amplifiers Al - An generate a positive output signal.
- the remaining networks OP1 - OPn generate a negative output signal .
- a negative pulse 1201 resets latches 1202 before each illumination light pulse. This pulse is also inverted by invertor 1203 (after a delay) and applied to each of the second terminals of NAND gates Nl - Nn. The signals output by each NAND gate Nl - Nn are applied, in parallel, to latches 1202 which latches the applied signals. The signals from latches 1202 are then applied to a selector switch 1204 to select the appropriate signal (i.e., the largest signal) from the signals received from A/Dl - A/Dn. The selected signal is then output at output terminal 1205 and may then be used as the reference signal (described above in connection with Fig. 11) .
- 2-D Bar Code Determining the reflectance property of each point on a surface is particularly useful in a machine vision application such as reading two- dimensional ("2-D") Bar Codes and data matrix symbols (as described in U.S. Patent Nos. 4,939,354 and 5,053,609, both expressly incorporated herein by reference) . Bar Codes and data matrix symbols are typically generated on a part by altering its local reflectance properties via laser marking, sandblasting, peening, or other means.
- the processor 503 analyzes the information stored regarding each point (e.g., items 1, 5, and 6 above) and generates a two dimensional bit matrix representing the inferred reflectance property of each point illuminated on the object's surface.
- FIG. 9 illustrates a portion of the generated matrix 910 representing a data matrix symbol sandblasted on a specular surface (such as, for example, stainless steel).
- a "1" identifies that the corresponding point on the object's surface is highly reflective or specular while a "0" identifies that the point is matte.
- the vector of relative light values gathered for each pixel also provide a means to infer the surface orientation of points or regions illuminated on the object or pattern. As illustrated in Fig. 8, the normal to the surface at that location may be determined by observing which photodiode detected the highest intensity reflected light, since at approximate locations of the light source 810, the photodiodes PDl - PD3, and the object 830 are known.
- 2-D And 3-D Images As a person of skill in the art will understand, the present invention can be applied to simultaneously obtaining multiple 2-D images of an object, simultaneously obtaining multiple three- dimensional (3-D) images of an object, and simultaneously obtaining both 2-D and 3-D images of an object.
- One 3-D imaging technique (obtaining a single 3-D image) is described in U.S. Patent No. 4,957,369 to Antonsson, expressly incorporated herein by reference.
- a common housing puts all of the optical scanning and photodetection equipment into one easily handled package, and creates a known fixed geometric relationship between the light source and the various photodetectors. This fixed relationship is useful for computing ranges via triangulation and for taking distance and angle into account when determining reflectance properties.
- the photodetectors may be multi output position sensing devices where a ratio of signals indicates angle and the sum of the signals indicates the light values, as disclosed in U.S. Patent 4,957,369 issued to Antonsson, expressly incorporated herein by reference.
- the laser scanning function can be operator held and made as small and lightweight as possible.
- the photodetectors and processing equipment may be distributed throughout the room or work area where the hand held scanner will be used. Most of the photodetectors, however, should not be shadowed from the area of the object that is being scanned by the light spot .
- the object can be located at a large distance from the hand-held laser scanner as long as the laser spot is kept reasonably well focussed on the object surface.
- focus can be maintained. The simplest are based on normal camera-based autofocus techniques such as sonic ranging, maximizing detail, meter ranging, etc.
- the scanner 500 could have its own internal light source (AC or pulsed) used as a target for two photodetectors on a known baseline. This allows the system to track the location of the hand scanner with respect to the known baseline. Since the location of the scanner location and the target location (by tracking a light spot on the object surface) , the processor 503 could compute the range between the scanner and the target and use this information to adjust the optics in the scanner to maintain focus.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biochemistry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Image Processing (AREA)
- Image Input (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Non-Silver Salt Photosensitive Materials And Non-Silver Salt Photography (AREA)
- Holo Graphy (AREA)
- Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
- Preparing Plates And Mask In Photomechanical Process (AREA)
- Thermal Transfer Or Thermal Recording In General (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP97946611A EP1016028B1 (en) | 1996-11-12 | 1997-11-07 | Method and system for imaging an object or pattern |
DK97946611T DK1016028T3 (en) | 1996-11-12 | 1997-11-07 | Method and system for mapping an object or pattern |
CA002271492A CA2271492C (en) | 1996-11-12 | 1997-11-07 | Method and system for imaging an object or pattern |
AU51747/98A AU5174798A (en) | 1996-11-12 | 1997-11-07 | Method and system for imaging an object or pattern |
AT97946611T ATE250249T1 (en) | 1996-11-12 | 1997-11-07 | METHOD AND SYSTEM FOR REPRESENTING AN OBJECT OR PATTERN |
DE69725021T DE69725021T2 (en) | 1996-11-12 | 1997-11-07 | METHOD AND SYSTEM FOR PRESENTING AN OBJECT OR PATTERN |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/748,040 | 1996-11-12 | ||
US08/748,040 US6075883A (en) | 1996-11-12 | 1996-11-12 | Method and system for imaging an object or pattern |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1998021687A1 true WO1998021687A1 (en) | 1998-05-22 |
Family
ID=25007738
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US1997/020432 WO1998021687A1 (en) | 1996-11-12 | 1997-11-07 | Method and system for imaging an object or pattern |
Country Status (10)
Country | Link |
---|---|
US (3) | US6075883A (en) |
EP (2) | EP1359534B1 (en) |
AT (2) | ATE250249T1 (en) |
AU (1) | AU5174798A (en) |
CA (1) | CA2271492C (en) |
DE (2) | DE69736818D1 (en) |
DK (1) | DK1016028T3 (en) |
ES (1) | ES2207754T3 (en) |
PT (1) | PT1016028E (en) |
WO (1) | WO1998021687A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011056282A1 (en) * | 2009-11-05 | 2011-05-12 | Aerospace Corporation | Refraction assisted illumination for imaging |
WO2011119372A1 (en) * | 2010-03-26 | 2011-09-29 | The Aerospace Corporation | Refraction assisted illumination for imaging |
US8450688B2 (en) | 2009-11-05 | 2013-05-28 | The Aerospace Corporation | Refraction assisted illumination for imaging |
US9007454B2 (en) | 2012-10-31 | 2015-04-14 | The Aerospace Corporation | Optimized illumination for imaging |
Families Citing this family (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6705526B1 (en) | 1995-12-18 | 2004-03-16 | Metrologic Instruments, Inc. | Automated method of and system for dimensioning objects transported through a work environment using contour tracing, vertice detection, corner point detection, and corner point reduction methods on two-dimensional range data maps captured by an amplitude modulated laser scanning beam |
US20020014533A1 (en) | 1995-12-18 | 2002-02-07 | Xiaxun Zhu | Automated object dimensioning system employing contour tracing, vertice detection, and forner point detection and reduction methods on 2-d range data maps |
JP4143788B2 (en) * | 1999-08-04 | 2008-09-03 | 澁谷工業株式会社 | Ball mounting apparatus and mounting method |
US6268093B1 (en) * | 1999-10-13 | 2001-07-31 | Applied Materials, Inc. | Method for reticle inspection using aerial imaging |
US6912076B2 (en) * | 2000-03-17 | 2005-06-28 | Accu-Sort Systems, Inc. | Coplanar camera scanning system |
EP1269414B1 (en) * | 2000-03-30 | 2003-10-22 | BRITISH TELECOMMUNICATIONS public limited company | Image processing |
US7173648B1 (en) * | 2000-04-21 | 2007-02-06 | Advanced Micro Devices, Inc. | System and method for visually monitoring a semiconductor processing system |
US7196782B2 (en) * | 2000-09-20 | 2007-03-27 | Kla-Tencor Technologies Corp. | Methods and systems for determining a thin film characteristic and an electrical property of a specimen |
US6891627B1 (en) | 2000-09-20 | 2005-05-10 | Kla-Tencor Technologies Corp. | Methods and systems for determining a critical dimension and overlay of a specimen |
AU2002226951A1 (en) * | 2000-11-17 | 2002-05-27 | Oregon Health And Science University | Stereotactic wands, endoscopes, and methods using such wands and endoscopes |
US6547409B2 (en) * | 2001-01-12 | 2003-04-15 | Electroglas, Inc. | Method and apparatus for illuminating projecting features on the surface of a semiconductor wafer |
US7171033B2 (en) * | 2001-03-28 | 2007-01-30 | The Boeing Company | System and method for identifying defects in a composite structure |
US7072502B2 (en) | 2001-06-07 | 2006-07-04 | Applied Materials, Inc. | Alternating phase-shift mask inspection method and apparatus |
US20020186878A1 (en) * | 2001-06-07 | 2002-12-12 | Hoon Tan Seow | System and method for multiple image analysis |
US6766954B2 (en) * | 2001-06-15 | 2004-07-27 | Symbol Technologies, Inc. | Omnidirectional linear sensor-based code reading engines |
US6845178B1 (en) * | 2001-06-27 | 2005-01-18 | Electro Scientific Industries, Inc. | Automatic separation of subject pixels using segmentation based on multiple planes of measurement data |
US7113633B2 (en) * | 2001-07-02 | 2006-09-26 | Photoinaphoto.Com, Inc. | System and method for discovering and categorizing attributes of a digital image |
EP1273928A1 (en) * | 2001-07-06 | 2003-01-08 | Leica Geosystems AG | Method and device for suppressing electromagnetic background radiation in an image |
US7356453B2 (en) | 2001-11-14 | 2008-04-08 | Columbia Insurance Company | Computerized pattern texturing |
US7221805B1 (en) * | 2001-12-21 | 2007-05-22 | Cognex Technology And Investment Corporation | Method for generating a focused image of an object |
US7344082B2 (en) * | 2002-01-02 | 2008-03-18 | Metrologic Instruments, Inc. | Automated method of and system for dimensioning objects over a conveyor belt structure by applying contouring tracing, vertice detection, corner point detection, and corner point reduction methods to two-dimensional range data maps of the space above the conveyor belt captured by an amplitude modulated laser scanning beam |
DE10209269C1 (en) * | 2002-03-01 | 2003-07-03 | Leuze Electronic Gmbh & Co | Optoelectronic device for detecting barcode markings has receiver enclosed by light shield with channel structure allowing passage of light reflected by beam deflection device |
US6813016B2 (en) * | 2002-03-15 | 2004-11-02 | Ppt Vision, Inc. | Co-planarity and top-down examination method and optical module for electronic leaded components |
US7016525B2 (en) * | 2002-05-02 | 2006-03-21 | Mitutoyo Corporation | Systems and methods for continuously varying wavelength illumination |
US20040076321A1 (en) * | 2002-07-16 | 2004-04-22 | Frank Evans | Non-oriented optical character recognition of a wafer mark |
US20050004472A1 (en) * | 2002-08-17 | 2005-01-06 | Greg Pratt | Medical socket contour scanning system |
US6900942B2 (en) * | 2002-08-20 | 2005-05-31 | Veridian Systems | Wide field-of-view (FOV) coherent beam combiner/detector |
DE10239548A1 (en) * | 2002-08-23 | 2004-03-04 | Leica Microsystems Semiconductor Gmbh | Device and method for inspecting an object |
US8171567B1 (en) | 2002-09-04 | 2012-05-01 | Tracer Detection Technology Corp. | Authentication method and system |
WO2004023209A1 (en) * | 2002-09-04 | 2004-03-18 | Brooks Automation, Inc. | Device and process for reading out identification information on reticles |
JP3777149B2 (en) * | 2002-09-19 | 2006-05-24 | 株式会社ナムコ | Program, information storage medium, and image generation apparatus |
US7181066B1 (en) | 2002-12-26 | 2007-02-20 | Cognex Technology And Investment Corporation | Method for locating bar codes and symbols in an image |
JP3738291B2 (en) * | 2003-06-09 | 2006-01-25 | 住友大阪セメント株式会社 | 3D shape measuring device |
US7236625B2 (en) * | 2003-07-28 | 2007-06-26 | The Boeing Company | Systems and method for identifying foreign objects and debris (FOD) and defects during fabrication of a composite structure |
JP3953988B2 (en) * | 2003-07-29 | 2007-08-08 | Tdk株式会社 | Inspection apparatus and inspection method |
US7162322B2 (en) * | 2003-11-28 | 2007-01-09 | The Ohio Willow Wood Company | Custom prosthetic liner manufacturing system and method |
US8934702B2 (en) * | 2003-12-02 | 2015-01-13 | The Boeing Company | System and method for determining cumulative tow gap width |
US7289656B2 (en) | 2003-12-02 | 2007-10-30 | The Boeing Company | Systems and methods for determining inconsistency characteristics of a composite structure |
ATE433164T1 (en) | 2004-03-12 | 2009-06-15 | Ingenia Technology Ltd | METHOD AND DEVICES FOR GENERATING AUTHENTICABLE ITEMS AND THEIR SUBSEQUENT VERIFICATION |
EP2128790A3 (en) * | 2004-03-12 | 2011-01-26 | Ingenia Technology Limited | Authenticity verification with linearised data |
US7193696B2 (en) * | 2004-04-12 | 2007-03-20 | United Technologies Corporation | Systems and methods for using light to indicate defect locations on a composite structure |
US20060027657A1 (en) | 2004-08-04 | 2006-02-09 | Laurens Ninnink | Method and apparatus for high resolution decoding of encoded symbols |
GB2444139B (en) * | 2004-08-13 | 2008-11-12 | Ingenia Technology Ltd | Authenticity verification methods products and apparatuses |
US7175090B2 (en) | 2004-08-30 | 2007-02-13 | Cognex Technology And Investment Corporation | Methods and apparatus for reading bar code identifications |
DE102004053293A1 (en) * | 2004-11-04 | 2006-05-11 | Giesecke & Devrient Gmbh | Scanning device for barcodes |
US7424902B2 (en) | 2004-11-24 | 2008-09-16 | The Boeing Company | In-process vision detection of flaw and FOD characteristics |
US7963448B2 (en) * | 2004-12-22 | 2011-06-21 | Cognex Technology And Investment Corporation | Hand held machine vision method and apparatus |
US9552506B1 (en) | 2004-12-23 | 2017-01-24 | Cognex Technology And Investment Llc | Method and apparatus for industrial identification mark verification |
US7898524B2 (en) | 2005-06-30 | 2011-03-01 | Logitech Europe S.A. | Optical displacement detection over varied surfaces |
US20100033080A1 (en) * | 2005-08-31 | 2010-02-11 | Kenji Yoneda | Coaxial light irradiation device |
US20070077671A1 (en) * | 2005-10-03 | 2007-04-05 | Applied Materials | In-situ substrate imaging |
DE102005061834B4 (en) * | 2005-12-23 | 2007-11-08 | Ioss Intelligente Optische Sensoren & Systeme Gmbh | Apparatus and method for optically examining a surface |
EP1969525A1 (en) | 2005-12-23 | 2008-09-17 | Ingenia Holdings (UK)Limited | Optical authentication |
US20070202476A1 (en) * | 2006-02-03 | 2007-08-30 | Mark Williamson | Techniques for inspecting an electronic device |
US8170322B2 (en) * | 2006-03-22 | 2012-05-01 | Jadak Llc | Optical imaging system and method using a reflective background |
JP4890096B2 (en) * | 2006-05-19 | 2012-03-07 | 浜松ホトニクス株式会社 | Image acquisition apparatus, image acquisition method, and image acquisition program |
US8108176B2 (en) | 2006-06-29 | 2012-01-31 | Cognex Corporation | Method and apparatus for verifying two dimensional mark quality |
JP4306741B2 (en) | 2006-07-20 | 2009-08-05 | 株式会社デンソーウェーブ | Optical information reader |
GB2443457B (en) * | 2006-10-31 | 2011-11-02 | Hewlett Packard Development Co | Image processing system and method |
CN101191719A (en) * | 2006-12-01 | 2008-06-04 | 鸿富锦精密工业(深圳)有限公司 | Image focal point synthetic system and method |
US8169478B2 (en) * | 2006-12-14 | 2012-05-01 | Cognex Corporation | Method and apparatus for calibrating a mark verifier |
WO2008124397A1 (en) | 2007-04-03 | 2008-10-16 | David Fishbaine | Inspection system and method |
TWI475874B (en) * | 2007-06-15 | 2015-03-01 | Camtek Ltd | Optical inspection system using multi-facet imaging |
US8011583B2 (en) | 2007-07-02 | 2011-09-06 | Microscan Systems, Inc. | Systems, devices, and/or methods for managing data matrix lighting |
US9734376B2 (en) | 2007-11-13 | 2017-08-15 | Cognex Corporation | System and method for reading patterns using multiple image frames |
US8462328B2 (en) * | 2008-07-22 | 2013-06-11 | Orbotech Ltd. | Efficient telecentric optical system (ETOS) |
US8760507B2 (en) * | 2008-08-05 | 2014-06-24 | Inspectron, Inc. | Light pipe for imaging head of video inspection device |
GB2466311B (en) | 2008-12-19 | 2010-11-03 | Ingenia Holdings | Self-calibration of a matching algorithm for determining authenticity |
GB2466465B (en) * | 2008-12-19 | 2011-02-16 | Ingenia Holdings | Authentication |
SG164298A1 (en) * | 2009-02-24 | 2010-09-29 | Visionxtreme Pte Ltd | Object inspection system |
US8441532B2 (en) * | 2009-02-24 | 2013-05-14 | Corning Incorporated | Shape measurement of specular reflective surface |
US20100226114A1 (en) * | 2009-03-03 | 2010-09-09 | David Fishbaine | Illumination and imaging system |
US20110080476A1 (en) * | 2009-10-02 | 2011-04-07 | Lasx Industries, Inc. | High Performance Vision System for Part Registration |
GB2476226B (en) | 2009-11-10 | 2012-03-28 | Ingenia Holdings Ltd | Optimisation |
DE102010032469B4 (en) * | 2010-07-28 | 2014-12-04 | Ioss Intelligente Optische Sensoren & Systeme Gmbh | Method and apparatus for reading codes on solar cell wafers |
US9265629B2 (en) | 2011-04-01 | 2016-02-23 | The Ohio Willow Wood Company | Fabric covered polymeric prosthetic liner |
EP2831539B1 (en) * | 2012-03-30 | 2019-06-05 | Nikon Metrology NV | Improved optical scanning probe |
JP5783953B2 (en) * | 2012-05-30 | 2015-09-24 | 株式会社日立ハイテクノロジーズ | Pattern evaluation apparatus and pattern evaluation method |
EP2909807B1 (en) * | 2012-10-17 | 2020-02-19 | Cathx Research Ltd. | Improvements in relation to underwater imaging for underwater surveys |
KR101510553B1 (en) * | 2013-11-14 | 2015-04-08 | 주식회사 포스코 | Apparatus for detecting steel plate of steel plate using shadow area |
CN107000334B (en) * | 2014-12-03 | 2019-08-13 | 庞巴迪公司 | To composite construction in X -ray inspection X |
CN107110639B (en) | 2014-12-22 | 2020-10-09 | 倍耐力轮胎股份公司 | Device for inspecting tyres on a tyre production line |
BR112017013327B1 (en) * | 2014-12-22 | 2022-05-24 | Pirelli Tyre S.P.A. | Method and apparatus for checking tires on a tire production line. |
JP6832650B2 (en) | 2016-08-18 | 2021-02-24 | 株式会社Screenホールディングス | Inspection equipment and inspection method |
CN109754425B (en) * | 2017-11-01 | 2023-07-28 | 浙江舜宇智能光学技术有限公司 | Calibration equipment and calibration method of TOF (time of flight) camera module |
JP7472111B2 (en) | 2018-09-24 | 2024-04-22 | アプライド マテリアルズ インコーポレイテッド | Machine Vision as Input to CMP Process Control Algorithms |
CN113447485A (en) * | 2020-03-26 | 2021-09-28 | 捷普电子(新加坡)公司 | Optical detection method |
US11847776B2 (en) | 2020-06-29 | 2023-12-19 | Applied Materials, Inc. | System using film thickness estimation from machine learning based processing of substrate images |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4152723A (en) * | 1977-12-19 | 1979-05-01 | Sperry Rand Corporation | Method of inspecting circuit boards and apparatus therefor |
US4343553A (en) * | 1979-09-03 | 1982-08-10 | Hitachi, Ltd. | Shape testing apparatus |
US5455870A (en) * | 1991-07-10 | 1995-10-03 | Raytheon Company | Apparatus and method for inspection of high component density printed circuit board |
Family Cites Families (67)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US891013A (en) * | 1907-01-25 | 1908-06-16 | John Hammond Smith | Method of reproducing objects. |
US1596458A (en) * | 1924-03-13 | 1926-08-17 | Schiesari Mario | Method of obtaining data for reproducing three-dimensional objects |
US2177737A (en) * | 1936-11-13 | 1939-10-31 | Warner Bros | Photographic apparatus |
CH442965A (en) * | 1966-07-08 | 1967-08-31 | Koch Carl | Photographic focusing screen camera |
US4092068A (en) * | 1976-05-05 | 1978-05-30 | Domtar Inc. | Surface sensor |
US4146327A (en) * | 1976-12-27 | 1979-03-27 | Autech | Optical triangulation gauging system |
US4373804A (en) * | 1979-04-30 | 1983-02-15 | Diffracto Ltd. | Method and apparatus for electro-optically determining the dimension, location and attitude of objects |
US4238147A (en) * | 1979-05-23 | 1980-12-09 | Solid Photography Inc. | Recording images of a three-dimensional surface by focusing on a plane of light irradiating the surface |
US4286293A (en) * | 1979-05-30 | 1981-08-25 | Western Electric Company, Inc. | Laser scanning and multiple detection for video image processing |
US4594001A (en) * | 1981-07-07 | 1986-06-10 | Robotic Vision Systems, Inc. | Detection of three-dimensional information with a projected plane of light |
US4494874A (en) * | 1981-07-07 | 1985-01-22 | Robotic Vision Systems, Inc. | Detection of three-dimensional information using a projected point or line of light |
US4441124A (en) * | 1981-11-05 | 1984-04-03 | Western Electric Company, Inc. | Technique for inspecting semiconductor wafers for particulate contamination |
US4443705A (en) * | 1982-10-01 | 1984-04-17 | Robotic Vision Systems, Inc. | Method for locating points on a three-dimensional surface using light intensity variations |
US4527893A (en) * | 1982-10-13 | 1985-07-09 | Taylor Francis M | Method and apparatus for optically measuring the distance to a workpiece |
US4529316A (en) * | 1982-10-18 | 1985-07-16 | Robotic Vision Systems, Inc. | Arrangement of eliminating erroneous data in three-dimensional optical sensors |
US4645348A (en) * | 1983-09-01 | 1987-02-24 | Perceptron, Inc. | Sensor-illumination system for use in three-dimensional measurement of objects and assemblies of objects |
US4590367A (en) * | 1983-10-06 | 1986-05-20 | Robotic Vision Systems, Inc. | Arrangement for the expansion of the dynamic range of optical devices |
US4682894A (en) * | 1985-03-21 | 1987-07-28 | Robotic Vision Systems, Inc. | Calibration of three-dimensional space |
JPS6279644A (en) * | 1985-10-02 | 1987-04-13 | Sanwa Electron Kk | Inspecting device for ic pin |
US4762990A (en) * | 1985-10-21 | 1988-08-09 | International Business Machines Corporation | Data processing input interface determining position of object |
US4688939A (en) * | 1985-12-27 | 1987-08-25 | At&T Technologies, Inc. | Method and apparatus for inspecting articles |
JPH0776754B2 (en) * | 1986-06-25 | 1995-08-16 | 日立テクノエンジニアリング株式会社 | IC lead bending detection method |
US4740708A (en) * | 1987-01-06 | 1988-04-26 | International Business Machines Corporation | Semiconductor wafer surface inspection apparatus and method |
US5490084A (en) * | 1987-06-01 | 1996-02-06 | Hitachi Techno Engineering Co., Ltd. | Method and apparatus for detecting bent leads in electronic components |
JPH01249181A (en) * | 1988-03-31 | 1989-10-04 | Tdk Corp | Parts sorting method for automatic appearance screening machine for chip parts |
US4824251A (en) * | 1987-09-25 | 1989-04-25 | Digital Signal Corporation | Optical position sensor using coherent detection and polarization preserving optical fiber |
FR2624600B1 (en) * | 1987-12-09 | 1990-04-13 | Snecma | METHOD AND DEVICE FOR CONTROLLING CONTACTLESS GEOMETRIC CONTOURS |
US4991968A (en) * | 1988-07-20 | 1991-02-12 | Robotic Vision Systems, Inc. | Three dimensional object surface determination with automatic sensor control |
US4925308A (en) * | 1988-08-09 | 1990-05-15 | Robotic Vision System, Inc. | Calibration of three-dimensional space |
US5030008A (en) * | 1988-10-11 | 1991-07-09 | Kla Instruments, Corporation | Method and apparatus for the automated analysis of three-dimensional objects |
US5247585A (en) * | 1988-12-28 | 1993-09-21 | Masaharu Watanabe | Object recognition device |
US4957369A (en) * | 1989-01-23 | 1990-09-18 | California Institute Of Technology | Apparatus for measuring three-dimensional surface geometries |
US5635697A (en) * | 1989-03-01 | 1997-06-03 | Symbol Technologies, Inc. | Method and apparatus for decoding two-dimensional bar code |
US5509104A (en) * | 1989-05-17 | 1996-04-16 | At&T Corp. | Speech recognition employing key word modeling and non-key word modeling |
JPH03209737A (en) * | 1990-01-11 | 1991-09-12 | Tokyo Electron Ltd | Probe equipment |
US5060065A (en) * | 1990-02-23 | 1991-10-22 | Cimflex Teknowledge Corporation | Apparatus and method for illuminating a printed circuit board for inspection |
JPH04105341A (en) * | 1990-08-24 | 1992-04-07 | Hitachi Ltd | Method and equipment for detecting bending and floating of lead of semiconductor device |
JP2591292B2 (en) * | 1990-09-05 | 1997-03-19 | 日本電気株式会社 | Image processing device and automatic optical inspection device using it |
US5245421A (en) * | 1990-09-19 | 1993-09-14 | Control Automation, Incorporated | Apparatus for inspecting printed circuit boards with surface mounted components |
IL99823A0 (en) * | 1990-11-16 | 1992-08-18 | Orbot Instr Ltd | Optical inspection method and apparatus |
US5365084A (en) * | 1991-02-20 | 1994-11-15 | Pressco Technology, Inc. | Video inspection system employing multiple spectrum LED illumination |
US5172005A (en) * | 1991-02-20 | 1992-12-15 | Pressco Technology, Inc. | Engineered lighting system for tdi inspection comprising means for controlling lighting elements in accordance with specimen displacement |
JP2722287B2 (en) * | 1991-05-15 | 1998-03-04 | ファナック株式会社 | Position detection method in laser sensor |
JP2714277B2 (en) * | 1991-07-25 | 1998-02-16 | 株式会社東芝 | Lead shape measuring device |
US5187611A (en) | 1991-08-27 | 1993-02-16 | Northeast Robotics, Inc. | Diffuse on-axis light source |
US5351126A (en) * | 1991-10-31 | 1994-09-27 | Matsushita Electric Works, Ltd. | Optical measurement system for determination of an object's profile or thickness |
JP2969402B2 (en) * | 1991-12-02 | 1999-11-02 | 株式会社新川 | Bonding wire inspection device |
US5179413A (en) * | 1992-02-05 | 1993-01-12 | Eastman Kodak Company | System for illuminating a linear zone which reduces the effect of light retroflected from outside the zone on the illumination |
US5260779A (en) * | 1992-02-21 | 1993-11-09 | Control Automation, Inc. | Method and apparatus for inspecting a printed circuit board |
US5448650A (en) * | 1992-04-30 | 1995-09-05 | International Business Machines Corporation | Thin-film latent open optical detection with template-based feature extraction |
TW278212B (en) * | 1992-05-06 | 1996-06-11 | Sumitomo Electric Industries | |
US5371375A (en) * | 1992-06-24 | 1994-12-06 | Robotic Vision Systems, Inc. | Method for obtaining three-dimensional data from multiple parts or devices in a multi-pocketed tray |
US5463227A (en) * | 1992-06-24 | 1995-10-31 | Robotic Vision Systems, Inc. | Method for obtaining three-dimensional data from multiple parts or devices in a multi-pocketed tray |
US5313542A (en) * | 1992-11-30 | 1994-05-17 | Breault Research Organization, Inc. | Apparatus and method of rapidly measuring hemispherical scattered or radiated light |
JP2745102B2 (en) * | 1992-12-02 | 1998-04-28 | ローレルバンクマシン株式会社 | Coin discriminator |
US5305091A (en) * | 1992-12-07 | 1994-04-19 | Oreo Products Inc. | Optical coordinate measuring system for large objects |
US5367439A (en) * | 1992-12-24 | 1994-11-22 | Cognex Corporation | System for frontal illumination |
US5461417A (en) * | 1993-02-16 | 1995-10-24 | Northeast Robotics, Inc. | Continuous diffuse illumination method and apparatus |
US5406372A (en) * | 1993-04-16 | 1995-04-11 | Modular Vision Systems Inc. | QFP lead quality inspection system and method |
US5517235A (en) * | 1993-11-03 | 1996-05-14 | Control Automation, Inc. | Method and apparatus for inspecting printed circuit boards at different magnifications |
JP3333615B2 (en) * | 1993-12-21 | 2002-10-15 | 三菱電機株式会社 | Apparatus and method for measuring dimensions of semiconductor device |
US5506793A (en) * | 1994-01-14 | 1996-04-09 | Gerber Systems Corporation | Method and apparatus for distortion compensation in an automatic optical inspection system |
US5463213A (en) * | 1994-05-03 | 1995-10-31 | Honda; Takafaru | Code mark reader |
US5546189A (en) * | 1994-05-19 | 1996-08-13 | View Engineering, Inc. | Triangulation-based 3D imaging and processing method and system |
US5465152A (en) * | 1994-06-03 | 1995-11-07 | Robotic Vision Systems, Inc. | Method for coplanarity inspection of package or substrate warpage for ball grid arrays, column arrays, and similar structures |
US5635700A (en) * | 1994-07-27 | 1997-06-03 | Symbol Technologies, Inc. | Bar code scanner with multi-channel light collection |
US5550583A (en) * | 1994-10-03 | 1996-08-27 | Lucent Technologies Inc. | Inspection apparatus and method |
-
1996
- 1996-11-12 US US08/748,040 patent/US6075883A/en not_active Expired - Lifetime
-
1997
- 1997-11-07 WO PCT/US1997/020432 patent/WO1998021687A1/en active IP Right Grant
- 1997-11-07 DE DE69736818T patent/DE69736818D1/en not_active Expired - Fee Related
- 1997-11-07 AT AT97946611T patent/ATE250249T1/en not_active IP Right Cessation
- 1997-11-07 DK DK97946611T patent/DK1016028T3/en active
- 1997-11-07 ES ES97946611T patent/ES2207754T3/en not_active Expired - Lifetime
- 1997-11-07 EP EP03077317A patent/EP1359534B1/en not_active Expired - Lifetime
- 1997-11-07 DE DE69725021T patent/DE69725021T2/en not_active Expired - Fee Related
- 1997-11-07 CA CA002271492A patent/CA2271492C/en not_active Expired - Fee Related
- 1997-11-07 AU AU51747/98A patent/AU5174798A/en not_active Abandoned
- 1997-11-07 AT AT03077317T patent/ATE342549T1/en not_active IP Right Cessation
- 1997-11-07 PT PT97946611T patent/PT1016028E/en unknown
- 1997-11-07 EP EP97946611A patent/EP1016028B1/en not_active Expired - Lifetime
-
2000
- 2000-03-03 US US09/518,559 patent/US6603874B1/en not_active Expired - Fee Related
-
2003
- 2003-03-12 US US10/387,940 patent/US20030215127A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4152723A (en) * | 1977-12-19 | 1979-05-01 | Sperry Rand Corporation | Method of inspecting circuit boards and apparatus therefor |
US4343553A (en) * | 1979-09-03 | 1982-08-10 | Hitachi, Ltd. | Shape testing apparatus |
US5455870A (en) * | 1991-07-10 | 1995-10-03 | Raytheon Company | Apparatus and method for inspection of high component density printed circuit board |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011056282A1 (en) * | 2009-11-05 | 2011-05-12 | Aerospace Corporation | Refraction assisted illumination for imaging |
US8138476B2 (en) | 2009-11-05 | 2012-03-20 | The Aerospace Corporation | Refraction assisted illumination for imaging |
US8212215B2 (en) | 2009-11-05 | 2012-07-03 | The Aerospace Corporation | Refraction assisted illumination for imaging |
US8450688B2 (en) | 2009-11-05 | 2013-05-28 | The Aerospace Corporation | Refraction assisted illumination for imaging |
US8461532B2 (en) | 2009-11-05 | 2013-06-11 | The Aerospace Corporation | Refraction assisted illumination for imaging |
WO2011119372A1 (en) * | 2010-03-26 | 2011-09-29 | The Aerospace Corporation | Refraction assisted illumination for imaging |
US9007454B2 (en) | 2012-10-31 | 2015-04-14 | The Aerospace Corporation | Optimized illumination for imaging |
Also Published As
Publication number | Publication date |
---|---|
US6603874B1 (en) | 2003-08-05 |
ATE342549T1 (en) | 2006-11-15 |
DE69725021T2 (en) | 2004-04-22 |
DE69725021D1 (en) | 2003-10-23 |
DE69736818D1 (en) | 2006-11-23 |
EP1016028A1 (en) | 2000-07-05 |
US6075883A (en) | 2000-06-13 |
PT1016028E (en) | 2004-02-27 |
EP1359534A1 (en) | 2003-11-05 |
CA2271492C (en) | 2007-05-29 |
EP1016028A4 (en) | 2001-11-21 |
CA2271492A1 (en) | 1998-05-22 |
DK1016028T3 (en) | 2004-01-26 |
ES2207754T3 (en) | 2004-06-01 |
EP1016028B1 (en) | 2003-09-17 |
ATE250249T1 (en) | 2003-10-15 |
EP1359534B1 (en) | 2006-10-11 |
US20030215127A1 (en) | 2003-11-20 |
AU5174798A (en) | 1998-06-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6075883A (en) | Method and system for imaging an object or pattern | |
US7573569B2 (en) | System for 2-D and 3-D vision inspection | |
CN106959293B (en) | System and method for detecting defects on reflective surface through vision system | |
US9618329B2 (en) | Optical inspection probe | |
US6091834A (en) | Method of illuminating a digital representation of an image | |
US5118195A (en) | Area scan camera system for detecting streaks and scratches | |
US6661508B2 (en) | Inspection systems using sensor array and double threshold arrangement | |
JPH06168321A (en) | Method and apparatus for processing of two-dimensional image | |
JPH11185028A (en) | Method for detecting artifact on surface of transmissive image medium | |
KR20010101576A (en) | Method and device for inspecting objects | |
WO2001037025A1 (en) | Confocal imaging | |
US9746420B2 (en) | Differential scan imaging systems and methods | |
US7869021B2 (en) | Multiple surface inspection system and method | |
US8144968B2 (en) | Method and apparatus for scanning substrates | |
US5648853A (en) | System for inspecting pin grid arrays | |
JP2005528593A (en) | Imaging method and apparatus | |
KR20020093507A (en) | Apparatus for inspecting parts | |
TWI687672B (en) | Optical inspection system and image processing method thereof | |
KR20190129693A (en) | High-sensitivity low-power camera system for 3d structured light application | |
JP2000513099A (en) | Triangulation-based three-dimensional image forming / processing method and apparatus | |
JP2006038550A (en) | Painted surface inspection device | |
JP2000295639A (en) | Lighting device for inspecting solid-state image pickup element and adjustment tool used for the same | |
JPH05272922A (en) | Visual sensor apparatus | |
KR100275565B1 (en) | Apparatus for inspecting printed shape of solder paste on printed circuit board | |
JP2003294429A (en) | Measuring instrument for polishing angle of optical connector and measuring method using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW AM AZ BY KG KZ MD RU TJ TM |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH KE LS MW SD SZ UG ZW AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
ENP | Entry into the national phase |
Ref document number: 2271492 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1997946611 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWP | Wipo information: published in national office |
Ref document number: 1997946611 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 1997946611 Country of ref document: EP |