US20040047140A1 - Programmable illuminator for vision system - Google Patents
Programmable illuminator for vision system Download PDFInfo
- Publication number
- US20040047140A1 US20040047140A1 US10/657,286 US65728603A US2004047140A1 US 20040047140 A1 US20040047140 A1 US 20040047140A1 US 65728603 A US65728603 A US 65728603A US 2004047140 A1 US2004047140 A1 US 2004047140A1
- Authority
- US
- United States
- Prior art keywords
- workpiece
- view
- image
- field
- imaging system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000005286 illumination Methods 0.000 claims abstract description 26
- 238000000034 method Methods 0.000 claims description 29
- 238000003384 imaging method Methods 0.000 claims description 15
- 239000004065 semiconductor Substances 0.000 claims description 2
- 230000003993 interaction Effects 0.000 claims 2
- 230000001678 irradiating effect Effects 0.000 claims 2
- 230000008569 process Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 7
- 210000001747 pupil Anatomy 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 235000012431 wafers Nutrition 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
Definitions
- the present invention is directed to machine vision and in particular to providing illumination for such systems.
- Machine vision has been applied to a number of production and testing tasks.
- workpieces such as printed-circuit boards, integrated-circuit chips, and other articles of manufacture are brought into the field of view of a camera.
- the camera typically generates an image in digital form, which digital circuitry normally in the form of a microprocessor and related circuitry processes in accordance with the task to be performed.
- a superior solution is to move neither the camera nor the workpiece, but rather to move the camera's field of view by employing deflector mechanisms.
- Galvanometer-mounted pivoting mirrors, pivoting prisms, and rotating reflector polygons are among the mechanisms commonly employed in optical systems to perform image deflection. Although these still are moving parts, they are ordinarily relatively small and take advantage of optical leverage to change the field of view faster than systems that move the entire workpiece or camera.
- the lighting system would have to be optimized for a wide range of resultant relationships between the lighting system's position and that of the camera's field of view.
- the difficulty of solving this problem has confounded attempts to employ field-of-view deflection.
- “Dark-field illumination” is an illumination approach that takes advantage of the fact that a specularly reflecting feature in the midst of a diffuse-reflecting background will appear dark if that feature's specular reflection images the main light source outside the camera's field of view. That is, since the angle of reflection of all light striking a specular reflector equals that light's angle of incidence, the reflected light will not pass through the camera's entrance pupil unless that sole angle of reflection yields that result. But light striking the diffusely reflecting background is reflected in a range of angles, so a substantial amount may enter the camera even if the specular-reflection angle would not result in a ray that does. The specularly reflecting feature is therefore readily identified because it appears dark against a lighter background.
- FIG. 1 is a diagram that illustrates specular reflection
- FIG. 2 is a similar diagram that illustrates diffuse reflection
- FIG. 3 is a diagram that illustrates specularly reflecting indicia in a diffusely reflecting background
- FIG. 4 is a diagram of a vision system that employs the present invention's teachings
- FIG. 5 is a block diagram of the control system that an application using the FIG. 4 system may employ;
- FIG. 6 is a diagram of the processes that the system of FIG. 5 performs.
- FIG. 7 is a diagram of the system that employs the present invention's teachings in a laser-scribing system.
- an incident ray 12 strikes a specularly reflecting surface 14 at an angle ⁇ i with respect to the normal 16 to that surface. If the surface 14 is a mirror or other specularly reflecting surface, essentially all light that strikes the surface at the angle of incidence ⁇ i reflects at an angle of reflection ⁇ o equal to the angle of incidence ⁇ i .
- the point 20 at which the light ray strikes the surface 14 may be located within the field of view of a camera, but that light will not contribute to the camera image unless ray 18 extends through that camera's entrance pupil. For the purposes of this discussion, we will assume that it does not. So point 20 is not illuminated so far as the camera is concerned.
- the reflection of ray 12 includes not only ray 18 , whose angle of reflection equals the angle of incidence, but also a plume of other rays, such as ray 22 , whose angles with the normal differ from that of the incident ray.
- the camera pupil may be so positioned that it receives some of these rays even if it does not receive ray 18 . So spot 20 is illuminated from the camera's point of view, even though it would not be if it reflected only specularly.
- the surface 14 includes both FIG. 3's specularly reflecting indicia 24 and its diffuse background 26 , an image is formed in which the indicia are readily distinguished if the camera is positioned with an entrance pupil that does not receive the specularly reflected rays but does receive some of the rays that result from diffuse reflection.
- the indicia may not have perfectly mirror-like surfaces, as the drawing suggests, but they will be distinguishable so long as they yield light plumes that are significantly more compact than the plumes produces by the indicia's surroundings.
- FIG. 4 depicts an apparatus for applying the technique of dark-field illumination to a field-of-view-deflection system.
- a camera that includes a detector 30 , lenses 32 , 34 , and 36 , and a field-of-view deflector depicted for purposes of illustration as including a galvanometer-mounted mirror 38 .
- Detector 30 will typically take the form of an array of charge-coupled devices, whose outputs are converted to digital form for processing in accordance with the particular application to which the system is applied.
- That lens system includes an image-forming lens 32 spaced by its focal length from the detector 30 . It also includes collimating lenses 34 and 36 , which collimate the light from a target region in which a workpiece 37 is disposed; i.e., lenses 34 and 36 image workpiece 37 at infinity. But other embodiments may, say, include only a single lens, corresponding to lens 32 but positioned to image the workpiece 37 on the detector 30 .
- a camera field of view consisting of a volume of points in space from which the lenses and field-of-view deflector provide optical paths to the detector 30 .
- the field of view intersects the target region so that the camera can “see” a portion 40 of the workpiece.
- the field-of-view deflector's position determines where on the workpiece portion 40 falls; i.e., the field viewed by the detector can change without relative movement between the camera and the workpiece, although such movement may occur, too.
- the workpiece may be on a conveyor, which typically would not be capable of moving as quickly as the field of view.
- the drawing depicts the field-of-view deflector as comprising only a single mirror, many embodiments of the present invention will employ two, which deflect the camera's field of view along mutually orthogonal axes. Also, although the drawing shows lenses 34 and 36 between the mirror 38 and the workpiece 37 , some embodiments that employ collimating lenses may instead place them between the field-of-view deflector and the detector.
- the system further includes an array of lamps suspended above the workpiece.
- the number of such lamps will be relatively large, but FIG. 4 depicts only two such lamps, 42 and 44 , which are controlled by illumination-control circuitry not shown in FIG. 4. Operation of these lamps is coordinated with the position of the mirror 38 and thus of the camera's field of view. As that mirror pivots, the workpiece portion 40 of which the camera can form an image on the detector 30 moves about the workpiece's surface.
- the lamps in the array shine on the workpiece at any given time, but selected ones are prevented from doing so as the portion 40 within the field of view moves about the surface of the workpiece 37 .
- any lamp is prevented from shining on the workpiece if under the assumption of specular reflection it would be imaged into the field of view. That is, any source that the camera could “see” if the workpiece portion 40 were a mirror is prevented from shining on the workpiece so that the camera receives no specular reflection from it.
- lamps prevented from shining on the workpiece to prevent specular reflection into the camera typically shine on it again after the field of view moves beyond their images. So a dark region of unlit sources moves about against a background of sources that are lit as the scanning process proceeds.
- the “sources” can be reflectors, for instance, and “lit” array elements could be the reflectors on which a remote source or sources selectively shine, while the unlit elements would be the reflectors on which the source or sources do not shine.
- Another approach is to provide continuously operating lamps and selectively operable baffles that can selectively hide lamps from the workpiece; the “lit” elements would be the lamps not hidden.
- the array elements consist of respective light-emitting diodes (“LEDs”) that are simply turned on to cause them to shine on the workpiece and turned off to prevent them from doing so, and the following description is based on this assumption.
- Source 42 contributes to the image because any diffuse reflection resulting from ray 48 will result in a plume of rays 52 , of which some, such as rays 54 and 56 , pass through the camera's entrance pupil.
- the portion 40 viewable by the camera 28 may reach a point from which specular reflection in response to light from source 42 would produce rays that contribute to the image on detector 30 whereas reflection of light from source 44 would no longer do so.
- source 42 will be turned off and source 44 will typically be turned back on. Note that this operation of turning off selected elements occurs in addition to other illumination-control operations that may be occurring. It may be desirable, for instance, to operate different ones of the sources with differing intensities so as to achieve illumination that is optimally uniform for the currently prevailing camera angle. Also, the sources that “remain” lit may actually be strobed so as to “freeze” the workpiece image despite continuous relative motion between the camera's field of view and the workpiece.
- Controlling the sources can take any of a number of forms. Since it is well within the skill of those familiar with such optical systems to predict which sources can be “seen” at various angles, an algorithm for converting scan-angle position to lamp selection can readily be written, and the determination can accordingly be made algorithmically in real time. Or that calculation can be made ahead of time to populate a look-up table used to make the real-time conversion from field-of-view position to lamp selection. Alternatively, the look-up table could be populated in accordance with experimental results. Regardless of how the look-up table is populated, it can be used in a straight table look-up or as part of a combination of table look-up, and, say real-time interpolation.
- FIG. 6 depicts the systems as including a computer 60 that runs a software application 62 for which image data are required.
- the application may operate various other software modules such as modules 64 , 66 , and 68 , which respectively control the field-of-view deflector, lights, and detector.
- the computer 60 communicates with appropriate interface circuitry represented by blocks 70 , 72 , and 74 to coordinate these operations.
- FIG. 6 indicates, an application requiring image data would perform a routine whose object is to acquire image data at a particular location.
- FIG. 6's block 76 represents entering such a routine.
- This routine may concurrently call FIG. 5's several processes 64 , 66 , and 68 .
- the scan-control whose entry block 77 represents would typically perform operations such as computing the deflector positions required to place the field of view in the desired location.
- Block 78 represents this step. Once the appropriate locations are determined, the scan-control process would cause FIG. 5's scanner interface to move, say, a galvanometer to produce the desired field-of-view deflection.
- Block 80 represents this operation, after which the process terminates in a step 82 . That step may include setting a flag to indicate that the movement operation has been completed.
- FIG. 6's block 84 represents entering the light-control process, which includes determining from the commanded field-of-view location which lamps need to be switched on or off.
- Block 86 represents performing this conversion, which, as was explained above, may involve an algorithmic determination, a table look-up, or a combination of both.
- the desired light actuations are determined, they are performed by communications with FIG. 5's lighting interface 72 in a step that FIG. 6's block 88 represents.
- That drawing's block 90 represents ending the process in a step that may involve setting a flag to indicate that the lights have been properly set.
- the image-processing operation whose entry FIG. 6's block 92 represents, depends on proper illumination and proper positioning of the field-of-view deflector.
- Block 94 accordingly represents testing the scan-control operation's flag to determine whether the field of view is positioned as required.
- the process proceeds to step 96 to determine whether the illumination has been set properly. If it has, the process operates FIG. 5's detector interface 74 to obtain the camera's output data, as block 98 indicates, and the resultant data are supplied to the requesting application, as block 100 indicates.
- FIG. 7 illustrates one example of an apparatus in which the present invention is advantageous.
- the apparatus in FIG. 7 differs from that of FIG. 4 in that it additionally includes a laser source 102 and a beam splitter 104 .
- a system of the FIG. 7 type typically will include a module for controlling the laser as well as a laser interface by which that module exercises such control.
- the beam splitter 104 cooperates with the field-of-view deflector 38 to direct the laser light to a workpiece location in the camera's field of view.
- the application program may employ the vision system to identify fiducial marks or other identifying features previously made on the workpiece and thereby properly locate the position at which the new mark is to be made.
- the vision system also may be used to perform quality control on the marking process, possibly in a closed-loop fashion so as to adjust laser-beam positioning in accordance with the results of previous observations.
- FIG. 7 The system of FIG. 7 is more convenient than some prior-art marking systems.
- the marking apparatus was located separate from the imaging systems, in which there could be relative movement between the workpiece and the camera and lighting systems.
- the illustrated system employs the same lens and deflector apparatus as the imaging system, so positions being inspected and marked are readily correlated.
Abstract
Description
- The present invention is directed to machine vision and in particular to providing illumination for such systems.
- Machine vision has been applied to a number of production and testing tasks. In general, workpieces, such as printed-circuit boards, integrated-circuit chips, and other articles of manufacture are brought into the field of view of a camera. The camera typically generates an image in digital form, which digital circuitry normally in the form of a microprocessor and related circuitry processes in accordance with the task to be performed.
- In many cases, the workpiece is too large for a practical-sized camera to image with adequate resolution, but this problem is readily solved by taking an image of only a small part of the workpiece at any single time. This yields the requisite resolution, and images of respective segments of the workpiece can be taken as the workpiece is stepped through the camera's field of view.
- Although this approach is acceptable in a number of applications, it can be throughput- and accuracy-limiting in some others. There are often practical limits to the speed at which the workpiece can be advanced through the camera's field of view. Additionally, the need for accurate correlation between successive images can impose severe accuracy requirements on the workpiece-advancing system. To a greater or lesser degree, the same limitations apply regardless of whether it is the camera or the workpiece that is moved.
- For some applications, a superior solution is to move neither the camera nor the workpiece, but rather to move the camera's field of view by employing deflector mechanisms. Galvanometer-mounted pivoting mirrors, pivoting prisms, and rotating reflector polygons are among the mechanisms commonly employed in optical systems to perform image deflection. Although these still are moving parts, they are ordinarily relatively small and take advantage of optical leverage to change the field of view faster than systems that move the entire workpiece or camera.
- Despite this advantage, there is a class of applications to which workers in this field have been slow to apply the field-of-view-deflection approach. One example of this class is the type of application that involves reading laser-scribed marks on workpieces such as semiconductor wafers or electronic-component packages. Marks of that type are hard to detect reliably because they are quite subtle. So considerable effort has been applied to illuminating the workpiece in such a manner as to minimize noise contributed by surface irregularities in non-marked regions. But achieving this result is greatly complicated in systems that use field-of-view deflectors. In systems that move the workpiece or the camera, the illumination apparatus always has the same position with respect to the field of view, so illumination characteristics need to be optimized for that relationship only. In field-of-view-deflector systems, on the other hand, the lighting system would have to be optimized for a wide range of resultant relationships between the lighting system's position and that of the camera's field of view. For some applications, the difficulty of solving this problem has confounded attempts to employ field-of-view deflection.
- But we have recognized that imaging results for such systems can be greatly improved by emphasizing the dark-field-illumination aspects of the problem and adapting to it a method previously used to vary dark-field illumination in response to camera-objective changes.
- “Dark-field illumination” is an illumination approach that takes advantage of the fact that a specularly reflecting feature in the midst of a diffuse-reflecting background will appear dark if that feature's specular reflection images the main light source outside the camera's field of view. That is, since the angle of reflection of all light striking a specular reflector equals that light's angle of incidence, the reflected light will not pass through the camera's entrance pupil unless that sole angle of reflection yields that result. But light striking the diffusely reflecting background is reflected in a range of angles, so a substantial amount may enter the camera even if the specular-reflection angle would not result in a ray that does. The specularly reflecting feature is therefore readily identified because it appears dark against a lighter background.
- Because of this effect, there is a rich store of work directed to dark-field illumination, and we have recognized that properly adapting it can yield significantly improved results for field-of-view-deflection systems. It had long ago been recognized in systems such as that described in U.S. Pat. No. 4,604,648 to Kley that individual elements of a light-source array should be selectively operated in accordance with the particular objective or zoom position of the imaging camera. We have adapted this concept by so operating elements of a light-source array that the position within the array at which one or more light sources is not lit moves around the array as the deflector changes the field of view's position.
- Specifically, as the deflector so moves as to change the field of view on the workpiece, we selectively turn off any elements of the source array that will be imaged into the camera's field if specular reflection occurs in the portion of the workpiece within that field of view. As the field of view moves so that an element previously thus imaged no longer is, the element is typically turned on again.
- The invention description below refers to the accompanying drawings, of which:
- FIG. 1 is a diagram that illustrates specular reflection;
- FIG. 2 is a similar diagram that illustrates diffuse reflection;
- FIG. 3 is a diagram that illustrates specularly reflecting indicia in a diffusely reflecting background;
- FIG. 4 is a diagram of a vision system that employs the present invention's teachings;
- FIG. 5 is a block diagram of the control system that an application using the FIG. 4 system may employ;
- FIG. 6 is a diagram of the processes that the system of FIG. 5 performs; and
- FIG. 7 is a diagram of the system that employs the present invention's teachings in a laser-scribing system.
- Before we describe the system of the present invention, we briefly review the concept of dark-field illumination. In FIG. 1, an
incident ray 12 strikes a specularly reflectingsurface 14 at an angle θi with respect to the normal 16 to that surface. If thesurface 14 is a mirror or other specularly reflecting surface, essentially all light that strikes the surface at the angle of incidence θi reflects at an angle of reflection θo equal to the angle of incidence θi. Thepoint 20 at which the light ray strikes thesurface 14 may be located within the field of view of a camera, but that light will not contribute to the camera image unlessray 18 extends through that camera's entrance pupil. For the purposes of this discussion, we will assume that it does not. Sopoint 20 is not illuminated so far as the camera is concerned. - Now consider FIG. 2, where we change the specular-reflection assumption and instead assume that the
target surface 14 is a diffuse reflector. In that case, the reflection ofray 12 includes not onlyray 18, whose angle of reflection equals the angle of incidence, but also a plume of other rays, such asray 22, whose angles with the normal differ from that of the incident ray. The camera pupil may be so positioned that it receives some of these rays even if it does not receiveray 18. Sospot 20 is illuminated from the camera's point of view, even though it would not be if it reflected only specularly. - If the
surface 14 includes both FIG. 3's specularly reflectingindicia 24 and itsdiffuse background 26, an image is formed in which the indicia are readily distinguished if the camera is positioned with an entrance pupil that does not receive the specularly reflected rays but does receive some of the rays that result from diffuse reflection. Of course, the indicia may not have perfectly mirror-like surfaces, as the drawing suggests, but they will be distinguishable so long as they yield light plumes that are significantly more compact than the plumes produces by the indicia's surroundings. - FIG. 4 depicts an apparatus for applying the technique of dark-field illumination to a field-of-view-deflection system. In an electro-
optical process head 28 is mounted a camera that includes adetector 30,lenses mirror 38.Detector 30 will typically take the form of an array of charge-coupled devices, whose outputs are converted to digital form for processing in accordance with the particular application to which the system is applied. - Other detection devices can be employed, of course, as can lens systems different from that of FIG. 4. That lens system includes an image-forming
lens 32 spaced by its focal length from thedetector 30. It also includes collimatinglenses workpiece 37 is disposed; i.e.,lenses image workpiece 37 at infinity. But other embodiments may, say, include only a single lens, corresponding tolens 32 but positioned to image theworkpiece 37 on thedetector 30. - Extending downward to infinity from the electro-
optical head 28 is a camera field of view consisting of a volume of points in space from which the lenses and field-of-view deflector provide optical paths to thedetector 30. The field of view intersects the target region so that the camera can “see” aportion 40 of the workpiece. The field-of-view deflector's position determines where on theworkpiece portion 40 falls; i.e., the field viewed by the detector can change without relative movement between the camera and the workpiece, although such movement may occur, too. For instance, the workpiece may be on a conveyor, which typically would not be capable of moving as quickly as the field of view. - Although the drawing depicts the field-of-view deflector as comprising only a single mirror, many embodiments of the present invention will employ two, which deflect the camera's field of view along mutually orthogonal axes. Also, although the drawing shows
lenses mirror 38 and theworkpiece 37, some embodiments that employ collimating lenses may instead place them between the field-of-view deflector and the detector. - To illuminate the
workpiece 37 so that the camera can form an adequate image, the system further includes an array of lamps suspended above the workpiece. For most of the present invention's embodiments, the number of such lamps will be relatively large, but FIG. 4 depicts only two such lamps, 42 and 44, which are controlled by illumination-control circuitry not shown in FIG. 4. Operation of these lamps is coordinated with the position of themirror 38 and thus of the camera's field of view. As that mirror pivots, theworkpiece portion 40 of which the camera can form an image on thedetector 30 moves about the workpiece's surface. - In accordance with the present invention, most or a significant portion of the lamps in the array shine on the workpiece at any given time, but selected ones are prevented from doing so as the
portion 40 within the field of view moves about the surface of theworkpiece 37. As thatportion 40 moves, any lamp is prevented from shining on the workpiece if under the assumption of specular reflection it would be imaged into the field of view. That is, any source that the camera could “see” if theworkpiece portion 40 were a mirror is prevented from shining on the workpiece so that the camera receives no specular reflection from it. As themirror 38 continues moving and continues to deflect the camera's field of view, lamps prevented from shining on the workpiece to prevent specular reflection into the camera typically shine on it again after the field of view moves beyond their images. So a dark region of unlit sources moves about against a background of sources that are lit as the scanning process proceeds. - Any way of achieving such a dark region moving about a background of sources can be used. The “sources” can be reflectors, for instance, and “lit” array elements could be the reflectors on which a remote source or sources selectively shine, while the unlit elements would be the reflectors on which the source or sources do not shine. Another approach is to provide continuously operating lamps and selectively operable baffles that can selectively hide lamps from the workpiece; the “lit” elements would be the lamps not hidden. Preferably, though, the array elements consist of respective light-emitting diodes (“LEDs”) that are simply turned on to cause them to shine on the workpiece and turned off to prevent them from doing so, and the following description is based on this assumption.
- To determine which lamps are in the “viewed” subset and should thus be turned off, we assume that the workpiece is a mirror. Under this assumption, an
example ray 45 emitted bysource 44 will be reflected alongpath 46 to contribute to formation of the image ondetector 30. That is, ifregion 40 were a mirror, at least part ofsource 44 would be imaged into the camera's field of view. Any source for which this is true is part of the viewed subset and is therefore turned off.Source 42, on the other hand, remains lit because specular reflection of any rays, such asray 48, thatstrike region 40 will result in rays, such asray 50, that do not enter the camera: that source belongs in the “unviewed” set. (We note in passing that the “mirror” 40 need not have the horizontal orientation that the drawing depicts; any expected workpiece-surface angle can be used for determining the viewed and unviewed sets' respective memberships.)Source 42 contributes to the image because any diffuse reflection resulting fromray 48 will result in a plume ofrays 52, of which some, such asrays - As deflection continues, the
portion 40 viewable by thecamera 28 may reach a point from which specular reflection in response to light fromsource 42 would produce rays that contribute to the image ondetector 30 whereas reflection of light fromsource 44 would no longer do so. When the deflector reaches such a point,source 42 will be turned off andsource 44 will typically be turned back on. Note that this operation of turning off selected elements occurs in addition to other illumination-control operations that may be occurring. It may be desirable, for instance, to operate different ones of the sources with differing intensities so as to achieve illumination that is optimally uniform for the currently prevailing camera angle. Also, the sources that “remain” lit may actually be strobed so as to “freeze” the workpiece image despite continuous relative motion between the camera's field of view and the workpiece. - Controlling the sources can take any of a number of forms. Since it is well within the skill of those familiar with such optical systems to predict which sources can be “seen” at various angles, an algorithm for converting scan-angle position to lamp selection can readily be written, and the determination can accordingly be made algorithmically in real time. Or that calculation can be made ahead of time to populate a look-up table used to make the real-time conversion from field-of-view position to lamp selection. Alternatively, the look-up table could be populated in accordance with experimental results. Regardless of how the look-up table is populated, it can be used in a straight table look-up or as part of a combination of table look-up, and, say real-time interpolation.
- FIGS. 5 and 6 show how machine-vision applications will typically implement the present invention's teachings. Although it is apparent that dedicated “random logic” could be separately designed for each of the functions to be described below, most embodiments will employ programming to configure common microprocessor circuitry to act as the various circuits for performing these operations. So FIG. 6 depicts the systems as including a
computer 60 that runs asoftware application 62 for which image data are required. To this end, the application may operate various other software modules such asmodules computer 60 communicates with appropriate interface circuitry represented byblocks - As FIG. 6 indicates, an application requiring image data would perform a routine whose object is to acquire image data at a particular location. FIG. 6's
block 76 represents entering such a routine. This routine may concurrently call FIG. 5'sseveral processes entry block 77 represents would typically perform operations such as computing the deflector positions required to place the field of view in the desired location.Block 78 represents this step. Once the appropriate locations are determined, the scan-control process would cause FIG. 5's scanner interface to move, say, a galvanometer to produce the desired field-of-view deflection.Block 80 represents this operation, after which the process terminates in astep 82. That step may include setting a flag to indicate that the movement operation has been completed. - FIG. 6's
block 84 represents entering the light-control process, which includes determining from the commanded field-of-view location which lamps need to be switched on or off.Block 86 represents performing this conversion, which, as was explained above, may involve an algorithmic determination, a table look-up, or a combination of both. Once the desired light actuations are determined, they are performed by communications with FIG. 5'slighting interface 72 in a step that FIG. 6'sblock 88 represents. That drawing'sblock 90 represents ending the process in a step that may involve setting a flag to indicate that the lights have been properly set. - The image-processing operation, whose entry FIG. 6's
block 92 represents, depends on proper illumination and proper positioning of the field-of-view deflector.Block 94 accordingly represents testing the scan-control operation's flag to determine whether the field of view is positioned as required. Once it has determined that the desired position has been reached, the process proceeds to step 96 to determine whether the illumination has been set properly. If it has, the process operates FIG. 5'sdetector interface 74 to obtain the camera's output data, asblock 98 indicates, and the resultant data are supplied to the requesting application, asblock 100 indicates. - The particular nature of whatever
application 62 the invention supports is not a feature of the invention, but FIG. 7 illustrates one example of an apparatus in which the present invention is advantageous. The apparatus in FIG. 7 differs from that of FIG. 4 in that it additionally includes alaser source 102 and abeam splitter 104. And, in addition to the program modules depicted in FIG. 5, a system of the FIG. 7 type typically will include a module for controlling the laser as well as a laser interface by which that module exercises such control. To mark the workpiece, thebeam splitter 104 cooperates with the field-of-view deflector 38 to direct the laser light to a workpiece location in the camera's field of view. - The application program may employ the vision system to identify fiducial marks or other identifying features previously made on the workpiece and thereby properly locate the position at which the new mark is to be made. The vision system also may be used to perform quality control on the marking process, possibly in a closed-loop fashion so as to adjust laser-beam positioning in accordance with the results of previous observations.
- The system of FIG. 7 is more convenient than some prior-art marking systems. In such prior-art systems the marking apparatus was located separate from the imaging systems, in which there could be relative movement between the workpiece and the camera and lighting systems. In contrast, the illustrated system employs the same lens and deflector apparatus as the imaging system, so positions being inspected and marked are readily correlated.
- It is another aspect of the present invention to reverse the arrangement described above in order to provide “bright field” illumination. This type of illumination is valuable when the background for the relatively specular indicia is relatively unreflective. In such a situation, it is preferable to emphasize reflection from the relatively specular indicia without unnecessarily illuminating the background. For this purpose, the sources that specular reflection would make visible at the source are the lit ones, and the others are the ones that are not lit.
- From the foregoing description, it is apparent that the present invention can be employed in a wide range of embodiments and constitutes a significant advance in the art.
Claims (14)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/657,286 US20040047140A1 (en) | 1999-04-27 | 2003-09-08 | Programmable illuminator for vision system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/301,002 US6633338B1 (en) | 1999-04-27 | 1999-04-27 | Programmable illuminator for vision system |
US10/657,286 US20040047140A1 (en) | 1999-04-27 | 2003-09-08 | Programmable illuminator for vision system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/301,002 Continuation US6633338B1 (en) | 1999-04-27 | 1999-04-27 | Programmable illuminator for vision system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20040047140A1 true US20040047140A1 (en) | 2004-03-11 |
Family
ID=28791804
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/301,002 Expired - Lifetime US6633338B1 (en) | 1999-04-27 | 1999-04-27 | Programmable illuminator for vision system |
US10/657,286 Abandoned US20040047140A1 (en) | 1999-04-27 | 2003-09-08 | Programmable illuminator for vision system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/301,002 Expired - Lifetime US6633338B1 (en) | 1999-04-27 | 1999-04-27 | Programmable illuminator for vision system |
Country Status (1)
Country | Link |
---|---|
US (2) | US6633338B1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050197672A1 (en) * | 2000-02-04 | 2005-09-08 | Freeman Gary A. | Integrated resuscitation |
US20100155379A1 (en) * | 2008-12-19 | 2010-06-24 | Applied Materials, Inc. | Illumination methods and systems for laser scribe detection and alignment in thin film solar cell fabrication |
US20100212182A1 (en) * | 2005-06-06 | 2010-08-26 | Krishna Singh | Method and apparatus for dehydrating high level waste based on dew point temperature measurements |
CN104148810A (en) * | 2013-01-28 | 2014-11-19 | 先进科技新加坡有限公司 | Method of radiatively grooving a semiconductor substrate |
AU2012244240B2 (en) * | 2007-08-29 | 2015-04-30 | Scientific Games, Llc | Enhanced scanner design |
CN112858169A (en) * | 2021-02-01 | 2021-05-28 | 苏州维嘉科技股份有限公司 | Light source detection device, light source lighting method thereof and light source control device |
WO2023166643A1 (en) * | 2022-03-03 | 2023-09-07 | 三菱電機株式会社 | Appearance inspection device and appearance inspection method |
JP7471533B2 (en) | 2022-03-03 | 2024-04-19 | 三菱電機株式会社 | Appearance inspection device and appearance inspection method |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7028899B2 (en) * | 1999-06-07 | 2006-04-18 | Metrologic Instruments, Inc. | Method of speckle-noise pattern reduction and apparatus therefore based on reducing the temporal-coherence of the planar laser illumination beam before it illuminates the target object by applying temporal phase modulation techniques during the transmission of the plib towards the target |
JP2000090233A (en) * | 1998-09-08 | 2000-03-31 | Olympus Optical Co Ltd | Image processor |
US6788411B1 (en) * | 1999-07-08 | 2004-09-07 | Ppt Vision, Inc. | Method and apparatus for adjusting illumination angle |
GB2356996A (en) * | 1999-12-03 | 2001-06-06 | Hewlett Packard Co | Improvements to digital cameras |
IL133696A (en) * | 1999-12-23 | 2006-04-10 | Orbotech Ltd | Cam reference inspection of multi-color and contour images |
JP2007509368A (en) * | 2003-10-17 | 2007-04-12 | ジーエスアイ・ルモニクス・コーポレーション | Flexible scanning range |
US7593593B2 (en) | 2004-06-16 | 2009-09-22 | Microsoft Corporation | Method and system for reducing effects of undesired signals in an infrared imaging system |
US7344273B2 (en) | 2005-03-22 | 2008-03-18 | Binary Works, Inc. | Ring light with user manipulable control |
US7315361B2 (en) * | 2005-04-29 | 2008-01-01 | Gsi Group Corporation | System and method for inspecting wafers in a laser marking system |
US20110199764A1 (en) * | 2005-08-26 | 2011-08-18 | Camtek Ltd. | Device and method for controlling an angular coverage of a light beam |
US7911444B2 (en) | 2005-08-31 | 2011-03-22 | Microsoft Corporation | Input method for surface of interactive display |
WO2007063909A1 (en) | 2005-11-30 | 2007-06-07 | Nikon Corporation | Observing device |
US7630002B2 (en) * | 2007-01-05 | 2009-12-08 | Microsoft Corporation | Specular reflection reduction using multiple cameras |
US8212857B2 (en) | 2007-01-26 | 2012-07-03 | Microsoft Corporation | Alternating light sources to reduce specular reflection |
DE102007063453B3 (en) * | 2007-12-28 | 2009-10-08 | Göpel electronic GmbH | Assembled printed circuit board optical inspection arrangement for quality control, has control system connected with color filters to ensure that inspection fields are optically represented in successive manner on image sensor surface |
CN106124510B (en) * | 2010-01-26 | 2019-10-18 | 戴比尔斯英国有限公司 | Gemstone sparkle analysis |
US20120274838A1 (en) * | 2010-10-15 | 2012-11-01 | Triune Ip Llc | Illumination and image capture |
CN103650474B (en) * | 2012-06-20 | 2017-05-24 | 株式会社日立制作所 | Automatic image compositing device |
DE102013017795C5 (en) * | 2013-10-25 | 2018-01-04 | Lessmüller Lasertechnik GmbH | Process monitoring method and apparatus |
MX360408B (en) * | 2013-11-04 | 2018-10-29 | Tomra Sorting Nv | Inspection apparatus. |
CN104023179B (en) * | 2014-06-27 | 2017-08-15 | 北京智谷睿拓技术服务有限公司 | Image formation control method and equipment |
US10926416B2 (en) * | 2018-11-21 | 2021-02-23 | Ford Global Technologies, Llc | Robotic manipulation using an independently actuated vision system, an adversarial control scheme, and a multi-tasking deep learning architecture |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4475796A (en) * | 1981-03-13 | 1984-10-09 | Olympus Optical Co., Ltd. | Epidark illumination system |
US4484069A (en) * | 1981-10-15 | 1984-11-20 | St. Regis Paper Company | Apparatus and method for sensing distance |
US4604648A (en) * | 1984-10-12 | 1986-08-05 | Kley Victor B | Electronic viewing system for integrated circuit packages |
US4706168A (en) * | 1985-11-15 | 1987-11-10 | View Engineering, Inc. | Systems and methods for illuminating objects for vision systems |
US4893223A (en) * | 1989-01-10 | 1990-01-09 | Northern Telecom Limited | Illumination devices for inspection systems |
US4918284A (en) * | 1988-10-14 | 1990-04-17 | Teradyne Laser Systems, Inc. | Calibrating laser trimming apparatus |
US4972093A (en) * | 1987-10-09 | 1990-11-20 | Pressco Inc. | Inspection lighting system |
US5038258A (en) * | 1989-03-02 | 1991-08-06 | Carl-Zeiss-Stiftung | Illuminating arrangement for illuminating an object with incident light |
US5129009A (en) * | 1990-06-04 | 1992-07-07 | Motorola, Inc. | Method for automatic semiconductor wafer inspection |
US5185638A (en) * | 1991-04-26 | 1993-02-09 | International Business Machines Corporation | Computer controlled, multiple angle illumination system |
US5515452A (en) * | 1992-12-31 | 1996-05-07 | Electroglas, Inc. | Optical character recognition illumination method and system |
US5519496A (en) * | 1994-01-07 | 1996-05-21 | Applied Intelligent Systems, Inc. | Illumination system and method for generating an image of an object |
US5585616A (en) * | 1995-05-05 | 1996-12-17 | Rockwell International Corporation | Camera for capturing and decoding machine-readable matrix symbol images applied to reflective surfaces |
US5615013A (en) * | 1995-06-27 | 1997-03-25 | Virtek Vision Corp. | Galvanometer and camera system |
US5684530A (en) * | 1993-02-16 | 1997-11-04 | Northeast Robotics, Inc. | Continuous diffuse illumination method and apparatus |
US5720424A (en) * | 1995-05-30 | 1998-02-24 | Kabushiki Kaisha Shinkawa | Wire bonding apparatus |
US5724139A (en) * | 1996-06-28 | 1998-03-03 | Polaroid Corporation | Dark field, photon tunneling imaging probes |
US5737122A (en) * | 1992-05-01 | 1998-04-07 | Electro Scientific Industries, Inc. | Illumination system for OCR of indicia on a substrate |
US5799135A (en) * | 1994-06-28 | 1998-08-25 | Fanuc, Ltd. | Robot controlling method and apparatus using laser sensor |
US5822053A (en) * | 1995-04-25 | 1998-10-13 | Thrailkill; William | Machine vision light source with improved optical efficiency |
US5965042A (en) * | 1996-07-24 | 1999-10-12 | Miyachi Technos Corporation | Method and apparatus for laser marking with laser cleaning |
-
1999
- 1999-04-27 US US09/301,002 patent/US6633338B1/en not_active Expired - Lifetime
-
2003
- 2003-09-08 US US10/657,286 patent/US20040047140A1/en not_active Abandoned
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4475796A (en) * | 1981-03-13 | 1984-10-09 | Olympus Optical Co., Ltd. | Epidark illumination system |
US4484069A (en) * | 1981-10-15 | 1984-11-20 | St. Regis Paper Company | Apparatus and method for sensing distance |
US4604648A (en) * | 1984-10-12 | 1986-08-05 | Kley Victor B | Electronic viewing system for integrated circuit packages |
US4706168A (en) * | 1985-11-15 | 1987-11-10 | View Engineering, Inc. | Systems and methods for illuminating objects for vision systems |
US4972093A (en) * | 1987-10-09 | 1990-11-20 | Pressco Inc. | Inspection lighting system |
US4918284A (en) * | 1988-10-14 | 1990-04-17 | Teradyne Laser Systems, Inc. | Calibrating laser trimming apparatus |
US4893223A (en) * | 1989-01-10 | 1990-01-09 | Northern Telecom Limited | Illumination devices for inspection systems |
US5038258A (en) * | 1989-03-02 | 1991-08-06 | Carl-Zeiss-Stiftung | Illuminating arrangement for illuminating an object with incident light |
US5129009A (en) * | 1990-06-04 | 1992-07-07 | Motorola, Inc. | Method for automatic semiconductor wafer inspection |
US5185638A (en) * | 1991-04-26 | 1993-02-09 | International Business Machines Corporation | Computer controlled, multiple angle illumination system |
US5737122A (en) * | 1992-05-01 | 1998-04-07 | Electro Scientific Industries, Inc. | Illumination system for OCR of indicia on a substrate |
US5515452A (en) * | 1992-12-31 | 1996-05-07 | Electroglas, Inc. | Optical character recognition illumination method and system |
US5684530A (en) * | 1993-02-16 | 1997-11-04 | Northeast Robotics, Inc. | Continuous diffuse illumination method and apparatus |
US5519496A (en) * | 1994-01-07 | 1996-05-21 | Applied Intelligent Systems, Inc. | Illumination system and method for generating an image of an object |
US5799135A (en) * | 1994-06-28 | 1998-08-25 | Fanuc, Ltd. | Robot controlling method and apparatus using laser sensor |
US5822053A (en) * | 1995-04-25 | 1998-10-13 | Thrailkill; William | Machine vision light source with improved optical efficiency |
US5585616A (en) * | 1995-05-05 | 1996-12-17 | Rockwell International Corporation | Camera for capturing and decoding machine-readable matrix symbol images applied to reflective surfaces |
US5720424A (en) * | 1995-05-30 | 1998-02-24 | Kabushiki Kaisha Shinkawa | Wire bonding apparatus |
US5615013A (en) * | 1995-06-27 | 1997-03-25 | Virtek Vision Corp. | Galvanometer and camera system |
US5724139A (en) * | 1996-06-28 | 1998-03-03 | Polaroid Corporation | Dark field, photon tunneling imaging probes |
US5965042A (en) * | 1996-07-24 | 1999-10-12 | Miyachi Technos Corporation | Method and apparatus for laser marking with laser cleaning |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050197672A1 (en) * | 2000-02-04 | 2005-09-08 | Freeman Gary A. | Integrated resuscitation |
US20100212182A1 (en) * | 2005-06-06 | 2010-08-26 | Krishna Singh | Method and apparatus for dehydrating high level waste based on dew point temperature measurements |
AU2012244240B2 (en) * | 2007-08-29 | 2015-04-30 | Scientific Games, Llc | Enhanced scanner design |
US20100155379A1 (en) * | 2008-12-19 | 2010-06-24 | Applied Materials, Inc. | Illumination methods and systems for laser scribe detection and alignment in thin film solar cell fabrication |
WO2010080595A2 (en) * | 2008-12-19 | 2010-07-15 | Applied Materials, Inc. | Illumination methods and systems for laser scribe detection and alignment in thin film solar cell fabrication |
WO2010080595A3 (en) * | 2008-12-19 | 2010-10-14 | Applied Materials, Inc. | Illumination methods and systems for laser scribe detection and alignment in thin film solar cell fabrication |
CN104148810A (en) * | 2013-01-28 | 2014-11-19 | 先进科技新加坡有限公司 | Method of radiatively grooving a semiconductor substrate |
CN112858169A (en) * | 2021-02-01 | 2021-05-28 | 苏州维嘉科技股份有限公司 | Light source detection device, light source lighting method thereof and light source control device |
WO2023166643A1 (en) * | 2022-03-03 | 2023-09-07 | 三菱電機株式会社 | Appearance inspection device and appearance inspection method |
JP7471533B2 (en) | 2022-03-03 | 2024-04-19 | 三菱電機株式会社 | Appearance inspection device and appearance inspection method |
Also Published As
Publication number | Publication date |
---|---|
US6633338B1 (en) | 2003-10-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6633338B1 (en) | Programmable illuminator for vision system | |
EP1581781B1 (en) | Method and apparatus for simultaneous 2-d and topographical inspection | |
US6464126B2 (en) | Bonding apparatus and bonding method | |
US4608494A (en) | Component alignment apparatus | |
JP2750953B2 (en) | Barcode imaging method | |
US8681211B2 (en) | High speed optical inspection system with adaptive focusing | |
KR950024099A (en) | How to create images of lighting systems and objects | |
WO2000011400A1 (en) | Light array system and method for illumination of objects imaged by imaging systems | |
CN101603926B (en) | Multi-surface detection system and method | |
KR20030015207A (en) | Imaging system | |
JP5807772B2 (en) | Defect detection apparatus and method | |
JPH11183389A (en) | Observing device | |
CN1243970C (en) | Scanning head and outer inspection method and apparatus capable of using said scanning head | |
KR20110069058A (en) | Apparatus and method for optically converting a three-dimensional object into a two-dimensional planar image | |
TWI697662B (en) | Illumination system, inspection tool with illumination system, method for inspecting an object, and method of operating an illumination system | |
US20070024846A1 (en) | Device for Dark Field Illumination and Method for Optically Scanning of Object | |
JP2005091049A (en) | Light irradiator for image processing and light irradiation method for image processing | |
KR101581777B1 (en) | Multiple surface inspection system and method | |
JP2000028320A (en) | Image recognizing equipment and image recognizing method | |
JP2001522997A (en) | Apparatus and method for detecting the position of a component and / or for detecting the position of a connection of a component, and a mounting head having a device for detecting the position of a component and / or detecting the position of a connection of a component | |
US20040114035A1 (en) | Focusing panel illumination method and apparatus | |
JPH0545142A (en) | Method for inspecting surface state | |
JPH11242002A (en) | Observing device | |
JP2013007588A (en) | Defect detection device and its method | |
RU2366932C2 (en) | Method for control of paper web quality in process of its production and device for control of paper web quality in process of its production |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: ELECTRO SCIENTIFIC INDUSTRIES, INC., OREGON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GSI GROUP CORPORATION;GSI GROUP INC;REEL/FRAME:030582/0160 Effective date: 20130503 |
|
AS | Assignment |
Owner name: ELECTRO SCIENTIFIC INDUSTRIES, INC., OREGON Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION SERIAL NUMBER 11776904 PREVIOUSLY RECORDED ON REEL 030582 FRAME 0160. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:GSI GROUP CORPORATION;GSI GROUP INC.;REEL/FRAME:056424/0287 Effective date: 20130503 |