WO1990009561A2 - Laser range imaging system using projective geometry - Google Patents

Laser range imaging system using projective geometry Download PDF

Info

Publication number
WO1990009561A2
WO1990009561A2 PCT/US1990/000832 US9000832W WO9009561A2 WO 1990009561 A2 WO1990009561 A2 WO 1990009561A2 US 9000832 W US9000832 W US 9000832W WO 9009561 A2 WO9009561 A2 WO 9009561A2
Authority
WO
WIPO (PCT)
Prior art keywords
apparatus defined
height
image
light
determining
Prior art date
Application number
PCT/US1990/000832
Other languages
French (fr)
Other versions
WO1990009561A3 (en
Inventor
Constantine J. Tsikos
Original Assignee
Tsikos Constantine J
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsikos Constantine J filed Critical Tsikos Constantine J
Publication of WO1990009561A2 publication Critical patent/WO1990009561A2/en
Publication of WO1990009561A3 publication Critical patent/WO1990009561A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2522Projection by scanning of the object the position of the object changing and being recorded
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses

Definitions

  • the present inver s ion relates to a system (both 'method and apparatus) for producing a "range image" of an area of the surface of an object, and a method for calibrating a system of this type. More particularly, the present invention relates to a range imaging system, and a method for calibrating the system, which are based on the principles of projective geometry.
  • Active, optical range imaging systems collect three dimensional coordinate data from visible object surfaces in a scene. These systems can be used in a wide variety of automation applications, including shape acquisition, bin picking, robotic assembly, inspection, gauging, mobile robot navigation, automated cartography, medical diagnosis (biostereometrics) and automated military applications.
  • the range imaging sensors in such systems are unique imaging devices in that the image data points explicitly represent seen surface geometry as sampled surface points.
  • Range images are known by many other names depending on the context.
  • range images have been referred to, variously, as a range map, depth map, 3-D image, 2.5-D image, digital terrain map (DTM) , topographic map, surface profile, XYZ point list, surface distance matrix, coajtour map, and surface height map.
  • DTM digital terrain map
  • Radar measures "time of flight" transmission time to and from an object surface.
  • the transmitted energy may be electromagnetic radiation or sonic waves.
  • the sh K transmitted energy may be pulsed, a continuous wave or an amplitude or frequency modulated wave.
  • Triangulation measures two interior angles, angle AB and angle BC, and the baseline B of a triangle ABC, and then determines the length A and C from the viewing apparatus to the object surface.
  • either the ambient light reflected from the object surface may be viewed from two angles, on opposite ends of the base line, or light may be projected onto the object's surface from one end of the base line and viewed or detected from the opposite end of the baseline.
  • the light projected may be "structured" as a single point, straight line, multiple points, multiple lines, grid, circle or the like.
  • the viewing grating to create an output signal with surface depth information encoded as a phase difference.
  • coherent light from two separate lasar beams, focused at a common surface point is added and the surface depth is encoded in the detected phase difference.
  • Lens focusing determines the distance to a point on an object surface viewed through a lens that is in focus.
  • exact in-focus images of the grating are formed at regular, periodic distance intervals whereas the grating images are out of focus in a predictable manner between the end points of these intervals.
  • the calibration method comprises the steps of:
  • Figure la is a representational diagram of a general camera model. " ,
  • Figure lb is a representational diagram of a pinhole camera model.
  • Figure 2 is a representational diagram showing three points under projection.
  • Figure 3 is a diagram giving the cross-ratio or anharmonic ratio definition.
  • Figure 4 is a representational diagram illustrating the invariance of the cross-ratio under central projection.
  • Figure 6 is a partly perspective, partly representational diagram of a laser range imaging system according to the preferred embodiment of the present invention.
  • Figure 7 is a block diagram showing the camera sub ⁇ system and the electronics subsystem in the laser range imaging system of Fig. 6.
  • Figure 8 is a representational diagram illustrating the operation of the laser range imaging system of Fig. 6.
  • Figure 9 is a block diagram of the RAM address generation and read/write control logic in the electronics subsystem of Fig. 7.
  • Figure 10 is a memory map diagram containing an example of the contents of the LUT EPROM in the electronics sub ⁇ system of Fig. 7.
  • Figure 11 is a perspective view of a simple calibration target which may be used in the calibration of the range imaging system of Fig. 6, wherein the height h(A) equals zero.
  • Figure 12 is a perspective view illustrating the correct placement of the calibration target of Fig. 11 on the scanning platform in the range imaging system of Fig. 6.
  • Figure 13 is a representational diagram illustrating the use of the cross ratio for height computation from three non-zero heights h(A) , h(B) and h(D)) .
  • Figure 14 is a perspective view of a simple, generalized calibration target which may be used in the calibration of the range imaging system of Fig. 6, wherein the height h (A) is non-zero.
  • Figure 15 is a perspective view illustrating the correct placement of the calibration target of Fig. 14 on the scanning platform in the range imaging system of Fig. 6.
  • Figure 16 is a perspective view of a simple calibration target which may be used in the calibration of the range imaging system of Fig. 6 to determine the height at each possible point.
  • the range imaging system according to the present invention is based on the invariance of the cross-ra*tio, a well known ratio from projective geometry.
  • the theoretical background of the invention will first be explained; the preferred embodiment of the invention will be described immediately thereafter.
  • FIG. 1 shows a conventional lens camera in Fig. 1A and the pinhole camera in Fig. IB.
  • the length of a line segment is the key to metric geometry, there is one fundamental concept of projective geometry in terms of which all projective properties of figures can be expressed. This concept - namely, the invariance of the so-called "cross-ratio" - forms the basis of the present invention.
  • any three points A, B, C on a straight line 1 can always be coordinated with any three points A* , B' , C' on another line 1* by two successive projections, as is illustrated in Figure 2.
  • the line 1* is rotated about the point C until it assumes a position l 1 ' parallel to 1.
  • the cross-ratio is neither a length, nor a ratio of two lengths, but the ratio of two such ratios.
  • ABCD cross-ratio
  • (ABCD) is the cross-ratio of the four points A, B, C, D taken in that order. It will now be proven that the cross-ratio of four points is invariant under projection.
  • cross-ratio of A, B, C, D depends only on the angles subtended at 0 by the segments joining A, B, C, D. Since these angles are the same for any four points A' , B 1 , C , D 1 into which A, B, C, D may be projected from 0, it follows that the cross-ratio remains invariant under projection. QED.
  • line 1 represents the cross- section of a laser plane of light normal to the horizontal plane.
  • Line 1' represents the cross-section of a camera's image plane.
  • Point 0 represents the camera's center of lens.
  • Point L represents the laser source.
  • point A is the intersection of the laser line with the horizontal reference plane and represents a point of minimum height visible by the camera.
  • Point B represents an arbitrary point whose distance from 11 the horizontal plane is known, or can be measured.
  • Point C represents a point on a surface whose height is to be determined.
  • point D represents a point of maximum height visible by the camera.
  • Points A, B and D are known, or can be accurately physically measured. In this example, the height of point A is set equal to zero. The height of the arbitrary point C is the unknown. Points A', B", C , D' represent the images of the points A, B, C, D, respectively, onto the camera's image plane under central projection from the camera's center of lens (i.e., point 0).
  • the displacements (in pixels) of the points A", B', and D 1 are known or can be accurately measured during calibration.
  • -the displacement of point A 1 is set equal to zero.
  • The' displacement of point C is accurately determined in real time.
  • the unknown height of point C is computed by solving for CA in the cross-ratio equation:
  • the unknown height of point C i.e. h(C)
  • h(C) is independent of the geometry (i.e. the base-line distance between camera and laser, laser angle, camera angle).
  • h(C) is independent of camera parameters (i.e. lens focal length, and the distance between the center of lens and the image plane) .
  • the unknown height h(C) is a function of one variable, d(C), and four constants: h(B), (D) , d(B), and d(D).
  • the constants h(B), h(D) , d(B), and d(D) are measured.
  • the system measures d(C).in real time and computes h(C); i.e., the unknown height at point C. 1
  • the laser range imaging system comprises four subsystems: (1) a laser and a cylindrical lens or vibrating mirror for producing a planar beam of light; (2) an electronic camera equipped with a lens and an appropriate interference filter; (3) an electronic circuit for height (depth) measurements and video image generation; and (4) a scanning mechanism (for example, a linear or rotary conveyor, or a robot manipulator with the laser/camera apparatus attached to the robot wrist) .
  • This range imaging system is illustrated generally in Figure 6.
  • the Laser Subsystem In the best mode of practicing the invention a 5 W Helium-Neon (HeNe) laser light source 20, emitting at 632.8 nanometers, is employed to produce a pencil light beam.
  • a cylindrical lens 22, or an oscillating or rotating mirror, are preferably used to spread the laser beam into a plane of light 24 as indicated in Figure 6.
  • the oscillating or rotating mirror, if used, are provided with means for oscillating or rotating the mirror surface.
  • the cylindrical lens if used, has a cylinder axis which intersects and is transverse to the pencil beam.
  • the Camera Subsystem In the best mode the camera subsystem is comprised of a B/W, CCD video camera 26, with gen-lock capability, a 16mm lens 28, f and a 632.8 n laser interference filter 30 mounted in front of the lens.
  • the purpose of the interference filter is to block out ambient light and to allow only the laser light to be seen by the camera.
  • the camera is oriented and mounted so that the image of the laser stripe 32, caused by the plane of light 24 inci. ent onto the horizontal plane 34, is seen by the camera as a vertical straight line.
  • the viewing axis of the camera is arranged at an angle in the range of 10° - 80° with respect to the plane of the light beam. This arrangement is shown in Figure 8.
  • the Electronics Subsystem The best mode of the electronics subsystem is illustrated in detail in Figure 7. As is shown there, a master clock oscillator 36 drives a video sync generator 38 to generate the vertical and horizontal sync signals for the camera and a control logic. In addition, the master clock oscillator clocks a stripe displacement counter chain 40 to count the displacement, in pixels, of the light stripe from the edge of the scanned image.
  • the horizontal sync pulse is delayed in a start pulse generator 42 by a (camera dependent) programmable amount of time. This delayed pulse labeled "start” enables the counter chain. Counting continues until a significant peak in the video signal is detected by a signal peak finder 44. When this occurs, counting stops.
  • the electronic subsystem can be provided with a signal threshold device, such as a Schmitt Trigger, or a signal centroid determining device for determining the presence and center of the video pulse representing the light stripe.
  • a signal threshold device such as a Schmitt Trigger, or a signal centroid determining device for determining the presence and center of the video pulse representing the light stripe.
  • the threshold device can be made variable, if desired, to adjust to the best signal threshold level.
  • the counter now contains a number d(i) which is the number of pixels that the light stripe, as "seen” by the camera, is displaced in the camera's image plane at the point "i". This operation is illustrated in Figure 8.
  • This number d(i) (for example, d(l), d(2), d(3) , in Figure 8) is used as an address to the cross-ratio LUT (Look Up Table) EPROM 46 which linearizes the space and converts displacement numbers into height (or distance) .
  • the generation of the LUT is described hereinbelow.
  • the data from the LUT is stored in a RAM 48 as a height number h(i) .
  • d(l) becomes h(l)
  • d(2) becomes h(2), and so on.
  • the RAM row address counter 50 (Fig. 9) is incremented by one.
  • a total of 240 points are computed (one for each horizontal raster line) and are stored in a 256x256x8 RAM as a column of consecutive height numbers. In other words, the system generates one cross-section of the surface under inspection every 16.67 ms.
  • the data from the entire RAM bank is supplied to an 8-bit video D/A converter to generate a standard RS-170 or CCIR video signal.
  • This signal' can be displayed on a monitor or can be digitized by an extsernal machine vision system.
  • the video image is a picture of the RAM contents. The higher the height value at a point, the brighter the image on the video display at that point. In other words, the image intensity will be proportional to height.
  • This range image generation process is illustrated in Figure 8.
  • two eight-bit binary counters 50 and 52 are used to generate the RAM's row and column addresses.
  • the RAM bank always contains a window of the last 256 cross sections. Older cross sections are overwritten as new height data is coming in. In other words the RAM behaves as a non-recirculating, two-dimensional shift register.
  • the system has two modes of operation. In the asynchronous mode, selected via a mode switch 56 shown in Figure 9, d(i) data is constantly written in RAM. In the synchronous mode, an external trigger signal allows only the next 256 columns of h(i) data to be written into the RAM. The RAM then becomes write-protected until the next external trigger pulse arrives. This mode allows synchronization to an external event and preservation of height data d(i) for further inspection and processing.
  • the Scanning Subsystem One best mode of the scanning subsystem according to the invention is shown in Figure 6.
  • This subsystem comprises a scanning platform or conveyor 58 driven by a motor 60 and screw 62.
  • this subsystem is highly user dependent.
  • the laser range imaging system of the invention uses a fixed laser/camera geometry, where the laser and camera subsystems are mounted on a common support.
  • the scene is linearly translated under a stationary laser/camera, or the laser/camera is linearly translated over a stationary scene.
  • Rotational cross-sections can be easily generated by replacing the linear translator with a rotating mechanism. Dense or sparse cross-sections can be generated by varying the speed of the scanning mechanism. Generation of true aspect ratio images requires the use of a simple servo loop to control the velocity of the scanning mechanism.
  • the Look-Up Table is stored in an Electrically Programmable Read Only Memory (EPROM) .
  • EPROM Electrically Programmable Read Only Memory
  • the address space of the EPROM 46 is divided into two sections, low and high, as shown in Figure 10.
  • the user selects a memory section via a mode switch, which is indicated in Figure 7. Only one section can be selected at any time.
  • the mode switch 64 is in the "CALIBRATE” position the low section is selected.
  • the mode switch is in the "RUN” position, the high section is selected.
  • the calibration procedure consists of the following steps:
  • the electronics subsystem is switched into the "CALIBRATE" mode. This selects the low section of the LUT EPROM.
  • the laser is oriented perpendicular to the scanning platform (or conveyor) .
  • a two step, parallel plane calibration target is constructed as shown in Figure 11. The top plane and the lower plane of the target are designated as D and B, respectively.
  • the lengths h(D) and h(B) in some units of length are then measured as shown in Figure 11 and stored in a computer. It is recommended that the target be constructed so that h(D) covers the camera's field of view, that is, so that the laser stripe on the top plane D will be imaged onto the right-most pixels in the camera' s image plane.
  • the length h(B) should preferably be about half the size of h(D) .
  • the target is placed on the scanning platform
  • the above program prints a list of EPROM addresses and the corresponding EPROM data at each address .
  • This list is passed to a commercially available EPROM Programmer Device to program the high section of the LUT EPROM as is illustrated in Figure 10.
  • Equation [26] solves for the unknown height h(C) in -the general case (i.e. when h(A) and d(A) are not equal to zero) . Substituting d(i) for d(C) into equation [26] yields:
  • a range imaging system which implements this equation is identical to that shown in Fig. 6. It is not necessary, in this case, to align the camera in such a way that the image of points at height h(A) are aligned on the first scan line.
  • the target is then placed on the scanning platform (or conveyor) as illustrated in Figure 15.
  • a complete range image is generated and the displacements of points lying on the planes A, B and D respectively, are saved in the lower section of the LUT EPROM as the values of d(A) d(B) and d(D) , respectively.
  • h(D) Measured Physical Height of Plane D.
  • h(B) Measured Physical Height of Plane B.
  • h(A) Measured Physical Height of Plane A.
  • d(D) Measured Displacement of Plane D.
  • d(B) Measured Displacement of Plane B.
  • d(A) Measured Displacement of Plane A.
  • d(i) 1 h(B) * [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)] h(A) - [d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]]
  • the height h (i) at successive points along a light stripe may be determined in various alternative ways with the range imaging system according to the present invention.
  • a microcomputer can be provided to perform the calculation for each h(i) , in accordance with equation [16] or [27] , in dependence upon the values of h(A) , h(B) , h(D) , d(A) , d(B) and d(D) determined during calibration and stored in RAM.

Abstract

A range imaging system, and a method for calibrating such a system are based on the principles of projective geometry. The system comprises four subsystems: (1) a laser and a cylindrical lens or vibrating mirror for producing a planar beam of light; (2) an electronic camera equipped with a lens and an appropriate interference filter; (3) an electronic circuit for height (depth) measurements and video image generation; and (4) a scanning mechanism for moving the object with respect to the light beam and the camera so as to scan an area of the object surface. The system is calibrated by determining the position in the electronic image of the object surface at three different heights. The range image is generated from these three known heights from either a previously determined look-up table, or from a calculation based on the invariance of the cross-ratio, a well known ratio from projective geometry.

Description

LASER RANGE IMAGING SYSTEM USING PROJECTIVE GEOMETRY
BACKGROUND OF THE INVENTION
The present inversion relates to a system (both 'method and apparatus) for producing a "range image" of an area of the surface of an object, and a method for calibrating a system of this type. More particularly, the present invention relates to a range imaging system, and a method for calibrating the system, which are based on the principles of projective geometry.
Active, optical range imaging systems collect three dimensional coordinate data from visible object surfaces in a scene. These systems can be used in a wide variety of automation applications, including shape acquisition, bin picking, robotic assembly, inspection, gauging, mobile robot navigation, automated cartography, medical diagnosis (biostereometrics) and automated military applications. The range imaging sensors in such systems are unique imaging devices in that the image data points explicitly represent seen surface geometry as sampled surface points.
Range images are known by many other names depending on the context. For example, range images have been referred to, variously, as a range map, depth map, 3-D image, 2.5-D image, digital terrain map (DTM) , topographic map, surface profile, XYZ point list, surface distance matrix, coajtour map, and surface height map. Many techniques are Jξiiown in the art for obtaining range images, but most active optical techniques are based on one of the following five principles:
(1) Radar measures "time of flight" transmission time to and from an object surface. The transmitted energy may be electromagnetic radiation or sonic waves. The sh K transmitted energy may be pulsed, a continuous wave or an amplitude or frequency modulated wave. (2) Triangulation measures two interior angles, angle AB and angle BC, and the baseline B of a triangle ABC, and then determines the length A and C from the viewing apparatus to the object surface. Basically, either the ambient light reflected from the object surface may be viewed from two angles, on opposite ends of the base line, or light may be projected onto the object's surface from one end of the base line and viewed or detected from the opposite end of the baseline. The light projected may be "structured" as a single point, straight line, multiple points, multiple lines, grid, circle or the like.
(3) The Moire method and holographic interferometry both use interference phenomena to determine the distance to an object surface. With Moire, one amplitude-modulated spacial signal (e.g. reflected light from a scene) is multiplied by another amplitude modulated spacial signal
(the viewing grating) to create an output signal with surface depth information encoded as a phase difference. In holographic interferometry, coherent light from two separate lasar beams, focused at a common surface point, is added and the surface depth is encoded in the detected phase difference.
(4) Lens focusing determines the distance to a point on an object surface viewed through a lens that is in focus. Similarly, with Fresnel diffraction of coherent light passed through a diffraction grating, exact in-focus images of the grating are formed at regular, periodic distance intervals whereas the grating images are out of focus in a predictable manner between the end points of these intervals.
(5) Stadimetry determines the distance to an object surface from the size of the image in a viewed scene. This technique requires that the size of the object be recognized and compared to the size of the viewed image. All of the techniques heretofore known for producing range images require complex mechanical/optical/electronic means for implementation. Furthermore, the use of these techniques normally requires that the viewing geometry and dimensions, or in the case of stadimetry the object geometry and dimensions, be known in advance. Furthermore, complex calculations are normally required to produce the range image so that the range imaging apparatus is relatively costly.
SUMMARY OF THE INVENTION
It is a principal object of the present invention to provide a range imaging system (both method and apparatus) , capable of producing a range image of an area of the surface of an object, which system is substantially simpler and less costly than range imaging systems known in the art.
It is a further object of the present invention to provide a range imaging system which is based on the principles of projective geometry.
It is a further object of the present invention to provide a range imaging system which uses the invariance of the so-called "cross-ratio" under central projection to determine the height of an object surface at a plurality of points.
It is a further object of the present invention to provide a simple method of calibration for a range imaging system.
It is a further object of the present invention to provide a range imaging system which is compatible with any video standard and may thus be interfaced with any video system or "frame-grabber" circuit board. It is a further object of the present invention to provide a range imaging system which makes ditigal range data available in a system memory, for access by a computer or parallel I/O.
It is a further object of the present invention to provide a range imaging system which is capable of operation in an asynchronous mode, wherein the image is constantly generated and updated, and in a synchronous or "trigger-and- freeze" mode in which the system operation is synchronized with, or triggered by, an external event.
These objects, as well as further objects which will become apparent from the discussion that follows, are achieved, according to the present invention, by:
(a) projecting a substantially planar light beam onto an object surface to illuminate the surface along a light stripe;
(b) viewing the object surface, as illuminated by the light beam, at an angle with respect to the plane of the light beam, and converting the viewed image into an image representation thereof which is divided into a plurality of picture scan elements;
(c) determining the relative positions, within the image representation, of the scan elements thereof corresponding to the light stripe on the object surface;
(d) determining the height of the object surface at a plurality of points along the light stripe; and
(e) moving the object with respect to the planar light beam so as to scan an area of the object surface with the light beam. The calibration method, according to the present invention, comprises the steps of:
(a) viewing a target object having a surface with at least three known heights h(A), h(B) and h(C) at different points A, B and C, and producing an electronic representation of the target object which is divided into a plurality of picture scan elements; and
(b) determining the relative positions, d(A) , d(B) and d(C) , within the electronic representation of the scan elements thereof, of the object surface at the known heights h(A) , h(B) and h(C) .
For a full understanding of the present invention, reference should now be made to the following detailed description of the preferred embodiments of the invention and to the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure la is a representational diagram of a general camera model. " ,
Figure lb is a representational diagram of a pinhole camera model.
i -k
Figure 2 is a representational diagram showing three points under projection.
Figure 3 is a diagram giving the cross-ratio or anharmonic ratio definition. Figure 4 is a representational diagram illustrating the invariance of the cross-ratio under central projection.
Figure 5 is a representational diagram illustrating the use of the cross-ratio for height computation from one zero height [h (A) = O] and two non-zero heights [h(B) and h(D)] .
Figure 6 is a partly perspective, partly representational diagram of a laser range imaging system according to the preferred embodiment of the present invention.
Figure 7 is a block diagram showing the camera sub¬ system and the electronics subsystem in the laser range imaging system of Fig. 6.
Figure 8 is a representational diagram illustrating the operation of the laser range imaging system of Fig. 6.
Figure 9 is a block diagram of the RAM address generation and read/write control logic in the electronics subsystem of Fig. 7.
Figure 10 is a memory map diagram containing an example of the contents of the LUT EPROM in the electronics sub¬ system of Fig. 7.
Figure 11 is a perspective view of a simple calibration target which may be used in the calibration of the range imaging system of Fig. 6, wherein the height h(A) equals zero.
Figure 12 is a perspective view illustrating the correct placement of the calibration target of Fig. 11 on the scanning platform in the range imaging system of Fig. 6. Figure 13 is a representational diagram illustrating the use of the cross ratio for height computation from three non-zero heights h(A) , h(B) and h(D)) .
Figure 14 is a perspective view of a simple, generalized calibration target which may be used in the calibration of the range imaging system of Fig. 6, wherein the height h (A) is non-zero.
Figure 15 is a perspective view illustrating the correct placement of the calibration target of Fig. 14 on the scanning platform in the range imaging system of Fig. 6.
Figure 16 is a perspective view of a simple calibration target which may be used in the calibration of the range imaging system of Fig. 6 to determine the height at each possible point.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
The range imaging system according to the present invention is based on the invariance of the cross-ra*tio, a well known ratio from projective geometry. The theoretical background of the invention will first be explained; the preferred embodiment of the invention will be described immediately thereafter.
A. THEORETICAL BACKGROUND
The "pinhole" model for a camera will be used without loss of generality in the following explanation. In particular, it will be assumed that the camera can be approximated by a pinhole and that the image is formed onto the camera's image plane under a central projection. Figure 1 shows a conventional lens camera in Fig. 1A and the pinhole camera in Fig. IB. Just as the length of a line segment is the key to metric geometry, there is one fundamental concept of projective geometry in terms of which all projective properties of figures can be expressed. This concept - namely, the invariance of the so-called "cross-ratio" - forms the basis of the present invention.
If three points A, B, C lie on a straight line, a projection will in general change not only the distances AB and BC but also the ratio AB/BC. In fact, any three points A, B, C on a straight line 1 can always be coordinated with any three points A* , B' , C' on another line 1* by two successive projections, as is illustrated in Figure 2. To do this, the line 1* is rotated about the point C until it assumes a position l1' parallel to 1. Then line 1 is projected onto I1 by a projection parallel to the line joining C and C, thus defining three points A"', B' ' and C' (where C1' = C) . The lines joining A1 , A11 and B1, B*' will intersect at a point O, which is chosen as the center of a second projection. Therefore, a parallel and a central projection can coordinate any three points on a line 1 with any three points on a line 1* .
As has been just shown, no quantity that involves only three points on a line can be invariant under projection. The decisive discovery of projective geometry is the fact that if we have four points A, B, C, D on a straight line and project these onto A1, B', C, D', as shown in Figure 3, then there is a quantity, called the "cross-ratio" of the four points, that remains invariant under projection. This is a mathematical property of a set of four points on a line that is not destroyed by projection and that can be recognized in any image of the line.
Cross-Ratio Definition and Proof of Invariance:
The cross-ratio is neither a length, nor a ratio of two lengths, but the ratio of two such ratios. Consider for example, four points on a line A, B, C, D. Given t-tie ratios CA/CB and DA/DB, their cross-ratio (ABCD) is given by:
(CA)
(CB) (ABCD) = (Cross-ratio definition) . " [1]
(DA)
(DB)
By definition, (ABCD) is the cross-ratio of the four points A, B, C, D taken in that order. It will now be proven that the cross-ratio of four points is invariant under projection.
Theorem (from Projective Geometry) : If A, B, C, D and A' , B' , C, D' are corresponding points on two lines related by a projection, then: »
Figure imgf000011_0001
(CA) (C'A1)
(CB) (C'B')
(ABCD) = = (A'B'C'D1) = . [2]
(DA) (D'A')
(DB) (D'B')
Proof: The area of the triangles shown in Figure 4 may be computed as follows:
area (OCA) = 1/2 * h * (CA) = 1/2 * (OA) * (0C) * sin(CQA) [2a] area(OCB) = 1/2 * h * (CB) = 1/2 * (OB) * (0C) * sin (COB) * [2b] area(ODA) = 1/2 * h * (DA) = 1/2 * (QA) * (OD) * sin0X3A) " "' [2c] area(ODB) = 1/2 * h * (DB) = 1/2 * (OB) * (OD) * si (DOB) [2d] Substituting [2a] , [2b] , [2c] and [2d] into equation [2] yields:
(CA)
(CB) (CA)*(DB) (OA)*(OC)*sin(CQA) (OB)*(OD)*sin(DOB)
(DA) (CB)*(DA) (OB)*(OC)*sin(COB) (QA)*(OD)*sin(DOA)
(DB)
sin(CCA) * sin(DOB)
sin(COB) * sin(DOA)
Hence the cross-ratio of A, B, C, D depends only on the angles subtended at 0 by the segments joining A, B, C, D. Since these angles are the same for any four points A' , B1, C , D1 into which A, B, C, D may be projected from 0, it follows that the cross-ratio remains invariant under projection. QED.
B. USE OF CROSS-RATIO FOR REAL-TIME RANGE IMAGE GENERATION
Referring now to Figure 5, line 1 represents the cross- section of a laser plane of light normal to the horizontal plane. Line 1' represents the cross-section of a camera's image plane. Point 0 represents the camera's center of lens. Point L represents the laser source. Considering now the four points on line 1, point A is the intersection of the laser line with the horizontal reference plane and represents a point of minimum height visible by the camera. Point B represents an arbitrary point whose distance from 11 the horizontal plane is known, or can be measured. Point C represents a point on a surface whose height is to be determined. Finally, point D represents a point of maximum height visible by the camera.
The heights of points A, B and D are known, or can be accurately physically measured. In this example, the height of point A is set equal to zero. The height of the arbitrary point C is the unknown. Points A', B", C , D' represent the images of the points A, B, C, D, respectively, onto the camera's image plane under central projection from the camera's center of lens (i.e., point 0).
In particular, the displacements (in pixels) of the points A", B', and D1 are known or can be accurately measured during calibration. In this example, -the displacement of point A1 is set equal to zero. The' displacement of point C is accurately determined in real time. The unknown height of point C is computed by solving for CA in the cross-ratio equation:
(CA) (CA')
(CB) (CB1)
(DA) (D'A')
(DB) (D'B')
Solving for CA yields the height of point C:
(CB)*(D,B,)*(DA)
(CA) = (CA') * [5]
(C,B,)*(DB)*(D'A')
The following substitutions will now be made: (CA) = h(C) (height at point C) [6]
(CB) = h(C)-h(B) (height difference between points C and B) [7]
(DA) = h(D) (height at point D) [8]
(DB) = h(D)-h(B) (height difference between points
D and B) [9]
(CA*) = d(C) (displacement of point C in the image) [10]
(CB1) = d(C)-d(B) (displacement difference between C and B) [11]
(D'A') = d(D) (displacement of point D in the [12] image ) (D'B1) = d(D)-d(B) (displacement difference between [13]
D and B)
Substituting [6], [7], [8], [9], [10], [11], [12] and [13] into equation [5] yields:
[h(C)-h(B)]*[d(D)-d(B)]*h(D) h(C) = d(C) * [14]
[d(C)-d(B)]*[h(D)-h(B) ]*d(D)
Solving for h(C) yields:
d(C)*h(B)*h(D)*[d(D)-d(B) ] h(C) = [15]
[d(C)*h(D)*[d(D)-d(B)]]-[[(d(C)-d(B)]*[h(D)-h(B)]*d(D)]
In equation [15] the unknown height of point C, i.e. h(C), is independent of the geometry (i.e. the base-line distance between camera and laser, laser angle, camera angle). Furthermore h(C) is independent of camera parameters (i.e. lens focal length, and the distance between the center of lens and the image plane) . Instead, the unknown height h(C) is a function of one variable, d(C), and four constants: h(B), (D) , d(B), and d(D).
During calibration of the range imaging system, according to the present invention, the constants h(B), h(D) , d(B), and d(D) are measured. During range image generation, the system measures d(C).in real time and computes h(C); i.e., the unknown height at point C.1
C. THE LASER RANGE IMAGING SYSTEM
The laser range imaging system, according to the preferred embodiment of the present invention, comprises four subsystems: (1) a laser and a cylindrical lens or vibrating mirror for producing a planar beam of light; (2) an electronic camera equipped with a lens and an appropriate interference filter; (3) an electronic circuit for height (depth) measurements and video image generation; and (4) a scanning mechanism (for example, a linear or rotary conveyor, or a robot manipulator with the laser/camera apparatus attached to the robot wrist) . This range imaging system is illustrated generally in Figure 6.
The Laser Subsystem: In the best mode of practicing the invention a 5 W Helium-Neon (HeNe) laser light source 20, emitting at 632.8 nanometers, is employed to produce a pencil light beam. A cylindrical lens 22, or an oscillating or rotating mirror, are preferably used to spread the laser beam into a plane of light 24 as indicated in Figure 6. The oscillating or rotating mirror, if used, are provided with means for oscillating or rotating the mirror surface. The cylindrical lens, if used, has a cylinder axis which intersects and is transverse to the pencil beam.
The Camera Subsystem: In the best mode the camera subsystem is comprised of a B/W, CCD video camera 26, with gen-lock capability, a 16mm lens 28,fand a 632.8 n laser interference filter 30 mounted in front of the lens. The purpose of the interference filter is to block out ambient light and to allow only the laser light to be seen by the camera. The camera is oriented and mounted so that the image of the laser stripe 32, caused by the plane of light 24 inci. ent onto the horizontal plane 34, is seen by the camera as a vertical straight line. The viewing axis of the camera is arranged at an angle in the range of 10° - 80° with respect to the plane of the light beam. This arrangement is shown in Figure 8.
The Electronics Subsystem: The best mode of the electronics subsystem is illustrated in detail in Figure 7. As is shown there, a master clock oscillator 36 drives a video sync generator 38 to generate the vertical and horizontal sync signals for the camera and a control logic. In addition, the master clock oscillator clocks a stripe displacement counter chain 40 to count the displacement, in pixels, of the light stripe from the edge of the scanned image.
For each horizontal raster line the horizontal sync pulse is delayed in a start pulse generator 42 by a (camera dependent) programmable amount of time. This delayed pulse labeled "start" enables the counter chain. Counting continues until a significant peak in the video signal is detected by a signal peak finder 44. When this occurs, counting stops.
Instead of a signal peak finder the electronic subsystem can be provided with a signal threshold device, such as a Schmitt Trigger, or a signal centroid determining device for determining the presence and center of the video pulse representing the light stripe. The threshold device can be made variable, if desired, to adjust to the best signal threshold level.
The counter now contains a number d(i) which is the number of pixels that the light stripe, as "seen" by the camera, is displaced in the camera's image plane at the point "i". This operation is illustrated in Figure 8.
This number d(i) (for example, d(l), d(2), d(3) , in Figure 8) is used as an address to the cross-ratio LUT (Look Up Table) EPROM 46 which linearizes the space and converts displacement numbers into height (or distance) . The generation of the LUT is described hereinbelow.
The data from the LUT is stored in a RAM 48 as a height number h(i) . As is shown in Figure 8, d(l) becomes h(l), d(2) becomes h(2), and so on. During the horizontal sync pulse the RAM row address counter 50 (Fig. 9) is incremented by one. During a TV field time of 16.67 ms, a total of 240 points are computed (one for each horizontal raster line) and are stored in a 256x256x8 RAM as a column of consecutive height numbers. In other words, the system generates one cross-section of the surface under inspection every 16.67 ms.
Simultaneously, the data from the entire RAM bank is supplied to an 8-bit video D/A converter to generate a standard RS-170 or CCIR video signal. This signal' can be displayed on a monitor or can be digitized by an extsernal machine vision system. The video image is a picture of the RAM contents. The higher the height value at a point, the brighter the image on the video display at that point. In other words, the image intensity will be proportional to height. This range image generation process is illustrated in Figure 8.
As shown in Figure 9, two eight-bit binary counters 50 and 52 are used to generate the RAM's row and column addresses. The RAM bank always contains a window of the last 256 cross sections. Older cross sections are overwritten as new height data is coming in. In other words the RAM behaves as a non-recirculating, two-dimensional shift register. The system has two modes of operation. In the asynchronous mode, selected via a mode switch 56 shown in Figure 9, d(i) data is constantly written in RAM. In the synchronous mode, an external trigger signal allows only the next 256 columns of h(i) data to be written into the RAM. The RAM then becomes write-protected until the next external trigger pulse arrives. This mode allows synchronization to an external event and preservation of height data d(i) for further inspection and processing.
The Scanning Subsystem: One best mode of the scanning subsystem according to the invention is shown in Figure 6. This subsystem comprises a scanning platform or conveyor 58 driven by a motor 60 and screw 62. However, this subsystem is highly user dependent. Preferably the laser range imaging system of the invention uses a fixed laser/camera geometry, where the laser and camera subsystems are mounted on a common support. To generate a range image, either the scene is linearly translated under a stationary laser/camera, or the laser/camera is linearly translated over a stationary scene. Rotational cross-sections can be easily generated by replacing the linear translator with a rotating mechanism. Dense or sparse cross-sections can be generated by varying the speed of the scanning mechanism. Generation of true aspect ratio images requires the use of a simple servo loop to control the velocity of the scanning mechanism.
D. CALIBRATION PROCEDURE AND LOOK-UP TABLE GENERATION
The Look-Up Table is stored in an Electrically Programmable Read Only Memory (EPROM) . The address space of the EPROM 46 is divided into two sections, low and high, as shown in Figure 10. The user selects a memory section via a mode switch, which is indicated in Figure 7. Only one section can be selected at any time. When the mode switch 64 is in the "CALIBRATE" position the low section is selected. When the mode switch is in the "RUN" position, the high section is selected.
The low section of the LUT EPROM is a linear, one-to- one mapping between the displacement values d(i) and the height values h(i); i.e., h(i) = d(i). The high section of the EPROM is a non-linear mapping between the displacement values d(i) and the height values h(i); i.e., the high section of the LUT EPROM contains solutions to equation [15]. Substituting d(i) for d(C) into equation [15], and assuming d(A) = h(A) = 0, h(i) becomes:
d(i)*h(B)*h(D)*[d(D)-d(B)]
[d(i)*h(D)*[d(D)-d(B)]]-[[d(i)-d(B)]*[h(D)-h(B)]*d(D)]
The calibration procedure consists of the following steps:
1. The electronics subsystem is switched into the "CALIBRATE" mode. This selects the low section of the LUT EPROM.
2. The laser is oriented perpendicular to the scanning platform (or conveyor) .
3. The camera is oriented so that the laser stripe appears as a straight vertical line on the leftmost side of the monitor. More specifically, in this step the camera is aligned so that zero displacements d(i)=0, correspond to zero heights h(i)=0; i.e., the intensity of the generated range image is zero (i.e. black picture) . 4. A two step, parallel plane calibration target is constructed as shown in Figure 11. The top plane and the lower plane of the target are designated as D and B, respectively.
The lengths h(D) and h(B) in some units of length (e.g. millimeters) are then measured as shown in Figure 11 and stored in a computer. It is recommended that the target be constructed so that h(D) covers the camera's field of view, that is, so that the laser stripe on the top plane D will be imaged onto the right-most pixels in the camera' s image plane. The length h(B) should preferably be about half the size of h(D) .
5. The target is placed on the scanning platform
(or conveyor) as illustrated in Figure 12. A complete range image is then generated (requiring about 4.27 seconds) . This image is digitized using a commercially available image digitizer or "frame-grabber" board.
6. The displacement of any point lying on the top plane D of the target is determined, and this value is saved in the low section of the LUT EPROM as the value of d(D) . The displacement of any point lying on the lower plane B of the target is then determined and this value is saved as the value of d(B) . Assuming an 8-bit digitizer, the maximum value of d(D) will be 256.
7. The following Look-Up Table generator program is then run on the computer to generate the high section of the LUT EPROM contents: BEGIN /*Begin the Look-tip Table generation program */ h(D) = Measured physical height of plane D. /* from step 4 */ h(B) = Measured physical height of plane B. /* from step 4 */ d(D) = Measured displacement of plane D. /* from step 6 */ d(B) = Measured displacement of plane B. /* from step 6 */ d(i) = 1/* Initialize d(i) . Calculate h(i) using Eq. [16] */
d(i)*h(B)*h(D)*[d(D)-d(B) ]
10 h(i) =
[d(i)*h(D)*[d(D)-d(B)]]-[[d(i)-d(B)]*[h(D)-h(B.) ]*d(D) ]
PRINT "EPROM ADDRESS" = d (i) , "EPROM DATA" = h(i) /* Print */ d(i) = d(i) + 1 /* Increment the calculation loop counter */
IF d(i) 257 THEN /* Check if all 256 entries cαπputed */
GOTO 10 /* If not, go back and calculate another entry */
ELSE /* Otherwise */
STOP /* Stop, and exit from the loop */
END /* End the Look-Up Table generation program */
The above program prints a list of EPROM addresses and the corresponding EPROM data at each address . This list is passed to a commercially available EPROM Programmer Device to program the high section of the LUT EPROM as is illustrated in Figure 10.
E . GENERALIZED SOLUTION
It will be understood that the cross-ratio equations [15] and [16] employed with the lasar range imaging system described above assume that the lowermost height h (A) is equal to the minimum intensity (0 or black image) and the displacement at this height d (A) is also zero . This simplification is not necessary, of course, for operation of the range imaging system. Set forth below is a generalized solution of the cross-ratio formula which includes the variables h (A) and d (A) .
Referring to Figure 13, equation [5] is repeated here for completeness:
(CB) * (D'B1) * (DA)
(CA) = (CA') * [5]
(CB,)*(DB)*(,D'A*)
The following substitutions will now be made:
(CA) = h(C)-h(A) (Height difference between points C, A) [17]
(CB) = h(C)-h(B) (Height difference between points C, B) [18]
(DA) = h(D)-h(A) (Height difference between points D, A) [19]
(DB) = h(D)-h(B) (Height difference between points D, B) [20]
(CA')= d(C)-d(A) (Displacement difference between C, A in the image) [21]
(C'B')= d(C)-d(B) (Displacement difference between C, B in the image) [22]
(D'A')= d(D)-d(A) (Displacement difference between D, A in the image) [23]
(D'B')= d(D)-d(B) (Displacement difference between D, B in the image) [24]
Substituting [17] , [18] , [19] , [20] , [21] , [22] , [23] , and [24] into equation [5] yields: [h(C) -h(B) ] * [d(D) -d (B) ] * [h(~D) -h (A) ]
[h (C) -h(A) ] = [d(C) -d(A) ] * — [25]
[d(C) -d(B) ] * [h(D)-h(0) ] * [d(D) -d(A) ]
Solving for h(C) (the height at point C) yields: h(B) * [d(C)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)] ' (A) -
[d(C)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)] h(C) = [26]
[d(C)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
1 -
[d(C)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
Equation [26] solves for the unknown height h(C) in -the general case (i.e. when h(A) and d(A) are not equal to zero) . Substituting d(i) for d(C) into equation [26] yields:
h(B) * [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)] h(A) -
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)l ■* h(i) = [27]
[d(i)-dCA)] * [d(D)-d(B)] * [h(D)-h(A)]
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
A range imaging system which implements this equation is identical to that shown in Fig. 6. It is not necessary, in this case, to align the camera in such a way that the image of points at height h(A) are aligned on the first scan line.
Calibration of this system follows the same method set forth in Section D above. In this case, however, a three step, parallel plane calibration target is-' constructed as shown in Figure 14. The lower plane, middle plane and top plane of the target are designated as A, B and D, respectively. The heights h(A) h(B) and h (D) are measured and stored in the computer memory.
The target is then placed on the scanning platform (or conveyor) as illustrated in Figure 15. A complete range image is generated and the displacements of points lying on the planes A, B and D respectively, are saved in the lower section of the LUT EPROM as the values of d(A) d(B) and d(D) , respectively.
Thereafter, the following look-up table generator program is run to generate the high section of the LUT EPROM contents:
BEGIN h(D) = Measured Physical Height of Plane D. h(B) = Measured Physical Height of Plane B. h(A) = Measured Physical Height of Plane A. d(D) = Measured Displacement of Plane D. d(B) = Measured Displacement of Plane B. d(A) = Measured Displacement of Plane A. d(i) = 1 h(B) * [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)] h(A) - [d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
10 h(i) = [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
1 -
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
PRINT "EPROM ADDRESS" = d (i) , "EPROM DATA" = h(i) d(i) = d(i) + 1
IF d(i) 257 THEN
GOTO 10 ELSE
STOP END
It will be understood that the height h (i) at successive points along a light stripe may be determined in various alternative ways with the range imaging system according to the present invention. For example, instead of a look-up table LUT, a microcomputer can be provided to perform the calculation for each h(i) , in accordance with equation [16] or [27] , in dependence upon the values of h(A) , h(B) , h(D) , d(A) , d(B) and d(D) determined during calibration and stored in RAM.
Similarly, it is possible to store in a look-up table all of the possible values of h(i) and d(i) which are determined experimentally during an initial calibration. In this case, calibration of the system follows the same procedures as are outlined above, except that a "wedge" shaped calibration target is used as is illustrated in Figure 16. With this type of target the height h(i) increases linearly from the minimum height h(A) , which can be set equal to zero, to the maximum height h(D) , which can be set equal to 256. The displacements d(A) , d(2), d(3) ... d(i) ...d(D), are then measured during calibration and stored in the LUT EPROM for use by the system during operation.
There has thus been shown and described a ndvel laser range imaging system based on projective geometry which fulfills all the objects and advantages sought therefor. Many changes, modifications, variations and other uses and applications of the subject invention will, however, become apparent to those skilled in the art after considering this specification and the accompanying drawings which disclose the preferred embodiments thereof. All such changes, modifications, variations and other uses and applications which do not depart from the spirit and scope of the invention are deemed to be covered by the invention which is limited only by the claims which follow.

Claims

1. Apparatus for producing a range image of an area of the surface of an object, said apparatus comprising, in combination:
(a) light source means for projecting a substantially planar light beam onto said object surface to illuminate said surface along a light stripe;
(b) electronic camera means, arranged to view said object surface, as illuminated by said light beam, at an angle with respect to the plane of said light beam, for converting the viewed image into an electronic representation thereof which is divided into a plurality of picture scan elements;
(c) electronic signal processing means, coupled to said electronic camera means, comprising:
(1) means, responsive to said electronic representation of said image, for determining the relative position, within said electronic representation, of the scan elements thereof corresponding to said light stripe on said object surface;
(2) table look-up means responsive to said position determining means for determining the height of said object surface at a plurality of points along said light stripe; and
(3) means responsive to said height determining means for storing each height in association with its respective point on said object surface; and
(d) means for moving at least one of (i) said object with respect to said light source means and said camera means and (ii) said light source means and said camera means with respect to said object, thereby to scan an area of said object surface with said planar light beam.
2. The apparatus defined in claim 1, wherein said light source means comprises:
(1) means for producing a pencil beam of light; and
(2) means for scanning said pencil beam in a substantially linear direction.
3. The apparatus defined in claim 2, wherein said scanning means includes a mirror having a mirror surface arranged in the path of said pencil beam, and means for moving said mirror to adjust the angle of incidence of said pencil beam with respect to said mirror surface.
4. The apparatus defined in claim 1, v/herein said light source means comprises: L-
(1) means for producing a pencil beam of light; and.
(2) means for spreading said pencil beam in a substantially linear direction.
5. The apparatus defined in claim 4, wherein said spreading means includes a cylindrical lens arranged in, the path of said pencil beam.
6. The apparatus defined in claim 5, wherein, said cylindrical lens has a cylinder axis which intersects and is transverse to said pencil beam.
7. The apparatus defined in claim 2 , wherein "said means for producing said pencil beam includes a laser.
8. The apparatus defined in claim 4, wherein said means for producing said pencil beam includes a laser.
9. The apparatus defined in claim 1, wherein said light source means projects said light beam in a direction generally perpendicular to said object surface.
10. The apparatus defined in claim 9, wherein said object surface has a minimum height which is generally planar, and wherein the plane of said light beam is substantially perpendicular to the minimum height plane of said object surface.
11. The apparatus defined in claim 1, wherein said electronic camera means comprises video camera means for scanning the viewed image and producing a video signal representing a plurality of horizontal scan lines.
12. The apparatus defined in claim 11, wherein said scan lines form said scan elements of said electronic representation.
13. The apparatus defined in claim 11, wherein said video camera is oriented such that said scan lines are substantially perpendicular to the image of said light stripe.
14. The apparatus defined in claim 1, wherein said camera means has a viewing axis which is arranged at an angle in the range of 10°- 80° with respect to the plane of said light beam.
15. The apparatus defined in claim 1, wherein said position determining means includes means for counting the number of scan elements in said electronic representation of said image from the edge of said image to said scan elements corresponding to the image of said light stripe.
16. The apparatus defined in claim 15, wherein said position determining means further includes means for determining the presence of said light stripe image in said electronic representation of said viewed image.
17. The apparatus defined in claim 16, wherein said light stripe presence determining means include signal peak determining means for determining the peak of an electronic representation of said light stripe image.
18. The apparatus defined in claim 16, wherein said light stripe presence determining means include signal threshold means for determining when said electronic representation of said light stripe image exceeds a threshold.
19. The apparatus defined in claim 18, wherein said signal threshold means includes means for varying said threshold.
20. The apparatus defined in claim 16, wherein said light stripe presence determining means include signal centroid determining means for determining the centroid of an electronic representation of said light stripe image.
21. The apparatus defined in claim 15, wherein said counting means include means for generating clock pulses; a pulse counter, responsive to said pulse generator means, for counting said clock pulses during a prescribed period of time; sync generator means, coupled to said pulse generator means, to said camera means and to said pulse counter, for enabling said pulse counter at the beginning of a scan of said camera means; and light stripe detection means, responsive to said electronic representation of said image, for stopping said pulse counter when said light stripe is scanned by said camera means.
22. The apparatus defined in claim 1, wherein said table look-up means includes a table containing the height associated with each possible relative position.
23. The apparatus defined in claim 1, wherein said height determining means include means for storing the known, calibrated height of at least two points (B, D) of different height.
24. The apparatus defined in claim 23, wherein said height determining means include means for storing the known, calibrated height of at least three points (A,B,D) of different height.
25. The apparatus defined in claim 24, wherein points A and D are at substantially minimum and maximum viewing height, respectively, of said camera means, and point B is substantially mid-height between points A and D.
26. Apparatus for producing a range image of an area of the surface of an object, said apparatus comprising, in combination:
(a) light source means for projecting a substantially planar light beam onto said object surface to illuminate said surface along a light stripe;
(b) electronic camera means, arranged to view said object surface, as illuminated by said light beam, at an angle with respect to the plane of said light beam, for converting the viewed image into an electronic representation thereof which is divided into a plurality of picture scan elements;
(c) electronic signal processing means, coupled to said electronic camera means, comprising: 29
(1) means, responsive to said electronic representation of said image, for determining the relative position, within said electronic representation, of the scan elements thereof corresponding to said light stripe on "said object surface;
(2) means responsive to said position determining means for determining the height of said object surface at a plurality of points along said light stripe, said height determining means including means for determining the height (h) of said object surface at a plurality of points (1, 2, 3...Ϊ...N) along said light stripe in dependence upon the heights (h(A) , h(B), h(D)) of three known points (A,B,D) and the cross ratio (d(A) , d(B), d(i), d(D)) with respect to point i, where "d" is the position of the respective point;
(3) means responsive to said height determining means for storing each height in association with its respective point on said object surface; and
(d) means for moving said object with respect to said light source means and said camera means so as to scan an area of said object surface with said planar lig beam.
27. The apparatus defined in claim 26, wherein said heights of said three known points (A, B, D) are, respectively, at substantially minimum height viewable by said camera means, at substantially intermediate height between said minimum and maximum heights viewable by said camera means, and at substantially maximum height viewable by said camera means.
28. The apparatus defined in claim 26, wherein said height h(i) is determined in accordance with the formula: d(i)*h(B)*h(D)*[d(D)-d(B) ] h(i) = [16]
[d(i) *h(D) *[d(D)-d(B) ] ]-[[d) (i)-d(B) ] *[h(D)-h(B) ]*d(D) ] where d (A) = h(A) = 0.
29. The apparatus defined in claim 28, wherein the height h(i) associated with each point (i) is stored in a look-up table in association with the number of scan elements d(i) associated with that point.
30. The apparatus defined in claim 1, wherein said light source means and said camera means are immovably mounted with respect to each other, and wherein said moving means includes conveyor means for moving said object past said planar light beam.
31. The apparatus defined in claim 1, wherein said light source means and said camera means are mounted on a common support, and wherein said moving means include means for moving said common support past said object so as to scan an area of said object surface with said planar light beam.
32. The apparatus defined in claim 26, wherein said height h(i) is determined in accordance with the formula:
h(B) * [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)] h(A)
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)] h(i) = [27]
[d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
1 -
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
33. The apparatus defined in claim 26, wherein said height determining means includes:
(i) means for precalculating the height h(i) at a plurality of points, with each point i associated with a different scan element d(i) of said electronic representation;
(ii) means for storing said precalculated heights h(i) in association with their respective scan elements d(i); and
(iii) means, coupled to said position determining means and responsive to the scan element position d(i) corresponding to said light stripe, for retrieving from said storage means the precalculated height h(i) of said light stripe at point i.
34. The apparatus defined in claim 33, wherein said precalculating means calculates the height h(i) at each point i in accordance with the formula:
h(B) * [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
[d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
35. The apparatus defined in claim 34, wherein d(A) = h(A) = 0.
36. The apparatus defined in claim 26, wherein said height determining means includes means, coupled to said position determining means and responsive to the scan element position d(i) corresponding to said light stripe, for calculating the height h(i) of said light stripe at point i.
37. The apparatus defined in claim 36, wherein said calculating means calculates the height h(i) at point i in accordance with the formula
h(B) * [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)] h(A) -
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
[d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
1 _
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
38. The apparatus defined in claim 37, wherein d(A) h(A) = 0.
39. The apparatus defined in claim 26, wherein d(i) is the number of scan elements from the edge of said electronic representation of the respective point i.
40. The apparatus defined in claim 26, further comprising target means, positionable as said object, for calibrating said apparatus, said target means having at least two known heights h(B) and h(C) .
41. The apparatus defined in claim 40, wherein h(A) = 0 and h(B) and h(C) are at substantially mid and maximum viewing heights, respectively of said camera means.
42. The apparatus defined in claim 40, wherein said target means has three known heights h(A) , h(B) and h(C).
43. The apparatus defined in claim 42 wherein h(A) , h(B) and h(C) are at substantially minimum, mid and maximum heights, respectively, of said camera means.
44. The apparatus defined in claim 22, further comprising target means, positionable as said object, for calibrating said apparatus, said target means having a different, known height associated with each possible relative position.
45. The apparatus defined in claim 44, wherein said target means is in the shape of a wedge.
46. A method of calibrating apparatus for producing a range image of an area of the surface of an object, said method comprising the steps of:
(a) viewing a target object having a surface with at least three known heights h(A) , h(B) and h(D) at different points A, B and D along a light stripe, and producing an electronic representation of said target object alpng said light stripe which is divided into a plurality of picture scan elements; and
(b) determining the relative positions, d(A)(, d(B) and d(D), within said electronic representation of the scan elements thereof, of said target object surface at said known heights h(A) , h(B) and h(D). i
47. The method defined in claim 46, wherein -i e height h(A) of point A equals 0, and wherein said target object is viewed such that said relative position d(A) equals 0.
48. The method defined in claim 46, further comprising the steps of determining the height h(i) of a poin^t i at each possible relative position d(i) , and storing said heights in a look-up table.
49. The method defined in claim 48, whe-i-tein said heights h(i) are determined in accordance in the formula: h(B) * [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)] h(A) -
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)] h(i) = — [27] [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)]
50. The method defined in claim 49, wherein the height h(A) of point A equals 0, and wherein said target object is viewed such that said relative position d(A) equals 0.
51. A method for producing a range image of an area of the surface of an object, said method comprising the steps of:
(a) projecting a substantially planar light beam onto said object surface to illuminate said surface along a light stripe;
(b) viewing said object surface, as illuminated by said light beam, at an angle with respect to the plane of said light beam; and converting the viewed image into an image representation thereof which is divided into a plurality of picture scan elements;
(c) determining the relative positions, within said representation, of the scan elements thereof corresponding to said light stripe on said object surface;
(d) determining the height of said object surface at a plurality of points along said light stripe; and (e) moving said object with respect to said planar light beam so as to scan an area of said object surface with said light beam.
52. The method defined in claim 51, wherein said height determining step includes the steps of:
(1) producing a look-up table containing the height associated with each possible relative position, within said image representation, of the scan elements thereof; and
(2) referring to said look-up table to determine the height of said object surface at a plurality of points along said light stripe in dependence upon the relative position of the scan elements corresponding to said light stripe on said object surface.
53. The method defined in claim 51, wherein said height determining step comprises the steps of:
(1) determining the height of at least three points (A,B,D) of different height within the field of view of said object surface, as illuminated by said light beam; and
(2) calculating from said known heights and from the relative positions, within said image representation, of the scan elements thereof corresponding to said light stripe on said object surface, the height of said object surface at a plurality of points along said light stripe.
54. The method defined in claim 53, wherein said height h(i) at each point (i) is calculated according to the formula: h(B) * [d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)] h(i) = [27]
[d(i)-d(A)] * [d(D)-d(B)] * [h(D)-h(A)]
[d(i)-d(B)] * [d(D)-d(A)] * [h(D)-h(B)] where d(i) is the number of scan elements from the edge of the image representation to each respective point i.
55. The method defined in claim 54, wherein said viewing step is carried out such that d(A) = h(A) = 0 in the viewed image.
56. The apparatus defined in claim 6, wherein said camera means has a viewing axis which is arranged at an angle of 10° - 80° with respect to the plane of said light beam.
57. The apparatus defined in claim 56, wherein said cylindrical lens has an optical axis arranged in line with said pencil beam, and wherein said viewing axis of said camera is perpendicular to said optical axis.
58. The apparatus defined in claim 18, wherein said signal threshold means is a Schmitt Trigger circuit.
59. The apparatus defined in claim 1, wherein said signal processing means is continuously operative to repeatedly process and store successive range images of the object surface viewed by said camera means.
60. The apparatus defined in claim 1, wherein said signal processing means is responsive to an externally generated trigger signal to process and store a single range image of the object surface viewed by said camera means.
61. The apparatus defined in claim 1, wherein said signal processing means include a mode switch which is switchable into asynchronous mode and synchronous mode positions, respectively: wherein said signal processing means is continuously operative, when said mode switch is placed in the asynchronous mode position, to repeatedly process and store successive range images of the object surface viewed by said camera means; and wherein said signal processing means is operative, in response to the setting of said mode switch in the synchronous mode position and to an externally generated trigger signal, to process and store a single range image of the object surface viewed by said camera means.
62. The apparatus defined in claim 1, wherein said moving means includes means for varying the speeά-a«ιth which said object is moved with respect to said light source means and said camera means, thereby to vary the pixel density of the scanned image in the direction of scan.
'H.
63. The apparatus defined in claim 33, wherein said storage means is an electrically programmable read only memory (EPROM) .
64. The method defined in claim 49, further comprising the steps of determining the height h(i) of a point i at each possible relative position d(i), and storing said heights in a look-up table.
65. The method defined in claim 64, wherein said look-up table is an electrically programmable read -only memory (EPROM) .
66. The apparatus defined in claim 26, wherein said light source means comprises: (1) means for producing a pencil beam of light; and
(2) means for scanning said pencil beam in a substantially linear direction.
67. The apparatus defined in claim 65, wherein said scanning means includes a mirror having a mirror surface arranged in the path of said pencil beam, and means for moving said mirror to adjust the angle of incidence of said pencil beam with respect to said mirror surface.
68. The apparatus defined in claim 26, wherein said light source means comprises:
(1) means for producing a pencil beam of light; and
(2) means for spreading said pencil beam in a substantially linear direction.
69. The apparatus defined in claim 68, wherein said spreading means includes a cylindrical lens arranged in the path of said pencil beam.
70. The apparatus defined in claim 69, wherein said cylindrical lens has a cylinder axis which intersects and is transverse to said pencil beam.
71. The apparatus defined in claim 66, wherein said means for producing said pencil beam includes a laser.
72. The apparatus defined in claim 68, wherein said means for producing said pencil beam includes a laser.
73. The apparatus defined in claim 26, wherein said light source means projects said light beam in a direction generally perpendicular to said object surface.
74. The apparatus defined in claim 73, wherein said object surface has a minimum height which is generally planar, and wherein the plane of said light beam is substantially perpendicular to the minimum height plane of said object surface.
75. The apparatus defined in claim 26, wherein said electronic camera means comprises video camera means for scanning the viewed image and producing a video signal representing a plurality of horizontal scan lines.
76. The apparatus defined in claim 75, wherein said scan lines form said scan elements of said electronic representation.
77. The apparatus defined in claim 75, wherein said video camera is oriented such that said scan lines are substantially perpendicular to the image of said light stripe.
78. The apparatus defined in claim 26, wherein the viewing axis of said camera means is arranged at an angle in the range of 10°- 80°C with respect to the plane of said light beam.
79. The apparatus defined in claim 26, wherein said position determining means includes means for counting the number of scan elements in said electronic representation of said image from the edge of said image to said scan elements corresponding to the image of said light stripe.
80. The apparatus defined in claim 79 wherein said position determining means further includes means for determining the presence of said light stripe image in said electronic representation of said viewed image.
81. The apparatus defined in claim 80, wherein said light stripe presence determining means include signal peak determining means for determining the peak of an electronic representation of said light stripe image.
82. The apparatus defined in claim 80, wherein said light stripe presence determining means include signal threshold means for determining when said electronic representation of said light stripe image exceeds a threshold.
83. The apparatus defined in claim 82, wherein said signal threshold means includes means for varying said threshold.
84. The apparatus defined in claim 80, wherein said light stripe presence determining means include signal centroid determining means for determining the centroid of an electronic representation of said light stripe image.
85. The apparatus defined in claim 79, wherein said counting means include means for generating clock pulses; a pulse counter, responsive to said pulse generator means, for counting said clock pulses during a prescribed period of time; sync generator means, coupled to said pulse generator means, to said camera means and to said pulse counter, for enabling said pulse counter at the beginning of a scan of said camera means; and light stripe detection means, responsive to said electronic representation of said image, for stopping said pulse counter when said light stripe is scanned by said camera means.
86. The apparatus defined in claim 26, wherein said height determining means include means for storing the known, calibrated height of at least two points (B, D) of different height.
87. The apparatus defined in claim 86, wherein said height determining means include means for storing the known, calibrated height of at least three points (A,B,D) of different height.
88. The apparatus defined in claim 87, wherein points A and D are at substantially minimum and maximum viewing height, respectively, of said camera means, and point B is substantially mid-height between points A and D.
89. The apparatus defined in claim 26, wherein said light source means and said camera means are immovably mounted with respect to each other, and wherein said moving means includes conveyor means for moving said object past said planar light beam.
90. The apparatus defined in claim 26, wherein said light source means and said camera means are mounted on a common support, and wherein said moving means include means for moving said common support past said object so as to scan an area of said object surface with said planar light beam.
PCT/US1990/000832 1989-02-17 1990-02-15 Laser range imaging system using projective geometry WO1990009561A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US07/312,635 US4979815A (en) 1989-02-17 1989-02-17 Laser range imaging system based on projective geometry
US312,635 1989-02-17

Publications (2)

Publication Number Publication Date
WO1990009561A2 true WO1990009561A2 (en) 1990-08-23
WO1990009561A3 WO1990009561A3 (en) 1990-10-04

Family

ID=23212346

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1990/000832 WO1990009561A2 (en) 1989-02-17 1990-02-15 Laser range imaging system using projective geometry

Country Status (3)

Country Link
US (1) US4979815A (en)
IL (1) IL93292A0 (en)
WO (1) WO1990009561A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0489641A1 (en) * 1990-12-05 1992-06-10 Commissariat A L'energie Atomique Calibration method for a tri-dimensional metrology system
FR2685764A1 (en) * 1991-12-30 1993-07-02 Kreon Ind COMPACT AND HIGH RESOLUTION OPTICAL SENSOR FOR THREE DIMENSIONAL SHAPE ANALYSIS.
WO2014170607A1 (en) * 2013-04-17 2014-10-23 R & D Vision Laser sheet localised imaging and profilometry system
EP2085744A4 (en) * 2006-12-25 2015-06-03 Nec Corp Distance measuring device, method, and program
JP2017067753A (en) * 2015-09-29 2017-04-06 株式会社東京精密 Device and method for identifying works
US20230026608A1 (en) * 2021-07-20 2023-01-26 Keyence Corporation Shape inspection device, processing device, height image processing device

Families Citing this family (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6631842B1 (en) 2000-06-07 2003-10-14 Metrologic Instruments, Inc. Method of and system for producing images of objects using planar laser illumination beams and image detection arrays
US6736321B2 (en) 1995-12-18 2004-05-18 Metrologic Instruments, Inc. Planar laser illumination and imaging (PLIIM) system employing wavefront control methods for reducing the power of speckle-pattern noise digital images acquired by said system
US6732929B2 (en) 1990-09-10 2004-05-11 Metrologic Instruments, Inc. Led-based planar light illumination beam generation module employing a focal lens for reducing the image size of the light emmiting surface of the led prior to beam collimation and planarization
US5245409A (en) * 1991-11-27 1993-09-14 Arvin Industries, Inc. Tube seam weld inspection device
CA2067400A1 (en) * 1992-04-28 1993-10-29 Robert E. Bredberg Laser thickness gauge
US5588428A (en) * 1993-04-28 1996-12-31 The University Of Akron Method and apparatus for non-invasive volume and texture analysis
US5513276A (en) * 1994-06-02 1996-04-30 The Board Of Regents Of The University Of Oklahoma Apparatus and method for three-dimensional perspective imaging of objects
US5684531A (en) * 1995-04-10 1997-11-04 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Ranging apparatus and method implementing stereo vision system
US5673082A (en) * 1995-04-10 1997-09-30 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Light-directed ranging system implementing single camera system for telerobotics applications
US6133948A (en) * 1995-12-04 2000-10-17 Virginia Tech Intellectual Properties, Inc. Automatic identification of articles having contoured surfaces
US6705526B1 (en) 1995-12-18 2004-03-16 Metrologic Instruments, Inc. Automated method of and system for dimensioning objects transported through a work environment using contour tracing, vertice detection, corner point detection, and corner point reduction methods on two-dimensional range data maps captured by an amplitude modulated laser scanning beam
US20020014533A1 (en) 1995-12-18 2002-02-07 Xiaxun Zhu Automated object dimensioning system employing contour tracing, vertice detection, and forner point detection and reduction methods on 2-d range data maps
US6830189B2 (en) * 1995-12-18 2004-12-14 Metrologic Instruments, Inc. Method of and system for producing digital images of objects with subtantially reduced speckle-noise patterns by illuminating said objects with spatially and/or temporally coherent-reduced planar laser illumination
US6629641B2 (en) 2000-06-07 2003-10-07 Metrologic Instruments, Inc. Method of and system for producing images of objects using planar laser illumination beams and image detection arrays
US5831719A (en) * 1996-04-12 1998-11-03 Holometrics, Inc. Laser scanning system
US6097994A (en) * 1996-09-30 2000-08-01 Siemens Corporate Research, Inc. Apparatus and method for determining the correct insertion depth for a biopsy needle
US6249713B1 (en) 1996-09-30 2001-06-19 Siemens Corporate Research, Inc. Apparatus and method for automatically positioning a biopsy needle
US7245958B1 (en) 1996-09-30 2007-07-17 Siemens Corporate Research, Inc. Trigonometric depth gauge for biopsy needle
US5889582A (en) * 1997-03-10 1999-03-30 Virtek Vision Corporation Image-directed active range finding system
US7028899B2 (en) * 1999-06-07 2006-04-18 Metrologic Instruments, Inc. Method of speckle-noise pattern reduction and apparatus therefore based on reducing the temporal-coherence of the planar laser illumination beam before it illuminates the target object by applying temporal phase modulation techniques during the transmission of the plib towards the target
US6055449A (en) * 1997-09-22 2000-04-25 Siemens Corporate Research, Inc. Method for localization of a biopsy needle or similar surgical tool in a radiographic image
US6028912A (en) * 1997-09-30 2000-02-22 Siemens Corporate Research, Inc. Apparatus and method for point reconstruction and metric measurement on radiographic images
US7006132B2 (en) * 1998-02-25 2006-02-28 California Institute Of Technology Aperture coded camera for three dimensional imaging
US7612870B2 (en) * 1998-02-25 2009-11-03 California Institute Of Technology Single-lens aperture-coded camera for three dimensional imaging in small volumes
US7584893B2 (en) 1998-03-24 2009-09-08 Metrologic Instruments, Inc. Tunnel-type digital imaging system for use within retail shopping environments such as supermarkets
US6101455A (en) * 1998-05-14 2000-08-08 Davis; Michael S. Automatic calibration of cameras and structured light sources
US6252623B1 (en) 1998-05-15 2001-06-26 3Dmetrics, Incorporated Three dimensional imaging system
US6988660B2 (en) 1999-06-07 2006-01-24 Metrologic Instruments, Inc. Planar laser illumination and imaging (PLIIM) based camera system for producing high-resolution 3-D images of moving 3-D objects
US6959869B2 (en) * 1999-06-07 2005-11-01 Metrologic Instruments, Inc. Automatic vehicle identification (AVI) system employing planar laser illumination and imaging (PLIIM) based subsystems
US6959870B2 (en) * 1999-06-07 2005-11-01 Metrologic Instruments, Inc. Planar LED-based illumination array (PLIA) chips
FR2796458B1 (en) * 1999-07-12 2001-09-21 Cogema METHOD AND INSTALLATION FOR RELIEF MEASUREMENT ON A FACE OF AN AXISYMMETRIC PART
US6603870B1 (en) * 1999-09-30 2003-08-05 Siemens Corporate Research, Inc. Method and apparatus for visual servoing of a linear apparatus
US7034272B1 (en) 1999-10-05 2006-04-25 Electro Scientific Industries, Inc. Method and apparatus for evaluating integrated circuit packages having three dimensional features
US7006141B1 (en) * 1999-11-23 2006-02-28 Panavision, Inc. Method and objective lens for spectrally modifying light for an electronic camera
US6975353B1 (en) 2000-04-19 2005-12-13 Milinusic Tomislav F Immersive camera system
FI111754B (en) 2000-08-25 2003-09-15 Outokumpu Oy Ways of measuring the surface level of a material layer lying on a conveyor track and to be heat treated
JP2002139304A (en) * 2000-10-30 2002-05-17 Honda Motor Co Ltd Distance measuring device and distance measuring method
US6809809B2 (en) * 2000-11-15 2004-10-26 Real Time Metrology, Inc. Optical method and apparatus for inspecting large area planar objects
WO2002040970A1 (en) * 2000-11-15 2002-05-23 Real Time Metrology, Inc. Optical method and apparatus for inspecting large area planar objects
US7954719B2 (en) 2000-11-24 2011-06-07 Metrologic Instruments, Inc. Tunnel-type digital imaging-based self-checkout system for use in retail point-of-sale environments
US7077319B2 (en) * 2000-11-24 2006-07-18 Metrologic Instruments, Inc. Imaging engine employing planar light illumination and linear imaging
US7140543B2 (en) * 2000-11-24 2006-11-28 Metrologic Instruments, Inc. Planar light illumination and imaging device with modulated coherent illumination that reduces speckle noise induced by coherent illumination
CA2327894A1 (en) * 2000-12-07 2002-06-07 Clearview Geophysics Inc. Method and system for complete 3d object and area digitizing
WO2002046713A2 (en) 2000-12-08 2002-06-13 Cyberoptics Corporation Automated system with improved height sensing
FI112279B (en) * 2001-11-21 2003-11-14 Mapvision Oy Ltd Method for determining offset points
US7513428B2 (en) 2001-11-21 2009-04-07 Metrologic Instruments, Inc. Planar laser illumination and imaging device employing laser current modulation to generate spectral components and reduce temporal coherence of laser beam, so as to achieve a reduction in speckle-pattern noise during time-averaged detection of images of objects illuminated thereby during imaging operations
US7344082B2 (en) * 2002-01-02 2008-03-18 Metrologic Instruments, Inc. Automated method of and system for dimensioning objects over a conveyor belt structure by applying contouring tracing, vertice detection, corner point detection, and corner point reduction methods to two-dimensional range data maps of the space above the conveyor belt captured by an amplitude modulated laser scanning beam
US6741177B2 (en) 2002-03-28 2004-05-25 Verifeye Inc. Method and apparatus for detecting items on the bottom tray of a cart
US7389199B2 (en) * 2003-06-17 2008-06-17 Troxler Electronics Laboratories, Inc. Method of determining a dimension of a sample of a construction material and associated apparatus
US9587938B2 (en) 2003-06-17 2017-03-07 Troxler Electronic Laboratories, Inc. Method and apparatus for determining a characteristic of a construction material
US7916898B2 (en) * 2003-09-15 2011-03-29 Deere & Company Method and system for identifying an edge of a crop
US7460250B2 (en) * 2003-10-24 2008-12-02 3Dm Devices Inc. Laser triangulation system
US8098275B2 (en) * 2003-12-30 2012-01-17 The Trustees Of The Stevens Institute Of Technology Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration
US7620209B2 (en) * 2004-10-14 2009-11-17 Stevick Glen R Method and apparatus for dynamic space-time imaging system
US7375801B1 (en) 2005-04-13 2008-05-20 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Video sensor with range measurement capability
US7209577B2 (en) 2005-07-14 2007-04-24 Logitech Europe S.A. Facial feature-localized and global real-time video morphing
JP4533824B2 (en) * 2005-08-30 2010-09-01 株式会社日立製作所 Image input device and calibration method
CA2625775A1 (en) 2005-10-14 2007-04-19 Applied Research Associates Nz Limited A method of monitoring a surface feature and apparatus therefor
DE102005054658A1 (en) 2005-11-16 2007-05-24 Sick Ag Method for automatically paramenting measuring systems
GB0608841D0 (en) 2006-05-04 2006-06-14 Isis Innovation Scanner system and method for scanning
KR100735565B1 (en) * 2006-05-17 2007-07-04 삼성전자주식회사 Method for detecting an object using structured light and robot using the same
DE102007017747B4 (en) * 2007-04-12 2009-05-07 V & M Deutschland Gmbh Method and device for the optical measurement of external threads
US8243289B2 (en) * 2009-05-29 2012-08-14 Perceptron, Inc. System and method for dynamic windowing
EA020917B1 (en) * 2010-04-07 2015-02-27 Валерий Александрович Бердников Device for measuring volume and mass of bulk material on conveyor traction mechanism
EP2615494B1 (en) 2010-09-07 2016-11-09 Dai Nippon Printing Co., Ltd. Projection-type footage display device
EP3104059B1 (en) 2010-09-07 2019-08-21 Dai Nippon Printing Co., Ltd. Illumination apparatus using coherent light source
WO2012032668A1 (en) 2010-09-07 2012-03-15 大日本印刷株式会社 Scanner device and device for measuring three-dimensional shape of object
CA2991928C (en) 2011-06-06 2021-03-02 Troxler Electronic Laboratories, Inc. Optical method and apparatus for determining a characteristic such as volume and density of an excavated void in a construction material
US8756792B2 (en) * 2011-06-08 2014-06-24 The Boeing Company Digitally designed shims for joining parts of an assembly
US9179844B2 (en) 2011-11-28 2015-11-10 Aranz Healthcare Limited Handheld skin measuring or monitoring device
DE102012003620B4 (en) * 2012-02-20 2014-02-27 Salzgitter Mannesmann Grobblech Gmbh Method and device for non-contact geometric measurement of a measurement object
US9939389B2 (en) * 2012-09-28 2018-04-10 Thomas Engineering Solutions & Consulting, Llc Data acquisition system useful for inspection of tubulars
US9669509B2 (en) 2012-09-28 2017-06-06 Thomas Engineering Solutions & Consulting, Llc Methods for external cleaning and inspection of tubulars
BR112015013804B1 (en) * 2012-12-14 2021-02-09 Bp Corporation North America Inc measuring system for three-dimensional measurement of an underwater structure, method for laser triangulation of an underwater structure and non-transient computer-readable medium coded with instructions
WO2014178047A1 (en) 2013-04-30 2014-11-06 Inuitive Ltd. System and method for video conferencing
US10013527B2 (en) 2016-05-02 2018-07-03 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
DE102016110444A1 (en) * 2016-06-06 2017-12-07 Jabil Optics Germany GmbH Method and determination device for determining a surface shape
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
TWI627380B (en) * 2016-11-23 2018-06-21 綠點高新科技股份有限公司 Method And System For Estimating Contour Of A Surface Of A 3D Object
CZ307628B6 (en) * 2016-12-09 2019-01-23 VĂšTS, a.s. A measuring station for diagnosis of stringers
EP4183328A1 (en) 2017-04-04 2023-05-24 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
US10402988B2 (en) * 2017-11-30 2019-09-03 Guangdong Virtual Reality Technology Co., Ltd. Image processing apparatuses and methods
CN108629325B (en) * 2018-05-11 2021-06-22 北京旷视科技有限公司 Method, device and system for determining position of article
US10867186B2 (en) 2018-05-15 2020-12-15 Genetec Inc. Transaction monitoring

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4498778A (en) * 1981-03-30 1985-02-12 Technical Arts Corporation High speed scanning method and apparatus
EP0159187A2 (en) * 1984-04-17 1985-10-23 Simon-Carves Limited A surface topography measuring system
EP0160781A1 (en) * 1984-04-12 1985-11-13 International Business Machines Corporation Measuring and detecting printed circuit wiring defects
JPS60253806A (en) * 1984-05-30 1985-12-14 Toyota Central Res & Dev Lab Inc Method and apparatus for detecting shape
EP0168557A1 (en) * 1984-06-22 1986-01-22 Dornier Gmbh Digital method for interpreting an image which is representative of the topography of an object
JPH06131906A (en) * 1992-10-14 1994-05-13 Tokyo Electric Co Ltd Luminaire
JPH06129706A (en) * 1992-10-15 1994-05-13 Sanden Corp Attitude controlling device for power transfer plate for air-conditioner

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2645971A (en) * 1949-06-28 1953-07-21 Rca Corp Surface contour measurement
US2786096A (en) * 1951-11-10 1957-03-19 Du Mont Allen B Lab Inc Television range finder
US3361912A (en) * 1964-12-07 1968-01-02 Barkley & Dexter Lab Inc Sun shield for photoelectric cloud height detector
GB1125306A (en) * 1966-07-04 1968-08-28 Marconi Internat Marine Compan Improvements in or relating to range and bearing measuring apparatus
US3773422A (en) * 1971-04-07 1973-11-20 Singer Co Calculating linear dimensions from tv images
US3895870A (en) * 1971-06-01 1975-07-22 Autech Corp Laser dimension comparator
US3796492A (en) * 1971-06-01 1974-03-12 Autech Corp Laser dimension comparator
US3975102A (en) * 1974-07-29 1976-08-17 Zygo Corporation Scanning photoelectric autocollimator
US4175862A (en) * 1975-08-27 1979-11-27 Solid Photography Inc. Arrangement for sensing the geometric characteristics of an object
US4070683A (en) * 1976-03-04 1978-01-24 Altschuler Bruce R Optical surface topography mapping system
US4113389A (en) * 1976-04-19 1978-09-12 Morton Kaye Optical measurement system
US4089608A (en) * 1976-10-21 1978-05-16 Hoadley Howard W Non-contact digital contour generator
US4165178A (en) * 1978-06-29 1979-08-21 International Business Machines Corporation Gap measurement tool
US4294541A (en) * 1979-06-18 1981-10-13 Abler William L Bi-periscopic instrument for use in determining terrestrial positions through celestial observation
US4294544A (en) * 1979-08-03 1981-10-13 Altschuler Bruce R Topographic comparator
US4391514A (en) * 1981-02-13 1983-07-05 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Sidelooking laser altimeter for a flight simulator
US4492472A (en) * 1981-07-27 1985-01-08 Kawasaki Steel Corporation Method and system for determining shape in plane to be determined in atmosphere of scattering materials
US4412121A (en) * 1981-08-28 1983-10-25 S R I International Implement positioning apparatus and process
US4443705A (en) * 1982-10-01 1984-04-17 Robotic Vision Systems, Inc. Method for locating points on a three-dimensional surface using light intensity variations
US4541722A (en) * 1982-12-13 1985-09-17 Jenksystems, Inc. Contour line scanner
NO152065C (en) * 1983-02-24 1985-07-24 Tomra Systems As PROCEDURE FOR CONTINUOUS RECOGNITION OF TOTAL OR PARTIAL TRANSPARENT GOODS, EX. BOTTLES
US4541721A (en) * 1983-03-17 1985-09-17 Perceptron, Inc. Optical checking apparatus and method of using same
US4493968A (en) * 1983-07-13 1985-01-15 Caterpillar Tractor Co. Adaptive welder with laser TV-scanner
US4523809A (en) * 1983-08-04 1985-06-18 The United States Of America As Represented By The Secretary Of The Air Force Method and apparatus for generating a structured light beam array
US4645348A (en) * 1983-09-01 1987-02-24 Perceptron, Inc. Sensor-illumination system for use in three-dimensional measurement of objects and assemblies of objects
US4648717A (en) * 1984-02-06 1987-03-10 Robotic Vision Systems, Inc. Method of three-dimensional measurement with few projected patterns
US4845373A (en) * 1984-02-22 1989-07-04 Kla Instruments Corporation Automatic alignment apparatus having low and high resolution optics for coarse and fine adjusting
FR2560377B1 (en) * 1984-02-29 1988-05-13 Commissariat Energie Atomique OPTICAL DEVICE FOR MEASURING SURFACE PROXIMITY AND ITS APPLICATION TO MEASURING A SURFACE PROFILE
US4657394A (en) * 1984-09-14 1987-04-14 New York Institute Of Technology Apparatus and method for obtaining three dimensional surface contours
US4641972A (en) * 1984-09-14 1987-02-10 New York Institute Of Technology Method and apparatus for surface profilometry
US4672562A (en) * 1984-12-11 1987-06-09 Honeywell Inc. Method and apparatus for determining location and orientation of objects
SE447848B (en) * 1985-06-14 1986-12-15 Anders Bengtsson INSTRUMENTS FOR SEATING SURFACE TOPOGRAPHY
US4734766A (en) * 1985-08-19 1988-03-29 Kawasaki Steel Corporation Method and system for locating and inspecting seam weld in metal seam-welded pipe
US4687326A (en) * 1985-11-12 1987-08-18 General Electric Company Integrated range and luminance camera
US4653316A (en) * 1986-03-14 1987-03-31 Kabushiki Kaisha Komatsu Seisakusho Apparatus mounted on vehicles for detecting road surface conditions
US4741621A (en) * 1986-08-18 1988-05-03 Westinghouse Electric Corp. Geometric surface inspection system with dual overlap light stripe generator
US4791482A (en) * 1987-02-06 1988-12-13 Westinghouse Electric Corp. Object locating system
US4847510A (en) * 1987-12-09 1989-07-11 Shell Oil Company Method for comparison of surfaces
US4847687A (en) * 1988-04-18 1989-07-11 General Electric Company Video ranging system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4498778A (en) * 1981-03-30 1985-02-12 Technical Arts Corporation High speed scanning method and apparatus
EP0160781A1 (en) * 1984-04-12 1985-11-13 International Business Machines Corporation Measuring and detecting printed circuit wiring defects
EP0159187A2 (en) * 1984-04-17 1985-10-23 Simon-Carves Limited A surface topography measuring system
JPS60253806A (en) * 1984-05-30 1985-12-14 Toyota Central Res & Dev Lab Inc Method and apparatus for detecting shape
EP0168557A1 (en) * 1984-06-22 1986-01-22 Dornier Gmbh Digital method for interpreting an image which is representative of the topography of an object
JPH06131906A (en) * 1992-10-14 1994-05-13 Tokyo Electric Co Ltd Luminaire
JPH06129706A (en) * 1992-10-15 1994-05-13 Sanden Corp Attitude controlling device for power transfer plate for air-conditioner

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Eight International Conference on Pattern Recognition, Paris, France, 27-31 October 1986, IEEE, H. YAMAMOTO et al.: "Range Imaging System Based on Binary Image Accumulation", pages 233-235 *
PATENT ABSTRACTS OF JAPAN, Volume 10, No. 127 (P-455) (2184), 13 May 1986, & JP-A-60253806 (Toyoda Chuo Kenkyusho K.K.) 14 December 1985 *
PATENT ABSTRACTS OF JAPAN, Volume 10, No. 183 (P-472) (2239), 26 June 1986; & JP-A-6129706 (Matsushita Electric Works Ltd) 10 February 1986 *
PATENT ABSTRACTS OF JAPAN, Volume 10, No. 183 (P-472) (2239), 26 June 1986; & JP-A-6131906 (Hitachi Zosen Corp.) 14 February 1986 *
Proceedings IEEE Computer Society Conference on Pattern Recognition and Image Processing, Las Vegas, Nevada, 14-17 June 1982, IEEE, (New York, US), J.B.K. TIO et al.: "Curved Surface Measurement for Robot Vision", pages 370-378 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0489641A1 (en) * 1990-12-05 1992-06-10 Commissariat A L'energie Atomique Calibration method for a tri-dimensional metrology system
FR2670283A1 (en) * 1990-12-05 1992-06-12 Commissariat Energie Atomique METHOD FOR CALIBRATING A THREE-DIMENSIONAL METROLOGY SYSTEM.
FR2685764A1 (en) * 1991-12-30 1993-07-02 Kreon Ind COMPACT AND HIGH RESOLUTION OPTICAL SENSOR FOR THREE DIMENSIONAL SHAPE ANALYSIS.
EP0550300A1 (en) * 1991-12-30 1993-07-07 Sa Kreon Industrie Compact optical sensor of high resolution for analysing three-dimensional shapes
US5424835A (en) * 1991-12-30 1995-06-13 Kreon Industrie High-resolution compact optical sensor for scanning three-dimensional shapes
CN1039603C (en) * 1991-12-30 1998-08-26 克隆恩工业公司 High-resolution compact optical sensor for scanning three-dimensional shapes
EP2085744A4 (en) * 2006-12-25 2015-06-03 Nec Corp Distance measuring device, method, and program
WO2014170607A1 (en) * 2013-04-17 2014-10-23 R & D Vision Laser sheet localised imaging and profilometry system
FR3004818A1 (en) * 2013-04-17 2014-10-24 R & D Vision LOCALIZED IMAGING AND LASER PATH PROFILOMETRY SYSTEM
JP2017067753A (en) * 2015-09-29 2017-04-06 株式会社東京精密 Device and method for identifying works
US20230026608A1 (en) * 2021-07-20 2023-01-26 Keyence Corporation Shape inspection device, processing device, height image processing device

Also Published As

Publication number Publication date
US4979815A (en) 1990-12-25
WO1990009561A3 (en) 1990-10-04
IL93292A0 (en) 1990-11-29

Similar Documents

Publication Publication Date Title
US4979815A (en) Laser range imaging system based on projective geometry
Beraldin et al. Active 3D sensing
EP1596158B1 (en) Three-dimensional shape input device
Gühring Dense 3D surface acquisition by structured light using off-the-shelf components
US4601053A (en) Automatic TV ranging system
EP0335035B1 (en) Method and apparatus for measuring a three-dimensional curved surface shape
US5612905A (en) Three-dimensional measurement of large objects
US7215430B2 (en) Integrated system for quickly and accurately imaging and modeling three-dimensional objects
EP2839238B1 (en) 3d scanner using merged partial images
WO1993008448A1 (en) High-speed 3-d surface measurement surface inspection and reverse-cad system
Moring et al. Acquisition of three-dimensional image data by a scanning laser range finder
GB2264601A (en) Object inspection
US3726591A (en) Stereoplotting apparatus for correlating image points disposed along epipolar lines
EP1680689B1 (en) Device for scanning three-dimensional objects
JPH04232414A (en) Apparatus for variable-depth triangulation distance-measuring device
WO1994015173A1 (en) Scanning sensor
US20220130112A1 (en) Hybrid photogrammetry
Hattori et al. Accurate rangefinder with laser pattern shifting
JP2003279332A (en) Three-dimensional shape input unit and method for detecting positional deviation
CN108036742A (en) New lines structural light three-dimensional method for sensing and device
JP2000304508A (en) Three-dimensional input device
EP0618461B1 (en) Distance measuring method and apparatus
Reid et al. A laser scanning camera for range data acquisition
Beraldin et al. Active 3D sensing for heritage applications
JP2731681B2 (en) 3D measurement system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): BR CA JP NO

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH DE DK ES FR GB IT LU NL SE

AK Designated states

Kind code of ref document: A3

Designated state(s): BR CA JP NO

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH DE DK ES FR GB IT LU NL SE

NENP Non-entry into the national phase

Ref country code: CA