WO1992014118A1 - An optical sensor - Google Patents
An optical sensor Download PDFInfo
- Publication number
- WO1992014118A1 WO1992014118A1 PCT/GB1992/000221 GB9200221W WO9214118A1 WO 1992014118 A1 WO1992014118 A1 WO 1992014118A1 GB 9200221 W GB9200221 W GB 9200221W WO 9214118 A1 WO9214118 A1 WO 9214118A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detector
- light source
- focus
- image
- pattern
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 62
- 238000004458 analytical method Methods 0.000 claims abstract description 18
- 238000012545 processing Methods 0.000 claims abstract description 18
- 238000000034 method Methods 0.000 claims description 23
- 238000005286 illumination Methods 0.000 claims description 13
- 230000000694 effects Effects 0.000 description 8
- 230000008901 benefit Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 239000000523 sample Substances 0.000 description 7
- 230000002123 temporal effect Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000000835 fiber Substances 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 239000000243 solution Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000004304 visual acuity Effects 0.000 description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- DGAQECJNVWCQMB-PUAWFVPOSA-M Ilexoside XXIX Chemical compound C[C@@H]1CC[C@@]2(CC[C@@]3(C(=CC[C@H]4[C@]3(CC[C@@H]5[C@@]4(CC[C@@H](C5(C)C)OS(=O)(=O)[O-])C)C)[C@@H]2[C@]1(C)O)C)C(=O)O[C@H]6[C@@H]([C@H]([C@@H]([C@H](O6)CO)O)O)O.[Na+] DGAQECJNVWCQMB-PUAWFVPOSA-M 0.000 description 1
- 208000020307 Spinal disease Diseases 0.000 description 1
- 239000006096 absorbing agent Substances 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000010420 art technique Methods 0.000 description 1
- 239000012472 biological sample Substances 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000002316 cosmetic surgery Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 208000002925 dental caries Diseases 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 239000000975 dye Substances 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001125 extrusion Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000007850 fluorescent dye Substances 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 239000005337 ground glass Substances 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 238000000386 microscopy Methods 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- 235000021067 refined food Nutrition 0.000 description 1
- 239000005060 rubber Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 229910052708 sodium Inorganic materials 0.000 description 1
- 239000011734 sodium Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000010186 staining Methods 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
Definitions
- This invention relates to an optical sensor and, more particularly, to a sensor used to sense the range of one or more parts of an object so as to form an image thereof.
- the system detects which parts of the object are in focus by analysing the detected signal for high frequency components which are caused by features, such as edges or textured parts, which change rapidly across the scene. Because of this, the system is not suitable for imaging plain or smooth surfaces, such as a flat painted wall, which have no such features.
- the senor detects those parts of the target object that are in focus by analysing the image for features that match the spatial frequencies present in the projected pattern.
- Various analysis techniques such as convolution and synchronous detection are described.
- convolution and synchronous detection are described.
- convolution and synchronous detection are described.
- a further patent by the same company US4640620 describes a method which aims to overcome this problem by the use of a liquid crystal light valve device to convert the required high spatial frequency components present in the image into an amplitude variation that can be detected directly.
- an optical sensor comprising: a structured light source for producing a pattern of constrasting areas; a detector which comprises an array of detector elements having dimensions matched to the pattern produced by the light source; an optical system for projecting a primary image of the light source onto an object that is to be sensed and for forming a secondary image on the detector of the primary image thus formed on the object; adjustment means for adjusting at least part of the optical system so as to vary the focussing of the primary image on the object, the arrangement being such that when the primary image is in focus on the object, the secondary image on the detector Is also in focus; and processing means for analysing signals produced by the detector in conjunction with information on the adjustment of the optical system, wherein the structured light source is adjustable so as to interchange the positions of contrasting areas of the pattern produced by the light source and in that the processing means is arranged to analyse the secondary images received by the detector elements with the contrasting areas in the interchanged positions to determine those parts of the secondary images which are in focus on the detector and thereby determine the
- a method of determining the range of at least part of an object being viewed using an optical sensor comprising: a structured light source which produces a pattern of contrasting areas; a detector which comprises an array of detector elements having dimensions matched to the pattern produced by the light source; an optical system which projects a primary image of the light source onto an object that is to be sensed and forms a secondary image on the detector of the primary image thus formed on the object; adjustment means which adjusts at least part of the optical system so as to vary the focussing of the primary image on the object, the arrangement being such that when the primary image is in focus on the object, the secondary image on the detector is also In focus; and processing means which analyses signals produced by the detector in conjunction with information on the adjustment of the optical system, the method involving adjustment of the structured light source so as to interchange the positions of contrasting areas of the pattern produced by the light source and the processing means being arranged to analyse the secondary images received by the detector elements with the contrasting areas in the interchanged positions to determine those parts of the secondary images
- the invention thus transforms analysis of the image sensed by the detector into the temporal domain.
- Figure 1 illustrates the basic concept of an optical sensor such as that described In US4629324 and shows the main components and the optical pathways between them;
- Figure 2 shows a box diagram of an optical sensor of the type shown in Figure 1 with a particular control unit and signal processing arrangement according to one embodiment of the present invention
- Figure 3 illustrates how an image of an object being viewed by the type of system shown In Figures 1 and 2 can be built up and displayed;
- Figures 4, 5 and 6 show alternative optical arrangements which may be used in the sensor;
- Figure 7 shows an alternative form of beam splitter which may be used in the sensor
- Figures 8(A) and 8(B) show a further embodiment of a sensor according to the invention.
- Figure 9(A) and 9(B) show another embodiment of a sensor according to the invention.
- Figure 10 shows a box diagram of a control unit and a signal processor which may be used with the embodiment shown in Figure 9.
- the optical sensor described herein combines the concepts of a 'sweep focus ranger' of the type described above with the concept of an active confocal light source as described in US4629324 together with means to transform the depth information analysis into the temporal domain.
- 'confocal' is used in this specification to describe an optical system arranged so that two images formed by the system are in focus at the same time. In most cases, this means that if the relevant optical paths are 'unfolded', the respective images or objects coincide with each other.
- the basic concept of a sensor such as that described in US4629324 is illustrated in Figure 1.
- the sensor comprises a detector 1 and a lens system 2 for focussing an image of an object 3 which is to be viewed onto the detector 1.
- Positioning means 4 are provided to adjust the position of the lens system 2 to focus different 'slices' of the object 3 on the detector 1.
- the sensor corresponds to a conventional 'sweep focus sensor'.
- the sensor is also provided with a structured light source comprising a lamp 5, a projector lens 6 and a grid or spatial filter 7 together with a beam-splitter 8 which enables an image of the filter 7 to be projected through the lens system 2 onto the object 3.
- the filter 7 and detector 1 are accurately positioned so that when a primary image of the filter 7 is focussed by the lens system 2 onto the object 3, a secondary image of the primary image formed on the object 3 is also focussed by the lens system 2 onto the detector 1. This is achieved by positioning the detector 1 and filter 7 equidistant from the beam splitter 8 so the system is 'confocal'.
- An absorber 9 is also provided to help ensure that the portion of the beam from the filter 7 which passes through the beam splitter 8 is absorbed and is not reflected onto the detector 1.
- the lens system 2 has a wide aperture with a very small depth of focus and in order to form an image of the object 3 the lens system 2 is successively positioned at hundreds of discrete, precalculated positions and the images received by the detector 1 for each position of the lens system 2 are analysed to detect those parts of the image which are in focus.
- the distance between the lens system 2 and the parts of the object 3 which are in focus at that time can be calculated using the standard lens equation; which for a simple lens is:
- f is the focal length of the lens system 2
- U is the object distance (i.e. the distance between the lens system 2 and the in focus parts of the object 3) and V is the image distance (i.e. the distance between the lens system 2 and the detector 1).
- the images formed on the object 3 and detector 1 are preferably in the form of a uniform structured pattern with a high spatial frequency, i.e. having a small repeat distance, comprising contrasting areas, such as a series of light and dark bands.
- a uniform structured pattern with a high spatial frequency, i.e. having a small repeat distance, comprising contrasting areas, such as a series of light and dark bands.
- Such patterns can, for example, be provided by slots or square cut-outs in the filter 7.
- those parts of the image which are in focus on the object 3 being viewed produce a corresponding image on the detector 1 of light and dark bands whereas the out of focus parts of the image rapidly 'break-up' and so produce a more uniform illumination of the detector 1 which is also significantly less bright than the light areas of the parts of the image which are in focus.
- the structured pattern should preferably have as high a spatial frequency as possible, although this is limited by the resolution of the lens system 2 and the size of the detector array, as the depth resolution is proportional to this
- FIG. 2 is a box diagram of a sensor of the type shown in Figure 1 together with control and processing units as used in an embodiment of the invention to be described.
- the light source 5 of the sensor is controlled by a projector lamp control unit 9, a piezo-electric grid control unit 10 is provided for moving the grid or filter 7 (for reasons to be discussed later) and adjustment of the lens system 2 is controlled by a sweep lens control unit 11.
- the control units 9, 10 and 11, together with the output of the CCD detector 1, are connected to a computer 12 provided with an image processor, frame grabber, frame stores (computer memory corresponding to the pixel array of the detector 1) and a digital signal processor ( DSP) 13 for processing the signals received and providing appropriate instructions to the control units.
- the output of the signal processor may then be displayed on a monitor 14.
- the detector 1 comprises an array of detector elements or pixels such as charged coupled devices (CCD's) or charged injection devices (CID's).
- CCD's charged coupled devices
- CID's charged injection devices
- Such detectors are preferred over other TV type sensors because the precise nature of the detector geometry makes it possible to align the unfolded optical path of the structured image with the individual pixels of the detector array.
- the grid or spatial filter 7 consists of a uniform structured pattern with a high spatial frequency i.e. having a small repeat distance, comprising contrasting areas, such as a series of light and dark bands or a chequer-board pattern.
- the spatial frequency of the grid pattern 7 should be matched to the pixel dimensions of the detector 1, i.e. the repeat distance of the pattern should be n pixels wide, where n is a small integer e.g. two (which is the preferred repeat distance). The value of n will be limited by the resolution of the lens system 2.
- the detector should be matched with the pattern In both dimensions, so for optimum resolution it should comprise an array of square rather than rectangular pixels as the lens system 2 will have the same resolving power in both the x and y dimensions. Also, the use of square pixels simplifies the analysis of data received by the detector.
- the light and dark portion of the pattern should also be complementary so that added together they give a uniform intensity.
- the choice of pattern will depend on the resolution characteristics of the optical system used but should be chosen such that the pattern 'breaks up' as quickly and cleanly as possible as it goes out of focus.
- the pattern need not have a simple light/dark or clear/opaque structure. It could also comprise a pattern having smoothly changing (e.g. sinusoidal) opacity. This would tend to reduce the higher frequency components of the pattern so that it 'breaks up' more smoothly as it moves out of focus.
- the grid pattern In order to avoid edge effects in the image formed on the object 3 or detector 1, it is preferable for the grid pattern to comprise more repeat patterns than the CCD detector, ie for the image formed to be larger than the detector array 1.
- grid pattern 7 is that known as a 'Ronchi ruling resolution target' which comprises a pattern of parallel stripes which are alternatively clear and opaque.
- a good quality lens system 2 (such as that used in a camera) will typically be able to resolve a pattern having between 50 and 125 lines/mm in the image plane depending upon its aperture, and special lenses (such as those used in aerial-photography) would be even better.
- the resolution can also be further improved by limiting the wavelength band used, e.g. by using a laser or sodium lamp to illuminate the grid pattern 7.
- a typical CCD detector has pixels spaced at about 20 microns square so with two pixel widths, i.e. 40 microns, corresponding to the pattern repeat distance, this sets a resolution limit of 25 lines/mm.
- the number of repeats of the structured pattern across the area of the object being viewed should preferably be as high as possible for the lens system and detector array used. With a typical detector array of the type described above comprising a matrix of 512 ⁇ 512 pixels, the maximum number of repeats would be 256 across the image. Considerably higher repeat numbers can be achieved using a linear detector array (described further below).
- the lower limit on the spatial frequency of the grid pattern is set by the minimum depth resolution required. As the grid pattern becomes coarser, the number of repeats of the pattern across the image is reduced and the quality of the image produced by the sensor is degraded. An image with less than, say, 40 pattern repeats across it is clearly going to provide only very crude information about the object being sensed.
- positioning means may be provided for moving the grid pattern 7. This is used to move the grid 7 in its own plane by one half of the pattern repeat distance between each frame grab. This movement is typically very small, e.g. around 20 microns (where the pattern repeats every 2 pixels), and is preferably carried out using piezo-electric positioning means.
- the sum of the intensities 11 and i2 is a measure of the brightness of that pixel and the difference a measure of the depth of modulation. Dividing the difference signal by the brightness gives a normalized measure of the high pass component, which is a measure of how 'in-focus' that pixel is.
- the sign of the term "i1 - i2" will, of course, alternate from pixel to pixel depending whether i1 or i2 corresponds to a light area of the grid pattern.
- a correction can be applied by capturing an image with the light source switched off in a third frame store.
- the background intensity "13" for each pixel can then be subtracted from the values i1 and i2 used in the above equations.
- 'I' In order to avoid spurious, noisy data, it is desirable to impose a minimum threshold value on the signal 'I'. Where 'I' is very low, for example when looking at a black object or when the object is very distant, then that sample should be ignored (e.g. by setting M to zero or some such value).
- the DSP is now used to perform the following functions on the raw image data: for each pixel (or group of pixels) the functions I and M described above are constructed.
- the function M can be displayed from frame store 1 on the monitor (this corresponds to the in-focus contours which are shown bright against a dark background).
- the Intensity information signal 'I' which corresponds to the information obtained by a conventional sweep focus ranger as the average of i1 and i2 is effectively a uniform Illumination signal (i.e. without a structured light source), is also helpful in interpreting the data. Displaying the signal 'I' on a monitor gives a normal TV style image of the scene except for the very small depth of focus inherent in the system. If desired, changing patterns of intensity found using standard edge detector methods can therefore be used to provide information on in-focus edges and textured areas of the object and this information can be added to that obtained from the *M* signal (which is provided by the use of a structured light source).
- the technique of temporal modulation has the advantage that as each pixel in the image is analysed independently, edges or textures present on the object do not interfere with the depth measurement (this may not be the case for techniques based upon spatial modulation).
- the technique can also be used with a chequer-board grid pattern instead of vertical or horizontal bands. Such a pattern would probably 'break-up' more effectively than the bands as the lens system 2 moves out of focus and would therefore be the preferred choice.
- the sensors described above use a structured light source of relatively high spatial frequency to enable the detector to detect accurately when the image is in focus and when it 'breaks-up' as it goes out of focus. This is in contrast with some known range finding systems which rely on detecting a shift between two halves of the image as it goes out of focus.
- the calculations described above may be performed pixel by pixel in software or by using specialist digital signal processor (DSP) hardware at video frame rates.
- DSP digital signal processor
- a display of the function M on a CRT would have the appearance of a dark screen with bright contours corresponding to the "in- focus" components of the object being viewed. As the focus of the lens system is swept, so these contours would move to show the in-focus parts of the object. From an analysis of these contours a 3-dimensional map of the object can be constructed.
- a piezo-electric positioner can also be used on the detector 1 to increase the effective resolution in both the spatial and depth dimensions. To do this, the detector array is moved so as to effectively provide a smaller pixel dimension (a technique used in some high definition CCD cameras) and thus allow a finer grid pattern to be used.
- piezo-electric devices for the fine positioning of an article is well known, e.g. in the high definition CCD cameras mentioned above and in scanning tunnelling microscopes, are capable of very accurately positioning an article even down to atomic dimensions.
- other devices could be used such as a loudspeaker voice coil positioning mechanism.
- LCD liquid crystal display
- Figure 3 illustrates how the lens system 2 is adjusted to focus on successive planes of the object 3, and a 3-dimenslonal electronic image, or range map, of the object is built up from these 'slices' in the computer memory.
- the contours of the object 3 which are in focus for any particular position of the lens system 2 can be displayed on the monitor 14.
- Other images of the 3-dlmenslonal model built up in this way can also be displayed using well known image display and processing techniques.
- the optical sensor described above is an 'active' system (i.e. uses a light source to illuminate the object being viewed) rather than passive (i.e. relying on ambient light to illuminate the object).
- a pattern onto the object being viewed it is able to sense plain, un-textured surfaces, such as painted walls, floors, skin, etc., and is not restricted to edge or textured features like a conventional sweep focus ranger.
- the use of a pattern which is projected onto the object to be viewed and which is of a form which can be easily analysed to determine those parts which are In focus thus provides significant advantages over a conventional 'sweep focus ranger'.
- the outgoing projected beam is subject to the same focussing sweep as the Incoming beam sensed by the detector 1, the projected pattern is only focussed on those parts of the object which are themselves in focus. This improves on the resolving capability of the conventional sweep focus ranger by effectively 'doubling up' the focussing action using both the detector 1 and the light source.
- the symmetry of the optical system means that when the object is in focus, the spatial frequency of the signal formed at the detector 1 will exactly equal that of the grid pattern 7.
- the scale of measurements over which the type of sensor described above can be used ranges from the large, e.g. a few metres, down to the small, e.g. a few millimetres. It should also be possible to apply the same method to the very small, so effectively giving a 3D microscope, using relatively simple and inexpensive equipment.
- the technology is easier to apply at the smaller rather than the larger scale. This is because for sensors working on the small scale, the depth resolution is comparable with the spatial resolution, whereas on the larger scale the depth resolution falls off (which is to be expected as the effective triangulation angle of the lens system is reduced). Also, being an active system, the illumination required will increase as the square of the distance between the sensor and the object being viewed. Nevertheless, the system is well suited to use as a robot sensor covering a range up to several metres.
- fluorescence Another common microscopy technique that can be adapted for use with the sensor described above is fluorescence. This involves staining the sample with a fluorescent dye so that when illuminated with light of a particular wavelength, e.g. UV light, it will emit light of some other wavelength, say yellow. In this way it is possible to see which part of a sample has taken up the dye. If a light source of the appropriate wavelength is used and the detector arranged to respond only to the resulting fluorescence by the use of an appropriate filter, a 3D fluorescence model of the sample can be built up.
- a particular wavelength e.g. UV light
- the lens system 2 acts as the microscope objective lens.
- Figure 4 shows a "side-by-side" arrangement in which separate lens systems 2A and 2B are used to focus a primary image of the grid 7 on the object 3 and to focus the secondary image thereof on the CCD detector 1.
- the system is again arranged so that when the primary image is in focus on the object 3, the secondary image on the detector is also in focus.
- This is achieved by making the lens systems 2A and 2B identical and positioning them side by side or one above the other (depending whether a horizontal or vertical pattern is used) so they are equi-distant from the projector grid 7 and detector 1, respectively. Movement of the two lens systems 2A and 2B would also be matched during the sweeping action.
- the two optical systems in this case are effectively combined by overlap on the object 3 of the area illuminated by the projector grid 7 and the area sensed by the detector 1.
- Figure 5 shows an arrangement in which separate optical systems are combined using a mirror (or prism) 15 and a beam-splitter 16. Again, care is taken to ensure that the two lens systems 2A and 2B are identical are accurately positioned and their movements matched, to ensure the system is 'confocal'.
- Figure 6 shows part of an arrangement corresponding to that of Figure 1 in which an intermediate imaging stage is inserted In the detector optics.
- the secondary image is first focussed on a translucent screen 17, e.g. of ground glass, by the optical system 2 and then focussed onto the detector 1 by a detector lens 18.
- the arrangement shown in Figure 6 is particularly suited to long range sensors (e.g. more than 2 m) where it is desirable to use a larger lens (to give better depth resolution) than would normally be applicable for use with a CCD detector which is typically about 1 cm square.
- a mirror type of lens system such as a Cassegrain mirror type microscope objective, as this offers a much larger aperture, and hence better depth resolution, than a conventional lens system.
- the positions of the light source and detector 1 may also be interchanged if desired.
- the beam splitter 8 may be a simple half-silvered mirror of conventional design. However, a prism type of beam splitter, as shown in Figure 7, is preferred to help reduce the amount of stray light reaching the detector 1. As shown in Figure 7, total internal reflection prevents light from the grid pattern 7 Impinging directly on the detector 1. Polarizing beam splitters combined with polarizing filters may also be used to reduce further the problems caused by stray light reaching the detector 1.
- FIG. 8A shows a linear detector array 1A, a lens system 2, lamp 5, projector lens 6 and grid 7A and a beam splitter 8 arranged in a similar manner to the components of the embodiment shown in Figure 1.
- An optional scanning mirror 19 (to be described further below) is also shown.
- Figure 8B shows front views of the linear CCD detector array 1A and the linear grid pattern 7A comprising a line of light and dark areas which have dimensions corresponding to those of the pixels of the CCD detector 1A.
- the 1-dimensional arrangement shown in Figure 8 corresponds to a single row of data from a 2-dimensional sensor and gives a 2-dimensional cross-sectional measurement of the object 3 being viewed rather than a full 3-dimensional image. There are many applications where this is all that is required but a full 3-dimensional image can be obtained if a mechanism is used to scan the beam through an angle in order to give the extra dimension.
- a scanning mirror 19 is shown performing this task although many other methods could be used.
- the scanning device 19 may either operate faster or slower than the sweep scan of the lens system 2 whichever is more convenient, i.e. either a complete lens system 2 focus sweep can be carried out for each scan position or a complete scan of the object is carried out for each position of the lens system 2 sweep.
- the extra dimension can be obtained by moving the object being viewed.
- This arrangement is therefore suitable for situations where the object is being carried passed the sensor, e.g. by a conveyor belt.
- a 1-dimensional system has a number of advantages:-
- the signal level at the detector 1A is much lower than for a 2-dimensional arrangement. This is because, the projected light falls over a large area, most of which is not viewed by the detector 1A. Hence, the further out of focus the object is, the smaller the observed signal will be. This helps simplify the data analysis as the fall off in intensity is proportional to the "out-of- focus" distance (as well as the normal factors). This is in contrast to the 2-dimensional detector arrangement in which the average light level remains much the same, and simply falls off with the distance between the sensor and the object according to the normal Inverse square law.
- a 1-dimensional version of the sensor is therefore suitable for use in sensing:
- Extrusions e.g. for checking the cross-sections of extruded metal, rubber or plastics articles or processed food products.
- the grid pattern and detector of the sensor may each be further reduced to a 'structured point'.
- this can be provided by a small grid pattern, e.g. in the form of a two-by-two chequer-board pattern 7B and a small detector, e.g. in the form of a quadrant detector 1B, as shown in Figures 9A and 9B.
- a two element detector and corresponding two element grid pattern may be used. The signals from such a detector are processed as if forming a single pixel of information. It will be appreciated that the optical arrangement of such a sensor is similar to that of a scanning confocal microscope but with the addition of a structured light source and detector rather than a simple point.
- this form of sensor acts as a 1D range finder.
- extra dimensions can be obtained by using scanning mechanisms to deflect the beam across the object of Interest and/or by moving the object past the sensor, e.g. on a conveyor.
- a double galvanometer scanner could, for example, be used to give a full 3-dimensional image or a single galvonometer scanner to give a 2-dimensional cross-sectional image.
- matched light sources e.g. laser diodes, arranged in a quadrant and connected together in opposite pairs with an optical system to focus the light onto a smaller quadrant grid mask.
- the mean illumination level remains constant and the AC synchronous component of the signal is detected to indicate when the image is in focus (as the opposite pairs of quadrants will be alternately bright and dark).
- the out of focus components will slmply provide a level DC signal (as all four quadrants receive similar signals for both states of the illumination grid).
- the electronics is arranged to detect the difference in signals received by opposite pairs of quadrant in synchrony with the changing light source.
- the intensity level falls off rapidly when the object is out of focus. In this case, however, the fall off is very much faster as (in addition to the normal factors) the fall off is approximately proportional to the square of the "out-of-focus" distance (as similarly occurs in a scanning confocal microscope).
- the synchronous detection scheme described above further distinguishes spurious light scattered from the fog and any background illumination from the genuine signal reflected from an in-focus object.
- Modulation depth M' (A+C) - (B+D)
- This modulation depth signals M' is then passed through a standard synchronous detection circuit.
- the signal M' is multiplied by plus or minus one depending upon the phase of the clock.
- the output M' and I' then pass through a low pass filter and a divider to give a signal M that corresponds to a measure of the 'in-focus' signal.
- the signals M and I are then digitized and sent to the computer.
- the ordering of the above operations could be changed with no net difference to the result.
- the division could be performed before or after the low pass filtering.
- the division could also be performed digitally after the analog to digital conversion (ADC) stage.
- ADC analog to digital conversion
- the signal M may be integrated to give a better signal to noise ratio.
- a test will also be required to reject measurements where I is very low.
- this sensor is such that the mean intensity I rapidly drops away if the sweep position is not close to the in-focus position. This fact can be used to skip rapidly over non- useful data.
- the net effect is that only the temporally modulated signal corresponding to the in-focus grid pattern is detected.
- the lens sweep position corresponding to the maximum M value can thus be determined and hence the range of the object.
- the lens system 2 may simply comprise a fast, high quality camera lens which has the advantage of being relatively inexpensive.
- a zoom lens may also be used.
- a wide angle view could then be taken, followed by a telephoto close-up of areas of interest to give a higher resolution.
- a disadvantage of most zoom lenses is their restricted aperture which limits the depth resolution possible.
- the large number of optical surfaces within the lens system also tend to lead to problems with stray light.
- a mirror lens may be the preferred option for larger scale ranging as conventional lenses become too heavy and cumbersome.
- a camera mirror lens is compact and its Cassegrain geometry offers a triangulation diameter greater than its aperture number and so should provide better depth resolution.
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP4504040A JP2973332B2 (en) | 1991-02-12 | 1992-02-06 | Light sensor |
US08/104,084 US5381236A (en) | 1991-02-12 | 1992-02-06 | Optical sensor for imaging an object |
DE69207176T DE69207176T2 (en) | 1991-02-12 | 1992-02-06 | Optical sensor |
EP92904077A EP0571431B1 (en) | 1991-02-12 | 1992-02-06 | An optical sensor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB919102903A GB9102903D0 (en) | 1991-02-12 | 1991-02-12 | An optical sensor |
GB9102903.3 | 1991-02-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1992014118A1 true WO1992014118A1 (en) | 1992-08-20 |
Family
ID=10689869
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB1992/000221 WO1992014118A1 (en) | 1991-02-12 | 1992-02-06 | An optical sensor |
Country Status (7)
Country | Link |
---|---|
US (1) | US5381236A (en) |
EP (1) | EP0571431B1 (en) |
JP (1) | JP2973332B2 (en) |
AU (1) | AU1195892A (en) |
DE (1) | DE69207176T2 (en) |
GB (1) | GB9102903D0 (en) |
WO (1) | WO1992014118A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997002466A1 (en) * | 1995-06-30 | 1997-01-23 | Siemens Aktiengesellschaft | Optical distance sensor |
US6731383B2 (en) | 2000-09-12 | 2004-05-04 | August Technology Corp. | Confocal 3D inspection system and process |
US6773935B2 (en) | 2001-07-16 | 2004-08-10 | August Technology Corp. | Confocal 3D inspection system and process |
WO2004068400A2 (en) * | 2003-01-25 | 2004-08-12 | Spiral Scratch Limited | Methods and apparatus for making images including depth information |
US6870609B2 (en) | 2001-02-09 | 2005-03-22 | August Technology Corp. | Confocal 3D inspection system and process |
US6882415B1 (en) | 2001-07-16 | 2005-04-19 | August Technology Corp. | Confocal 3D inspection system and process |
US6970287B1 (en) | 2001-07-16 | 2005-11-29 | August Technology Corp. | Confocal 3D inspection system and process |
WO2013049646A1 (en) | 2011-09-29 | 2013-04-04 | Fei Company | Microscope device |
US9134126B2 (en) | 2010-06-17 | 2015-09-15 | Dolby International Ab | Image processing device, and image processing method |
US9350921B2 (en) | 2013-06-06 | 2016-05-24 | Mitutoyo Corporation | Structured illumination projection with enhanced exposure control |
US11539937B2 (en) | 2009-06-17 | 2022-12-27 | 3Shape A/S | Intraoral scanning apparatus |
US11701208B2 (en) | 2014-02-07 | 2023-07-18 | 3Shape A/S | Detecting tooth shade |
Families Citing this family (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7983817B2 (en) * | 1995-06-07 | 2011-07-19 | Automotive Technologies Internatinoal, Inc. | Method and arrangement for obtaining information about vehicle occupants |
KR100417567B1 (en) * | 1995-06-07 | 2004-02-05 | 제이콥 엔 올스테드터 | A camera for generating image of a scene in a three dimensional imaging system |
AU746605B2 (en) * | 1995-06-07 | 2002-05-02 | Jacob N. Wohlstadter | Three-dimensional imaging system |
US5867604A (en) * | 1995-08-03 | 1999-02-02 | Ben-Levy; Meir | Imaging measurement system |
US6044170A (en) * | 1996-03-21 | 2000-03-28 | Real-Time Geometry Corporation | System and method for rapid shape digitizing and adaptive mesh generation |
US5870220A (en) * | 1996-07-12 | 1999-02-09 | Real-Time Geometry Corporation | Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation |
IT1286838B1 (en) * | 1996-09-25 | 1998-07-17 | Consiglio Nazionale Ricerche | METHOD FOR COLLECTING IMAGES IN CONFOCAL MICROSCOPY |
US6172349B1 (en) * | 1997-03-31 | 2001-01-09 | Kla-Tencor Corporation | Autofocusing apparatus and method for high resolution microscope system |
AU737617B2 (en) * | 1997-04-04 | 2001-08-23 | Isis Innovation Limited | Microscopy imaging apparatus and method |
JP3585018B2 (en) * | 1997-05-15 | 2004-11-04 | 横河電機株式会社 | Confocal device |
DE19720832C2 (en) * | 1997-05-17 | 2003-02-27 | Diehl Stiftung & Co | Target detection device |
US6094269A (en) * | 1997-12-31 | 2000-07-25 | Metroptic Technologies, Ltd. | Apparatus and method for optically measuring an object surface contour |
US6098031A (en) * | 1998-03-05 | 2000-08-01 | Gsi Lumonics, Inc. | Versatile method and system for high speed, 3D imaging of microscopic targets |
US6366357B1 (en) * | 1998-03-05 | 2002-04-02 | General Scanning, Inc. | Method and system for high speed measuring of microscopic targets |
AU3991799A (en) | 1998-05-14 | 1999-11-29 | Metacreations Corporation | Structured-light, triangulation-based three-dimensional digitizer |
IL125659A (en) * | 1998-08-05 | 2002-09-12 | Cadent Ltd | Method and apparatus for imaging three-dimensional structure |
US20140152823A1 (en) * | 1998-11-30 | 2014-06-05 | American Vehicular Sciences Llc | Techniques to Obtain Information About Objects Around a Vehicle |
NL1011080C2 (en) * | 1999-01-20 | 2000-07-21 | Kwestar B V | Apparatus and method for sorting asparagus. |
US6734962B2 (en) * | 2000-10-13 | 2004-05-11 | Chemimage Corporation | Near infrared chemical imaging microscope |
DE19944516B4 (en) * | 1999-09-16 | 2006-08-17 | Brainlab Ag | Three-dimensional shape detection with camera images |
GB9926014D0 (en) * | 1999-11-04 | 2000-01-12 | Burton David R | Measurement of objects |
US6505140B1 (en) * | 2000-01-18 | 2003-01-07 | Intelligent Automation, Inc. | Computerized system and method for bullet ballistic analysis |
US6785634B2 (en) * | 2000-01-18 | 2004-08-31 | Intelligent Automation, Inc. | Computerized system and methods of ballistic analysis for gun identifiability and bullet-to-gun classifications |
US6462814B1 (en) | 2000-03-15 | 2002-10-08 | Schlumberger Technologies, Inc. | Beam delivery and imaging for optical probing of a device operating under electrical test |
US7065242B2 (en) * | 2000-03-28 | 2006-06-20 | Viewpoint Corporation | System and method of three-dimensional image capture and modeling |
EP1297486A4 (en) | 2000-06-15 | 2006-09-27 | Automotive Systems Lab | Occupant sensor |
US6369879B1 (en) * | 2000-10-24 | 2002-04-09 | The Regents Of The University Of California | Method and apparatus for determining the coordinates of an object |
ATE493683T1 (en) * | 2001-04-07 | 2011-01-15 | Zeiss Carl Microimaging Gmbh | METHOD AND ARRANGEMENT FOR DEPTH-RESOLVED OPTICAL DETECTION OF A SAMPLE |
US7274446B2 (en) * | 2001-04-07 | 2007-09-25 | Carl Zeiss Jena Gmbh | Method and arrangement for the deep resolved optical recording of a sample |
US6968073B1 (en) | 2001-04-24 | 2005-11-22 | Automotive Systems Laboratory, Inc. | Occupant detection system |
JP2003098439A (en) * | 2001-09-25 | 2003-04-03 | Olympus Optical Co Ltd | Microscope capable of changing over observation |
US6597437B1 (en) * | 2002-01-03 | 2003-07-22 | Lockheed Martin Corporation | Closed loop tracking and active imaging of an out-of-band laser through the use of a fluorescent conversion material |
GB0200819D0 (en) * | 2002-01-15 | 2002-03-06 | Cole Polytechnique Federale De | Microscopy imaging apparatus and method for generating an image |
US6750974B2 (en) | 2002-04-02 | 2004-06-15 | Gsi Lumonics Corporation | Method and system for 3D imaging of target regions |
US7255558B2 (en) * | 2002-06-18 | 2007-08-14 | Cadent, Ltd. | Dental imaging instrument having air stream auxiliary |
US6898377B1 (en) * | 2002-06-26 | 2005-05-24 | Silicon Light Machines Corporation | Method and apparatus for calibration of light-modulating array |
US7218336B2 (en) * | 2003-09-26 | 2007-05-15 | Silicon Light Machines Corporation | Methods and apparatus for driving illuminators in printing applications |
US7406181B2 (en) * | 2003-10-03 | 2008-07-29 | Automotive Systems Laboratory, Inc. | Occupant detection system |
EP1607064B1 (en) | 2004-06-17 | 2008-09-03 | Cadent Ltd. | Method and apparatus for colour imaging a three-dimensional structure |
US7212949B2 (en) * | 2004-08-31 | 2007-05-01 | Intelligent Automation, Inc. | Automated system and method for tool mark analysis |
US7115848B1 (en) * | 2004-09-29 | 2006-10-03 | Qioptiq Imaging Solutions, Inc. | Methods, systems and computer program products for calibration of microscopy imaging devices |
US20060095172A1 (en) * | 2004-10-28 | 2006-05-04 | Abramovitch Daniel Y | Optical navigation system for vehicles |
US7573631B1 (en) | 2005-02-22 | 2009-08-11 | Silicon Light Machines Corporation | Hybrid analog/digital spatial light modulator |
US7477400B2 (en) * | 2005-09-02 | 2009-01-13 | Siimpel Corporation | Range and speed finder |
DE102007018048A1 (en) * | 2007-04-13 | 2008-10-16 | Michael Schwertner | Method and arrangement for optical imaging with depth discrimination |
US8184364B2 (en) * | 2007-05-26 | 2012-05-22 | Zeta Instruments, Inc. | Illuminator for a 3-D optical microscope |
US7729049B2 (en) * | 2007-05-26 | 2010-06-01 | Zeta Instruments, Inc. | 3-d optical microscope |
US20090002271A1 (en) * | 2007-06-28 | 2009-01-01 | Boundary Net, Incorporated | Composite display |
US20090323341A1 (en) * | 2007-06-28 | 2009-12-31 | Boundary Net, Incorporated | Convective cooling based lighting fixtures |
CA2597891A1 (en) * | 2007-08-20 | 2009-02-20 | Marc Miousset | Multi-beam optical probe and system for dimensional measurement |
US8712116B2 (en) * | 2007-10-17 | 2014-04-29 | Ffei Limited | Image generation based on a plurality of overlapped swathes |
DE102008016767B4 (en) * | 2008-04-02 | 2016-07-28 | Sick Ag | Optoelectronic sensor and method for detecting objects |
US10568535B2 (en) * | 2008-05-22 | 2020-02-25 | The Trustees Of Dartmouth College | Surgical navigation with stereovision and associated methods |
US11690558B2 (en) * | 2011-01-21 | 2023-07-04 | The Trustees Of Dartmouth College | Surgical navigation with stereovision and associated methods |
JP5403458B2 (en) * | 2008-07-14 | 2014-01-29 | 株式会社ブイ・テクノロジー | Surface shape measuring method and surface shape measuring apparatus |
US20100019997A1 (en) * | 2008-07-23 | 2010-01-28 | Boundary Net, Incorporated | Calibrating pixel elements |
US20100020107A1 (en) * | 2008-07-23 | 2010-01-28 | Boundary Net, Incorporated | Calibrating pixel elements |
US20100019993A1 (en) * | 2008-07-23 | 2010-01-28 | Boundary Net, Incorporated | Calibrating pixel elements |
US7978346B1 (en) * | 2009-02-18 | 2011-07-12 | University Of Central Florida Research Foundation, Inc. | Methods and systems for realizing high resolution three-dimensional optical imaging |
US9389408B2 (en) | 2010-07-23 | 2016-07-12 | Zeta Instruments, Inc. | 3D microscope and methods of measuring patterned substrates |
AT509884B1 (en) * | 2010-07-27 | 2011-12-15 | Alicona Imaging Gmbh | Microscopy method and device |
US8581962B2 (en) * | 2010-08-10 | 2013-11-12 | Larry Hugo Schroeder | Techniques and apparatus for two camera, and two display media for producing 3-D imaging for television broadcast, motion picture, home movie and digital still pictures |
US9561022B2 (en) | 2012-02-27 | 2017-02-07 | Covidien Lp | Device and method for optical image correction in metrology systems |
US10546441B2 (en) | 2013-06-04 | 2020-01-28 | Raymond Anthony Joao | Control, monitoring, and/or security, apparatus and method for premises, vehicles, and/or articles |
US9675430B2 (en) | 2014-08-15 | 2017-06-13 | Align Technology, Inc. | Confocal imaging apparatus with curved focal surface |
JP6027220B1 (en) * | 2015-12-22 | 2016-11-16 | Ckd株式会社 | 3D measuring device |
JP7035831B2 (en) | 2018-06-13 | 2022-03-15 | オムロン株式会社 | 3D measuring device, controller, and control method in 3D measuring device |
JP2020153798A (en) * | 2019-03-19 | 2020-09-24 | 株式会社リコー | Optical device, distance measuring optical unit, distance measuring device, and distance measuring system |
US10809378B1 (en) * | 2019-09-06 | 2020-10-20 | Mitutoyo Corporation | Triangulation sensing system and method with triangulation light extended focus range using variable focus lens |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4640620A (en) * | 1983-12-29 | 1987-02-03 | Robotic Vision Systems, Inc. | Arrangement for rapid depth measurement using lens focusing |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4629324A (en) * | 1983-12-29 | 1986-12-16 | Robotic Vision Systems, Inc. | Arrangement for measuring depth based on lens focusing |
JP2928548B2 (en) * | 1989-08-02 | 1999-08-03 | 株式会社日立製作所 | Three-dimensional shape detection method and device |
-
1991
- 1991-02-12 GB GB919102903A patent/GB9102903D0/en active Pending
-
1992
- 1992-02-06 AU AU11958/92A patent/AU1195892A/en not_active Abandoned
- 1992-02-06 DE DE69207176T patent/DE69207176T2/en not_active Expired - Fee Related
- 1992-02-06 JP JP4504040A patent/JP2973332B2/en not_active Expired - Fee Related
- 1992-02-06 WO PCT/GB1992/000221 patent/WO1992014118A1/en active IP Right Grant
- 1992-02-06 US US08/104,084 patent/US5381236A/en not_active Expired - Lifetime
- 1992-02-06 EP EP92904077A patent/EP0571431B1/en not_active Expired - Lifetime
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4640620A (en) * | 1983-12-29 | 1987-02-03 | Robotic Vision Systems, Inc. | Arrangement for rapid depth measurement using lens focusing |
Non-Patent Citations (6)
Title |
---|
APPLIED OPTICS. vol. 26, no. 12, 15 June 1987, NEW YORK US pages 2416 - 2420; T.R.CORLE E.A.: 'DISTANCE MEASUREMENTS BY DIFFERENTIAL CONFOCAL OPTICAL RANGING' * |
APPLIED OPTICS. vol. 29, no. 10, 1 April 1990, NEW YORK US pages 1474 - 1476; J.DIRICKX E.A.: 'AUTOMATIC CALIBRATION METHOD FOR PHASE SHIFT SHADOW MOIRE INTERFEROMETRY' * |
IBM TECHNICAL DISCLOSURE BULLETIN. vol. 16, no. 2, July 1973, NEW YORK US pages 433 - 444; J.R. MALIN: 'OPTICAL MICROMETER' * |
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE vol. 11, no. 11, November 1989, NEW YORK US pages 1225 - 1228; MAKOTO MATSUKI E.A.: 'A REAL-TIME SECTIONAL IMAGE MAESURING SYSTEM USING TIME SEQUENTIALLY CODED GRATING METHOD' * |
OPTICAL ENGINEERING. vol. 29, no. 12, December 1990, BELLINGHAM US pages 1439 - 1444; JIAN LI E.A.: 'IMPROVED FOURIER TRANSFORM PROFILOMETRY FOR THE AUTOMATIC MEASUREMENT OF THREE-DIMENSIONAL OBJECT SHAPES' * |
TECHNISCHE RUNDSCHAU. vol. 79, no. 41, 9 October 1987, BERN CH pages 94 - 98; E.SENN: 'DREIDIMENSIONALE MULTIPUNKTMESSUNG MIT STRUKTURIERTEM LICHT' * |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997002466A1 (en) * | 1995-06-30 | 1997-01-23 | Siemens Aktiengesellschaft | Optical distance sensor |
US5991040A (en) * | 1995-06-30 | 1999-11-23 | Siemens Aktiengesellschaft | Optical distance sensor |
US6731383B2 (en) | 2000-09-12 | 2004-05-04 | August Technology Corp. | Confocal 3D inspection system and process |
US6870609B2 (en) | 2001-02-09 | 2005-03-22 | August Technology Corp. | Confocal 3D inspection system and process |
US6773935B2 (en) | 2001-07-16 | 2004-08-10 | August Technology Corp. | Confocal 3D inspection system and process |
US6882415B1 (en) | 2001-07-16 | 2005-04-19 | August Technology Corp. | Confocal 3D inspection system and process |
US6970287B1 (en) | 2001-07-16 | 2005-11-29 | August Technology Corp. | Confocal 3D inspection system and process |
WO2004068400A2 (en) * | 2003-01-25 | 2004-08-12 | Spiral Scratch Limited | Methods and apparatus for making images including depth information |
WO2004068400A3 (en) * | 2003-01-25 | 2004-12-09 | Spiral Scratch Ltd | Methods and apparatus for making images including depth information |
US11622102B2 (en) | 2009-06-17 | 2023-04-04 | 3Shape A/S | Intraoral scanning apparatus |
US11539937B2 (en) | 2009-06-17 | 2022-12-27 | 3Shape A/S | Intraoral scanning apparatus |
US11671582B2 (en) | 2009-06-17 | 2023-06-06 | 3Shape A/S | Intraoral scanning apparatus |
US11831815B2 (en) | 2009-06-17 | 2023-11-28 | 3Shape A/S | Intraoral scanning apparatus |
US9134126B2 (en) | 2010-06-17 | 2015-09-15 | Dolby International Ab | Image processing device, and image processing method |
DE102011114500A1 (en) | 2011-09-29 | 2013-04-04 | Ludwig-Maximilians-Universität | microscope device |
DE102011114500B4 (en) | 2011-09-29 | 2022-05-05 | Fei Company | microscope device |
WO2013049646A1 (en) | 2011-09-29 | 2013-04-04 | Fei Company | Microscope device |
US9350921B2 (en) | 2013-06-06 | 2016-05-24 | Mitutoyo Corporation | Structured illumination projection with enhanced exposure control |
US11701208B2 (en) | 2014-02-07 | 2023-07-18 | 3Shape A/S | Detecting tooth shade |
US11707347B2 (en) | 2014-02-07 | 2023-07-25 | 3Shape A/S | Detecting tooth shade |
US11723759B2 (en) | 2014-02-07 | 2023-08-15 | 3Shape A/S | Detecting tooth shade |
Also Published As
Publication number | Publication date |
---|---|
US5381236A (en) | 1995-01-10 |
GB9102903D0 (en) | 1991-03-27 |
AU1195892A (en) | 1992-09-07 |
JP2973332B2 (en) | 1999-11-08 |
EP0571431B1 (en) | 1995-12-27 |
JPH06505096A (en) | 1994-06-09 |
DE69207176T2 (en) | 1996-07-04 |
DE69207176D1 (en) | 1996-02-08 |
EP0571431A1 (en) | 1993-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0571431B1 (en) | An optical sensor | |
US4645347A (en) | Three dimensional imaging device | |
US5305092A (en) | Apparatus for obtaining three-dimensional volume data of an object | |
US6611344B1 (en) | Apparatus and method to measure three dimensional data | |
JP3481631B2 (en) | Apparatus and method for determining a three-dimensional shape of an object using relative blur in an image due to active illumination and defocus | |
EP0380904A1 (en) | Solid state microscope | |
US6909509B2 (en) | Optical surface profiling systems | |
EP0244781A2 (en) | Method and apparatus of using a two beam interference microscope for inspection of integrated circuits and the like | |
US20030072011A1 (en) | Method and apparatus for combining views in three-dimensional surface profiling | |
CN112469361B (en) | Apparatus, method and system for generating dynamic projection patterns in confocal cameras | |
WO2002082009A1 (en) | Method and apparatus for measuring the three-dimensional surface shape of an object using color informations of light reflected by the object | |
US7369309B2 (en) | Confocal microscope | |
CA2089079C (en) | Machine vision surface characterization system | |
CA2360936A1 (en) | Automatic on-the-fly focusing for continuous image acquisition in high-resolution microscopy | |
US20120120232A1 (en) | Shape measuring device, observation device, and image processing method | |
US6765606B1 (en) | Three dimension imaging by dual wavelength triangulation | |
CA2334225C (en) | Method and device for opto-electrical acquisition of shapes by axial illumination | |
US6556307B1 (en) | Method and apparatus for inputting three-dimensional data | |
CN108253905B (en) | Vertical color confocal scanning method and system | |
JPH11132748A (en) | Multi-focal point concurrent detecting device, stereoscopic shape detecting device, external appearance inspecting device, and its method | |
WO1990009560A1 (en) | Distance gauge | |
JP3321866B2 (en) | Surface shape detecting apparatus and method | |
KR101523336B1 (en) | apparatus for examining pattern image of semiconductor wafer | |
US10948284B1 (en) | Optical profilometer with color outputs | |
EP0890822A2 (en) | A triangulation method and system for color-coded optical profilometry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AT AU BB BG BR CA CH DE DK ES FI GB HU JP KP KR LK LU MG MW NL NO PL RO RU SD SE US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE BF BJ CF CG CH CI CM DE DK ES FR GA GB GN GR IT LU MC ML MR NL SE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) |
Free format text: OAPI PATENT (BF,BJ,CF,CG,CL,CM,GA,GN,ML,MR,SN,TD,TG), AT,AU,BB,BG,BR,CA,DE,DK,FI,HU,KP,KR,LK,LU,MG,MW,NL,NO,PL,RO,RU,SD,SE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 08104084 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1992904077 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1992904077 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
NENP | Non-entry into the national phase |
Ref country code: CA |
|
WWG | Wipo information: grant in national office |
Ref document number: 1992904077 Country of ref document: EP |