WO2000028371A1 - Method and apparatus for determining optical distance - Google Patents

Method and apparatus for determining optical distance Download PDF

Info

Publication number
WO2000028371A1
WO2000028371A1 PCT/US1999/025689 US9925689W WO0028371A1 WO 2000028371 A1 WO2000028371 A1 WO 2000028371A1 US 9925689 W US9925689 W US 9925689W WO 0028371 A1 WO0028371 A1 WO 0028371A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
view
pixel
depth
focus lens
Prior art date
Application number
PCT/US1999/025689
Other languages
French (fr)
Inventor
Charles D. Melville
Michael Tidwell
Richard S. Johnston
Joel S. Kollin
Original Assignee
University Of Washington
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Of Washington filed Critical University Of Washington
Priority to CA002347253A priority Critical patent/CA2347253C/en
Priority to EP99961558A priority patent/EP1129383A4/en
Priority to JP2000581496A priority patent/JP2002529792A/en
Priority to AU18110/00A priority patent/AU758750B2/en
Publication of WO2000028371A1 publication Critical patent/WO2000028371A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/236Image signal generators using stereoscopic image cameras using a single 2D image sensor using varifocal lenses or mirrors

Definitions

  • a conventional camera includes an objective lens and a light detector, such as a photographic film, CCD array or other photosensitive device or structure.
  • a light detector such as a photographic film, CCD array or other photosensitive device or structure.
  • Light from a viewing environment enters the camera through the objective lens and impinges on the light detector.
  • the portion of the viewing environment for which light enters is the camera's field of view.
  • Some cameras pass the light to a viewfinder or eyepiece allowing an operator to select a desired field of view from the background environment.
  • the light detector captures frames of the background light from the field of view.
  • the field of view is divided into discrete picture elements or pixels.
  • the light detector records data for each pixel within the field of view for a given video frame.
  • the data includes color, intensity and the pixel coordinates (i.e., x,y coordinates).
  • Conventional still cameras and video cameras include optics for focusing within the field of view. Thus, an operator can select to focus on a near field object or a far field object. Some cameras even include autofocus devices which automatically adjust the focal length of the objective lens to focus within the field of view.
  • an apparent distance of one or more points within an optical field of view is determined. For example, an apparent distance is determined for each pixel, or for one or more group of pixels, within a field of view. Such distance is also referred to as a depth of view.
  • One advantage of the invention is that pixel data for an object viewed may be recorded, input to a computer and mapped enabling display of a 3-dimensional model of the object.
  • Another advantage is that an augmented display device or camera device can have variable accommodation.
  • incoming light is scanned along a raster pattern to direct light for a select pixel onto a light distance detector.
  • the distance is sampled for each pixel or for a group of pixels.
  • the light distance detector includes a concentric set of ring sensors.
  • the spot For light entering from a far distance, such as from infinity to about 20 feet, the spot will be small. For light coming from closer distances the spot is larger.
  • the diameter of the spot is proportional to the distance at which the light originated (e.g., light source or object from which light was reflected).
  • each ring corresponds to a distance. The number of rings impinged determines the distance for the pixel being sampled.
  • a variable focus lens is included in the light path.
  • the focal length of the VFL is varied to achieve a small spot size.
  • the distance at which the light originated correlates to the resulting focal length of the VFL.
  • light intensity and color also may be sampled to record a digital image of a field of view, such as for a camera implementation.
  • Fig. 1 is a diagram of a conventional image detection apparatus
  • Fig. 2 is a diagram of an apparatus for scanning optical distance within a field of view according to an embodiment of this invention
  • Fig. 3 is a diagram of the light detector of Fig. 2;
  • Fig. 4 is a diagram of the light detector of Fig. 3 with an impinging spot of light
  • Fig. 5 is a diagram of an electro-mechanically variable focus lens for a lensing system of Fig. 2 according to an embodiment of this invention
  • Fig. 6 is a diagram of an alternative variable focus lens embodiment for the lensing system of Fig. 2;
  • Fig. 7 is a diagram of another alternative variable focus lens embodiment for the lensing system of Fig. 2;
  • Fig. 8 is a diagram of a plurality of cascaded lens for the lensing system of Fig. 2 according to an embodiment of this invention
  • Fig. 9 is a block diagram of a feedback control scheme for detecting light distance according to an embodiment of this invention.
  • Fig. 10 is a diagram of an image recording apparatus according to an embodiment of this invention.
  • a conventional image detection apparatus background light from a field of view F impinges on an objective lens 14 which converges the light toward a light detector 16.
  • the light detector 16 may be a charge-coupled device (CCD), which also serves as a viewfinder.
  • CCD charge-coupled device
  • Light from objects within the field of view F, such as a first object 18 (e.g., a tree) and a second object 20 (e.g., a bird) is captured to record an image of the field of view or a part thereof.
  • an apparatus 30 detects optical distance (i.e., depth of view) for objects 18, 20 in the field of view F according to an embodiment of this invention.
  • the apparatus 30 includes an objective lens 14, a scanning system 32, a lensing system 34 and a light distance detector 36.
  • Background light 12 from the field of view F, including light reflected from the objects 18, 20 enters the apparatus 30 at the objective lens 14.
  • the light is directed to the scanning system which scans the background light along two axes to select at any given time a pixel area of the field of view to be analyzed.
  • Light originating within the select pixel area is directed into the lensing system 34 which converges the light onto the light distance detector 36.
  • Preferably only light originating from a single, select pixel area is focused onto the light distance detector 36.
  • the size of the area being measured for distance may vary to include multiple pixels.
  • the size of the field portion measured for distance is determined by the size of a mirror surface on scanners 38, 40 within the scanner system 32, the relative location of the mirror surface relative to the objective lens 14 and the lensing system 34, and the relative location of the lensing system 34 relative to the light distance detector 36.
  • the scanning system 32 periodically scans along a prescribed scanning pattern, such as a raster pattern.
  • a horizontal scanner 38 scans along a horizontal axis and a vertical scanner 40 scans along a vertical axis.
  • a sample is taken at the light distance detector for multiple points along each given horizontal scanning line. Such sample, for example, corresponds to a pixel.
  • the light distance detector signal 35 corresponds to the depth of view of the light sample.
  • a table of correlation data is stored in memory 37.
  • a controller 43 compares the light distance detector signal 35 to entries in the table to derive the depth of view for the light sample.
  • the determined depth of view is read from the memory 37 and stored as the depth of view for the pixel that was sampled.
  • a distance i.e., depth of view
  • the distance is stored in memory together with the pixel coordinates (i.e., field of view coordinates) for later retrieval.
  • Light intensity and color also may be detected and stored, as for a camera or other recording implementation.
  • a light distance detector 36 includes concentrically positioned light detection sensors
  • Light 52 from select pixel region is converged by the lensing system 34 onto the light distance detector 36. Referring to Fig. 4, such light 52 forms a spot 54, preferably centered at the center of the detector 36. The smaller the spot 54, the farther the focal source of the light 52 for the select pixel. For light, at approximately 20 feet or further from the system 30, the light waves are flat and focus down to a common point size. Accordingly, light at such distance is not differentiated (i.e., resolved). Light from zero feet to approximately 20 feet from the system 30, however is differentiated by identifying which ring sensors detect light. In the example illustrated in Fig. 4, the light spot 54 encompasses sensors
  • a specific distance corresponds to activation of such sensors 42-48.
  • An alternative method for detecting the optical distance for pixel light is achieved, by modifying the focal length of the lensing system 34 until a spot of a desired standard size is achieved.
  • the focal length may be varied until the spot size encompasses only sensors 42 and 44.
  • only sensor 42 may define the standard spot size or only sensors 42-46, or some other prescribed subset of sensors 42-50 may define the prescribed spot size.
  • the lensing system 14 includes a variable focus lens (VFL).
  • VFL variable focus lens
  • the VFL has its focus varied by controlling the shape or thickness of the lens.
  • the VFL has its focus varied by varying the index of refraction of the lens.
  • Fig. 5 shows an electro-mechanically variable focus lens (VFL) 60 which changes its shape. A central portion 62 of the VFL
  • VFL 60 is constructed of a piezoelectric resonant crystalline quartz.
  • a pair of transparent conductive electrodes 64 provide an electrical field that deforms the piezoelectric material in a known manner. Such deformation changes the thickness of the central portion 62 along its optical axis to effectively change the focus of the VFL 60.
  • the VFL 60 is a resonant device, its focal length varies periodically in a very predictable pattern. By controlling the time when a light pulse enters the resonant lens, the effective focal position of the VFL 60 can be controlled.
  • the VFL 60 is designed to be nonresonant at the frequencies of interest, yet fast enough to focus for each image pixel.
  • the variable focus lens is formed from a material that changes its index of refraction in response to an electric field or other input.
  • the lens material may be an electrooptic or acoustooptic material.
  • the central portion 62 (see Fig. 1 1) is formed from lithium niobate, which is both electrooptic and acoustooptic. The central portion 62 thus exhibits an index of refraction that depends upon an applied electric field or acoustic energy.
  • the electrodes 64 apply an electric field to control the index of refraction of the lithium niobate central portion 62.
  • a quartz lens includes a transparent indium tin oxide coating that forms the electrode 64.
  • a lens 70 in another embodiment shown in Fig. 6, includes a compressible cylindrical center 72 having a gradient index of refraction as a function of its radius.
  • a cylindrical piezoelectric transducer 74 forms an outer shell that surrounds the cylindrical center 72. When an electric field is applied to the transducer 74, the transducer 74 compresses the center 72. This compression deforms the center 72, thereby changing the gradient of the index of refraction. The changed gradient index changes the focal length of the center 72.
  • variable focus element is a semiconductor device 80 that has an index of refraction that depends upon the free carrier concentration in a transmissive region 82.
  • Applying either a forward or reverse voltage to the device 80 through a pair of electrodes 84 produces either a current that increases the free-carrier concentration or a reverse bias that depletes the free carrier concentration. Since the index of refraction depends upon the free carrier concentration, the applied voltage can control the index of refraction.
  • Memory 86 and control electronics 88 may be used to control the index of refraction.
  • a plurality of lenses 90-92 are cascaded in series.
  • One or more piezoelectric positioners 94-96 move one or more of the respective lenses 90-92 along the light path changing the focal distance of the light beam. By changing the relative position of the lenses to each other the curvature of the light varies.
  • the lensing system 34 continuously varies its focal length as needed to maintain a constant spot size.
  • the light distance detector 36 and lensing system 14 are coupled in a feedback loop.
  • the output of the light distance detector 36 is fed to focal control electronics 100.
  • the focal control electronics 100 vary the focal length of a VFL 102 to maintain a constant spot size (e.g., the prescribed standard spot size previously described).
  • the focal length at any given sample time correlates to the light distance (i.e., depth of view) for such sample.
  • the lensing system performs a sweep of the focal length range of the VFL during each light sample to be measured. During the sweep the spot 54 (see Fig. 4) will achieve its smallest size. The focal length at such time is used to define the light distance.
  • the precise light distance for any given sample is determined from the focal length of the lensing system 14 at the time such sample is taken.
  • One of ordinary skill in the art will appreciate that a specific distance can be derived from the focal length using the various optical parameters (e.g., magnification factors, relative positions of components) of a system 30 embodiment.
  • the scanning system 32 includes a resonant scanner for performing horizontal scanning and a galvanometer for performing vertical scanning.
  • the scanner serving as the horizontal scanner receives a drive signal having a horizontal scanning frequency.
  • the galvanometer serving as the vertical scanner receives a drive signal having a vertical scanning frequency.
  • the horizontal scanner has a resonant frequency corresponding to the horizontal scanning frequency.
  • the vertical scanner also is a resonant scanner.
  • a resonant scanner includes a mirror driven by a drive circuit (e.g., electromagnetic drive circuit or piezoelectric actuator) to oscillate at a high frequency about an axis of rotation.
  • the drive circuit moves the mirror responsive to a drive signal which defines the frequency of motion.
  • background light 12 impinges on the mirror 39 of one scanner 38, then is reflected to another scanner 40, where its mirror 41 deflects the light toward the lensing system 34.
  • the scanner mirrors 39, 41 move, different portions (e.g., pixel areas) of the background field of view are directed toward the lensing system 34 and light distance detector 36.
  • the scanning system 32 instead includes acousto-optical deflectors, electro-optical deflectors, or rotating polygons to perform the horizontal and vertical light deflection.
  • acousto-optical deflectors In some embodiments, two of the same type of scanning device are used. In other embodiments different types of scanning devices are used for the horizontal scanner and the vertical scanner.
  • an image capturing system 150 is shown in which image data is obtained and stored for each pixel within the field of view F for a single still frame or for multiple video image frames.
  • the system 150 operates in the same manner as described for the system 30 of Fig. 2 and like parts performing like functions are given the same part numbers.
  • light intensity and light color also is detected for each pixel within the field of view.
  • a light intensity sensor 152 is included along with color sensor 154.
  • the sensors 152, 154 and 36 may be combined into a common device, or that the color sensing and intensity sensing can be achieved with a common device. Further, rather than color detection gray scales may be detected for black and white monochromatic viewing.
  • image data is obtained and stored in memory storage 156.
  • the image data includes the pixel coordinates, the determined light distance, the light intensity and the light color.
  • Such image data may be recalled and displayed at display device 158 to replay the captured image frame(s).
  • 160 coordinates the field of view scanning and the image replay.

Abstract

Apparent distance of a pixel within an optical field of view F is determined. Incoming light is scanned along a raster pattern to direct light for a select pixel onto a light distance detector (36). The distance is sampled for each pixel or for a group of pixels. The light distance detector includes a concentric set of rings sensors (42-50). The larger the spot of light corresponding to the pixel, the more rings are impinged. The diameter of the spot is proportional to the distance at which the light originated (e.g., light source or object from which light was reflected). Alternatively, a variable focus lens (VFL) 62/70/80 adjusts focal length for a given pixel to achieve a standard spot size. The distance at which the light originated correlates to the focal length of the VFL.

Description

METHOD AND APPARATUS FOR DETERMINING OPTICAL DISTANCE
BACKGROUND OF THE INVENTION This invention relates to methods and apparatus for determining an optical distance, such as a distance of an object within a field of view, and more particularly to a method and apparatus for scanning distances within a field of view. A conventional camera includes an objective lens and a light detector, such as a photographic film, CCD array or other photosensitive device or structure. Light from a viewing environment enters the camera through the objective lens and impinges on the light detector. The portion of the viewing environment for which light enters is the camera's field of view. Some cameras pass the light to a viewfinder or eyepiece allowing an operator to select a desired field of view from the background environment. To take a picture or record, the light detector captures frames of the background light from the field of view.
Often the field of view is divided into discrete picture elements or pixels. In conventional digital video cameras the light detector records data for each pixel within the field of view for a given video frame. The data includes color, intensity and the pixel coordinates (i.e., x,y coordinates). Conventional still cameras and video cameras include optics for focusing within the field of view. Thus, an operator can select to focus on a near field object or a far field object. Some cameras even include autofocus devices which automatically adjust the focal length of the objective lens to focus within the field of view. SUMMARY OF THE INVENTION
According to the invention, an apparent distance of one or more points within an optical field of view is determined. For example, an apparent distance is determined for each pixel, or for one or more group of pixels, within a field of view. Such distance is also referred to as a depth of view. One advantage of the invention is that pixel data for an object viewed may be recorded, input to a computer and mapped enabling display of a 3-dimensional model of the object. Another advantage is that an augmented display device or camera device can have variable accommodation.
According to one aspect of the invention, incoming light is scanned along a raster pattern to direct light for a select pixel onto a light distance detector. The distance is sampled for each pixel or for a group of pixels.
According to another aspect of the invention, the light distance detector includes a concentric set of ring sensors. The larger the spot of light corresponding to the pixel, the more rings are impinged. For light entering from a far distance, such as from infinity to about 20 feet, the spot will be small. For light coming from closer distances the spot is larger. The diameter of the spot is proportional to the distance at which the light originated (e.g., light source or object from which light was reflected). According to another aspect of the invention, each ring corresponds to a distance. The number of rings impinged determines the distance for the pixel being sampled.
According to an alternative aspect of the invention, a variable focus lens (VFL) is included in the light path. For a given pixel to be sampled, the focal length of the VFL is varied to achieve a small spot size. The distance at which the light originated correlates to the resulting focal length of the VFL. Although, distance is sampled for each pixel or for a group of pixels, light intensity and color also may be sampled to record a digital image of a field of view, such as for a camera implementation.
These and other aspects and advantages of the invention will be better understood by reference to the following detailed description taken in conjunction with the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a diagram of a conventional image detection apparatus; Fig. 2 is a diagram of an apparatus for scanning optical distance within a field of view according to an embodiment of this invention; Fig. 3 is a diagram of the light detector of Fig. 2;
Fig. 4 is a diagram of the light detector of Fig. 3 with an impinging spot of light;
Fig. 5 is a diagram of an electro-mechanically variable focus lens for a lensing system of Fig. 2 according to an embodiment of this invention; Fig. 6 is a diagram of an alternative variable focus lens embodiment for the lensing system of Fig. 2;
Fig. 7 is a diagram of another alternative variable focus lens embodiment for the lensing system of Fig. 2;
Fig. 8 is a diagram of a plurality of cascaded lens for the lensing system of Fig. 2 according to an embodiment of this invention;
Fig. 9 is a block diagram of a feedback control scheme for detecting light distance according to an embodiment of this invention; and
Fig. 10 is a diagram of an image recording apparatus according to an embodiment of this invention.
DESCRIPTION OF SPECIFIC EMBODIMENTS Overview
Referring to Fig. 1, in a conventional image detection apparatus 10, background light from a field of view F impinges on an objective lens 14 which converges the light toward a light detector 16. In a digital camera the light detector 16 may be a charge-coupled device (CCD), which also serves as a viewfinder. Light from objects within the field of view F, such as a first object 18 (e.g., a tree) and a second object 20 (e.g., a bird) is captured to record an image of the field of view or a part thereof. Referring to Fig. 2, an apparatus 30 detects optical distance (i.e., depth of view) for objects 18, 20 in the field of view F according to an embodiment of this invention. The apparatus 30 includes an objective lens 14, a scanning system 32, a lensing system 34 and a light distance detector 36. Background light 12, from the field of view F, including light reflected from the objects 18, 20 enters the apparatus 30 at the objective lens 14. The light is directed to the scanning system which scans the background light along two axes to select at any given time a pixel area of the field of view to be analyzed. Light originating within the select pixel area is directed into the lensing system 34 which converges the light onto the light distance detector 36. Preferably only light originating from a single, select pixel area is focused onto the light distance detector 36. In alternative embodiment the size of the area being measured for distance may vary to include multiple pixels. The size of the field portion measured for distance is determined by the size of a mirror surface on scanners 38, 40 within the scanner system 32, the relative location of the mirror surface relative to the objective lens 14 and the lensing system 34, and the relative location of the lensing system 34 relative to the light distance detector 36.
During operation, the scanning system 32 periodically scans along a prescribed scanning pattern, such as a raster pattern. For scanning a two dimensional raster pattern, a horizontal scanner 38 scans along a horizontal axis and a vertical scanner 40 scans along a vertical axis. A sample is taken at the light distance detector for multiple points along each given horizontal scanning line. Such sample, for example, corresponds to a pixel. The light distance detector signal 35 corresponds to the depth of view of the light sample. In some embodiments a table of correlation data is stored in memory 37. A controller 43 compares the light distance detector signal 35 to entries in the table to derive the depth of view for the light sample. The determined depth of view is read from the memory 37 and stored as the depth of view for the pixel that was sampled. Thus, a distance (i.e., depth of view) is determined for each pixel within the field of view.
In some embodiments the distance is stored in memory together with the pixel coordinates (i.e., field of view coordinates) for later retrieval. Light intensity and color also may be detected and stored, as for a camera or other recording implementation.
Light Distance Detector
Referring to Fig. 3, a light distance detector 36 according to one embodiment of this invention includes concentrically positioned light detection sensors
42-50 that form a set of concentric rings. The number of rings and radial increment may vary depending on the distance resolution desired. Light 52 from select pixel region is converged by the lensing system 34 onto the light distance detector 36. Referring to Fig. 4, such light 52 forms a spot 54, preferably centered at the center of the detector 36. The smaller the spot 54, the farther the focal source of the light 52 for the select pixel. For light, at approximately 20 feet or further from the system 30, the light waves are flat and focus down to a common point size. Accordingly, light at such distance is not differentiated (i.e., resolved). Light from zero feet to approximately 20 feet from the system 30, however is differentiated by identifying which ring sensors detect light. In the example illustrated in Fig. 4, the light spot 54 encompasses sensors
42-48. A specific distance corresponds to activation of such sensors 42-48.
An alternative method for detecting the optical distance for pixel light is achieved, by modifying the focal length of the lensing system 34 until a spot of a desired standard size is achieved. For example, the focal length may be varied until the spot size encompasses only sensors 42 and 44. Alternatively, only sensor 42 may define the standard spot size or only sensors 42-46, or some other prescribed subset of sensors 42-50 may define the prescribed spot size. Following is a description of a lensing system which can vary its focal distance.
Lensing System with Variable Focal Length
To vary the focal length, the lensing system 14 includes a variable focus lens (VFL). In some embodiments the VFL has its focus varied by controlling the shape or thickness of the lens. In other embodiment the VFL has its focus varied by varying the index of refraction of the lens. Fig. 5 shows an electro-mechanically variable focus lens (VFL) 60 which changes its shape. A central portion 62 of the VFL
60 is constructed of a piezoelectric resonant crystalline quartz. In operation, a pair of transparent conductive electrodes 64 provide an electrical field that deforms the piezoelectric material in a known manner. Such deformation changes the thickness of the central portion 62 along its optical axis to effectively change the focus of the VFL 60. Because the VFL 60 is a resonant device, its focal length varies periodically in a very predictable pattern. By controlling the time when a light pulse enters the resonant lens, the effective focal position of the VFL 60 can be controlled.
In some applications, it may be undesirable to selectively delay pulses of light according to the resonant frequency of the VFL 60. In such cases, the VFL 60 is designed to be nonresonant at the frequencies of interest, yet fast enough to focus for each image pixel.
In an alternative embodiment, the variable focus lens is formed from a material that changes its index of refraction in response to an electric field or other input. For example, the lens material may be an electrooptic or acoustooptic material. In the preferred embodiment, the central portion 62 (see Fig. 1 1) is formed from lithium niobate, which is both electrooptic and acoustooptic. The central portion 62 thus exhibits an index of refraction that depends upon an applied electric field or acoustic energy. In operation, the electrodes 64 apply an electric field to control the index of refraction of the lithium niobate central portion 62. In another embodiment a quartz lens includes a transparent indium tin oxide coating that forms the electrode 64.
In another embodiment shown in Fig. 6, a lens 70 includes a compressible cylindrical center 72 having a gradient index of refraction as a function of its radius. A cylindrical piezoelectric transducer 74 forms an outer shell that surrounds the cylindrical center 72. When an electric field is applied to the transducer 74, the transducer 74 compresses the center 72. This compression deforms the center 72, thereby changing the gradient of the index of refraction. The changed gradient index changes the focal length of the center 72.
In another embodiment shown in Fig. 7 the variable focus element is a semiconductor device 80 that has an index of refraction that depends upon the free carrier concentration in a transmissive region 82. Applying either a forward or reverse voltage to the device 80 through a pair of electrodes 84 produces either a current that increases the free-carrier concentration or a reverse bias that depletes the free carrier concentration. Since the index of refraction depends upon the free carrier concentration, the applied voltage can control the index of refraction. Memory 86 and control electronics 88 may be used to control the index of refraction.
In still another embodiment shown in Fig. 8 a plurality of lenses 90-92 are cascaded in series. One or more piezoelectric positioners 94-96 move one or more of the respective lenses 90-92 along the light path changing the focal distance of the light beam. By changing the relative position of the lenses to each other the curvature of the light varies.
According to one control approach, the lensing system 34 continuously varies its focal length as needed to maintain a constant spot size. Referring to Fig. 9 the light distance detector 36 and lensing system 14 are coupled in a feedback loop. The output of the light distance detector 36 is fed to focal control electronics 100. The focal control electronics 100 vary the focal length of a VFL 102 to maintain a constant spot size (e.g., the prescribed standard spot size previously described). The focal length at any given sample time correlates to the light distance (i.e., depth of view) for such sample. According to another control approach, the lensing system performs a sweep of the focal length range of the VFL during each light sample to be measured. During the sweep the spot 54 (see Fig. 4) will achieve its smallest size. The focal length at such time is used to define the light distance.
According to these control techniques, the precise light distance for any given sample is determined from the focal length of the lensing system 14 at the time such sample is taken. One of ordinary skill in the art will appreciate that a specific distance can be derived from the focal length using the various optical parameters (e.g., magnification factors, relative positions of components) of a system 30 embodiment.
Scanning System
In one embodiment, the scanning system 32 includes a resonant scanner for performing horizontal scanning and a galvanometer for performing vertical scanning. The scanner serving as the horizontal scanner receives a drive signal having a horizontal scanning frequency. Similarly, the galvanometer serving as the vertical scanner receives a drive signal having a vertical scanning frequency. Preferably, the horizontal scanner has a resonant frequency corresponding to the horizontal scanning frequency. In other embodiments the vertical scanner also is a resonant scanner.
One embodiment of a resonant scanner includes a mirror driven by a drive circuit (e.g., electromagnetic drive circuit or piezoelectric actuator) to oscillate at a high frequency about an axis of rotation. The drive circuit moves the mirror responsive to a drive signal which defines the frequency of motion.
Referring to Fig. 2, background light 12 impinges on the mirror 39 of one scanner 38, then is reflected to another scanner 40, where its mirror 41 deflects the light toward the lensing system 34. As the scanner mirrors 39, 41 move, different portions (e.g., pixel areas) of the background field of view are directed toward the lensing system 34 and light distance detector 36.
In alternative embodiments, the scanning system 32 instead includes acousto-optical deflectors, electro-optical deflectors, or rotating polygons to perform the horizontal and vertical light deflection. In some embodiments, two of the same type of scanning device are used. In other embodiments different types of scanning devices are used for the horizontal scanner and the vertical scanner. Image Capturing System
Referring to Fig. 10, an image capturing system 150 is shown in which image data is obtained and stored for each pixel within the field of view F for a single still frame or for multiple video image frames. The system 150 operates in the same manner as described for the system 30 of Fig. 2 and like parts performing like functions are given the same part numbers. In addition to detecting light distance however, light intensity and light color also is detected for each pixel within the field of view.
Accordingly, a light intensity sensor 152 is included along with color sensor 154. One of ordinary skill in the art will appreciate that the sensors 152, 154 and 36 may be combined into a common device, or that the color sensing and intensity sensing can be achieved with a common device. Further, rather than color detection gray scales may be detected for black and white monochromatic viewing.
For each pixel in the field of view, image data is obtained and stored in memory storage 156. The image data includes the pixel coordinates, the determined light distance, the light intensity and the light color. Such image data may be recalled and displayed at display device 158 to replay the captured image frame(s). A controller
160 coordinates the field of view scanning and the image replay.
Although preferred embodiments of the invention have been illustrated and described, various alternatives, modifications and equivalents may be used. Therefore, the foregoing description should not be taken as limiting the scope of the inventions which are defined by the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A method for determining depth of view for a sample area within an optical field of view, comprising the steps of: receiving light from the optical field of view F; converging a portion of the scanned light; detecting a pattern of the converged light corresponding to the sample area; and identifying from the detected pattern a depth of view for the sample area.
2. The method of claim 1 , wherein the step of detecting comprises the steps of positioning a plurality of concentric light sensors 42-50 along an expected path of the converged light, and receiving the converged light with at least one of the plurality of concentric light sensors; and wherein the step of identifying comprises the steps of identifying which of the at least one of the plurality of concentric light sensors received impinging light, and matching to a corresponding depth of view the least one of the plurality of concentric light sensors which received impinging light.
3. The method of claim 1 or 2, further comprising the step of storing data correlated to varying depths of view; and wherein the step of identifying comprises looking up the depth of view for the identified pattern from the stored data.
4. A method for determining depth of view for each one of a plurality of samples areas within an optical field of view, comprising the steps of: receiving light from the optical field of view; scanning the received light along a predetermined pattern; converging a portion of the scanned light; respectively detecting over time a pattern of the converged light corresponding to each one of the plurality of sample areas; and respectively identifying from the respective detected pattern a depth of view for each said one of the plurality of sample areas.
5. The method of claim 4, wherein the step of detecting comprises the step of detecting, for a given sample area of the plurality of sample areas, the converged light impinging upon at least one of a plurality of concentric light sensors 42- 50; and wherein the step of identifying comprises the steps of identifying which of the at least one of the plurality of concentric light sensors received impinging light for the given sample area, wherein the depth of view for the given sample area corresponds to the identified at least one of the plurality of concentric light sensors which received impinging light.
6. The method of claim 1 , 2, 3, 4 or 5, wherein the step of identifying comprises the steps of varying the focal length of a variable focus lens 62/70, the focal length corresponding to the determined depth of view.
7. The method of claim 6, wherein the step of converging comprises converging a portion of the scanned light with the variable focus lens.
8. The method of claim 4, 5, 6, 7, or 8, in which each one of the plurality of sample areas is a pixel.
9. The method of claim 8, in which the step of identifying further comprises the step of generating a signal indicative of the depth of view, and further comprising the step of storing the indication of depth of view for each pixel.
10. The method of claim 9, further comprising the step of storing pixel coordinates for each pixel.
1 1. The method of claim 9, further comprising the steps of: respectively detecting over time the intensity of the converged light corresponding to each pixel; and storing the detected light intensity for each pixel.
12. The method of claim 9, further comprising the steps of: respectively detecting over time the color of the converged light corresponding to each pixel; and . storing the detected color for each pixel.
13. An apparatus for determining depth of view for a sample area within an optical field of view, comprising: a first lens 14 receiving light from the optical field of view and directing the light along a light path; a scanner 32 receiving the directed light for redirecting at least a portion of the light; a second lens 34 receiving the redirected light and converging the redirected light; and a light detector 36 which receives the converged light and generates a signal 35 indicative of the depth of view for the sample area corresponding to the converged light.
14. The apparatus of claim 13, wherein the scanner comprises a mirror 38 which moves along a predetermined scanning path, wherein at a given time during the scanning path the mirror redirects light for a select pixel within the field of view, the select pixel changing with time during a scanning cycle, and wherein the light detector receives converged light for the select pixel, the select pixel corresponding to the sample area.
15. The apparatus of claim 13 or 14, in which the light detector comprises a plurality of concentric light sensors 42-50, and wherein the number of light sensors of the plurality of concentric light sensors which detect the converged light is indicative of the depth of view for the sample area.
16. The apparatus of claim 13, 14, or 15, wherein the second lens is a variable focus lens 62/70 which receives a signal 64 to cause a change in focal length of the variable focal lens, wherein the focal length of the variable focus lens which results in a minimal spot size of converged light on the light detector corresponds to the depth of view of the sample area.
17. The apparatus of claim 13, 14, 15, 16 or 17, wherein the second lens is a variable focus lens 62/70 which receives a signal 64 to cause a change in focal length of the variable focal lens, and further comprising a controller 88 receiving the indicative signal, wherein the controller generates a signal for adjusting the focal length of the variable focus lens in response to the indicative signal to maintain the indicative signal constant, and wherein the focal length of the variable focus lens at a select sample time corresponding to the sample area is indicative of the depth of view of the sample area.
18. The apparatus of claim 16, further comprising memory 86 and wherein the light detector also indicates light intensity and light color, wherein light intensity, light color, light depth of view and pixel coordinates are stored in memory for each one of a plurality of pixels selected by the scanner.
19. An apparatus for determining depth of view for pixels within an optical field of view, comprising: a first lens 14 receiving light from the optical field of view and directing the light along a light path; a scanner 32 receiving the directed light for redirecting at least a portion of the light, wherein the scanner comprises a mirror 38 which moves along a predetermined scanning path, wherein at a given time during the scanning path the mirror redirects light for a select pixel within the field of view, the select pixel changing with time during a scanning cycle, a variable focus lens 62/70/80 receiving the redirected light and receiving a signal 64 for controlling focal length of the variable focus lens, the variable focus lens converging the redirected light, the converged light corresponding to the select pixel; a light detector 36 which receives the converged light and generates a signal; and a control circuit 43/ 88 coupled to the variable focus lens which generates the controlling signal received at the variable focus lens, wherein the focal length of the variable focus lens corresponds to the determined depth of view for the sample area.
20. The apparatus of claim 19, in which the light detector comprises a plurality of concentric light sensors 42-50, wherein the control signal varies the focal length of the variable focus lens during light detection for the select pixel, wherein the focal length of the variable focus lens which causes a smallest spot size at the light detector corresponds to the depth of view for the select pixel.
PCT/US1999/025689 1998-11-09 1999-11-02 Method and apparatus for determining optical distance WO2000028371A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CA002347253A CA2347253C (en) 1998-11-09 1999-11-02 Method and apparatus for determining optical distance
EP99961558A EP1129383A4 (en) 1998-11-09 1999-11-02 Method and apparatus for determining optical distance
JP2000581496A JP2002529792A (en) 1998-11-09 1999-11-02 Method and apparatus for determining optical distance
AU18110/00A AU758750B2 (en) 1998-11-09 1999-11-02 Method and apparatus for determining optical distance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/188,991 1998-11-09
US09/188,991 US6191761B1 (en) 1998-11-09 1998-11-09 Method and apparatus for determining optical distance

Publications (1)

Publication Number Publication Date
WO2000028371A1 true WO2000028371A1 (en) 2000-05-18

Family

ID=22695443

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/025689 WO2000028371A1 (en) 1998-11-09 1999-11-02 Method and apparatus for determining optical distance

Country Status (7)

Country Link
US (4) US6191761B1 (en)
EP (1) EP1129383A4 (en)
JP (1) JP2002529792A (en)
KR (1) KR100704083B1 (en)
AU (1) AU758750B2 (en)
CA (1) CA2347253C (en)
WO (1) WO2000028371A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2131597A3 (en) * 2008-06-04 2010-09-08 Sony Corporation Image encoding device and image encoding method

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6281862B1 (en) * 1998-11-09 2001-08-28 University Of Washington Scanned beam display with adjustable accommodation
US6191761B1 (en) * 1998-11-09 2001-02-20 University Of Washington Method and apparatus for determining optical distance
US7053937B1 (en) * 1999-05-21 2006-05-30 Pentax Corporation Three-dimensional image capturing device and recording medium
US7555333B2 (en) 2000-06-19 2009-06-30 University Of Washington Integrated optical scanning image acquisition and display
US8929688B2 (en) 2004-10-01 2015-01-06 University Of Washington Remapping methods to reduce distortions in images
US7298938B2 (en) * 2004-10-01 2007-11-20 University Of Washington Configuration memory for a scanning beam device
US7159782B2 (en) * 2004-12-23 2007-01-09 University Of Washington Methods of driving a scanning beam device to achieve high frame rates
EP2653995B1 (en) * 2004-12-23 2017-03-15 University of Washington Methods of driving a scanning beam device to achieve high frame rates
US7784697B2 (en) 2004-12-23 2010-08-31 University Of Washington Methods of driving a scanning beam device to achieve high frame rates
US7189961B2 (en) 2005-02-23 2007-03-13 University Of Washington Scanning beam device with detector assembly
JP2006279228A (en) * 2005-03-28 2006-10-12 Fuji Xerox Co Ltd Image pickup apparatus
US20060226231A1 (en) * 2005-03-29 2006-10-12 University Of Washington Methods and systems for creating sequential color images
GB0512743D0 (en) * 2005-06-17 2005-10-26 Mbda Uk Ltd Range detection
US7395967B2 (en) * 2005-07-21 2008-07-08 University Of Washington Methods and systems for counterbalancing a scanning beam device
US7312879B2 (en) 2005-08-23 2007-12-25 University Of Washington Distance determination in a scanned beam image capture device
US7680373B2 (en) * 2006-09-13 2010-03-16 University Of Washington Temperature adjustment in scanning beam devices
US9079762B2 (en) 2006-09-22 2015-07-14 Ethicon Endo-Surgery, Inc. Micro-electromechanical device
US7561317B2 (en) * 2006-11-03 2009-07-14 Ethicon Endo-Surgery, Inc. Resonant Fourier scanning
US7447415B2 (en) * 2006-12-15 2008-11-04 University Of Washington Attaching optical fibers to actuator tubes with beads acting as spacers and adhesives
US7738762B2 (en) * 2006-12-15 2010-06-15 University Of Washington Attaching optical fibers to actuator tubes with beads acting as spacers and adhesives
US20080146898A1 (en) * 2006-12-19 2008-06-19 Ethicon Endo-Surgery, Inc. Spectral windows for surgical treatment through intervening fluids
US7713265B2 (en) * 2006-12-22 2010-05-11 Ethicon Endo-Surgery, Inc. Apparatus and method for medically treating a tattoo
US20080151343A1 (en) * 2006-12-22 2008-06-26 Ethicon Endo-Surgery, Inc. Apparatus including a scanned beam imager having an optical dome
US8273015B2 (en) * 2007-01-09 2012-09-25 Ethicon Endo-Surgery, Inc. Methods for imaging the anatomy with an anatomically secured scanner assembly
US8801606B2 (en) 2007-01-09 2014-08-12 Ethicon Endo-Surgery, Inc. Method of in vivo monitoring using an imaging system including scanned beam imaging unit
US8305432B2 (en) 2007-01-10 2012-11-06 University Of Washington Scanning beam device calibration
US7589316B2 (en) * 2007-01-18 2009-09-15 Ethicon Endo-Surgery, Inc. Scanning beam imaging with adjustable detector sensitivity or gain
US20080221388A1 (en) * 2007-03-09 2008-09-11 University Of Washington Side viewing optical fiber endoscope
US20080226029A1 (en) * 2007-03-12 2008-09-18 Weir Michael P Medical device including scanned beam unit for imaging and therapy
US8216214B2 (en) 2007-03-12 2012-07-10 Ethicon Endo-Surgery, Inc. Power modulation of a scanning beam for imaging, therapy, and/or diagnosis
JPWO2008123104A1 (en) * 2007-03-27 2010-07-15 日本電気株式会社 Minute displacement measuring device, minute displacement measuring method, minute displacement measuring program
US7583872B2 (en) * 2007-04-05 2009-09-01 University Of Washington Compact scanning fiber device
US8626271B2 (en) 2007-04-13 2014-01-07 Ethicon Endo-Surgery, Inc. System and method using fluorescence to examine within a patient's anatomy
US7995045B2 (en) 2007-04-13 2011-08-09 Ethicon Endo-Surgery, Inc. Combined SBI and conventional image processor
US7608842B2 (en) * 2007-04-26 2009-10-27 University Of Washington Driving scanning fiber devices with variable frequency drive signals
US20080281159A1 (en) * 2007-05-08 2008-11-13 University Of Washington Coordinating image acquisition among multiple endoscopes
US20080281207A1 (en) * 2007-05-08 2008-11-13 University Of Washington Image acquisition through filtering in multiple endoscope systems
US8212884B2 (en) * 2007-05-22 2012-07-03 University Of Washington Scanning beam device having different image acquisition modes
US8160678B2 (en) 2007-06-18 2012-04-17 Ethicon Endo-Surgery, Inc. Methods and devices for repairing damaged or diseased tissue using a scanning beam assembly
US7558455B2 (en) * 2007-06-29 2009-07-07 Ethicon Endo-Surgery, Inc Receiver aperture broadening for scanned beam imaging
US7982776B2 (en) * 2007-07-13 2011-07-19 Ethicon Endo-Surgery, Inc. SBI motion artifact removal apparatus and method
US20090021818A1 (en) * 2007-07-20 2009-01-22 Ethicon Endo-Surgery, Inc. Medical scanning assembly with variable image capture and display
US8437587B2 (en) * 2007-07-25 2013-05-07 University Of Washington Actuating an optical fiber with a piezoelectric actuator and detecting voltages generated by the piezoelectric actuator
US9125552B2 (en) * 2007-07-31 2015-09-08 Ethicon Endo-Surgery, Inc. Optical scanning module and means for attaching the module to medical instruments for introducing the module into the anatomy
US7983739B2 (en) 2007-08-27 2011-07-19 Ethicon Endo-Surgery, Inc. Position tracking and control for a scanning assembly
US7925333B2 (en) 2007-08-28 2011-04-12 Ethicon Endo-Surgery, Inc. Medical device including scanned beam unit with operational control features
US7522813B1 (en) * 2007-10-04 2009-04-21 University Of Washington Reducing distortion in scanning fiber devices
WO2009057114A2 (en) * 2007-10-31 2009-05-07 Ben Gurion University Of The Negev Research And Development Authority Optical sensor measurement and crosstalk evaluation
US8411922B2 (en) * 2007-11-30 2013-04-02 University Of Washington Reducing noise in images acquired with a scanning beam device
US20090208143A1 (en) * 2008-02-19 2009-08-20 University Of Washington Efficient automated urothelial imaging using an endoscope with tip bending
US20090219606A1 (en) * 2008-03-03 2009-09-03 General Electric Company Device and method
US8050520B2 (en) * 2008-03-27 2011-11-01 Ethicon Endo-Surgery, Inc. Method for creating a pixel image from sampled data of a scanned beam imager
US8332014B2 (en) * 2008-04-25 2012-12-11 Ethicon Endo-Surgery, Inc. Scanned beam device and method using same which measures the reflectance of patient tissue
JP4483983B2 (en) * 2008-06-26 2010-06-16 ソニー株式会社 Image compression apparatus and image compression method
US8743348B2 (en) * 2010-09-03 2014-06-03 Pixart Imaging Inc. Optical distance detection system
US9292973B2 (en) 2010-11-08 2016-03-22 Microsoft Technology Licensing, Llc Automatic variable virtual focus for augmented reality displays
US9304319B2 (en) * 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
CN102749332B (en) * 2011-04-18 2015-08-26 通用电气公司 Optical system and optical detection apparatus and detection method
US9323325B2 (en) 2011-08-30 2016-04-26 Microsoft Technology Licensing, Llc Enhancing an object of interest in a see-through, mixed reality display device
US20130335528A1 (en) * 2012-05-15 2013-12-19 Board Of Regents Of The University Of Texas System Imaging device capable of producing three dimensional representations and methods of use
CN105209084B (en) 2013-03-13 2018-07-03 艾利丹尼森公司 Improved bond properties
CN111025321B (en) * 2019-12-28 2022-05-27 奥比中光科技集团股份有限公司 Variable-focus depth measuring device and measuring method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5299063A (en) * 1992-11-10 1994-03-29 Honeywell, Inc. Cross projection visor helmet mounted display
US5305124A (en) * 1992-04-07 1994-04-19 Hughes Aircraft Company Virtual image display system
US5701132A (en) * 1996-03-29 1997-12-23 University Of Washington Virtual retinal display with expanded exit pupil
US5754344A (en) * 1995-10-12 1998-05-19 Canon Kabushiki Kaisha Head-mounted stereoscopic image display apparatus
US5903397A (en) * 1998-05-04 1999-05-11 University Of Washington Display with multi-surface eyepiece

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5482229A (en) * 1977-12-14 1979-06-30 Canon Inc Automatic focus control camera
US4233503A (en) * 1979-03-05 1980-11-11 Nihon Beru-Haueru Kabushiki Kaisha (Bell & Howell Japan, Ltd.) Automatic focus adjusting system
US4269491A (en) * 1979-09-18 1981-05-26 Nippon Kogaku K.K. Distance information judging circuit for focus detecting apparatus
DE3333830C2 (en) * 1983-09-20 1985-08-01 Ralf 6751 Katzweiler Hinkel Method for laser distance measurement with high resolution for close range
JPS60123812A (en) * 1983-12-09 1985-07-02 Canon Inc Distance measuring device
US4584704A (en) * 1984-03-01 1986-04-22 Bran Ferren Spatial imaging system
FR2572515B1 (en) * 1984-10-25 1993-06-18 Canon Kk POSITION DETECTION DEVICE
JPH0617948B2 (en) * 1985-02-13 1994-03-09 富士写真フイルム株式会社 Optical beam scanning device
US4647193A (en) * 1985-06-10 1987-03-03 Rca Corporation Optical target ranging apparatus
GB8626812D0 (en) * 1986-11-10 1986-12-10 Sira Ltd Surface inspection
US5361115A (en) * 1989-04-21 1994-11-01 Canon Kabushiki Kaisha Camera
EP0408224B1 (en) * 1989-06-29 1995-09-06 The Research Foundation Of State University Of New York Computational methods and electronic camera apparatus for determining distance of objects, rapid autofocusing and obtaining improved focus images
US5355181A (en) * 1990-08-20 1994-10-11 Sony Corporation Apparatus for direct display of an image on the retina of the eye using a scanning laser
LU87896A1 (en) * 1991-03-01 1992-11-16 Wurth Paul Sa PROCESS FOR TREATING STEEL DAIRY, INSTALLATION FOR IMPLEMENTING SAME AND SLAGS OBTAINED BY THE PROCESS
US5467104A (en) 1992-10-22 1995-11-14 Board Of Regents Of The University Of Washington Virtual retinal display
US5596339A (en) 1992-10-22 1997-01-21 University Of Washington Virtual retinal display with fiber optic point source
JPH06324285A (en) 1993-05-13 1994-11-25 Olympus Optical Co Ltd Visual display device
US5498868A (en) * 1993-09-02 1996-03-12 Nippondenso Co., Ltd. Optical data reader capable of quickly changing a condensing position of a light beam
JPH07128579A (en) * 1993-10-29 1995-05-19 Canon Inc Detecting method for light of sight, detecting means for line of sight and video device having detecting means for line of sight
US5764290A (en) * 1993-11-26 1998-06-09 Sony Corporation Flange-back adjusting method and apparatus for a video camera using an inner focus lens assembly
US5668631A (en) * 1993-12-20 1997-09-16 Minolta Co., Ltd. Measuring system with improved method of reading image data of an object
JP2935805B2 (en) * 1994-03-23 1999-08-16 ローム株式会社 Auto focus projector
JP2826265B2 (en) * 1994-03-28 1998-11-18 株式会社生体光情報研究所 Tomographic imaging system
US5999840A (en) * 1994-09-01 1999-12-07 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets
US5557444A (en) 1994-10-26 1996-09-17 University Of Washington Miniature optical scanner for a two axis scanning system
US5717453A (en) * 1995-06-07 1998-02-10 Meso Scale Technology, Inc. Three dimensional imaging system
JP2962581B2 (en) * 1995-06-30 1999-10-12 シーメンス アクチエンゲゼルシヤフト Optical distance sensor
JP3286804B2 (en) * 1995-09-14 2002-05-27 キヤノン株式会社 Imaging device
JPH09114397A (en) * 1995-10-19 1997-05-02 Mitsubishi Electric Corp Display device and display equipment
JPH09187038A (en) * 1995-12-27 1997-07-15 Canon Inc Three-dimensional shape extract device
US5694237A (en) 1996-09-25 1997-12-02 University Of Washington Position detection of mechanical resonant scanner mirror
JP3872872B2 (en) * 1997-08-14 2007-01-24 株式会社東芝 Optical apparatus and image forming apparatus
US6191761B1 (en) * 1998-11-09 2001-02-20 University Of Washington Method and apparatus for determining optical distance
US7046838B1 (en) * 1999-03-30 2006-05-16 Minolta Co., Ltd. Three-dimensional data input method and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5305124A (en) * 1992-04-07 1994-04-19 Hughes Aircraft Company Virtual image display system
US5299063A (en) * 1992-11-10 1994-03-29 Honeywell, Inc. Cross projection visor helmet mounted display
US5754344A (en) * 1995-10-12 1998-05-19 Canon Kabushiki Kaisha Head-mounted stereoscopic image display apparatus
US5701132A (en) * 1996-03-29 1997-12-23 University Of Washington Virtual retinal display with expanded exit pupil
US5903397A (en) * 1998-05-04 1999-05-11 University Of Washington Display with multi-surface eyepiece

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1129383A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2131597A3 (en) * 2008-06-04 2010-09-08 Sony Corporation Image encoding device and image encoding method
US8369630B2 (en) 2008-06-04 2013-02-05 Sony Corporation Image encoding device and image encoding method with distance information

Also Published As

Publication number Publication date
KR100704083B1 (en) 2007-04-05
US20060077121A1 (en) 2006-04-13
EP1129383A4 (en) 2006-04-26
US6492962B2 (en) 2002-12-10
JP2002529792A (en) 2002-09-10
AU1811000A (en) 2000-05-29
KR20010092724A (en) 2001-10-26
US20030016187A1 (en) 2003-01-23
EP1129383A1 (en) 2001-09-05
US6191761B1 (en) 2001-02-20
CA2347253C (en) 2004-03-30
US6977631B2 (en) 2005-12-20
AU758750B2 (en) 2003-03-27
US20010001240A1 (en) 2001-05-17
CA2347253A1 (en) 2000-05-18

Similar Documents

Publication Publication Date Title
US6191761B1 (en) Method and apparatus for determining optical distance
KR910000617B1 (en) Image pick-up apparatus
US5448395A (en) Non-mechanical step scanner for electro-optical sensors
US6359650B1 (en) Electronic camera having a tilt detection function
US6750435B2 (en) Lens focusing device, system and method for use with multiple light wavelengths
US5134474A (en) Method of compensating scattered characteristics of outputs of an infrared detector of multiple-element type
WO1992012499A1 (en) Bar code scanner with a large depth of field
US7583293B2 (en) Apparatus and method for generating multi-image scenes with a camera
US5544252A (en) Rangefinding/autofocusing device of joint transform correlation type and driving method thereof
EP0127451A1 (en) Automatic focus control system for video camera
JP3733228B2 (en) Imaging device with tilt mechanism, method, and storage medium
JP2003149032A (en) Level measuring device
EP0508897A1 (en) Method and system for automatic focussing of a CCD camera using a DCT
JPH0695141B2 (en) Laser radar image forming device
RU2024212C1 (en) Method of infra-red imaging identification of shape of objects
JPH09139806A (en) Image information reader
JP3373319B2 (en) Video scanner device
JPS61121022A (en) Microscopic image pickup device
JPS61247157A (en) Automatic focusing device
JPH11153412A (en) Image pickup device for measurement purpose
JPH09126735A (en) Multiple image forming camera and shape measuring method using this camera
JPH0552658A (en) Infrared radiation image pickup device
JPH08161466A (en) Image pickup device for mobile object
JPH06148503A (en) Focus adjusting device for infrared camera

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref country code: AU

Ref document number: 2000 18110

Kind code of ref document: A

Format of ref document f/p: F

AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG UZ VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2347253

Country of ref document: CA

Ref country code: CA

Ref document number: 2347253

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 18110/00

Country of ref document: AU

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2000 581496

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 1020017005823

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 1999961558

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1999961558

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 1020017005823

Country of ref document: KR

WWG Wipo information: grant in national office

Ref document number: 18110/00

Country of ref document: AU

WWG Wipo information: grant in national office

Ref document number: 1020017005823

Country of ref document: KR