US20040239949A1 - Method for elementary depth detection in 3D imaging - Google Patents

Method for elementary depth detection in 3D imaging Download PDF

Info

Publication number
US20040239949A1
US20040239949A1 US10/886,176 US88617604A US2004239949A1 US 20040239949 A1 US20040239949 A1 US 20040239949A1 US 88617604 A US88617604 A US 88617604A US 2004239949 A1 US2004239949 A1 US 2004239949A1
Authority
US
United States
Prior art keywords
isa
intensity
location
pixels
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/886,176
Inventor
Mark Knighton
David Agabra
William McKinley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/660,809 external-priority patent/US6639684B1/en
Application filed by Individual filed Critical Individual
Priority to US10/886,176 priority Critical patent/US20040239949A1/en
Publication of US20040239949A1 publication Critical patent/US20040239949A1/en
Assigned to BIGFOOT PRODUCTIONS, INC. reassignment BIGFOOT PRODUCTIONS, INC. SECURITY INTEREST Assignors: NEXTENGINE, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light

Definitions

  • the invention relates to depth detection. More specifically, the invention relates to using individual pixels or elementary groups of pixels to determine distance from a reference.
  • Another common technique is to capture a projected pattern and interpret the distortion of the projected pattern. Using this technique, it is necessary to contextualize the distortion with the readings from adjacent pixels in order to identify and interpret a depth measurement for a particular location. While this technique improves the depth measurement per pixel ratio over the laser method discussed above, each measurement is dependent on the values captured by surrounding pixels and the ratio does not approach 1 to 1.
  • the maximum resolution of a 3D image captured during a period of time is limited by the number of depth measurements that can be derived during that time period from the ISA. Maximum resolution and speed is achieved where each pixel provides a depth measurement for each capture period.
  • FIG. 1 is a block diagram of a system of one embodiment of the invention.
  • FIG. 2 is a block diagram of a subsystem of one embodiment of the invention.
  • FIG. 3 shows a schematic diagram of a intensity gradient cone of one embodiment of the invention.
  • FIG. 1 is a block diagram of a system of one embodiment of the invention.
  • the distributed network 100 such as the Internet provides an interconnection between a plurality of user nodes 110 , a server node 120 and a host 150 .
  • Server node 120 may be any conventional server or a collection of servers to handle traffic and requests over the distributed network.
  • User nodes may be discrete computers running on their web browser, a corporate network, another server site, or any other node on the distributed network.
  • Host 150 may be a computer (laptop, desktop, hand-held, server, workstation, etc.), an internet appliance or any other device through which data may be forwarded across the distributed network.
  • the host 150 may communicate over a large wire link such as a universal serial bus (USB) or wireless link 162 to a digitizer 170 .
  • the digitizer 170 may be any of the myriad noncontact digitizers.
  • One suitable digitizer is described in copending patent application Ser. No. 09/660,809, entitled DIGITIZER USING INTENSITY GRADIENT TO IMAGE FEATURES OF THREE-DIMENSIONAL OBJECTS and assigned to the assignee of the instant application.
  • digitizer 170 is physically independent of an orientation fixture 180 .
  • “physically independent” means that no mechanical or wired electrical connection must exist between the physically independent units during operation.
  • two devices coupled together by an electrical signaling wire either directly or through a host computer are not physically independent, whereas two devices that have no physical coupling and communicate over a wireless link are deemed “physically independent.”
  • Connection to a common power source e.g., two outlets in a house, is not deemed to destroy physical independence.
  • Orientation fixture 180 repositions an object to be digitized by digitizer 170 such that different aspects of the object are exposed relative to the digitizer at different points in time.
  • the orientation fixture 180 is a turntable.
  • One suitable turntable is described in copending application Ser. No. 09/660,810 entitled WIRELESS TURNTABLE and assigned to the assignee of the instant application.
  • Orientation fixture may also be a robotic arm or other robotic device, or maybe a turntable in conjunction with a robotic arm or other robotic device. Other mechanisms that are capable of exposing different aspects of an object relative to the digitizer are deemed to be within the ambit of orientation fixtures.
  • the orientation fixture is physically independent of the digitizer.
  • One premise of the system is relative ease of setup to facilitate wide acceptance.
  • the digitizer 170 may be equipped to sweep an area looking with its sensing apparatus for a feature of the orientation fixture 180 .
  • the orientation fixture 180 may include a feature such as indicia, for example, acquisition indicia 188 , or may contain some other physically observable structure that permits the digitizer to identify and acquire the orientation fixture 180 without the user introducing or removing a separate reference object.
  • Acquiring the orientation fixture may permit, for example, any of automatic calibration of the digitizer, automatic determination of the relative position of the digitizer and orientation fixture, and fixture's orientation or condition.
  • imaging the feature provides an indication of focal distance as the perspective of the feature varies in a known way with distance.
  • Calibration may be performed by imaging the feature and comparing the results to a set of reference data corresponding to the feature. In this manner the digitizer settings can be automatically optimized to provide the best available accuracy under existing conditions.
  • the calibration can be performed based on a reference target or path entirely within the digitizer.
  • the orientation fixture may have a localized radiation source 186 , which permits the digitizer 170 to sweep and identify the location of the orientation fixture based on the localized radiation from radiation source 186 . It is also within the scope and contemplation of the invention to have the orientation fixture 170 position itself relative to the digitizer, such that the orientation fixture controls the acquisition by the digitizer 170 of the orientation fixture 180 and the object to be oriented thereby. In the system of such embodiment the orientation fixture would likely be a mobile robotic unit.
  • the digitizer communicates with the orientation fixture across a wireless link 184 to coordinate the orientation of the object with image capture by the digitizer.
  • the wireless link may be infrared (“IR”), radio frequency (“RF”) , optical signaling, or any other mode of wireless communication.
  • the orientation fixture 180 includes a self contained power source 194 such as a battery.
  • the self-contained power source 194 may also be a solar panel, fuel cell, or any other suitable power source.
  • digitizer 170 captures information about an object positioned by orientation fixture 180 from which a three-dimensional model can be derived. Controller 192 in digitizer 170 controls the coordination between the data capture by digitizer 170 and aspect change by the orientation fixture 180 . It is within the scope and contemplation of the invention for the controller to reside in the host, the digitizer, the orientation fixture or in an independent unit. References to the controller herein are deemed to include without limitation all of these options.
  • the digitizer 170 may also include a data analyzer 196 that reviews captured data to find errors, anomalies or other points of interest that warrant further investigation, including possibly rescanning the corresponding area.
  • the data captured by digitizer 170 is passed to the host 150 , which renders the three-dimensional model from the data.
  • the host 150 may perform compression or any other manipulation of the data known in the art.
  • the three-dimensional model may then be sent over distributed network 100 to remote nodes such as user nodes 110 or a server node 120 . This provides maximum ease of distribution across the distributed network 100 .
  • control of distribution of information captured by the digitizer is desirable, for example, to facilitate administration of user fees.
  • the digitizer is provided with a hardware interlock 190 which prevents the system from operating without first receiving authorization.
  • authorization may be provided by the server node 120 sending authorization data across the distributed network.
  • Alternative locking mechanisms such as software or firmware-based locking mechanisms may also be employed either within the digitizer 170 or the host 150 . Further security of the system can be affected by requiring an imaging application 152 on the host 150 to provide a valid digital signature in addition to the authorization data before enabling capture and/or transfer of captured data from the digitizer 170 to the host 150 .
  • Some embodiments of the digitizer 170 may encrypt the data captured prior to sending it to the host 150 . In that event, unless the host is able to decrypt the data to render it, it may forward it on to the server node 120 across the distributed network and subsequent rendering of the image or three-dimensional model would occur on the server node 120 . In this manner, the local user does not have access to the data from which the three-dimensional model may be derived unless a key is provided.
  • the host 150 may include encryption capabilities and encrypt the rendered image before forwarding it on to the server node 120 . Keying information may be provided to the digitizer and/or the host by the server node 120 .
  • the server node may maintain keying information and authorization data in a local data base 122 . Once the three-dimensional data is safely controlled by the server node 120 , access to the data may be made available for free or at cost to the user nodes 110 or back to the host 150 .
  • the digitizer may also include a field programmable gate array (“FPGA”) or other reconfigurable logic unit.
  • FPGA field programmable gate array
  • the server node periodically may reprogram the FPGA to implement an updated or enhanced algorithm for processing or security purposes, for example, as subsequently developed.
  • FIG. 2 is a block diagram of a subsystem of one embodiment of the invention.
  • the subsystem of FIG. 3 may be inserted in place of host 150 , digitizer 120 and orientation fixture 180 of FIG. 1.
  • Digitizer 70 is coupled to a host 50 .
  • This coupling may be by a bus 60 such as the Universal Serial Bus (USB), IEEE 1394 bus, or any other suitable data transfer system.
  • USB Universal Serial Bus
  • the digitizer may communicate with the host mode via a wireless interconnection.
  • Host 50 may be a personal computer, a work station, an internet appliance, or any other device that provides sufficient intelligence and processing power to render images from the data obtained by the digitizer.
  • the digitizer 70 captures image data and may forward it to the host 50 for rendering.
  • the processing on the digitizer 70 may be limited, permitting lower cost construction. It is also within the scope and contemplation of the invention for the digitizer to render the image and deliver it directly to a distributed network. It is further within the scope and contemplation of the invention for the digitizer to deliver the data to a distributed network for rendering on a remote node.
  • the digitizer 70 includes a projector to project a stripe of white light through a projection window 74 onto a remote object such as a person 82 on a turntable 80 remote from the digitizer.
  • the digitizer also contains an image sensing array (ISA) aligned with an image capture window 76 which captures the image of the object 82 within a focal zone.
  • the ISA is a linear charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) sensor
  • the focal zone is a line on the target object.
  • the digitizer includes a base 72 about which the upper unit, including the projector and the ISA, can rotate in either direction. This permits the focal line to be swept back and forth across a target object through an arc. This sweeping reduces the loss of detail in the captured image that results from shadowing on the object from the perspective of an immobile focal line.
  • the digitizer 70 also includes a wireless interface to communicate with a turntable 80 via a wireless link 84 .
  • Turntable 80 may be the type described in co-pending application entitled WIRELESS TURNTABLE, Ser. No. 09/660,810, assigned to the assignee of the instant application.
  • the digitizer Via wireless link 84 , the digitizer sends commands to the turntable 80 and receives from the turntable indications of the angular position of the turntable surface relative to a home position.
  • the digitizer When the digitizer is activated, it searches for the turntable 80 by sending a signal to which the turntable 80 is required to respond. If the turntable responds, the digitizer looks for a predetermined pattern that is expected to be present on the turntable surface. For example, the pattern may be concentric circles on the turntable surface.
  • the digitizer can both find the turntable and determine its distance from the digitizer. Then after the response is received, the digitizer sends a “go home” signal to the turntable. In some embodiments, the digitizer sends acceleration and rotation profiles to the turntable to control its rotation. Each profile may be retained in firmware on the digitizer or downloaded from host 50 .
  • the projection portion of the digitizer 70 is retained in fixed relation to the imaging portion.
  • the projection portion produces a light stripe as noted previously on the object 82 .
  • the intensity gradient can be created.
  • the blocking is from 0% to 100% during a cycle.
  • the ISA integrates the illumination over time, the outline of a three-dimensional surface is reflected in the data captured by the ISA. This is because protruding features will remain illuminated longer. Accordingly, more photons are captured by the ISA corresponding to those features. After repeating this process one stripe at a time as the object is rotated by turntable 80 or through the course of sweeping the entire digitizer back and forth as it rotates about the base, cost effective three-dimensional imaging is effected.
  • the system operates on the principle that depth data for a three-dimensional object may be calculated from an intensity difference resulting from an intensity gradient projected on the object.
  • Existing image sensing arrays such as linear charge coupled device (CCD) sensors can detect illumination intensity to a high degree of accuracy. Based on this principle, if a light source is placed in fixed relation to the ISA such that the projected light forms an angle with the focal line of the ISA, and a gradient slide, for example, going from dark to light, from left to right, is interposed between the light source and the object, features of the object closer to the ISA are illuminated by greater intensity light than those features further away.
  • ISAs image sensing arrays
  • CCD linear charge coupled device
  • the ISA captures a stripe of the object in which different intensities represent different depths of the object in that focal zone.
  • This general principle works well for uniformly colored objects imaged in an otherwise dark environment, but different coloring and ambient light conditions may cause misinterpretations of the intensity data.
  • the ISA images the same stripe of the object under ambient conditions (e.g., when the light source is not illuminating the object within the focal zone) and images again when the object is illuminated by a uniform light (e.g., with no gradient (flat gradient)), these possible misinterpretations can be avoided.
  • the ratio V G1 ⁇ V A /V G2 ⁇ V A yields a differential that can be mapped to depth of the object.
  • V G1 is the value from the ISA at a point resulting from the gradient exposure
  • V A is the value from the ambient exposure at that point
  • VG 2 is the value at the point from a second gradient exposure such as the uniform light (flat gradient) or a second gradient created as described further below.
  • the differential is computed for each point in the focal zone.
  • this differential also normalizes the effect of color variations and ambient light conditions.
  • the differential is also substantially independent of intensity of the light source. Unfortunately, as a practical matter, changing slides and/or turning the light source on and off rapidly enough to permit digitization of many possible target objects is both expensive and problematic.
  • the same effect may be created mechanically using a shutter which causes 0% to 100% of the light to illuminate the target object within the focal zone during the cycle.
  • the white light condition and ambient condition can be created. Specifically, if the imaging time of the CCD is 5 milliseconds, in an initial 5 milliseconds the shutter does not impinge on the light source, thereby allowing the imaging sensing array to image the fully illuminated object. The next 5 milliseconds, the shutter passes from 0 to 100% blockage of the light, thereby creating the intensity gradient within the focal zone. During the next 5 milliseconds, the shutter continues to drive so that the light is entirely blocked and the ambient condition image is obtained. The processing of each of these images (including the creation of the differential) may be offloaded to an attached host as discussed in greater detail below.
  • An intensity gradient may alternatively be created by sweeping the light through the focal zone. For example, by sweeping a light stripe from left to right through the focal zone, the ambient light image may be captured before the light enters the zone. A first gradient is captured from the first entry of the light into the zone until the light is entirely within the zone. A second gradient is captured as a light translates out of the zone to the right. The second gradient is the opposite of the first gradient and is not flat as in the fully illuminated case.
  • An analogous set of images may be captured as the light sweeps back from left to right.
  • One advantage of sweeping the light is that two gradients are generated as the light moves from right to left and two gradients are generated as the light moves from left to right. Thus, the sweeping can be performed at half speed without a reduction in imaging performance.
  • the differential may take the same form as discussed above.
  • the larger magnitude gradient should be selected for the numerator of the ratio. Color intensity is given by X 1 +X 2 .
  • FIG. 3 shows a schematic diagram of a intensity gradient cone of one embodiment of the invention.
  • This example has been simplified to include only a single capture of one gradient as might be applicable to uniformly shaped and colored objects in a dark environment.
  • This example could be expanded to cover non-uniform cases as discussed above using intensity differentials.
  • a light source is located a distance L along a line normal to a line of sight of a linear ISA.
  • the light source projects a gradient having a minimum intensity I A and maximum intensity I B .
  • the angle ⁇ A corresponds to the angle between the minimum intensity edge of the projected gradient and the line normal to the ISA line of sight.
  • ⁇ B corresponds to the angle defined by the maximum intensity edge of the projected gradient and the normal line.
  • corresponds to the angle defined by a line from the gradient origin to the point on the target for which distance is to be determined and the normal line.
  • the light source is fixed relative to the ISA and the minimum and maximum intensities are known or can be calibrated at run time. This allows intensity derived during capture to be mapped easily to distance with limited computational complexity.
  • depth measurements are determined on a pixel by pixel basis. Because intensity captured by the single pixel is independent of intensity captured by other pixels and intensity maps directly to depth, it is possible to achieve one depth measurement per pixel. Moreover, it would be possible to obtain a depth measurement from an image sensor having a single sensing element. In an alternative embodiment, adjacent groups of pixels may be treated as an element with the aggregate captured intensity used to determine the depth measurement without regard to an energy distribution among the pixels in the group. This yields one measurement per elementary group of pixels.
  • the embodiment of the invention maintains a LUT and indexes into the LUT based on the differential.

Abstract

A method and apparatus for obtaining up to one depth measurement for each pixel. A image sensing array (ISA) captures a measurement of intensity on a surface. The intensity maps to depth. Accordingly, from a single pixel, or a group of pixels without regard to the intensity distribution among the pixels in the group, a depth measurement can be obtained independently of any and/or all data captured by other pixels in the array.

Description

  • This is a continuation of application Ser. No. 09/839,755, filed on Apr. 19, 2001, entitled “Method for Depth Detection in 3D Imaging Providing a Depth Measurement for Each Unitary Group of Pixels” which is a Continuation-in-Part of application Ser. No. 09/660,809, filed Sep. 13, 2000, entitled “Digitizer Using Intensity Gradient to Image Features of Three-Dimensional Objects.”[0001]
  • BACKGROUND
  • 1. Field of the Invention [0002]
  • The invention relates to depth detection. More specifically, the invention relates to using individual pixels or elementary groups of pixels to determine distance from a reference. [0003]
  • BACKGROUND
  • Various 3D imaging techniques exist for capturing three dimensional representations of three dimensional objects. To effect these captures it is necessary to determine depth from a reference point for each point represented. One way this has been done in the past is using a laser which sweeps over the target object. As the laser sweeps, the spot reflects off the object and strikes an image sensing array (“ISA”) where the spot strikes the ISA and indicates the depth of that point on the target from which it was reflected. Only a single depth measurement is captured for each captured period using this method. If we assume an ISA has 5,000 pixels, this implies a ratio of 1 to 5,000 in terms of depth measurements per pixel. [0004]
  • Another common technique is to capture a projected pattern and interpret the distortion of the projected pattern. Using this technique, it is necessary to contextualize the distortion with the readings from adjacent pixels in order to identify and interpret a depth measurement for a particular location. While this technique improves the depth measurement per pixel ratio over the laser method discussed above, each measurement is dependent on the values captured by surrounding pixels and the ratio does not approach 1 to 1. [0005]
  • Ultimately, the maximum resolution of a 3D image captured during a period of time is limited by the number of depth measurements that can be derived during that time period from the ISA. Maximum resolution and speed is achieved where each pixel provides a depth measurement for each capture period. [0006]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “an” or “one” embodiment in this disclosure are not necessarily to the same embodiment, and such references mean at least one. [0007]
  • FIG. 1 is a block diagram of a system of one embodiment of the invention. [0008]
  • FIG. 2 is a block diagram of a subsystem of one embodiment of the invention. [0009]
  • FIG. 3 shows a schematic diagram of a intensity gradient cone of one embodiment of the invention. [0010]
  • DETAILED DESCRIPTION
  • FIG. 1 is a block diagram of a system of one embodiment of the invention. The [0011] distributed network 100 such as the Internet provides an interconnection between a plurality of user nodes 110, a server node 120 and a host 150. Server node 120 may be any conventional server or a collection of servers to handle traffic and requests over the distributed network. User nodes may be discrete computers running on their web browser, a corporate network, another server site, or any other node on the distributed network. Host 150 may be a computer (laptop, desktop, hand-held, server, workstation, etc.), an internet appliance or any other device through which data may be forwarded across the distributed network.
  • The [0012] host 150 may communicate over a large wire link such as a universal serial bus (USB) or wireless link 162 to a digitizer 170. The digitizer 170 may be any of the myriad noncontact digitizers. One suitable digitizer is described in copending patent application Ser. No. 09/660,809, entitled DIGITIZER USING INTENSITY GRADIENT TO IMAGE FEATURES OF THREE-DIMENSIONAL OBJECTS and assigned to the assignee of the instant application.
  • In one embodiment, [0013] digitizer 170 is physically independent of an orientation fixture 180. For user convenience, it is desirable to minimize space permanently allocated to the system and minimize setup time. Most users will not be able to allocate sufficient space to leave the system configured for use at all times. The user will therefore be required to reintroduce some portion of the system prior to each use. The need to swap cables and otherwise rewire serves as a significant deterrent to wide spread consumer adoption.
  • As used herein, “physically independent” means that no mechanical or wired electrical connection must exist between the physically independent units during operation. By way of example and not limitation, two devices coupled together by an electrical signaling wire either directly or through a host computer, are not physically independent, whereas two devices that have no physical coupling and communicate over a wireless link are deemed “physically independent.” Connection to a common power source, e.g., two outlets in a house, is not deemed to destroy physical independence. [0014]
  • [0015] Orientation fixture 180 repositions an object to be digitized by digitizer 170 such that different aspects of the object are exposed relative to the digitizer at different points in time. In one embodiment the orientation fixture 180 is a turntable. One suitable turntable is described in copending application Ser. No. 09/660,810 entitled WIRELESS TURNTABLE and assigned to the assignee of the instant application. Orientation fixture may also be a robotic arm or other robotic device, or maybe a turntable in conjunction with a robotic arm or other robotic device. Other mechanisms that are capable of exposing different aspects of an object relative to the digitizer are deemed to be within the ambit of orientation fixtures.
  • As previously noted the orientation fixture is physically independent of the digitizer. One premise of the system is relative ease of setup to facilitate wide acceptance. Thus, with the physical independence it is desirable that the [0016] digitizer 170 and orientation fixture 180 be able to “find” each other. To that end, the digitizer 170 may be equipped to sweep an area looking with its sensing apparatus for a feature of the orientation fixture 180. The orientation fixture 180 may include a feature such as indicia, for example, acquisition indicia 188, or may contain some other physically observable structure that permits the digitizer to identify and acquire the orientation fixture 180 without the user introducing or removing a separate reference object. Acquiring the orientation fixture may permit, for example, any of automatic calibration of the digitizer, automatic determination of the relative position of the digitizer and orientation fixture, and fixture's orientation or condition. In one embodiment, imaging the feature provides an indication of focal distance as the perspective of the feature varies in a known way with distance. Calibration may be performed by imaging the feature and comparing the results to a set of reference data corresponding to the feature. In this manner the digitizer settings can be automatically optimized to provide the best available accuracy under existing conditions. Alternatively, the calibration can be performed based on a reference target or path entirely within the digitizer.
  • Alternatively, the orientation fixture may have a localized [0017] radiation source 186, which permits the digitizer 170 to sweep and identify the location of the orientation fixture based on the localized radiation from radiation source 186. It is also within the scope and contemplation of the invention to have the orientation fixture 170 position itself relative to the digitizer, such that the orientation fixture controls the acquisition by the digitizer 170 of the orientation fixture 180 and the object to be oriented thereby. In the system of such embodiment the orientation fixture would likely be a mobile robotic unit.
  • In one embodiment, the digitizer communicates with the orientation fixture across a [0018] wireless link 184 to coordinate the orientation of the object with image capture by the digitizer. The wireless link may be infrared (“IR”), radio frequency (“RF”) , optical signaling, or any other mode of wireless communication. In one embodiment the orientation fixture 180 includes a self contained power source 194 such as a battery. The self-contained power source 194 may also be a solar panel, fuel cell, or any other suitable power source.
  • In one embodiment of the invention, [0019] digitizer 170 captures information about an object positioned by orientation fixture 180 from which a three-dimensional model can be derived. Controller 192 in digitizer 170 controls the coordination between the data capture by digitizer 170 and aspect change by the orientation fixture 180. It is within the scope and contemplation of the invention for the controller to reside in the host, the digitizer, the orientation fixture or in an independent unit. References to the controller herein are deemed to include without limitation all of these options. The digitizer 170 may also include a data analyzer 196 that reviews captured data to find errors, anomalies or other points of interest that warrant further investigation, including possibly rescanning the corresponding area. After any corrective action, the data captured by digitizer 170 is passed to the host 150, which renders the three-dimensional model from the data. The host 150 may perform compression or any other manipulation of the data known in the art. The three-dimensional model may then be sent over distributed network 100 to remote nodes such as user nodes 110 or a server node 120. This provides maximum ease of distribution across the distributed network 100.
  • In some cases, control of distribution of information captured by the digitizer is desirable, for example, to facilitate administration of user fees. To that end, in one embodiment the digitizer is provided with a [0020] hardware interlock 190 which prevents the system from operating without first receiving authorization. Such authorization may be provided by the server node 120 sending authorization data across the distributed network. Alternative locking mechanisms such as software or firmware-based locking mechanisms may also be employed either within the digitizer 170 or the host 150. Further security of the system can be affected by requiring an imaging application 152 on the host 150 to provide a valid digital signature in addition to the authorization data before enabling capture and/or transfer of captured data from the digitizer 170 to the host 150.
  • Some embodiments of the [0021] digitizer 170 may encrypt the data captured prior to sending it to the host 150. In that event, unless the host is able to decrypt the data to render it, it may forward it on to the server node 120 across the distributed network and subsequent rendering of the image or three-dimensional model would occur on the server node 120. In this manner, the local user does not have access to the data from which the three-dimensional model may be derived unless a key is provided. In still another embodiment, the host 150 may include encryption capabilities and encrypt the rendered image before forwarding it on to the server node 120. Keying information may be provided to the digitizer and/or the host by the server node 120. The server node may maintain keying information and authorization data in a local data base 122. Once the three-dimensional data is safely controlled by the server node 120, access to the data may be made available for free or at cost to the user nodes 110 or back to the host 150.
  • The digitizer may also include a field programmable gate array (“FPGA”) or other reconfigurable logic unit. In such case, the server node periodically may reprogram the FPGA to implement an updated or enhanced algorithm for processing or security purposes, for example, as subsequently developed. [0022]
  • FIG. 2 is a block diagram of a subsystem of one embodiment of the invention. The subsystem of FIG. 3 may be inserted in place of [0023] host 150, digitizer 120 and orientation fixture 180 of FIG. 1. Digitizer 70 is coupled to a host 50. This coupling may be by a bus 60 such as the Universal Serial Bus (USB), IEEE 1394 bus, or any other suitable data transfer system. It is also within the scope and contemplation of the invention for the digitizer to communicate with the host mode via a wireless interconnection. Host 50 may be a personal computer, a work station, an internet appliance, or any other device that provides sufficient intelligence and processing power to render images from the data obtained by the digitizer. The digitizer 70 captures image data and may forward it to the host 50 for rendering. In this way, the processing on the digitizer 70 may be limited, permitting lower cost construction. It is also within the scope and contemplation of the invention for the digitizer to render the image and deliver it directly to a distributed network. It is further within the scope and contemplation of the invention for the digitizer to deliver the data to a distributed network for rendering on a remote node.
  • The [0024] digitizer 70 includes a projector to project a stripe of white light through a projection window 74 onto a remote object such as a person 82 on a turntable 80 remote from the digitizer. The digitizer also contains an image sensing array (ISA) aligned with an image capture window 76 which captures the image of the object 82 within a focal zone. In one embodiment, the ISA is a linear charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) sensor, and the focal zone is a line on the target object. In some embodiments, the digitizer includes a base 72 about which the upper unit, including the projector and the ISA, can rotate in either direction. This permits the focal line to be swept back and forth across a target object through an arc. This sweeping reduces the loss of detail in the captured image that results from shadowing on the object from the perspective of an immobile focal line. The digitizer 70 also includes a wireless interface to communicate with a turntable 80 via a wireless link 84.
  • [0025] Turntable 80 may be the type described in co-pending application entitled WIRELESS TURNTABLE, Ser. No. 09/660,810, assigned to the assignee of the instant application. Via wireless link 84, the digitizer sends commands to the turntable 80 and receives from the turntable indications of the angular position of the turntable surface relative to a home position. When the digitizer is activated, it searches for the turntable 80 by sending a signal to which the turntable 80 is required to respond. If the turntable responds, the digitizer looks for a predetermined pattern that is expected to be present on the turntable surface. For example, the pattern may be concentric circles on the turntable surface. In such case, based on the image captured, the digitizer can both find the turntable and determine its distance from the digitizer. Then after the response is received, the digitizer sends a “go home” signal to the turntable. In some embodiments, the digitizer sends acceleration and rotation profiles to the turntable to control its rotation. Each profile may be retained in firmware on the digitizer or downloaded from host 50.
  • Generally speaking, the projection portion of the [0026] digitizer 70 is retained in fixed relation to the imaging portion. The projection portion produces a light stripe as noted previously on the object 82. By either sweeping the light stripe back and forth through the focal line or by mechanically blocking the stripe at a known rate, the intensity gradient can be created. In one embodiment, the blocking is from 0% to 100% during a cycle. Because the ISA integrates the illumination over time, the outline of a three-dimensional surface is reflected in the data captured by the ISA. This is because protruding features will remain illuminated longer. Accordingly, more photons are captured by the ISA corresponding to those features. After repeating this process one stripe at a time as the object is rotated by turntable 80 or through the course of sweeping the entire digitizer back and forth as it rotates about the base, cost effective three-dimensional imaging is effected.
  • In one embodiment, the system operates on the principle that depth data for a three-dimensional object may be calculated from an intensity difference resulting from an intensity gradient projected on the object. Existing image sensing arrays (ISAs) such as linear charge coupled device (CCD) sensors can detect illumination intensity to a high degree of accuracy. Based on this principle, if a light source is placed in fixed relation to the ISA such that the projected light forms an angle with the focal line of the ISA, and a gradient slide, for example, going from dark to light, from left to right, is interposed between the light source and the object, features of the object closer to the ISA are illuminated by greater intensity light than those features further away. Thus, the ISA captures a stripe of the object in which different intensities represent different depths of the object in that focal zone. This general principle works well for uniformly colored objects imaged in an otherwise dark environment, but different coloring and ambient light conditions may cause misinterpretations of the intensity data. However, if the ISA images the same stripe of the object under ambient conditions (e.g., when the light source is not illuminating the object within the focal zone) and images again when the object is illuminated by a uniform light (e.g., with no gradient (flat gradient)), these possible misinterpretations can be avoided. [0027]
  • Particularly, the ratio V[0028] G1−VA/VG2−VA yields a differential that can be mapped to depth of the object. In the differential, VG1 is the value from the ISA at a point resulting from the gradient exposure, VA is the value from the ambient exposure at that point, and VG2 is the value at the point from a second gradient exposure such as the uniform light (flat gradient) or a second gradient created as described further below. The differential is computed for each point in the focal zone. Moreover, this differential also normalizes the effect of color variations and ambient light conditions. Notably, the differential is also substantially independent of intensity of the light source. Unfortunately, as a practical matter, changing slides and/or turning the light source on and off rapidly enough to permit digitization of many possible target objects is both expensive and problematic.
  • However, by taking advantage of the fact that the ISA integrates over time, the same effect may be created mechanically using a shutter which causes 0% to 100% of the light to illuminate the target object within the focal zone during the cycle. Moreover, by overdriving the shutter, the white light condition and ambient condition, can be created. Specifically, if the imaging time of the CCD is 5 milliseconds, in an initial 5 milliseconds the shutter does not impinge on the light source, thereby allowing the imaging sensing array to image the fully illuminated object. The next 5 milliseconds, the shutter passes from 0 to 100% blockage of the light, thereby creating the intensity gradient within the focal zone. During the next 5 milliseconds, the shutter continues to drive so that the light is entirely blocked and the ambient condition image is obtained. The processing of each of these images (including the creation of the differential) may be offloaded to an attached host as discussed in greater detail below. [0029]
  • An intensity gradient may alternatively be created by sweeping the light through the focal zone. For example, by sweeping a light stripe from left to right through the focal zone, the ambient light image may be captured before the light enters the zone. A first gradient is captured from the first entry of the light into the zone until the light is entirely within the zone. A second gradient is captured as a light translates out of the zone to the right. The second gradient is the opposite of the first gradient and is not flat as in the fully illuminated case. An analogous set of images may be captured as the light sweeps back from left to right. One advantage of sweeping the light is that two gradients are generated as the light moves from right to left and two gradients are generated as the light moves from left to right. Thus, the sweeping can be performed at half speed without a reduction in imaging performance. [0030]
  • The differential may take the same form as discussed above. Alternatively, the differential may be computed as X[0031] 1/(X1+X2), where X1=VG1−VA and X2=VG2−VA. To reduce noise sensitivity, the larger magnitude gradient should be selected for the numerator of the ratio. Color intensity is given by X1+X2.
  • FIG. 3 shows a schematic diagram of a intensity gradient cone of one embodiment of the invention. This example has been simplified to include only a single capture of one gradient as might be applicable to uniformly shaped and colored objects in a dark environment. This example, of course, could be expanded to cover non-uniform cases as discussed above using intensity differentials. A light source is located a distance L along a line normal to a line of sight of a linear ISA. The light source projects a gradient having a minimum intensity I[0032] A and maximum intensity IB. The angle θA corresponds to the angle between the minimum intensity edge of the projected gradient and the line normal to the ISA line of sight. θB corresponds to the angle defined by the maximum intensity edge of the projected gradient and the normal line. θ corresponds to the angle defined by a line from the gradient origin to the point on the target for which distance is to be determined and the normal line. Between IA and IB, intensity is proportioned to the angle θ and is given by I(θ)=(IB−IA)[(θ−θA)/(θB−θA)]+IA.
  • Thus, angle as a function of intensity is given by θ(I)=(θ[0033] B−θA)*[(I−IA)/(IB−IA)]+θA. The distance D is the given by D=L*tan θ. Therefore, distance as a function of intensity measured at the ISA is given by D=L*tan {(θB−θA)*[(I−IA)/(IB−IA)]+θA}. In one embodiment, the light source is fixed relative to the ISA and the minimum and maximum intensities are known or can be calibrated at run time. This allows intensity derived during capture to be mapped easily to distance with limited computational complexity.
  • In one embodiment, depth measurements are determined on a pixel by pixel basis. Because intensity captured by the single pixel is independent of intensity captured by other pixels and intensity maps directly to depth, it is possible to achieve one depth measurement per pixel. Moreover, it would be possible to obtain a depth measurement from an image sensor having a single sensing element. In an alternative embodiment, adjacent groups of pixels may be treated as an element with the aggregate captured intensity used to determine the depth measurement without regard to an energy distribution among the pixels in the group. This yields one measurement per elementary group of pixels. [0034]
  • As indicated, it is possible to calculate the depth data directly from the intensity information. However, the speed and processing power required are reduced when a lookup table (LUT) based on a prior calibration is employed to derive depth data based on the differentials. This also allows nonidealities of the physical equipment to be accounted for in the selected LUT entity, based on the prior calibration. Accordingly, the embodiment of the invention maintains a LUT and indexes into the LUT based on the differential. [0035]
  • In the foregoing specification, the invention has been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes can be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. [0036]

Claims (18)

What is claimed is:
1. A method comprising:
capturing an intensity at a location on a surface in a single pixel of an image sensing array (ISA); and
converting the intensity into a measurement of distance to the location relative to a reference point independently of data from other pixels of the ISA and independent of time of flight of light reflected from the location to the single pixel.
2. The method of claim 1 wherein the ISA is a linear image sensor.
3. The method of claim 2 wherein the linear image sensor is one of a linear charge coupled device (CCD) and a photo diode array.
4. The method of claim 1 further comprising:
occluding incoming light to the single pixel incrementally to generate an intensity gradient over the location.
5. The method of claim 1 wherein converting the intensity into a measurement of distance further comprises:
retrieving a distance value from a look up table based on a differential intensity value.
6. The method of claim 5, wherein distance values in the look up table are generated during a calibration of the ISA.
7. The method of claim 1, further comprising:
capturing the intensity at a predefined interval.
8. A method comprising:
capturing an intensity at a location on a surface in an elementary group of pixels on an image sensing array (ISA) without regard to intensity distribution within the group; and
converting the intensity into a measurement of distance to the location independently of data from other pixels on the ISA and independently of time of flight of light reflected from the location to the elementary group of pixels.
9. The method of claim 8 wherein the ISA is a linear image sensor.
10. The method of claim 9 wherein the linear image sensor is one of a linear charge coupled device (CCD) and a photo diode array.
11. The method of claim 8 further comprising:
occluding incoming light to the elementary group of pixels incrementally to generate an intensity gradient over the location.
12. The method of claim 8 wherein converting the intensity into a measurement of distance further comprises:
retrieving a distance value from a look-up table based on a differential intensity value.
13. The method of claim 12, wherein distance values in the look-up table are generated during a calibration of the ISA.
14. The method of claim 8, further comprising:
capturing the intensity at a predefined interval.
15. A method comprising:
capturing a spectral energy distribution returned from a location on a surface in a single pixel of an ISA; and
converting the spectral energy distribution into a measurement of distance to the location relative to a reference point independently of data from other pixels of the ISA and independent of time of flight of light reflected from the location to the single pixel.
16. A method comprising:
altering one of a spatial and optical relationship between an image sensing array (ISA) and a surface;
observing a variation of an electrical signal at a single pixel on the ISA responsive to the alteration; and
converting the variation to a measure of distance to a location on the surface relative to a reference point, independently of data from other pixels of the ISA and independent of time of flight of light reflected from the location to the single pixel.
17. A method comprising:
altering one of a spatial and optical relationship between an image sensing array (ISA) and a surface; and
observing a variation of an electrical signal at an elementary group of pixels on the ISA without regard to variations in electrical signals within the group responsive to the alteration.
18. A method comprising:
capturing an intensity at a location on a surface in a single pixel of a linear image sensing array (ISA); and
converting the intensity into a measurement of distance to the location relative to a reference point independently of data from other pixels of the linear ISA.
US10/886,176 2000-09-13 2004-07-06 Method for elementary depth detection in 3D imaging Abandoned US20040239949A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/886,176 US20040239949A1 (en) 2000-09-13 2004-07-06 Method for elementary depth detection in 3D imaging

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US09/660,809 US6639684B1 (en) 2000-09-13 2000-09-13 Digitizer using intensity gradient to image features of three-dimensional objects
US09/839,755 US6856407B2 (en) 2000-09-13 2001-04-19 Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels
US10/886,176 US20040239949A1 (en) 2000-09-13 2004-07-06 Method for elementary depth detection in 3D imaging

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/839,755 Continuation US6856407B2 (en) 2000-09-13 2001-04-19 Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels

Publications (1)

Publication Number Publication Date
US20040239949A1 true US20040239949A1 (en) 2004-12-02

Family

ID=27098192

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/839,755 Expired - Fee Related US6856407B2 (en) 2000-09-13 2001-04-19 Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels
US10/886,176 Abandoned US20040239949A1 (en) 2000-09-13 2004-07-06 Method for elementary depth detection in 3D imaging

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/839,755 Expired - Fee Related US6856407B2 (en) 2000-09-13 2001-04-19 Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels

Country Status (8)

Country Link
US (2) US6856407B2 (en)
EP (1) EP1317651A2 (en)
JP (1) JP2004509396A (en)
CN (1) CN1474932A (en)
AU (1) AU2001273360A1 (en)
CA (1) CA2422263A1 (en)
TW (1) TW531688B (en)
WO (1) WO2002023918A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060028400A1 (en) * 2004-08-03 2006-02-09 Silverbrook Research Pty Ltd Head mounted display with wave front modulator
TWI498526B (en) * 2013-06-05 2015-09-01 Nat Univ Chung Cheng Environment depth measurement method and its image acquisition device

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7337093B2 (en) * 2001-09-07 2008-02-26 Purdue Research Foundation Systems and methods for collaborative shape and design
US20030090530A1 (en) * 2001-09-07 2003-05-15 Karthik Ramani Systems and methods for collaborative shape design
US20040184653A1 (en) * 2003-03-20 2004-09-23 Baer Richard L. Optical inspection system, illumination apparatus and method for use in imaging specular objects based on illumination gradients
CN100394141C (en) * 2004-12-28 2008-06-11 陈胜勇 Method and equipment for realizes structured light in high performance based on uniqueness in field
WO2011085225A1 (en) * 2010-01-08 2011-07-14 Wake Forest University Health Sciences Delivery system
US8742309B2 (en) 2011-01-28 2014-06-03 Aptina Imaging Corporation Imagers with depth sensing capabilities
KR101983402B1 (en) 2011-03-07 2019-05-28 웨이크 포리스트 유니버시티 헬스 사이언시즈 Delivery system
US10015471B2 (en) 2011-08-12 2018-07-03 Semiconductor Components Industries, Llc Asymmetric angular response pixels for single sensor stereo
US8786730B2 (en) 2011-08-18 2014-07-22 Microsoft Corporation Image exposure using exclusion regions
US9554115B2 (en) * 2012-02-27 2017-01-24 Semiconductor Components Industries, Llc Imaging pixels with depth sensing capabilities
US8855404B2 (en) 2012-08-27 2014-10-07 The Boeing Company Methods and systems for inspecting a workpiece
WO2015015718A1 (en) * 2013-07-31 2015-02-05 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Sensor assembly
US9291447B2 (en) 2014-07-09 2016-03-22 Mitutoyo Corporation Method for controlling motion of a coordinate measuring machine
US9578311B2 (en) 2014-10-22 2017-02-21 Microsoft Technology Licensing, Llc Time of flight depth camera
US10742961B2 (en) * 2015-09-02 2020-08-11 Industrial Technology Research Institute Depth sensing apparatus with self-calibration and self-calibration method thereof
TWI620926B (en) 2016-11-04 2018-04-11 財團法人工業技術研究院 Workpiece surface detection method and system using the same
US10929994B2 (en) * 2016-12-07 2021-02-23 Electronics And Telecommunications Research Institute Image processing device configured to generate depth map and method of operating the same
US10122997B1 (en) 2017-05-03 2018-11-06 Lowe's Companies, Inc. Automated matrix photo framing using range camera input
US10885622B2 (en) 2018-06-29 2021-01-05 Photogauge, Inc. System and method for using images from a commodity camera for object scanning, reverse engineering, metrology, assembly, and analysis

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4490617A (en) * 1979-11-26 1984-12-25 European Electronic Systems Limited Optical width measuring system using two cameras
US5444537A (en) * 1992-10-27 1995-08-22 Matsushita Electric Works, Ltd. Method for shape detection and apparatus therefor
US5621529A (en) * 1995-04-05 1997-04-15 Intelligent Automation Systems, Inc. Apparatus and method for projecting laser pattern with reduced speckle noise
US5799082A (en) * 1995-11-07 1998-08-25 Trimble Navigation Limited Secure authentication of images
US5831621A (en) * 1996-10-21 1998-11-03 The Trustees Of The University Of Pennyslvania Positional space solution to the next best view problem
US6040910A (en) * 1998-05-20 2000-03-21 The Penn State Research Foundation Optical phase-shift triangulation technique (PST) for non-contact surface profiling
US6057909A (en) * 1995-06-22 2000-05-02 3Dv Systems Ltd. Optical ranging camera
US6100517A (en) * 1995-06-22 2000-08-08 3Dv Systems Ltd. Three dimensional camera

Family Cites Families (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3636250A (en) 1964-02-26 1972-01-18 Andrew V Haeff Apparatus for scanning and reproducing a three-dimensional representation of an object
US4089608A (en) 1976-10-21 1978-05-16 Hoadley Howard W Non-contact digital contour generator
US4590608A (en) 1980-05-30 1986-05-20 The United States Of America As Represented By The Secretary Of The Army Topographic feature extraction using sensor array system
US4443705A (en) * 1982-10-01 1984-04-17 Robotic Vision Systems, Inc. Method for locating points on a three-dimensional surface using light intensity variations
US4564295A (en) 1983-03-07 1986-01-14 New York Institute Of Technology Apparatus and method for projection moire topography
US4657394A (en) 1984-09-14 1987-04-14 New York Institute Of Technology Apparatus and method for obtaining three dimensional surface contours
US4641972A (en) 1984-09-14 1987-02-10 New York Institute Of Technology Method and apparatus for surface profilometry
US4724525A (en) 1984-12-12 1988-02-09 Moore Special Tool Co., Inc. Real-time data collection apparatus for use in multi-axis measuring machine
US4737032A (en) 1985-08-26 1988-04-12 Cyberware Laboratory, Inc. Surface mensuration sensor
US4705401A (en) 1985-08-12 1987-11-10 Cyberware Laboratory Inc. Rapid three-dimensional surface digitizer
JPH0615968B2 (en) 1986-08-11 1994-03-02 伍良 松本 Three-dimensional shape measuring device
US4846577A (en) 1987-04-30 1989-07-11 Lbp Partnership Optical means for making measurements of surface contours
GB8716369D0 (en) 1987-07-10 1987-08-19 Travis A R L Three-dimensional display device
US5315512A (en) 1989-09-01 1994-05-24 Montefiore Medical Center Apparatus and method for generating image representations of a body utilizing an ultrasonic imaging subsystem and a three-dimensional digitizer subsystem
DE3941144C2 (en) 1989-12-13 1994-01-13 Zeiss Carl Fa Coordinate measuring device for the contactless measurement of an object
US5067817A (en) 1990-02-08 1991-11-26 Bauer Associates, Inc. Method and device for noncontacting self-referencing measurement of surface curvature and profile
DE4007500A1 (en) 1990-03-09 1991-09-12 Zeiss Carl Fa METHOD AND DEVICE FOR CONTACTLESS MEASUREMENT OF OBJECT SURFACES
JP2892430B2 (en) 1990-03-28 1999-05-17 株式会社日立製作所 Method and apparatus for displaying physical quantity
EP0488987B1 (en) 1990-11-26 1996-01-31 Michael Dr. Truppe Method for representing moving bodies
US5131844A (en) 1991-04-08 1992-07-21 Foster-Miller, Inc. Contact digitizer, particularly for dental applications
US5231470A (en) 1991-09-06 1993-07-27 Koch Stephen K Scanning system for three-dimensional object digitizing
US5377011A (en) 1991-09-06 1994-12-27 Koch; Stephen K. Scanning system for three-dimensional object digitizing
DE4134546A1 (en) 1991-09-26 1993-04-08 Steinbichler Hans METHOD AND DEVICE FOR DETERMINING THE ABSOLUTE COORDINATES OF AN OBJECT
US5175601A (en) 1991-10-15 1992-12-29 Electro-Optical Information Systems High-speed 3-D surface measurement surface inspection and reverse-CAD system
FR2688642B3 (en) 1992-02-17 1994-06-03 Galram Technology Ind Ltd DEVICE AND METHOD FOR FORMING A HIGH RESOLUTION IMAGE OF A THREE-DIMENSIONAL OBJECT.
US5216817A (en) 1992-03-18 1993-06-08 Colgate-Palmolive Company Digitizer measuring system
US5636025A (en) 1992-04-23 1997-06-03 Medar, Inc. System for optically measuring the surface contour of a part using more fringe techniques
US5432622A (en) 1992-05-29 1995-07-11 Johnston; Gregory E. High-resolution scanning apparatus
US5307292A (en) 1992-06-24 1994-04-26 Christopher A. Brown Method of quantifying the topographic structure of a surface
AT399647B (en) 1992-07-31 1995-06-26 Truppe Michael ARRANGEMENT FOR DISPLAYING THE INTERIOR OF BODIES
US5337149A (en) 1992-11-12 1994-08-09 Kozah Ghassan F Computerized three dimensional data acquisition apparatus and method
US5414647A (en) 1992-11-23 1995-05-09 Ford Motor Company Non-contact method and system for building CAD models by integrating high density data scans
US5611147A (en) 1993-02-23 1997-03-18 Faro Technologies, Inc. Three dimensional coordinate measuring apparatus
DE4313860A1 (en) * 1993-04-28 1994-11-03 Ralf Lampalzer Optical sensor for shape recognition of three-dimensional objects
JP3057960B2 (en) 1993-06-22 2000-07-04 トヨタ自動車株式会社 Evaluation equipment for three-dimensional workpieces
US5999641A (en) 1993-11-18 1999-12-07 The Duck Corporation System for manipulating digitized image objects in three dimensions
US5661667A (en) 1994-03-14 1997-08-26 Virtek Vision Corp. 3D imaging using a laser projector
US5471303A (en) 1994-04-29 1995-11-28 Wyko Corporation Combination of white-light scanning and phase-shifting interferometry for surface profile measurements
GB9413870D0 (en) 1994-07-09 1994-08-31 Vision 1 Int Ltd Digitally-networked active-vision camera
US5531520A (en) 1994-09-01 1996-07-02 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets including anatomical body data
US5747822A (en) 1994-10-26 1998-05-05 Georgia Tech Research Corporation Method and apparatus for optically digitizing a three-dimensional object
US5617645A (en) 1995-05-02 1997-04-08 William R. W. Wick Non-contact precision measurement system
JPH0969978A (en) 1995-08-30 1997-03-11 Sanyo Electric Co Ltd Image pickup device
JP4031841B2 (en) 1995-09-05 2008-01-09 株式会社コダックデジタルプロダクトセンター Digital camera
US5689446A (en) 1995-11-03 1997-11-18 Amfit, Inc. Foot contour digitizer
US5646733A (en) 1996-01-29 1997-07-08 Medar, Inc. Scanning phase measuring method and system for an object at a vision station
US6044170A (en) 1996-03-21 2000-03-28 Real-Time Geometry Corporation System and method for rapid shape digitizing and adaptive mesh generation
US5771310A (en) 1996-12-30 1998-06-23 Shriners Hospitals For Children Method and apparatus for recording three-dimensional topographies
US5988862A (en) 1996-04-24 1999-11-23 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three dimensional objects
US5870220A (en) 1996-07-12 1999-02-09 Real-Time Geometry Corporation Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation
CA2263763C (en) 1996-08-23 2006-01-10 Her Majesty The Queen, In Right Of Canada, As Represented By The Ministe R Of Agriculture And Agri-Food Canada Method and apparatus for using image analysis to determine meat and carcass characteristics
US5864640A (en) 1996-10-25 1999-01-26 Wavework, Inc. Method and apparatus for optically scanning three dimensional objects using color information in trackable patches
US5805289A (en) 1997-07-07 1998-09-08 General Electric Company Portable measurement system using image and point measurement devices
IL121267A0 (en) 1997-07-09 1998-01-04 Yeda Res & Dev Method and device for determining the profile of an object
US5910845A (en) 1997-12-02 1999-06-08 Brown; Thomas Mattingly Peripheral viewing optical scanner for three dimensional surface measurement
US6141753A (en) 1998-02-10 2000-10-31 Fraunhofer Gesellschaft Secure distribution of digital representations
AU3991799A (en) * 1998-05-14 1999-11-29 Metacreations Corporation Structured-light, triangulation-based three-dimensional digitizer
CA2306515A1 (en) 2000-04-25 2001-10-25 Inspeck Inc. Internet stereo vision, 3d digitizing, and motion capture camera
CN1227891C (en) 2000-09-13 2005-11-16 内克斯坦金公司 Image system monitored or controlled to ensure fidelity of files captured

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4490617A (en) * 1979-11-26 1984-12-25 European Electronic Systems Limited Optical width measuring system using two cameras
US5444537A (en) * 1992-10-27 1995-08-22 Matsushita Electric Works, Ltd. Method for shape detection and apparatus therefor
US5621529A (en) * 1995-04-05 1997-04-15 Intelligent Automation Systems, Inc. Apparatus and method for projecting laser pattern with reduced speckle noise
US6057909A (en) * 1995-06-22 2000-05-02 3Dv Systems Ltd. Optical ranging camera
US6091905A (en) * 1995-06-22 2000-07-18 3Dv Systems, Ltd Telecentric 3D camera and method
US6100517A (en) * 1995-06-22 2000-08-08 3Dv Systems Ltd. Three dimensional camera
US5799082A (en) * 1995-11-07 1998-08-25 Trimble Navigation Limited Secure authentication of images
US5831621A (en) * 1996-10-21 1998-11-03 The Trustees Of The University Of Pennyslvania Positional space solution to the next best view problem
US6040910A (en) * 1998-05-20 2000-03-21 The Penn State Research Foundation Optical phase-shift triangulation technique (PST) for non-contact surface profiling

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060028400A1 (en) * 2004-08-03 2006-02-09 Silverbrook Research Pty Ltd Head mounted display with wave front modulator
TWI498526B (en) * 2013-06-05 2015-09-01 Nat Univ Chung Cheng Environment depth measurement method and its image acquisition device

Also Published As

Publication number Publication date
AU2001273360A1 (en) 2002-03-26
CA2422263A1 (en) 2002-03-21
US20020033955A1 (en) 2002-03-21
TW531688B (en) 2003-05-11
CN1474932A (en) 2004-02-11
WO2002023918A3 (en) 2003-02-06
WO2002023918A2 (en) 2002-03-21
JP2004509396A (en) 2004-03-25
US6856407B2 (en) 2005-02-15
EP1317651A2 (en) 2003-06-11

Similar Documents

Publication Publication Date Title
US6856407B2 (en) Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels
US7358986B1 (en) Digital imaging system having distribution controlled over a distributed network
US9689972B2 (en) Scanner display
US6920242B1 (en) Apparatus and method for point cloud assembly
JP5882264B2 (en) 3D video scanner
US9074878B2 (en) Laser scanner
CN102257353B (en) Device and method for three-dimensional optical measurement of strongly reflective or transparent objects
US20140218480A1 (en) Hand held portable three dimensional scanner
US11506767B2 (en) Method for optically scanning and measuring an environment using a 3D measurement device and near field communication
US20210096359A1 (en) Environmental scanning and image reconstruction thereof
US11523029B2 (en) Artificial intelligence scan colorization
JPH09287913A (en) Apparatus for detecting position of object, method for detecting human body
CN112034485A (en) Reflectivity sensing with time-of-flight camera
US20220120863A1 (en) Three-dimensional scanning and image reconstruction thereof
Rieke-Zapp et al. Structured light 3D scanning
US20230153967A1 (en) Removing reflection from scanned data
EP4181063A1 (en) Markerless registration of image data and laser scan data
KR100241006B1 (en) Apparatus for detecting surface defects of thick plate using image processing
WO2022097133A1 (en) Generation of a temperature map

Legal Events

Date Code Title Description
AS Assignment

Owner name: BIGFOOT PRODUCTIONS, INC.,NEW YORK

Free format text: SECURITY INTEREST;ASSIGNOR:NEXTENGINE, INC.;REEL/FRAME:016536/0179

Effective date: 20050905

Owner name: BIGFOOT PRODUCTIONS, INC., NEW YORK

Free format text: SECURITY INTEREST;ASSIGNOR:NEXTENGINE, INC.;REEL/FRAME:016536/0179

Effective date: 20050905

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION