WO2009078002A1 - 3d camera and methods of gating thereof - Google Patents

3d camera and methods of gating thereof Download PDF

Info

Publication number
WO2009078002A1
WO2009078002A1 PCT/IL2007/001571 IL2007001571W WO2009078002A1 WO 2009078002 A1 WO2009078002 A1 WO 2009078002A1 IL 2007001571 W IL2007001571 W IL 2007001571W WO 2009078002 A1 WO2009078002 A1 WO 2009078002A1
Authority
WO
WIPO (PCT)
Prior art keywords
gate
light
gates
scene
pulse
Prior art date
Application number
PCT/IL2007/001571
Other languages
French (fr)
Inventor
Giora Yahav
Gil Zigelman
Allan C. Entis
Original Assignee
Microsoft International Holdings B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft International Holdings B.V. filed Critical Microsoft International Holdings B.V.
Priority to EP07849597A priority Critical patent/EP2235563A1/en
Priority to CN2007801023367A priority patent/CN102099703A/en
Priority to PCT/IL2007/001571 priority patent/WO2009078002A1/en
Publication of WO2009078002A1 publication Critical patent/WO2009078002A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/18Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used

Definitions

  • the invention relates to methods and apparatus for acquiring 3D images of a scene.
  • 3D optical imaging systems hereinafter referred to as "3D cameras”, that are capable of providing distance measurements to objects and points on objects that they image, are used for many different applications.
  • these applications are profile inspections of manufactured goods, CAD verification, robot vision, geographic surveying and imaging objects selectively as a function of distance.
  • Some 3D cameras provide simultaneous measurements to substantially all points of objects in a scene they image.
  • these 3D cameras comprise a light source, such as a laser, which is pulsed or shuttered so that it provides pulses of light for illuminating a scene being imaged and a gated imaging system for imaging light from the light pulses that is reflected from objects in the scene.
  • the gated imaging system comprises a camera having a photosensitive surface, hereinafter referred to as a "photosurface”, such as a CCD or CMOS photosurface and a gating means for gating the camera open and closed, such as an optical shutter or a gated image intensif ⁇ er.
  • the reflected light is registered on pixels of the photosurface of the camera only if it reaches the camera when the camera is gated open.
  • the scene is generally illuminated with a train of light pulses radiated from the light source. For each radiated light pulse in the train, following an accurately determined delay from the time that the light pulse is radiated, the camera is gated open for a period of time hereinafter referred to as a "gate".
  • a gate Light from the light pulse that is reflected from an object in the scene is imaged on the photosurface of the camera if it reaches the camera during the gate.
  • the time elapsed between radiating a light pulse and the gate that follows it is known, the time it took imaged light to travel from the light source to the reflecting object in the scene and back to the camera is known. The time elapsed is used to determine the distance to the object.
  • the cameras described in these patents use amounts of light registered by pixels in the camera during times at which the camera is gated open to determine distances to features in contiguous slices of a scene.
  • the slice locations and spatial widths are defined by length and timing of gates during which the cameras are gated open relative to pulse lengths and timing of light pulses that are transmitted to illuminate the scene.
  • At least two gates are used to acquire a 3D image of a slice of the scene. Relative to a time at which a light pulse is transmitted to illuminate the scene, a front gate starts at a same time that a long gate begins and a back gate ends at a same time as a long gate ends.
  • the short gates optionally have a gate width equal to a pulse width of pulses of light used to illuminate the scene and a long gate has a pulse width equal to twice the light pulse width.
  • Amounts of light registered during the at least one short gate by a pixel in the camera that images a feature of the scene are normalized to amounts of light registered by the pixel during the at least one long gate.
  • the normalized amounts of registered light are used to determine a distance to the feature.
  • a total acquisition time for acquiring data for a 3D image of a scene is substantially equal to a number of slices of the scene that are imaged, times a time, a "slice acquisition time", required to acquire 3D data for a single slice.
  • a slice acquisition time is a function of a number of light pulses and gates required to register quantities of light for the various gates sufficient to provide data for determining distances to features of the scene located in the slice.
  • a 3D camera using a pulsed source of illumination and a gated imaging system is described in "Design and Development of a Multi-detecting two Dimensional Ranging Sensor", Measurement Science and Technology 6 (September 1995), pages 1301-1308, by S. Christie, et al., and in “Range-gated Imaging for Near Field Target Identification", Yates et al, SPIE Vol. 2869, p374 - 385 which are herein incorporated by reference.
  • Another 3D camera is described in U.S. patent 5,081,530 to Medina, which is incorporated herein by reference.
  • a 3D camera described in this patent registers energy in a pulse of light reflected from a target that reaches the camera's imaging system during each gate of a pair of gates. Distance to a target is determined from the ratio of the difference between the amounts of energy registered during each of the two gates to the sum of the amounts of energy registered during each of the two gates.
  • An aspect of some embodiments of the invention relates to providing a gating procedure for gating a gated 3D camera that provides a relatively short acquisition time for a 3D image of a scene.
  • An aspect of some embodiments of the invention relates to providing a configuration of gates for providing a 3D image of a scene for which, for a given width of a light pulse that illuminates the scene the gates provide 3D data for slices of the scene that have relatively large spatial widths.
  • a method of acquiring a 3D image of a scene comprising: transmitting at least one light pulse at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of sets of gates, each set comprising at least one first, second and third gate having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of the at least one first gate and the at least one second gate are between the start and stop times of the at least one third gate, wherein for at least one of the set of gates, the at least one third gate has a start time equal to about the stop time of the at least one third gate of another of the set of gates; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.
  • the start time of the second gate is substantially equal to the stop time of the first gate.
  • the start time of the first gate is delayed relative to the start time of the third gate by a pulse width of the at least one light pulse.
  • the first and second gates have equal gate widths.
  • the gate width of the first and second gates is equal to half a pulse width of the at least one light pulse.
  • the gate width of the third gate is substantially equal to three pulse widths of the at least one light pulse.
  • the at least one third gate has a start time earlier than the stop time of the at least one third gate
  • a method of acquiring a 3D image of a scene comprising: transmitting at least one light pulse at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of equal length gates having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start time of at least one first gate is substantially a time half way between the start and stop times of at least one second gate; and using amounts of reflected light imaged on the photosurface during the at least one first gate and the at least one second gate to determine distance to a feature of the scene.
  • the gates have a gate width substantially equal to twice a pulse width of the at least one light pulse.
  • a method of acquiring a 3D image of a scene comprising: transmitting at least one light pulse having a pulse width at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a set of gates comprising at least one first, second and third gates having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of each first gate and each second gate are between the start and stop times of a same third gate; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.
  • the first gate has a start time delayed relative to a start time of the third gate by about the pulse width.
  • the second gate has a stop time that precedes a stop time of the at least one third gate by about a pulse width.
  • the first gate has a gate width equal to about half a pulse width.
  • the second gate has a gate width equal to about half a pulse width.
  • the third gate has a gate width equal to about three pulse widths.
  • a camera useable to acquire a 3D image of a scene comprising: a gateable photosurface; and a controller that gates the photosurface in accordance with an embodiment of the invention.
  • the camera comprises a light source controllable to illuminate the scene with a pulse of light.
  • Fig. 1 schematically illustrates a gated 3D camera being used to acquire a 3D image of a scene
  • Fig. 2 shows a time-distance graph that illustrates a temporal configuration of gates of the camera and at least one light pulse that illuminates the scene shown in Fig. 1 used to acquire a 3D image of the scene, in accordance with prior art;
  • Fig. 3 shows a time-distance graph that illustrates an ambiguity in determining distance to a feature in a scene, in accordance with prior art
  • Fig 4 shows a time-distance graph that illustrates a temporal configuration of gates for removing the ambiguity illustrated in Fig. 3, in accordance with an embodiment of the invention
  • Fig. 5 schematically shows a scene and slices of the scene for which a gated camera provides 3D data
  • Fig. 6 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for the plurality of slices of the scene shown in Fig. 5 using the gating configuration illustrated in Fig. 4, in accordance with an embodiment of the invention
  • Fig. 7 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for a plurality of slices of a scene, in accordance with another embodiment of the invention
  • Fig. 8 shows a time-distance graph that graphically illustrates another timing configuration of gates used to acquire 3D data for a scene, in accordance with prior art
  • Fig. 9 shows a time-distance graph that graphically illustrates another timing configuration of gates used to acquire 3D data for a scene, in accordance with prior art
  • Fig. 10 shows a time-distance graph that graphically illustrates timing of gates configured as shown Fig. 9 used to acquire 3D data for a plurality of slices of a scene, in accordance with prior art
  • Fig. 11 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for a plurality of slices of a scene, in accordance with another embodiment of the invention.
  • Fig. 1 schematically illustrates a gated 3D camera 20 being used to acquire a 3D image of a scene 30 having objects schematically represented by objects 31 and 32.
  • Camera 20 which is represented very schematically, comprises a lens system, represented by a lens 21, and a photosurface 22 having pixels 23 on which the lens system images the scene.
  • Photosurface 22 is "gateable” so that it may be selectively gated on or off to be made sensitive or insensitive respectively to light for desired periods.
  • pixels in the photosurface are independently gateable.
  • Photosurfaces that are gated by shutters are described in US Patents 6,057,909, 6,327,073, 6,331,911 and 6,794,628, the disclosures of which are incorporated herein by reference.
  • Photosurfaces having independently gateable pixels are described in PCT Publication WO 00/36372 describes photosurfaces having pixels that are independently gateable.
  • a “gate” refers to a period during which a photosurface or pixel in the photosurface is gated on and made sensitive to light. For convenience of presentation, it is assumed that all the pixels 23 in photosurface 22 are gated on or off simultaneously.
  • Camera 20 optionally comprises a suitable light source 26, such as for example, a laser or a LED or an array of lasers and/or LEDs, controllable to illuminate scene 30 with pulses of light.
  • a controller 24 controls pulsing of light source 26 and gating of photosurface 22.
  • controller 24 controls light source 26 to emit at least one light pulse, schematically represented by a wavy arrow 40, to illuminate scene 30.
  • Light from each at least one light pulse 40 is reflected by features in scene 30 and some of the reflected light is incident on camera 20 and collected by lens 21. Reflected light from at least one light pulse 40 that reaches camera 20 is schematically represented by wavy arrows 45.
  • controller 24 gates photosurface 22 on at a suitable time relative to a time at which the light pulse is emitted to receive and image reflected light 45 collected by lens 21 on photosurface 22. Amounts of light 45 imaged on pixels 23 of the photosurface are used to determine distances to features of scene 30 that are imaged on the pixels and provide thereby a 3D image of the scene.
  • Fig. 2 shows a time-distance graph 60 that illustrates relationships between timing of a light pulse 40 that illuminates scene 30, a gate of photosurface 22 and amounts of light 45 collected by lens 21 and registered by a pixel 23 in the photosurface that images a feature of scene 30 located at a distance D from camera 20.
  • distance D from camera 20 is indicated along an abscissa 61 and time is indicated along right and left ordinates 62 and 63 respectively.
  • the ordinates are scaled in units proportional to "ct" - time "t” multiplied by the speed of light "c".
  • camera 20 will receive light reflected from the light pulse by the feature for a period having duration ⁇ xp W from a time at which a first photon reflected by the feature from the light pulse reaches the camera.
  • Block arrows 72 extending from time tj(Df) to time t2(Df) and having a length Axp W shown along left and right hand time ordinates 62 and 63 schematically represent light reflected by feature 71 that reaches camera 20.
  • Aw 11 , A ⁇ p W .
  • FIG. 1 imaging the feature will register light from the feature.
  • Q 0 is an amount of light that would be registered by pixel 23 imaging feature 71, were all the light reflected from pulse 40 by the feature and collected by lens 21 registered by the pixel, irrespective of whether or not the reflected light reached the camera during short gate 80.
  • Q 0 is a function of reflectance of feature 71 and its distance Df from the camera.
  • Q 0 is determined by controlling light source 26 to emit at least one light pulse 40 and for each emitted light pulse 40, gating camera 20 with a long gate so that all light reflected by feature 71 from the light pulse that reaches the camera is imaged on photosurface 22.
  • Camera 20 receives and registers light reflected from light pulse 40 on its photosurface 22 during short gate 80 for any feature of scene 30 located at a distance Df between lower and upper bound distances, DSL 3 ⁇ DSy respectively, from the camera.
  • the upper and lower bound distances define a "slice", schematically indicated in graph 60 by a shaded rectangle 83, of scene 30.
  • a long gate corresponding to short gate 80 for determining Q 0 for any feature located in slice 83 is schematically represented by a rectangle 90 along right hand time ordinate 63.
  • Light registered by pixel 23 that images feature 71 during long gate 90 is graphically represented by a shaded area 91 in gate 90.
  • long gate 90 optionally has a gate width ⁇ /,TM, at least equal to a sum of short gate width Aw 14 , and twice the pulse width Aw 1 , of light pulse 40, i. e.
  • Gate 90 optionally has a start time tjgi relative to time t 0 at which light pulse 40 is emitted that is equal to a time that a first photon from a feature in scene 30 having a distance DSL f rom camera 20 reaches the camera.
  • Gate 90 optionally has a stop time t/g2 equal to a time at which a last photon from pulse 40 reaches camera 20 from a feature of the scene having a distance DSu fr° m camera 20.
  • long gate 90 provides information sufficient to determine Q 0 for any feature in slice 83, it is noted that neither a quantity Q of light 81 registered on pixel 23 that images feature 71 during short gate 80 or a quantity Q 0 of light 91 registered by the pixel during long gate 90, provides information as to which of equations (3) or (4) should be to used to determine
  • Df and Df * satisfy a relationship:
  • Fig. 3 schematically shows distance Df and complimentary distance Df* for feature 71 in a time-distance graph 75.
  • Amounts of light registered by pixel 23 for distance Df are schematically shown by shaded areas 81 and 91 for short and long gates 80 and 90 respectively, both of which gates are schematically shown along left time ordinate 62.
  • Position of feature 71 for distance Df* is indicated along time-distance line 70 by a circle 71*.
  • Amounts of light registered by pixel 23 for Df* are schematically represented by shaded areas 81* and 91* for gates 80 and 90, which for distance Df* are shown along right hand time ordinate 63.
  • a block arrow labeled 72* schematically represents light from light pulse 40 reflected from feature 71 that reaches camera 20 were the feature to be located at Df*. It is noted that whereas light is registered for distances Df and Df* by pixel 23 at different times during gates 80 and 90, the amounts of light registered during gate 80 are the same in each case, as are the amounts registered during gate 90.
  • the ambiguity of whether to use equation (3) or equation (4) in determining Df is removed by gating photosurface 22 on following each of at least one light pulse 40 for two short gates, a first, "front” short gate and a second, “back” short gate, optionally having equal gate widths.
  • photosurface 22 is gated on for a front short gate having a gate width ⁇ p ⁇ g ⁇ at a front gate start time, tpg, following a time at which the light pulse is emitted.
  • a shaded area 91 represents' light registered by pixel ' 23 during long gate 90 for distance Df.
  • Shaded areas 103 and 104 in front and back short gates 101 and 102 respectively graphically represent amounts of light registered by pixel 23 for feature 71, located at Df, during the short gates.
  • the same front and back short gates 101 and 102 and long gate 90 are shown along right hand time ordinate 63.
  • Shaded areas 105 and 106 in front and back short gates 101 and 102 shown along right hand time ordinate 63 represent amounts of light registered by pixel 23 during the short gates respectively, assuming that feature 71 were to be located at Df .
  • a shaded area 91 represents light registered by pixel 23 during long gate 90 for Df*. From Fig.
  • At least one light pulse 40 comprises a plurality of light pulses and each quantity of registered charge Qp 5 QB and Q 0 is determined from a train of light pulses 40.
  • controller 24 optionally controls light source 26 to emit a train of light pulses 40.
  • controller 24 gates photosurface 22 on for a short gate 101. A total amount of light registered by each pixel 23 for all short gates 101 is used to provide Qp for the pixel.
  • Qg another train of light pulses 40 is emitted and following a delay of t ⁇ g (equation (11)) after an emission time of each light pulse, photosurface 22 is gated on for a back gate 102.
  • a total amount of light registered by each pixel for all the light pulses in the pulse train is used to determine Qg.
  • Q 0 is similarly determined from a pulse train of light pulses 40 and a long gate 90 for each light pulse in the light pulse train, which gate 90 follows the light pulse by a delay of t/gj (equation (8)).
  • a "slice acquisition time", "Ts”, be a time during which a scene, such as scene 30, is illuminated by light source 26 with light pulses 40 and camera 20 gated to acquire values for Qp, QB and Q 0 for pixels 23 in photosurface 22 to provide distances to features in a slice of the scene imaged on the pixels.
  • slices imaged by a gated camera such as camera 20 are relatively narrow, and a scene for which distances are to be determined using the camera typically has features located in a range of distances substantially greater than a range defined by lower and upper bound distances DSL anc ⁇ DSy (Figs. 2-4) of the slice.
  • a plurality of substantially contiguous slices of the scene are imaged using camera 20.
  • the scene is illuminated by at least one light pulse 40 and camera 20 is gated on for front, back and long gates to acquire distances to features in the slice.
  • Tf total 3D scene acquisition time
  • Fig. 5 schematically shows scene 30 shown in Fig. 1 and a plurality "N" of slices S ⁇ , S 2 ... Sj ⁇ that are used by camera 20 to provide a 3D image of the scene, which extends from a range R ⁇ to a range R.2-
  • the slices shown in Fig. 5 are represented in graph 95 by shaded rectangles labeled Sj, S2 ... Sjq- along time-distance line 70.
  • slices represented by S n having a larger subscript n are farther from camera 20 than slices represented by S n having a smaller index, and slices whose indices differ by 1 are contiguous.
  • the short front and back gates and the long gate that define a given slice S n are graphically represented by rectangles labeled respectively FG n , BG n and LG n respectively. Gates for adjacent slices S n are shown on opposite sides of time ordinate 62.
  • a first slice Sj in range R1-R2 defined by gates FGj, BGj and LG ⁇ is slice 83 shown in Fig. 4 and shaded regions 103, 104 and 91 (Fig. 4) graphically representing light registered by a pixel 23 for feature 71 are shown for gates FGj , BGj and LGj.
  • a total 3D acquisition time Tj (equation (16)) required to acquire a 3D image of a scene by 3D imaging contiguous slices as described above can be too long to provide satisfactory 3D imaging of the scene.
  • an acquisition time Tj may be too long for scenes having moving features that displace by distances on the order of a slice width during time Tj.
  • a total time Tf for acquiring a 3D image of a scene can be reduced by modifying timing of gates used to acquire values for Qp, QQ and Q 0 for slices of the scene, optionally, without changing the lengths of the gates.
  • a time delay equal to 2Axp W temporally separates start times of front gates for contiguous slices, and long gates for the slices overlap (relative to t 0 ) as shown in Fig. 4.
  • a 3D image of a scene can be acquired, in accordance with an embodiment of the invention, in a reduced total acquisition time T ⁇ if long gates of adjacent slices are timed so that they do not overlap and when one long gate of adjacent slices ends, the other begins.
  • interstitial slices For the interstitial slices, the front and short gates do not provide 3D information. In accordance with an embodiment of the invention, the non-overlapping long gates provide additional information for determining distance to features located in the interstitial slices. For convenience of presentation, slices of a scene for which short front and back gates provide information are referred to as "regular slices".
  • each long gate and its associated front and back short gates provide 3D data for a regular slice and in addition 3D data for an interstitial slice. If the front, back and non-overlapping long, gates have same gate widths "SW" as corresponding gates having overlapping long gates shown in Fig. 6 that define a slice of a scene, a regular gate widths "SW" as corresponding gates having overlapping long gates shown in Fig. 6 that define a slice of a scene, a regular
  • each set of gates that define a regular slice of the scene provides 3D data for features in larger range of distances than a corresponding set of gates in Fig. 6. Therefore, a smaller number of sets of gates having non-overlapping long
  • a slice acquisition time for a regular slice is about the same as a slice acquisition time for a slice defined using overlapping gates.
  • a total 3D acquisition time Tj for a scene using non-overlapping long gates in accordance with an embodiment of the invention is therefore generally shorter than a total
  • Fig. 7 schematically shows a timing and distance graph 115 that graphically illustrates timing of short front and back, and non-overlapping long gates, in accordance with an embodiment of the invention.
  • Regular slices of scene 30 that are imaged by camera 20 are, graphically represented by shaded rectangles that are labeled RSj RS2 ... RS ⁇ along time-
  • corresponding front back and long gates of camera 20 that define the slice are graphically represented by rectangles labeled RFG n , RBG n and RLG n respectively.
  • the gates are graphically shown along left hand time ordinate 62, with gates for adjacent regular slices shown on opposite sides of the time ordinate. Relative to a start time for any given long gate RLG n , the corresponding short front and back gates RFG n , RBG n have a
  • First regular slice RS j in graph 115 is, by way of example, identical to slice S ⁇ shown in graph 95 of Fig. 6. And relative to an emission time t 0 of a first light pulse 40, the start, stop and gate lengths for gates RFG j , RBGi 3 ⁇ RLGi 3 ⁇ me same respectively as for gates FGj, BG ⁇ and LG ⁇ n Fig. 6.
  • Distance to a feature located in a regular slice RS n is determined from amounts of light Qp, Qg and Q 0 registered by a pixel 23 that images the feature during front, back and long gates RFG n , RBG n and RLG n that define the slice, using equations (13) and (14). Amounts of light registered by a pixel that images feature 71 during gates RFGi, RBGl and RLGj are graphically represented by shaded areas 103, 104 and 91 respectively in the gates.
  • a long gate RLG n begins when a preceding long gate RLG n .
  • regular slices RS n are not contiguous but are separated by interstitial slices.
  • Interstitial slices are graphically represented in Fig. 7 by shaded rectangles labeled IS n n +i, where the subscripts n, n+1 indicate the regular slices RS n and RS n + ⁇ that bracket the interstitial slice.
  • a pixel 23- that images the feature does not register light reflected by the feature from a light pulse 40 during short and front gates of regular slices that bracket the interstitial slice.
  • 3D information for the feature is not acquired by camera 20 during the front and back gates and distance to the feature cannot be determined using conventional equations such as equations (13) and (14).
  • the pixel does not register any light reflected from a light pulse 40 by the feature during the short gates, the pixel does register light reflected from a light pulse 40 during long gates of the regular slices that bracket the interstitial slice.
  • a feature of scene 30 located in interstitial slice IS 1 2 at a distance Df(121) is schematically represented by a circle labeled 121 along time-distance line 70.
  • Reflected light registered by a pixel 23 that images the feature during long gates RLGj and RLG2 is graphically indicated by shaded regions 122 and 124 respectively.
  • light registered by the long gates is used to determine distance to the feature.
  • a camera such as camera 20, used to determine distances to features of a scene and gated in accordance with an embodiment of the invention, let amounts of light registered by a pixel that images a feature in the scene during gates RFG n , RBG n be represented by Q F (n) and Q B (n) respectively. Let times at which the gates RFG n and RBG n are turned on be represented by tpg(n) and t Bg (n) respectively. Then distance Df to the feature may be determined in accordance with an embodiment of the invention, using the following general set of conditions and equations. If QF(II) ⁇ 0 or Q B (n) ⁇ 0 (20)
  • adjacent long gates are timed, relative to time t 0 , to "overlap" slightly to remove the ambiguity.
  • a long gate may have a start time earlier than the stop time of an immediately previous long gate by a period equal to or less than about a tenth of a pulse width.
  • to provide the overlap long gates are lengthened by the period of the overlap.
  • camera 30 is gated on for N sets of gates RFG n , RBG n and RLG n where (R 2 -Ri)/[3/2)SW], (27)
  • a slice acquisition time T ⁇ for a regular slice in accordance with an embodiment of the invention, is the same as a slice acquisition time for a slice acquired in accordance with an embodiment of the invention illustrated in Fig. 6.
  • a total time, Tj/, to acquire data for a 3D image of scene 30 in accordance with an embodiment of the invention is therefore 2/3 the time required to 3D image the scene in accordance with gating shown in Fig. 6.
  • distances to features in a slice of scene, such as scene 30, are determined by a short, "extended", front gate having a gate width equal to the pulse width A%p W of light pulses 40 (Fig. 1) that illuminate the scene.
  • the extended front gate and corresponding long gate begin at a same time relative to an emission time t 0 of respective light pulses 40 that illuminate the scene and provide light that is reflected from the scene and imaged by camera 20 during the gates.
  • Fig. 8 shows a time-distance graph 200 that illustrates timing of an extended front gate 202 and its corresponding long gate 204 relative to an emission time t 0 of a light pulse 40 that illuminates scene 30.
  • Front gate 202 is shown along left hand time ordinate 62 and long gate 204 is shown along right hand time ordinate 63.
  • Gates 202 and 204 begin at a same time relative to time t 0 .
  • long gate 90 shown in Figs. 2-4 optionally has a gate width equal to 3Axp W
  • long gate 204 used with extended front gate 202 optionally has a shorter gate width 2A%p W .
  • Extended front gate 202 and its associated long gate 204 define a slice, of scene 30 having a spatial width cAip W /2.
  • the slice defined by the gates is graphically represented by a shaded rectangle 206 along time-distance line 70 in Fig. 8.
  • t S g Let a time at which extended front gate 202 and its corresponding long gate begin following a light pulse emission time t 0 be represented by t S g. Then light acquired by pixels 23 in photosurface 22 during the gates may be used to determine distances Df to features in a slice of the scene having lower and upper distance bounds DSL, DSU where DSL, DSy and Df satisfy a relationship,
  • Light registered by a pixel 23 that images a feature 71 of scene 30 located in slice 206 during extended front gate 202 and long gate 204 is graphically represented by shaded areas 208 and 210 respectively in the gates. From Fig. 8 it may be seen that for any feature located in slice 206, an amount of light 210 registered by pixel 23 is substantially all the light reflected by feature 71 that reaches camera 20. However, for such a feature, an amount of light 208 registered by the pixel during extended front gate 202 is dependent on the location of the feature in slice 206. Let the amounts of registered light 208 and 210 be represented in symbols by "Q" and "Q 0 " respectively.
  • since there are relatively few features for which Q/Q o 1, ignoring such features does not substantially, adversely affect providing a complete 3D image of the slice.
  • Distances to features in slice 206 may also be determined in accordance with prior art using an extended back gate instead of an extended front gate.
  • the extended back gate has a gate width equal to that of an extended front gate but ends at a same time relative to a light pulse emission time t o as a corresponding long gate, instead of beginning at a same time as the long gate.
  • An extended back gate may be considered a mirror image of an extended front gate.
  • Fig. 9 shows a time-distance graph 220 that illustrates timing of an extended back gate
  • Extended back gate 222 is shown along left hand time ordinate 62 and long gate 224 is shown along right hand time ordinate 63.
  • Gates 222 and 224 end at a same time tgp- relative to time t 0 .
  • Light reflected from feature 71 registered by pixel 23 that images the feature during extended back and long gates 222 and 224 is represented by shaded areas 226 and 228 respectively.
  • Distance Df to a feature in slice 206 is given by
  • Df ct eg /2 + (c ⁇ p W )(Q/Q 0 -2)/2, (33) where "Q" and "Q 0 ", represent amounts of registered light 226 and 228 respectively.
  • configurations of an extended front gate 202 (Fig. 8) or an extended back gate 222 (Fig. 9) together with a long gate 204 and 224 respectively may be repeated to acquire distances to features in a plurality of optionally contiguous slices of scene 30.
  • a range over which distances to features in the scene extends from a distance R ⁇ to a distance R 2 , and a slice has a slice width SW, a plurality of N slices, where are needed to acquire a 3D image of the scene.
  • a "gate acquisition time" ⁇ Tg be required for a pixel 23 to register an amount of light for an extended front gate, extended back gate or a long gate suitable for use in determining a distance Df to a feature of a scene imaged on the pixel.
  • a 3D acquisition time for a slice of the scene may then be written
  • Fig. 10 shows a time-distance graph 240 that illustrates temporal relationships of a plurality of extended back gates and associated long gates that are used to acquire 3D images of slices of a scene, such as scene 30.
  • a first extended back gate 242 and its associated long gate 244 are shown along left time ordinate 62 and define a slice 246. Amounts of light reflected from a light pulse 40 (Fig. 1) from a feature 248 in slice 246 that are registered by a pixel imaging the feature during extended back gate 242 and long gate 244 are shown as shaded areas 249 and 250 respectively in the gates.
  • a second extended back gate 252 and its associated long gate 254 are shown along right time ordinate 63 and define a slice 256.
  • slice 248 in scene 30 are assumed contiguous with slice 256 in the scene and as a result in Fig. 10 touch at a corner. Ki ⁇ umHwm
  • Amounts of light reflected from a light pulse 40 (Fig. 1) from a feature 258 in slice 256 that are registered by a pixel 23 imaging the feature during extended back gate 252 and long gate 254 are shown as shaded areas 259 and 260 respectively in the gates.
  • the inventors have realized that if only long gates are used to acquire a plurality of slices of a scene a total acquisition time, Tp, to acquire data for a 3D image of scene 30 may be reduced relative to a total acquisition time required using an extended front or back gate and an associated long gate.
  • Fig. 11 shows a time-distance graph 280 that illustrates temporal relationships of a plurality of long gates used to acquire a 3D image of a scene, in accordance with an embodiment of the invention.
  • a sequence of long gates LGj , LG2, LG3 ... used to acquire a 3D image of scene 30 are shown along left time ordinate 62.
  • gates, hereinafter “odd gates”, labeled with an odd subscript and gates, hereinafter “even gates”, labeled with an even number subscript are shown on opposite sides of left time ordinate 62.
  • Gates LGj and LG2 are also repeated along right time ordinate 63. Each pair of sequential odd and even gates defines a spatial slice of scene 30.
  • gates LGj and LG2 define a slice located along time-distance line 70 labeled SI4 2 in Fig. 11.
  • Light registered by pixels 23 of photosurface 22 during gates LG ⁇ and LG2 may be used in accordance with an embodiment of the invention to determine distance to all features in slice SLj 2 located in the slice.
  • gate pairs (LG3.LG4) define slice SL3 4 shown in Fig. 11. In an embodiment of the invention, even and odd gates have a same gate width,
  • a ⁇ LGW 2 ⁇ p W , (37) and each even and odd gate begins at a time (relative to t 0 ) that is half a gate width, i.e. A ⁇ p W later than a time at which a preceding odd and even gate respectively begins.
  • distance Df to the feature imaged by the pixel is determined in accordance with the following equations,
  • a total acquisition time for the scene in accordance with an embodiment of the invention is therefore one half the prior art acquisition time given by equation (36).

Abstract

A method of acquiring a 3D image of a scene, the method comprising: transmitting at least one light pulse having a pulse width at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of sets of gates, each set comprising at least one first, second and third gate having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of the at least one first gate and the at least one second gate are between the start and stop times of the at least one third gate, wherein for at least one of the set of gates, the at least one third gate has a start time equal to about the stop time of the at least one third gate of another of the set of gates; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.

Description

3D CAMERA AND METHODS OF GATING THEREOF
FIELD
The invention relates to methods and apparatus for acquiring 3D images of a scene. BACKGROUND
Three-dimensional (3D) optical imaging systems, hereinafter referred to as "3D cameras", that are capable of providing distance measurements to objects and points on objects that they image, are used for many different applications. Among these applications are profile inspections of manufactured goods, CAD verification, robot vision, geographic surveying and imaging objects selectively as a function of distance.
Some 3D cameras provide simultaneous measurements to substantially all points of objects in a scene they image. Generally, these 3D cameras comprise a light source, such as a laser, which is pulsed or shuttered so that it provides pulses of light for illuminating a scene being imaged and a gated imaging system for imaging light from the light pulses that is reflected from objects in the scene. The gated imaging system comprises a camera having a photosensitive surface, hereinafter referred to as a "photosurface", such as a CCD or CMOS photosurface and a gating means for gating the camera open and closed, such as an optical shutter or a gated image intensifϊer. The reflected light is registered on pixels of the photosurface of the camera only if it reaches the camera when the camera is gated open. To image a scene and determine distances from the camera to objects in the scene, the scene is generally illuminated with a train of light pulses radiated from the light source. For each radiated light pulse in the train, following an accurately determined delay from the time that the light pulse is radiated, the camera is gated open for a period of time hereinafter referred to as a "gate". Light from the light pulse that is reflected from an object in the scene is imaged on the photosurface of the camera if it reaches the camera during the gate. Since the time elapsed between radiating a light pulse and the gate that follows it is known, the time it took imaged light to travel from the light source to the reflecting object in the scene and back to the camera is known. The time elapsed is used to determine the distance to the object.
In some "gated" 3D cameras, only the timing between light pulses and gates is used to determine distance from the 3D camera to a point in the scene imaged on a pixel of the photosurface of the camera. In others, an amount of light registered by the pixel during the time that the camera is gated open is also used to determine the distance. The accuracy of measurements made with these 3D cameras is a function of the rise and fall times of the light pulses and their flatness and, how fast the cameras can be gated open and closed. Gated 3D cameras and examples of their uses are found in European Patent EP 1214609 and in US Patents 6,057,909, US 6,091,905, US 6,100,517 and US 6,445,884, the disclosures of which are incorporated herein by reference. The cameras described in these patents use amounts of light registered by pixels in the camera during times at which the camera is gated open to determine distances to features in contiguous slices of a scene. The slice locations and spatial widths are defined by length and timing of gates during which the cameras are gated open relative to pulse lengths and timing of light pulses that are transmitted to illuminate the scene.
Generally, at least two gates, at least one relatively long gate and at least one relatively short, "front" or "back" gate temporally corresponding to the front part or back part respectively of the at least one long gate, are used to acquire a 3D image of a slice of the scene. Relative to a time at which a light pulse is transmitted to illuminate the scene, a front gate starts at a same time that a long gate begins and a back gate ends at a same time as a long gate ends. The short gates optionally have a gate width equal to a pulse width of pulses of light used to illuminate the scene and a long gate has a pulse width equal to twice the light pulse width. Amounts of light registered during the at least one short gate by a pixel in the camera that images a feature of the scene are normalized to amounts of light registered by the pixel during the at least one long gate. The normalized amounts of registered light are used to determine a distance to the feature. A total acquisition time for acquiring data for a 3D image of a scene is substantially equal to a number of slices of the scene that are imaged, times a time, a "slice acquisition time", required to acquire 3D data for a single slice. A slice acquisition time is a function of a number of light pulses and gates required to register quantities of light for the various gates sufficient to provide data for determining distances to features of the scene located in the slice. A 3D camera using a pulsed source of illumination and a gated imaging system is described in "Design and Development of a Multi-detecting two Dimensional Ranging Sensor", Measurement Science and Technology 6 (September 1995), pages 1301-1308, by S. Christie, et al., and in "Range-gated Imaging for Near Field Target Identification", Yates et al, SPIE Vol. 2869, p374 - 385 which are herein incorporated by reference. Another 3D camera is described in U.S. patent 5,081,530 to Medina, which is incorporated herein by reference. A 3D camera described in this patent registers energy in a pulse of light reflected from a target that reaches the camera's imaging system during each gate of a pair of gates. Distance to a target is determined from the ratio of the difference between the amounts of energy registered during each of the two gates to the sum of the amounts of energy registered during each of the two gates.
SUMMARY
An aspect of some embodiments of the invention relates to providing a gating procedure for gating a gated 3D camera that provides a relatively short acquisition time for a 3D image of a scene.
An aspect of some embodiments of the invention relates to providing a configuration of gates for providing a 3D image of a scene for which, for a given width of a light pulse that illuminates the scene the gates provide 3D data for slices of the scene that have relatively large spatial widths.
There is therefore provided in accordance with an embodiment of the invention, a method of acquiring a 3D image of a scene, the method comprising: transmitting at least one light pulse at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of sets of gates, each set comprising at least one first, second and third gate having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of the at least one first gate and the at least one second gate are between the start and stop times of the at least one third gate, wherein for at least one of the set of gates, the at least one third gate has a start time equal to about the stop time of the at least one third gate of another of the set of gates; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.
Optionally, the start time of the second gate is substantially equal to the stop time of the first gate. Optionally, the start time of the first gate is delayed relative to the start time of the third gate by a pulse width of the at least one light pulse. Optionally, the first and second gates have equal gate widths. Optionally, the gate width of the first and second gates is equal to half a pulse width of the at least one light pulse. Optionally, the gate width of the third gate is substantially equal to three pulse widths of the at least one light pulse.
In some embodiments of the invention, if amounts of reflected light from a feature imaged during the at least one first, second and third gates of a set of gates are denoted by Qj, Q2> Q3 respectively and the start time of the at least one first gate by t\, then if Q\ ≠ 0 or Q2 ≠ 0, distance Df to the feature is determined in accordance with Df= ct!/2 + (CAc)KQ1+ Q2)/Q3-l]/2, if Qi(n) < Q2(Ii) and Df= ctχ/2 + (CAT)KI-(Q1+ Q2)/Q3]/2 if Qi(n) > Q2(Ii), where c is the speed of light and Δτ is the pulse width of the at least one light pulse. Additionally or alternatively, if amounts of reflected light from a feature imaged during the at least one first, second and third gates of a set of gates are denoted by Qi, Q2, Q3 respectively, the start time of the at least one third gate by t^, and the amount of light from the feature imaged by the at least one third gate of the other set of gates by Q*3 then if Ql = Q2 = 0, Q3 ≠ 0 and Q 3 ≠ 0, distance Df to the feature is determined in accordance with Df= ct3/2 - (cΔτpW)(Q3(n)/(Q3+Q*3))/2 or Df= ct3/2 - (cΔτ/7W)(l-(Q*3)/(Q3+Q*3))/2, where c is the speed of light and Δτ is the pulse width of the at least one light pulse. In some embodiments of the invention, the at least one third gate has a start time earlier than the stop time of the at least one third gate of the other set of gates by a period that is less than or equal to about one twentieth of the pulse width of the at least one light pulse.
There is further provided in accordance with an embodiment of the invention, a method of acquiring a 3D image of a scene, the method comprising: transmitting at least one light pulse at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of equal length gates having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start time of at least one first gate is substantially a time half way between the start and stop times of at least one second gate; and using amounts of reflected light imaged on the photosurface during the at least one first gate and the at least one second gate to determine distance to a feature of the scene.
Optionally, the gates have a gate width substantially equal to twice a pulse width of the at least one light pulse.
Additionally or alternatively, if the start time and amounts of reflected light from the feature imaged during the at least one first gate are denoted respectively by X\, and Qj and for the at least one second gate by t2 and Q2 respectively, distance Df to the feature is determined in accordance with
Df= (ctn/2) + cΔτ(Q2/Qi)/2 if Q2 < Qi and Df= (ct2/2) + cΔτ(l-(Qi/Q2))/2 if Qi < Q2, where c is the speed of light and Δτ is the pulse width of the at least one light pulse.
There is further provided in accordance with an embodiment of the invention, a method of acquiring a 3D image of a scene, the method comprising: transmitting at least one light pulse having a pulse width at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a set of gates comprising at least one first, second and third gates having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of each first gate and each second gate are between the start and stop times of a same third gate; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.
Optionally, the first gate has a start time delayed relative to a start time of the third gate by about the pulse width. Additionally or alternatively, the second gate has a stop time that precedes a stop time of the at least one third gate by about a pulse width. In some embodiments of the invention, the first gate has a gate width equal to about half a pulse width. In some embodiments of the invention, the second gate has a gate width equal to about half a pulse width. In some embodiments of the invention, the third gate has a gate width equal to about three pulse widths.
Optionally, the first and second gates have gate widths equal to half a pulse width and the third gate has a gate width equal to about three pulse widths wherein, and if amounts of reflected light from a feature imaged during the at least one first, second and third gates are denoted by Qi, Q2, Q3 respectively, and the start time of the first gate by ti, distance Df to the feature is determined in accordance with Df= cti/2 + (cΔτ)(Q/Q3-l)/2 if Q2 < Qi; and Df = cti/2 + (cΔτ)(l-Q/Q3)/2 If Q1 > Q2, where c is the speed of light, Δτ is the pulse width of the at least one light pulse and
Q = (Qi + Q2).
There is further provided in accordance with an embodiment of the invention, a camera useable to acquire a 3D image of a scene comprising: a gateable photosurface; and a controller that gates the photosurface in accordance with an embodiment of the invention. Optionally, the camera comprises a light source controllable to illuminate the scene with a pulse of light.
BRIEF DESCRIPTION OF FIGURES
Non-limiting examples of embodiments of the invention are described belowi with reference to figures attached hereto that are listed following this paragraph. Identical structures, elements or parts that appear in more than one figure are generally labeled with a same numeral in all the figures in which they appear. Dimensions of components and features shown in the figures are chosen for convenience and clarity of presentation and are not necessarily shown to scale. Fig. 1 schematically illustrates a gated 3D camera being used to acquire a 3D image of a scene;
Fig. 2 shows a time-distance graph that illustrates a temporal configuration of gates of the camera and at least one light pulse that illuminates the scene shown in Fig. 1 used to acquire a 3D image of the scene, in accordance with prior art;
Fig. 3 shows a time-distance graph that illustrates an ambiguity in determining distance to a feature in a scene, in accordance with prior art;
Fig 4 shows a time-distance graph that illustrates a temporal configuration of gates for removing the ambiguity illustrated in Fig. 3, in accordance with an embodiment of the invention;
Fig. 5 schematically shows a scene and slices of the scene for which a gated camera provides 3D data;
Fig. 6 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for the plurality of slices of the scene shown in Fig. 5 using the gating configuration illustrated in Fig. 4, in accordance with an embodiment of the invention;
Fig. 7 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for a plurality of slices of a scene, in accordance with another embodiment of the invention;
Fig. 8 shows a time-distance graph that graphically illustrates another timing configuration of gates used to acquire 3D data for a scene, in accordance with prior art;
Fig. 9 shows a time-distance graph that graphically illustrates another timing configuration of gates used to acquire 3D data for a scene, in accordance with prior art;
Fig. 10 shows a time-distance graph that graphically illustrates timing of gates configured as shown Fig. 9 used to acquire 3D data for a plurality of slices of a scene, in accordance with prior art; and
Fig. 11 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for a plurality of slices of a scene, in accordance with another embodiment of the invention.
DETAILED DESCRIPTION
Fig. 1 schematically illustrates a gated 3D camera 20 being used to acquire a 3D image of a scene 30 having objects schematically represented by objects 31 and 32.
Camera 20, which is represented very schematically, comprises a lens system, represented by a lens 21, and a photosurface 22 having pixels 23 on which the lens system images the scene. Photosurface 22 is "gateable" so that it may be selectively gated on or off to be made sensitive or insensitive respectively to light for desired periods. Optionally, pixels in the photosurface are independently gateable. Photosurfaces that are gated by shutters are described in US Patents 6,057,909, 6,327,073, 6,331,911 and 6,794,628, the disclosures of which are incorporated herein by reference. Photosurfaces having independently gateable pixels are described in PCT Publication WO 00/36372 describes photosurfaces having pixels that are independently gateable. A "gate" refers to a period during which a photosurface or pixel in the photosurface is gated on and made sensitive to light. For convenience of presentation, it is assumed that all the pixels 23 in photosurface 22 are gated on or off simultaneously. Camera 20 optionally comprises a suitable light source 26, such as for example, a laser or a LED or an array of lasers and/or LEDs, controllable to illuminate scene 30 with pulses of light. A controller 24 controls pulsing of light source 26 and gating of photosurface 22.
To acquire a 3D image of scene 30, controller 24 controls light source 26 to emit at least one light pulse, schematically represented by a wavy arrow 40, to illuminate scene 30. Light from each at least one light pulse 40 is reflected by features in scene 30 and some of the reflected light is incident on camera 20 and collected by lens 21. Reflected light from at least one light pulse 40 that reaches camera 20 is schematically represented by wavy arrows 45. Following emission of each at least one light pulse 40, controller 24 gates photosurface 22 on at a suitable time relative to a time at which the light pulse is emitted to receive and image reflected light 45 collected by lens 21 on photosurface 22. Amounts of light 45 imaged on pixels 23 of the photosurface are used to determine distances to features of scene 30 that are imaged on the pixels and provide thereby a 3D image of the scene.
Fig. 2 shows a time-distance graph 60 that illustrates relationships between timing of a light pulse 40 that illuminates scene 30, a gate of photosurface 22 and amounts of light 45 collected by lens 21 and registered by a pixel 23 in the photosurface that images a feature of scene 30 located at a distance D from camera 20. In graph 60, distance D from camera 20 is indicated along an abscissa 61 and time is indicated along right and left ordinates 62 and 63 respectively. For convenience, the ordinates are scaled in units proportional to "ct" - time "t" multiplied by the speed of light "c".
A time-distance line 70 in graph 60 shows a relationship between distance D of a feature in scene 30 from camera 20 and a time, in units of ct, at which a photon emitted by light source 26 at an arbitrary time t0 = 0 that is reflected by the feature reaches the camera. Since a round trip time, in units of ct, of a photon from light source 26 to the feature and back is given by the equation ct = 2D, line 70 in graph 60 has a slope equal to 2. If the feature is illuminated by photons in a pulse of light, such as light pulse 40 having a pulse width "AxpW", camera 20 will receive light reflected from the light pulse by the feature for a period having duration ΔxpW from a time at which a first photon reflected by the feature from the light pulse reaches the camera.
For example, assume a feature, represented by a circle 71, of object 31 in scene 30 that is illuminated by a pulse of light 40 emitted at time t0 = 0 is located at a distance Df from camera 20. In graph 60, a circle, also labeled with the numeral 71, located on time-distance line 70 having a D coordinate Df schematically represents the feature. Reflected light from feature 71 first reaches camera 20 at a time tj(Df), which satisfies an equation, t1(Df) = 2Df/c. (1)
Light from light pulse 40 reflected by feature 71 continues to be incident on the camera for a period equal to the pulse width AτpW of the light pulse 40 until a time t2(Df), t2(Df) = 2Df/c + Δτpw, (2) at which time a last reflected photon from light pulse 40 reaches the camera. Block arrows 72 extending from time tj(Df) to time t2(Df) and having a length AxpW shown along left and right hand time ordinates 62 and 63 schematically represent light reflected by feature 71 that reaches camera 20.
Assume further that controller 24 gates photosurface 22 (Fig. 1) on for a relatively short gate represented by a rectangle 80 that extends from a start time Wj to a stop time W2, along left hand time ordinate 62 and has a gate width ΔτSgw = ^sg2 " Wl • Optionally, Aw11, = AτpW.
If short gate 80 is on, as shown in graph 60, during the period from ti(Df) to t2(Df) for which light 72 reflected by feature 71 from light pulse 40 is incident on camera 20, a pixel 23
(Fig. 1) imaging the feature will register light from the feature. The amount of light 72 registered by the pixel imaging feature 71 is useable to determine distance Df and is graphically represented by a shaded area 81 in rectangle 80. Let the amount of registered light 81 be represented in symbols by "Q". Then distance Df of feature 71 from camera 20 is given by, Df= ctø/2 + (cΔτ/w)(Q/Q0-l)/2 if tsgχ < t2(Df) < W2. and (3) Df= cWi/2 + (cAxpW)(l-Q/Q0)/2 if tø < I1(Df) < W2. (4)
In equations (3) and (4), Q0 is an amount of light that would be registered by pixel 23 imaging feature 71, were all the light reflected from pulse 40 by the feature and collected by lens 21 registered by the pixel, irrespective of whether or not the reflected light reached the camera during short gate 80. Q0 is a function of reflectance of feature 71 and its distance Df from the camera. Generally, Q0 is determined by controlling light source 26 to emit at least one light pulse 40 and for each emitted light pulse 40, gating camera 20 with a long gate so that all light reflected by feature 71 from the light pulse that reaches the camera is imaged on photosurface 22.
Camera 20 receives and registers light reflected from light pulse 40 on its photosurface 22 during short gate 80 for any feature of scene 30 located at a distance Df between lower and upper bound distances, DSL 3^ DSy respectively, from the camera. The upper and lower bound distances define a "slice", schematically indicated in graph 60 by a shaded rectangle 83, of scene 30. Shaded rectangle 83 has diagonal corners that lie at points along time-distance line 70 having distance coordinates DSL and DSu- DSL, DSU and Df satisfy a relationship, DSL = <tsgl-Aτpw)/2 < Df < DSu = ctJg2/2. (5)
Slice 83 has a spatial, "slice width" SW = (DSu - DSL) = (<%2 - ct^l + cΔw,,)/2 = cAτpw, (6) where the last equality holds under the assumption that short gate 80 has a gate width Aty,, equal to pulse width ΔτpW. Because time-distance line 70 has a slope 2, rectangle 83 has an extent along time ordinate 62 equal to 2SW.
A long gate corresponding to short gate 80 for determining Q0 for any feature located in slice 83 (i.e. any feature imaged with light reflected from a light pulse 40 by photosurface 22 during gate 80) is schematically represented by a rectangle 90 along right hand time ordinate 63. Light registered by pixel 23 that images feature 71 during long gate 90 is graphically represented by a shaded area 91 in gate 90. To assure that Q0 may be determined for any feature of scene 30 for which light is registered on a pixel 23 during short gate 80, long gate 90 optionally has a gate width Δτ/,™, at least equal to a sum of short gate width Aw14, and twice the pulse width Aw1, of light pulse 40, i. e.
Δτ/gw = Δτsgw + 2Axpw = 3A-Cp11,. (7)
The last equality in equation (7) is true under the assumption that short gate 80 has a gate width Δχsgw equal to pulse width Aτpw of at least one pulse 40. Gate 90 optionally has a start time tjgi relative to time t0 at which light pulse 40 is emitted that is equal to a time that a first photon from a feature in scene 30 having a distance DSL from camera 20 reaches the camera. Gate 90 optionally has a stop time t/g2 equal to a time at which a last photon from pulse 40 reaches camera 20 from a feature of the scene having a distance DSu fr°m camera 20. Time t/p-i is earlier than a start time of short gate 80 by a period equal to AτpW and t/g2 *s later man a stop time of short gate 80 by Δ%w so that t/gl = Gsgl " Δτpw) and t/g2 = βsg2 + Δtpw)- (8)
Whereas long gate 90 provides information sufficient to determine Q0 for any feature in slice 83, it is noted that neither a quantity Q of light 81 registered on pixel 23 that images feature 71 during short gate 80 or a quantity Q0 of light 91 registered by the pixel during long gate 90, provides information as to which of equations (3) or (4) should be to used to determine
Df. For any feature located in slice 83, and any set of values Q and Q0 for the feature, there are two different distances for the feature in the slice that will result in a same set of values Q and Q0.
For example, for any Df for feature 71 there is another complimentary distance Df* for the feature that is located in slice 83 for which pixel 23 that images feature 71 would register same amounts of light during short and long gates 80 and 90 as are registered respectively for the gates for distance Df. hi particular, Df and Df * satisfy a relationship: |Df- ctsgl/2| = |Df* - ctSg2/2|. (9)
Fig. 3 schematically shows distance Df and complimentary distance Df* for feature 71 in a time-distance graph 75. Amounts of light registered by pixel 23 for distance Df are schematically shown by shaded areas 81 and 91 for short and long gates 80 and 90 respectively, both of which gates are schematically shown along left time ordinate 62. Position of feature 71 for distance Df* is indicated along time-distance line 70 by a circle 71*. Amounts of light registered by pixel 23 for Df* are schematically represented by shaded areas 81* and 91* for gates 80 and 90, which for distance Df* are shown along right hand time ordinate 63. A block arrow labeled 72* schematically represents light from light pulse 40 reflected from feature 71 that reaches camera 20 were the feature to be located at Df*. It is noted that whereas light is registered for distances Df and Df* by pixel 23 at different times during gates 80 and 90, the amounts of light registered during gate 80 are the same in each case, as are the amounts registered during gate 90.
In accordance with an embodiment of the invention, the ambiguity of whether to use equation (3) or equation (4) in determining Df is removed by gating photosurface 22 on following each of at least one light pulse 40 for two short gates, a first, "front" short gate and a second, "back" short gate, optionally having equal gate widths. For example, to determine distance to features of scene 30 in slice 83 having lower and upper bound distances, DSL m& DSTJ, photosurface 22 is gated on for a front short gate having a gate width Δτp^g^ at a front gate start time, tpg, following a time at which the light pulse is emitted. The time tpg is given by the equation, tPg = lsgl = 2DSL/c + Aτpw. (10)
Similarly, the photosurface is gated on for the back short gate at a time following each of at least one light pulse 40 at a time tβg defined by an equation, tBg = tFsg + ΔτRsrgw C1 1) where, as noted above Δτps,™, is the width of the front gate.
Optionally, both short gates have a gate width, ΔτSgM,, equal to half the light pulse width, so that Aτsgw = Aτpw/2. (12)
Fig 4 schematically illustrates how two short gates, in accordance with an embodiment of the invention, having gate width Ax5^ = ΔτpW/2 remove the ambiguity with respect to equations (3) and (4) and thereby whether a feature, such as feature 71 imaged on a pixel 23, is located at a distance Df or Df . A long gate 90 and two short gates, a front short gate 101 and a back short gate 102, are shown in Fig. 4 along left hand time ordinate 62 of a time-distance graph 85. A shaded area 91 represents' light registered by pixel' 23 during long gate 90 for distance Df. Shaded areas 103 and 104 in front and back short gates 101 and 102 respectively, graphically represent amounts of light registered by pixel 23 for feature 71, located at Df, during the short gates. The same front and back short gates 101 and 102 and long gate 90 are shown along right hand time ordinate 63. Shaded areas 105 and 106 in front and back short gates 101 and 102 shown along right hand time ordinate 63 represent amounts of light registered by pixel 23 during the short gates respectively, assuming that feature 71 were to be located at Df . A shaded area 91 represents light registered by pixel 23 during long gate 90 for Df*. From Fig. 4 it is readily seen that if feature 71 is located at Df, as shown for short gates 101 and 102 along left hand time ordinate 62, front gate 101 registers a greater amount of light than back gate 102. On the other hand, if feature 71 is located at Df*, as shown for front and back gates 101 and 102 shown along right hand time ordinate 63, the front gate registers a smaller amount of light than the back gate. IfQp and Qg represent in symbols amounts of light registered by pixel 23 during front and back short gates 101 and 102 respectively, then Df may be determined in accordance with equation (3) if QQ < Qp and in accordance with equation (4) ifQF > QB, /•<?•: Df= ctFg/2 + (cΔτ^XQ/Qo-l)^ if QB < QF; and (13) Df= ctFg/2 + (cΔτpvi;)(l-Q/Q0)/2 if QF > QB. (14)
In equations (13) and (14), optionally, Q = (Qp + Qg).
Typically, to determine Qp5 Qβ and Q0 for each pixel 23 in photosurface 22 at least one light pulse 40 comprises a plurality of light pulses and each quantity of registered charge Qp5 QB and Q0 is determined from a train of light pulses 40. For example, to determine Qp, controller 24 optionally controls light source 26 to emit a train of light pulses 40. For each light pulse 40, following a time tpg (equation (10)) after the light pulse is emitted, controller 24 gates photosurface 22 on for a short gate 101. A total amount of light registered by each pixel 23 for all short gates 101 is used to provide Qp for the pixel. Similarly, to provide Qg, another train of light pulses 40 is emitted and following a delay of tβg (equation (11)) after an emission time of each light pulse, photosurface 22 is gated on for a back gate 102. A total amount of light registered by each pixel for all the light pulses in the pulse train is used to determine Qg. Q0 is similarly determined from a pulse train of light pulses 40 and a long gate 90 for each light pulse in the light pulse train, which gate 90 follows the light pulse by a delay of t/gj (equation (8)).
Let a "slice acquisition time", "Ts", be a time during which a scene, such as scene 30, is illuminated by light source 26 with light pulses 40 and camera 20 gated to acquire values for Qp, QB and Q0 for pixels 23 in photosurface 22 to provide distances to features in a slice of the scene imaged on the pixels. Often, slices imaged by a gated camera such as camera 20 are relatively narrow, and a scene for which distances are to be determined using the camera typically has features located in a range of distances substantially greater than a range defined by lower and upper bound distances DSL anc^ DSy (Figs. 2-4) of the slice. Therefore, in general, to determine distances to features in a scene, a plurality of substantially contiguous slices of the scene are imaged using camera 20. For each slice, the scene is illuminated by at least one light pulse 40 and camera 20 is gated on for front, back and long gates to acquire distances to features in the slice. If the range over which distances to features in the scene extends from a distance R\ to a distance R2, and a slice has a slice width SW, a plurality of N slices, where N = (R2-Ri)/SW, (15) are required to provide distances to substantially all features in the scene. And a total 3D scene acquisition time Tf, required to image N slices is
Tx = NTs- O6)
By way of example, Fig. 5 schematically shows scene 30 shown in Fig. 1 and a plurality "N" of slices S\, S2 ... Sj^ that are used by camera 20 to provide a 3D image of the scene, which extends from a range R\ to a range R.2- A time-distance graph 95 in Fig. 6 shows timing of the front, back and long gates for the slices relative to time t0 = 0 at which a light pulse 40 is emitted by light source 26 to illuminate scene 30. The slices shown in Fig. 5 are represented in graph 95 by shaded rectangles labeled Sj, S2 ... Sjq- along time-distance line 70. For convenience, slices represented by Sn having a larger subscript n are farther from camera 20 than slices represented by Sn having a smaller index, and slices whose indices differ by 1 are contiguous.
The short front and back gates and the long gate that define a given slice Sn are graphically represented by rectangles labeled respectively FGn, BGn and LGn respectively. Gates for adjacent slices Sn are shown on opposite sides of time ordinate 62. By way of example, a first slice Sj in range R1-R2 defined by gates FGj, BGj and LG^ is slice 83 shown in Fig. 4 and shaded regions 103, 104 and 91 (Fig. 4) graphically representing light registered by a pixel 23 for feature 71 are shown for gates FGj , BGj and LGj. To provide the contiguous slices, the front gate of any given slice Sn and the front gate of a next further contiguous slice Sn+i are delayed relative to each other by a time equal to 2AtpW = 2SW/c, the spatial slice width of the slices multiplied by 2 and divided by the speed of light. .
For some applications, a total 3D acquisition time Tj (equation (16)) required to acquire a 3D image of a scene by 3D imaging contiguous slices as described above can be too long to provide satisfactory 3D imaging of the scene. For example, an acquisition time Tj may be too long for scenes having moving features that displace by distances on the order of a slice width during time Tj.
The inventors have determined that a total time Tf for acquiring a 3D image of a scene can be reduced by modifying timing of gates used to acquire values for Qp, QQ and Q0 for slices of the scene, optionally, without changing the lengths of the gates. In particular, the inventors have noted that, a time delay equal to 2AxpW temporally separates start times of front gates for contiguous slices, and long gates for the slices overlap (relative to t0) as shown in Fig. 4. On the other hand, a 3D image of a scene can be acquired, in accordance with an embodiment of the invention, in a reduced total acquisition time Tγ if long gates of adjacent slices are timed so that they do not overlap and when one long gate of adjacent slices ends, the other begins.
However, by timing the long gates so that they do not overlap, front gates for adjacent slices are separated by a time 3ΔτpW rather than a time 2AxpW. As a result, adjacent slices defined by front and back short gates are no longer contiguous, but are separated by
"interstitial" slices. For the interstitial slices, the front and short gates do not provide 3D information. In accordance with an embodiment of the invention, the non-overlapping long gates provide additional information for determining distance to features located in the interstitial slices. For convenience of presentation, slices of a scene for which short front and back gates provide information are referred to as "regular slices".
5 By configuring gating of a 3D gated camera, such as camera 20 so that long gates provide 3D data for interstitial slices, each long gate and its associated front and back short gates provide 3D data for a regular slice and in addition 3D data for an interstitial slice. If the front, back and non-overlapping long, gates have same gate widths "SW" as corresponding gates having overlapping long gates shown in Fig. 6 that define a slice of a scene, a regular
10 slice of the scene in accordance with an embodiment of the invention has a same spatial width as a slice in Fig. 6. Since a set of front, back and non-overlapping long gates provide 3D data for an interstitial slice in addition to a regular slice, each set of gates that define a regular slice of the scene provides 3D data for features in larger range of distances than a corresponding set of gates in Fig. 6. Therefore, a smaller number of sets of gates having non-overlapping long
,15 gates are required to acquire a 3D image of a scene than is required when long gates overlap. However, in accordance with an embodiment of the invention, a slice acquisition time for a regular slice is about the same as a slice acquisition time for a slice defined using overlapping gates. A total 3D acquisition time Tj for a scene using non-overlapping long gates in accordance with an embodiment of the invention, is therefore generally shorter than a total
-20 acquisition time for the scene using overlapping long gates.
Fig. 7 schematically shows a timing and distance graph 115 that graphically illustrates timing of short front and back, and non-overlapping long gates, in accordance with an embodiment of the invention. Regular slices of scene 30 that are imaged by camera 20 are, graphically represented by shaded rectangles that are labeled RSj RS2 ... RS^ along time-
25 distance line 70. For a given slice RSn, corresponding front back and long gates of camera 20 that define the slice are graphically represented by rectangles labeled RFGn, RBGn and RLGn respectively. The gates are graphically shown along left hand time ordinate 62, with gates for adjacent regular slices shown on opposite sides of the time ordinate. Relative to a start time for any given long gate RLGn, the corresponding short front and back gates RFGn, RBGn have a
30 same start time as front and back gates FGn, and BGn (Fig. 6) have relative to a start time for a corresponding lpng gate LGn.
First regular slice RSj in graph 115 is, by way of example, identical to slice S\ shown in graph 95 of Fig. 6. And relative to an emission time t0 of a first light pulse 40, the start, stop and gate lengths for gates RFGj, RBGi 3^ RLGi 3^ me same respectively as for gates FGj, BG^ and LGμn Fig. 6. Distance to a feature located in a regular slice RSn is determined from amounts of light Qp, Qg and Q0 registered by a pixel 23 that images the feature during front, back and long gates RFGn, RBGn and RLGn that define the slice, using equations (13) and (14). Amounts of light registered by a pixel that images feature 71 during gates RFGi, RBGl and RLGj are graphically represented by shaded areas 103, 104 and 91 respectively in the gates.
Because, in accordance with an embodiment of the invention, a long gate RLGn begins when a preceding long gate RLGn. \ ends, rather than overlapping the preceding long gate as shown in Fig. 6, regular slices RSn are not contiguous but are separated by interstitial slices. Interstitial slices are graphically represented in Fig. 7 by shaded rectangles labeled ISn n+i, where the subscripts n, n+1 indicate the regular slices RSn and RSn+ \ that bracket the interstitial slice. Assuming that short gates RFGn, RBGn have gate widths equal to AxpW/2 and that long gates RLGn have gate width equal to 3Δ%W, each interstitial slice has a slice width "ISW" equal to one half the slice width of a regular slice so that ISW= SW/2 = cΔxpy/1. (17)
For a feature located in an interstitial slice ISn n+\, a pixel 23- that images the feature does not register light reflected by the feature from a light pulse 40 during short and front gates of regular slices that bracket the interstitial slice. As a result, 3D information for the feature is not acquired by camera 20 during the front and back gates and distance to the feature cannot be determined using conventional equations such as equations (13) and (14). However, whereas the pixel does not register any light reflected from a light pulse 40 by the feature during the short gates, the pixel does register light reflected from a light pulse 40 during long gates of the regular slices that bracket the interstitial slice. By way of example, a feature of scene 30 located in interstitial slice IS 1 2 at a distance Df(121) is schematically represented by a circle labeled 121 along time-distance line 70. Reflected light registered by a pixel 23 that images the feature during long gates RLGj and RLG2 is graphically indicated by shaded regions 122 and 124 respectively. In accordance with an embodiment of the invention, light registered by the long gates is used to determine distance to the feature.
Let the light reflected from a light pulse 40 by a feature in an interstitial slice ISn n+i of scene 30 and registered during long gates RLGn and RLGn+ \ be represented in symbols by QL(Π) and QL(n+l) respectively. If a time at which long gates RLGn and RLGn+ 1 begin are represented by and tLg(n) and tLg(n+l) respectively, distance Df to the feature is determined, in accordance with an embodiment of the invention, in accordance with equations
Df= ctLg(n+l)/2 - (cΔτpW)(QL(n)/Qo(n))/2; or (18) Df= ctLg(n+l)/2 - (cΔτ^Xl- QL(n+l)/Q0(n))/2. (19)
In equations (18) and (19), Q0(n) = (QL(Π) + Qjjn+l)).
For a camera, such as camera 20, used to determine distances to features of a scene and gated in accordance with an embodiment of the invention, let amounts of light registered by a pixel that images a feature in the scene during gates RFGn, RBGn be represented by QF(n) and QB(n) respectively. Let times at which the gates RFGn and RBGn are turned on be represented by tpg(n) and tBg(n) respectively. Then distance Df to the feature may be determined in accordance with an embodiment of the invention, using the following general set of conditions and equations. If QF(II) ≠ 0 or QB(n) ≠ 0 (20)
Df= ctFg(n)/2 + (cΔτpW)[(QF(n)+ QB(n))/QL(n) -l]/2, if QB(n) < QF(n). and (21)
Df= ctFg(n)/2 + (cΔτi7W)[l-(QF(n)+ QB(n))/QL(n)]/2 if QB(n) > QF(n) (22)
If QF(n) = QB(n) = 0 and QL(n) ≠ 0 and Qi/n+1) ≠ 0 (23)
Df= ctLg(n)/2 - (cΔτpw)(QL(n)/Q0(n))/2. or (24) Df= ctLg(n)/2 - (cΔτ^wχi-QL(n+l)/Qo(n))/2. (25)
It is noted that for features located at a distance exactly at the border between a regular slice and an adjacent interstitial slice, light is registered by a pixel 23 imaging the feature only during the long gate that defines the regular slice. Furthermore, there is an ambiguity as to whether the amount of light registered by the pixel corresponds to the feature being located at the border between the regular slice and the following adjacent interstitial slice or the preceding adjacent interstitial slice. Generally, such a situation arises only for a very few features in a scene and in some embodiments of the invention features for which this occurs are ignored.
In some embodiments of the invention, adjacent long gates are timed, relative to time t0, to "overlap" slightly to remove the ambiguity. For example, a long gate may have a start time earlier than the stop time of an immediately previous long gate by a period equal to or less than about a tenth of a pulse width. Optionally, to provide the overlap long gates are lengthened by the period of the overlap.
From the above discussion and Fig. 7 it is seen that by gating a 3D camera in accordance with an embodiment of the invention, for each set of gates RFGn, RBGn and RLGn 3D data is acquired for a regular slice RSn of scene 30 having a same width SW as a slice Sn shown in Fig. 6. However, in addition to acquiring 3D data for a regular slice, a set of gates RFGn, RBGn and RLGn and a long gate RLGn+ j provide 3D information for an interstitial slice ISn,n+l of the scene. For a sufficiently large number N of sets of gates RFGn, RBGn and RLGn, each set of gates may be considered to acquire 3D data for a regular slice and an interstitial slice. Since each regular slice RSn has a slice width SW and each interstitial slice ISn,n+l has a width ISW = SW/2, a set of gates RFGn, RBGn and RLGn may be considered to acquire 3D data for an "enlarged slice" having an enlarged slice width,
ESW = SW + ISW = (3/2)SW. (26)
To acquire a 3D image of scene 30 in accordance with an embodiment of the invention camera 30 is gated on for N sets of gates RFGn, RBGn and RLGn where
Figure imgf000019_0001
(R2-Ri)/[3/2)SW], (27) By comparing N to a number of slices, N, defined by equation (15) above, it is seen that iV= (2/3)N. (28)
Assume that a slice acquisition time T^ for a regular slice, in accordance with an embodiment of the invention, is the same as a slice acquisition time for a slice acquired in accordance with an embodiment of the invention illustrated in Fig. 6. A total time, Tj/, to acquire data for a 3D image of scene 30 in accordance with an embodiment of the invention is therefore 2/3 the time required to 3D image the scene in accordance with gating shown in Fig. 6.
In some prior art 3D gating algorithms, distances to features in a slice of scene, such as scene 30, are determined by a short, "extended", front gate having a gate width equal to the pulse width A%pW of light pulses 40 (Fig. 1) that illuminate the scene. The extended front gate and corresponding long gate begin at a same time relative to an emission time t0 of respective light pulses 40 that illuminate the scene and provide light that is reflected from the scene and imaged by camera 20 during the gates.
By way of example, Fig. 8 shows a time-distance graph 200 that illustrates timing of an extended front gate 202 and its corresponding long gate 204 relative to an emission time t0 of a light pulse 40 that illuminates scene 30. Front gate 202 is shown along left hand time ordinate 62 and long gate 204 is shown along right hand time ordinate 63. Gates 202 and 204 begin at a same time relative to time t0. Whereas, long gate 90 shown in Figs. 2-4 optionally has a gate width equal to 3AxpW, long gate 204 used with extended front gate 202 optionally has a shorter gate width 2A%pW. Extended front gate 202 and its associated long gate 204 define a slice, of scene 30 having a spatial width cAipW/2. The slice defined by the gates is graphically represented by a shaded rectangle 206 along time-distance line 70 in Fig. 8.
Let a time at which extended front gate 202 and its corresponding long gate begin following a light pulse emission time t0 be represented by tSg. Then light acquired by pixels 23 in photosurface 22 during the gates may be used to determine distances Df to features in a slice of the scene having lower and upper distance bounds DSL, DSU where DSL, DSy and Df satisfy a relationship,
DSL = ctsg/2 ≤ Df ≤ DSu = [ctsg+cAτpw]/2. (29)
Slice 206 has a spatial, "slice width" SW = (DSu - DSL) = [cΔτpW]/2 (30)
Light registered by a pixel 23 that images a feature 71 of scene 30 located in slice 206 during extended front gate 202 and long gate 204 is graphically represented by shaded areas 208 and 210 respectively in the gates. From Fig. 8 it may be seen that for any feature located in slice 206, an amount of light 210 registered by pixel 23 is substantially all the light reflected by feature 71 that reaches camera 20. However, for such a feature, an amount of light 208 registered by the pixel during extended front gate 202 is dependent on the location of the feature in slice 206. Let the amounts of registered light 208 and 210 be represented in symbols by "Q" and "Q0" respectively. Then distance Df of feature 71 from camera 20 is given by, Df= ct5g/2 + (cΔτp^Xl - Q/Qo)/2. (31) It is noted that unlike for the configuration of gates shown in Figs. 2 and 3, for the configuration of gates shown in Fig. 8 there is no ambiguity as to the location of a feature, such as feature 71, in slice 206. However, pixel 23 that images the feature will register light during both extended front gate 202 and long gate 204 for features that are not in slice 206 but are located just before slice 206 at distances Df' that satisfy an equation, ctyg/2 - (cΔτpw) < Df < ct^/2. (32)
For all such features, an amount of light registered during extended front gate 202 is equal to an amount of light registered during long gate 204, the ratio Q/Qo is equal to 1 and equation (31) provides a same distance Df = cXsJ2. In some prior art methods, to prevent erroneous distances measurements for features outside of slice 206, distances Df are not determined for pixels for which Q/Qo =1. In general, since there are relatively few features for which Q/Qo =1, ignoring such features does not substantially, adversely affect providing a complete 3D image of the slice.
Distances to features in slice 206 may also be determined in accordance with prior art using an extended back gate instead of an extended front gate. The extended back gate has a gate width equal to that of an extended front gate but ends at a same time relative to a light pulse emission time to as a corresponding long gate, instead of beginning at a same time as the long gate. An extended back gate may be considered a mirror image of an extended front gate. Fig. 9 shows a time-distance graph 220 that illustrates timing of an extended back gate
222 and its corresponding long gate 224 relative to an emission time t0 of a light pulse 40 that illuminates scene 30. Extended back gate 222 is shown along left hand time ordinate 62 and long gate 224 is shown along right hand time ordinate 63. Gates 222 and 224 end at a same time tgp- relative to time t0. Extended back gate 222 and its associated long gate 224 define a same slice 206 of scene 30 that extended front gate and long gate 202 shown in Fig. 7 define if teg = tSg+2AxpW. Light reflected from feature 71 registered by pixel 23 that images the feature during extended back and long gates 222 and 224 is represented by shaded areas 226 and 228 respectively. Distance Df to a feature in slice 206 is given by
Df= cteg/2 + (cΔτpW)(Q/Q0-2)/2, (33) where "Q" and "Q0", represent amounts of registered light 226 and 228 respectively. As in the case for gate configurations similar to that shown in Fig. 6, configurations of an extended front gate 202 (Fig. 8) or an extended back gate 222 (Fig. 9) together with a long gate 204 and 224 respectively may be repeated to acquire distances to features in a plurality of optionally contiguous slices of scene 30. If a range over which distances to features in the scene extends from a distance R\ to a distance R2, and a slice has a slice width SW, a plurality of N slices, where
Figure imgf000021_0001
are needed to acquire a 3D image of the scene. Let a "gate acquisition time" ΔTg be required for a pixel 23 to register an amount of light for an extended front gate, extended back gate or a long gate suitable for use in determining a distance Df to a feature of a scene imaged on the pixel. A 3D acquisition time for a slice of the scene may then be written
TS = 2ATg, (35) and a total acquisition time for the scene be written,
Fig. 10 shows a time-distance graph 240 that illustrates temporal relationships of a plurality of extended back gates and associated long gates that are used to acquire 3D images of slices of a scene, such as scene 30.
A first extended back gate 242 and its associated long gate 244 are shown along left time ordinate 62 and define a slice 246. Amounts of light reflected from a light pulse 40 (Fig. 1) from a feature 248 in slice 246 that are registered by a pixel imaging the feature during extended back gate 242 and long gate 244 are shown as shaded areas 249 and 250 respectively in the gates. A second extended back gate 252 and its associated long gate 254 are shown along right time ordinate 63 and define a slice 256. By way of example, slice 248 in scene 30 are assumed contiguous with slice 256 in the scene and as a result in Fig. 10 touch at a corner. KiπumHwm
Amounts of light reflected from a light pulse 40 (Fig. 1) from a feature 258 in slice 256 that are registered by a pixel 23 imaging the feature during extended back gate 252 and long gate 254 are shown as shaded areas 259 and 260 respectively in the gates.
The inventors have realized that if only long gates are used to acquire a plurality of slices of a scene a total acquisition time, Tp, to acquire data for a 3D image of scene 30 may be reduced relative to a total acquisition time required using an extended front or back gate and an associated long gate.
Fig. 11 shows a time-distance graph 280 that illustrates temporal relationships of a plurality of long gates used to acquire a 3D image of a scene, in accordance with an embodiment of the invention. A sequence of long gates LGj , LG2, LG3 ... used to acquire a 3D image of scene 30 are shown along left time ordinate 62. For clarity of presentation, gates, hereinafter "odd gates", labeled with an odd subscript and gates, hereinafter "even gates", labeled with an even number subscript are shown on opposite sides of left time ordinate 62. Gates LGj and LG2 are also repeated along right time ordinate 63. Each pair of sequential odd and even gates defines a spatial slice of scene 30. For example, gates LGj and LG2 define a slice located along time-distance line 70 labeled SI4 2 in Fig. 11. Light registered by pixels 23 of photosurface 22 during gates LG \ and LG2 may be used in accordance with an embodiment of the invention to determine distance to all features in slice SLj 2 located in the slice. Similarly, gate pairs (LG3.LG4) define slice SL3 4 shown in Fig. 11. In an embodiment of the invention, even and odd gates have a same gate width,
LGW= 2ΔτpW, (37) and each even and odd gate begins at a time (relative to t0) that is half a gate width, i.e. AτpW later than a time at which a preceding odd and even gate respectively begins. As a result, a slice SLn n+i defined by two long gates has a spatial width, SW = cΔτpW. (38)
Features that are located in slices SLj 2» & SL3 4, which are imaged by camera 20 are schematically indicated by circles 281, 282 and 283 along time-distance line 70. Amounts of light registered by pixels 23 that respectively image features 281, 282 and 283 during the gates are graphically indicated by shaded regions AB respectively. Amounts of light registered for feature 282 are shown in gates LGj and LG2 along right hand time ordinate 63.
In accordance with an embodiment of the invention, if a pixel 23 registers amounts of light Qn and Qn+ 1 during gates LGn and LGn+ 1 respectively, distance Df to the feature imaged by the pixel is determined in accordance with the following equations,
Df= (ctn/2) + cΔτpW(Qn+i/Qn)/2 if Qn+1 < Qn, (39) Df= (ctn+l/2) + cΔτ^OCQn/Qn+O)/! if Qn < Qn+I- (40)
From the above discussion and Fig. 11, it is seen that every two long gates provides 3D information for a slice of a scene having spatial width equal to cΔτpW which is and that η long gates LGn provides 3D information for features having distances Df in a range from R\ to R2 for which (R2-R1) = ηcΔτpW/2 so that η = 2(R2-Ri)/cΔτJw. (41)
Assuming a gate acquisition time ΔTg, a total acquisition time for a scene bounded by distances R^ and R2 may therefore be written,
Figure imgf000023_0001
A total acquisition time for the scene, in accordance with an embodiment of the invention is therefore one half the prior art acquisition time given by equation (36).
It is further noted for the "interstitial slice" gating method in accordance with an embodiment of the invention schematically illustrated in Fig. 7, as the number of slices used to provide a 3D image of a scene bounded by distances R\ and R2 increase, a total number of gates required to image the scene approaches that given by equation 41. As a result, for a same scene, a total 3D acquisition time for the scene using the gating configuration schematically illustrated in Fig. 7 approaches that using the gating configuration schematically illustrated in Fig. 11.
In the description and claims of the present application, each of the verbs, "comprise" "include" and "have", and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily an exhaustive listing of members, components, elements or parts of the subject or subjects of the verb.
The invention has been described with reference to embodiments thereof that are provided by way of example and are not intended to limit the scope of the invention. The described embodiments comprise different features, not all of which are required in all embodiments of the invention. Some embodiments of the invention utilize only some of the features or possible combinations of the features. Variations of embodiments of the described invention and embodiments of the invention comprising different combinations of features than those noted in the described embodiments will occur to persons of the art. The scope of the invention is limited only by the following claims.

Claims

1. A method of acquiring a 3D image of a scene, the method comprising: transmitting at least one light pulse having a pulse width at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of sets of gates, each set comprising at least one first, second and third gate having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of the at least one first gate and the at least one second gate are between the start and stop times of the at least one third gate, wherein for at least one of the set of gates, the at least one third gate has a start time equal to about the stop time of the at least one third gate of another of the set of gates; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.
2. A method according to claim 1 wherein the start time of the second gate is substantially equal the stop time of the first gate.
3. A method according to claim 2 wherein the start time of the first gate is delayed relative to the start time of the third gate by the pulse width of the at least one light pulse.
4. A method according to claim 3 wherein the first and second gates have equal gate widths.
5. A method according to claim 4 wherein the gate width of the first and second gates is equal to half a pulse width of the at least one light pulse.
6. A method according to claim 5 wherein the gate width of the third gate is substantially equal to three pulse widths of the at least one light pulse.
7. A method according to claim 6 wherein if amounts of reflected light from a feature imaged during the at least one first, second and third gates of a set of gates are denoted by Q\, Q2, Q3 respectively and the start time of the at least one first gate by t\, then if Qi ≠ 0 or Q2 ≠
0, distance Df to the feature is determined in accordance with Df= cti/2 + (cΔτ)[(Qi+ Q2VQs-IR if Qi(n) < Q2(n) and
Df= Ct1Il + (CAT)[I-(Q1+ Q2)/Q3]/2 if Q1(Ii) > Q2(Ii), where c is the speed of light and Δτ is the pulse width of the at least one light pulse.
8. A method according to claim 6 or claim 7 wherein, if amounts of reflected light from a feature imaged during the at least one first, second and third gates of a set of gates are denoted by Q1, Q2, Q3 respectively, the start time of the at least one third gate by t3, and the amount of light from the feature imaged by the at least one third gate of the other set of gates by Q*3 then if Q1 = Q2 = 0, Q3 ≠ 0 and Q 3 ≠ 0, distance Df to the feature is determined in accordance with
Df= ct3/2 - (cΔτpw)(Q3(n)/(Q3+Q*3))/2 or
Df= ct3/2 - (cΔτiWχi-Q*3/(Q3+Q*3))/2, where c is the speed of light and Δτ is the pulse width of the at least one light pulse.
9. A method according to any of the preceding claims wherein the at least one third gate has a start time earlier than the stop time of the at least one third gate of the other set of gates by a period that is less than or equal to about one twentieth of the pulse width of the at least one light pulse.
10. A method of acquiring a 3D image of a scene, the method comprising: transmitting at least one light pulse at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of equal length gates having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start time of at least one first gate is substantially a time half way between the start and stop times of at least one second gate; and using amounts of reflected light imaged on the photosurface during the at least one first gate and the at least one second gate to determine distance to a feature of the scene.
11. A method according to claim 10 wherein the gates have a gate width substantially equal to twice a pulse width of the at least one light pulse.
12. A method according to claim 10 or claim 11 wherein if the start time and amounts of reflected light from the feature imaged during the at least one first gate are denoted respectively by t\, and Qi and for the at least one second gate by t2 and Q2 respectively, distance Df to the feature is determined in accordance with Df= (cti/2) + cΔτ(Q2/Qi)/2 if Q2 ≤ Qi and Df= (ct2/2) + cΔτ(l-(Q!/Q2))/2 if Qi ≤ Q2, where c is the speed of light and Δτ is the pulse width of the at least one light pulse.
13. A method of acquiring a 3D image of a scene, the method comprising: transmitting at least one light pulse having a pulse width at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a set of gates comprising at least one first, second and third gates having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of each first gate and each second gate are between the start and stop times of a same third gate; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.
14. A method according to claim 13 wherein the first gate has a start time delayed relative to a start time of the third gate by about the pulse width.
15. A method according to claim 13 or claim 14 wherein the second gate has a stop time that precedes a stop time of the at least one third gate by about a pulse width.
16. A method according to any of claims 13-14 wherein the first gate has a gate width equal to about half a pulse width.
17. A method according to any of claims 13-16 wherein the second gate has a gate width equal to about half a pulse width.
18. A method according to any of claims 13-17 wherein the third gate has a gate width equal to about three pulse widths.
19. A method according to claim 14 wherein the first and second gates have gate widths equal to half a pulse width and the third gate has a gate width equal to about three pulse widths wherein, if amounts of reflected light from a feature imaged during the at least one first, second and third gates are denoted by Qj, Q2, Q3 respectively, and the start time of the first gate by t\, distance Df to the feature is determined in accordance with Df= ctj/2 + (cΔτ)(Q/Q3-l)/2 if Q2 < Qi; and
Df= cti/2 + (cΔτ)(l-Q/Q3)/2 IfQ1 > Q2, where c is the speed of light, Δτ is the pulse width of the at least one light pulse and
Q - (Qi + Q2)-
20. A camera useable to acquire a 3D image of a scene comprising: a gateable photosurface; and a controller that gates the photosurface in accordance with any of claims 1-19.
21. A camera according to claim 20 and comprising a light source controllable to illuminate the scene with a pulse of light.
PCT/IL2007/001571 2007-12-19 2007-12-19 3d camera and methods of gating thereof WO2009078002A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP07849597A EP2235563A1 (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof
CN2007801023367A CN102099703A (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof
PCT/IL2007/001571 WO2009078002A1 (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IL2007/001571 WO2009078002A1 (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof

Publications (1)

Publication Number Publication Date
WO2009078002A1 true WO2009078002A1 (en) 2009-06-25

Family

ID=39797928

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2007/001571 WO2009078002A1 (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof

Country Status (3)

Country Link
EP (1) EP2235563A1 (en)
CN (1) CN102099703A (en)
WO (1) WO2009078002A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8264581B2 (en) 2008-07-17 2012-09-11 Microsoft International Holdings B.V. CMOS photogate 3D camera system having improved charge sensing cell and pixel geometry
US8681321B2 (en) 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US8890952B2 (en) 2008-07-29 2014-11-18 Microsoft Corporation Imaging system
US20150109414A1 (en) * 2013-10-17 2015-04-23 Amit Adam Probabilistic time of flight imaging

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9826214B2 (en) * 2014-09-08 2017-11-21 Microsoft Technology Licensing, Llc. Variable resolution pixel
US9874630B2 (en) * 2015-01-30 2018-01-23 Microsoft Technology Licensing, Llc Extended range gated time of flight camera
US10708577B2 (en) * 2015-12-16 2020-07-07 Facebook Technologies, Llc Range-gated depth camera assembly

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003054579A1 (en) * 2001-12-06 2003-07-03 Astrium Gmbh Method and device for producing 3d range images
US20070091175A1 (en) * 1998-09-28 2007-04-26 3Dv Systems Ltd. 3D Vision On A Chip

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100524015C (en) * 1995-06-22 2009-08-05 3Dv系统有限公司 Method and apparatus for generating range subject distance image
WO2001018563A1 (en) * 1999-09-08 2001-03-15 3Dv Systems, Ltd. 3d imaging system
US7236235B2 (en) * 2004-07-06 2007-06-26 Dimsdale Engineering, Llc System and method for determining range in 3D imaging systems
EP1659418A1 (en) * 2004-11-23 2006-05-24 IEE INTERNATIONAL ELECTRONICS &amp; ENGINEERING S.A. Method for error compensation in a 3D camera
CN100337122C (en) * 2005-03-25 2007-09-12 浙江大学 Pulse modulation type three-dimensional image-forming method and system containing no scanning device
CN100462737C (en) * 2006-06-29 2009-02-18 哈尔滨工业大学 Distance gate type laser 3D imaging radar system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070091175A1 (en) * 1998-09-28 2007-04-26 3Dv Systems Ltd. 3D Vision On A Chip
WO2003054579A1 (en) * 2001-12-06 2003-07-03 Astrium Gmbh Method and device for producing 3d range images

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8264581B2 (en) 2008-07-17 2012-09-11 Microsoft International Holdings B.V. CMOS photogate 3D camera system having improved charge sensing cell and pixel geometry
US8890952B2 (en) 2008-07-29 2014-11-18 Microsoft Corporation Imaging system
US8681321B2 (en) 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US9641825B2 (en) 2009-01-04 2017-05-02 Microsoft International Holdings B.V. Gated 3D camera
US20150109414A1 (en) * 2013-10-17 2015-04-23 Amit Adam Probabilistic time of flight imaging
KR20160071390A (en) * 2013-10-17 2016-06-21 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Probabilistic time of flight imaging
CN105723238A (en) * 2013-10-17 2016-06-29 微软技术许可有限责任公司 Probabilistic time of flight imaging
US10063844B2 (en) * 2013-10-17 2018-08-28 Microsoft Technology Licensing, Llc. Determining distances by probabilistic time of flight imaging
KR102233419B1 (en) 2013-10-17 2021-03-26 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Probabilistic time of flight imaging

Also Published As

Publication number Publication date
EP2235563A1 (en) 2010-10-06
CN102099703A (en) 2011-06-15

Similar Documents

Publication Publication Date Title
EP2235563A1 (en) 3d camera and methods of gating thereof
US20220128663A1 (en) Methods and apparatus for array based lidar systems with reduced interference
JP6910010B2 (en) Distance measuring device
CN102147553B (en) Method for fast gating photosurface, method for determining distance to the feature of field and photo camera
KR101992511B1 (en) 3d zoom imager
JP5647118B2 (en) Imaging system
KR102233419B1 (en) Probabilistic time of flight imaging
US8593507B2 (en) Rolling camera system
US8681321B2 (en) Gated 3D camera
JP2022505772A (en) Time-of-flight sensor with structured light illumination
KR102559910B1 (en) A system for characterizing the environment around the vehicle
JP2021532648A (en) Hybrid time-of-flight imager module
EP3227721B1 (en) Distance measuring device and method for determining a distance
CN101446641B (en) Distance measurement system and distance measurement method
JP6526178B2 (en) Imaging system for monitoring a field of view and method for monitoring a field of view
JP2022551427A (en) Method and apparatus for determining distance to scene
WO2021065138A1 (en) Distance measurement device and control method
CN112470035A (en) Distance information acquisition device, distance information acquisition method, and program
CN104049258B (en) A kind of extraterrestrial target stereoscopic imaging apparatus and method
KR102656399B1 (en) Time-of-flight sensor with structured light illuminator
KR20220155362A (en) Apparatus and method for acquiring image data
CN116964486A (en) Door control camera, sensing system for vehicle and lamp for vehicle

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780102336.7

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07849597

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 5196/DELNP/2010

Country of ref document: IN

Ref document number: 2007849597

Country of ref document: EP