US20150253428A1 - Determining positional information for an object in space - Google Patents

Determining positional information for an object in space Download PDF

Info

Publication number
US20150253428A1
US20150253428A1 US14/212,485 US201414212485A US2015253428A1 US 20150253428 A1 US20150253428 A1 US 20150253428A1 US 201414212485 A US201414212485 A US 201414212485A US 2015253428 A1 US2015253428 A1 US 2015253428A1
Authority
US
United States
Prior art keywords
light
region
determining
emission
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/212,485
Inventor
David Holz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ultrahaptics IP Two Ltd
LMI Liquidating Co LLC
Original Assignee
Leap Motion Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US14/212,485 priority Critical patent/US20150253428A1/en
Application filed by Leap Motion Inc filed Critical Leap Motion Inc
Priority to US14/280,018 priority patent/US9679215B2/en
Assigned to LEAP MOTION, INC. reassignment LEAP MOTION, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HOLZ, David
Publication of US20150253428A1 publication Critical patent/US20150253428A1/en
Assigned to TRIPLEPOINT CAPITAL LLC reassignment TRIPLEPOINT CAPITAL LLC SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEAP MOTION, INC.
Assigned to THE FOUNDERS FUND IV, LP reassignment THE FOUNDERS FUND IV, LP SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEAP MOTION, INC.
Priority to US15/392,920 priority patent/US9778752B2/en
Priority to US15/696,086 priority patent/US10691219B2/en
Assigned to LEAP MOTION, INC. reassignment LEAP MOTION, INC. TERMINATION OF SECURITY AGREEMENT Assignors: THE FOUNDERS FUND IV, LP, AS COLLATERAL AGENT
Assigned to LEAP MOTION, INC. reassignment LEAP MOTION, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: TRIPLEPOINT CAPITAL LLC
Assigned to Ultrahaptics IP Two Limited reassignment Ultrahaptics IP Two Limited ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LMI LIQUIDATING CO., LLC.
Assigned to LMI LIQUIDATING CO., LLC. reassignment LMI LIQUIDATING CO., LLC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEAP MOTION, INC.
Assigned to LMI LIQUIDATING CO., LLC reassignment LMI LIQUIDATING CO., LLC SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Ultrahaptics IP Two Limited
Assigned to TRIPLEPOINT CAPITAL LLC reassignment TRIPLEPOINT CAPITAL LLC SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LMI LIQUIDATING CO., LLC
Priority to US16/908,643 priority patent/US11493998B2/en
Priority to US17/862,212 priority patent/US11720180B2/en
Priority to US18/209,259 priority patent/US20230325005A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/003Bistatic lidar systems; Multistatic lidar systems
    • G01S17/026
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems

Definitions

  • Embodiments of the disclosed technology generally relate to distance measurement and, more particularly, to determining a position and/or depth of an object or features of an object surface in space.
  • Determining the distance of objects from an observation point, or tracking objects in three dimensions is important for many applications, including, e.g., surveying, navigation, surveillance, focus finding in photography, and motion capture for gesture- and movement-controlled video gaming, to name just a few.
  • ultrasonic and laser rangefinders are based on time-of-flight measurements, i.e., they emit a sound or light pulse toward the object, capture a reflected signal pulse (or echo), and measure the roundtrip time for the pulse, from which the distance of the object can be calculated. This method generally works well over large distances where precision beyond a few meters is not required.
  • Embodiments hereof provide an approach to distance or depth measurements that utilizes a directed light source with time-variable direction to sweep or scan a spatial region of interest, and a light detector to measure light reflected by object(s) that might exist within the region of interest in a way that temporally resolves the different directions.
  • the light source scans the spatial region periodically (e.g., like a flash light waving back and forth over the region of interest), such that each direction of the illuminating emission provides information about object(s) that might exist in the region being scanned.
  • light direction can correspond to, and/or can include information about, a particular phase or point within the emission cycle.
  • the direction of the illuminating emission thus provides information about one or more of a presence, location, or other perceivable property of the object(s).
  • a camera or other light sensor—can be synchronized with the light source to capture light diffusely reflected (e.g., scattered) off objects in the spatial region.
  • Positional information can be extracted from analyzing the captured light.
  • positional information can be extracted from the time (or phase within the cycle) at which the light is received.
  • the direction of the illuminating emission, and thus the direction of the object's location viewed from the source/sensor arrangement, can be calculated.
  • positional information can comprise relative distances between the camera, the object, and the receiver.
  • An embodiment uses two such source/sensor arrangements disposed in different locations, each scan the same plane, thereby enabling the location of an object within that plane to be determined from the intersection of the two scanning directions that result in respective reflection signals.
  • a third source/sensor arrangement is disposed in a different plane such that the emission of the third source/sensor arrangement scans along an axis orthogonal to the two co-planar source/sensor arrangements, thereby enabling a three-dimensional spatial region to be scanned for the location of the object.
  • the distances to the object within the spatial region can be determined by triangulation based upon the relative distances between the object and the various source/sensor arrangements.
  • an object of interest is equipped with a retro-reflector, and the specular reflection of the illuminating emission is detected with a photo-sensor co-located with the light source.
  • an object may be equipped, for tracking purposes, with a light source and sensor, and its location may be computed from reflection signals that it receives from retro-reflectors at fixed spatial locations.
  • directed light source refers to a light source that emits light in accordance with a directional nature (primary direction).
  • the light emission may have any divergence, ranging from highly collimated to wide-angle, and can often be characterized with a Gaussian profile.
  • the primary direction of the emission comprises the central or average direction, or the direction of brightest emission, which coincide for a Gaussian profile.
  • Some implementations include a directed light source with time-variable direction, comprising an arrangement of one or more light-emitting devices, optionally in conjunction with any associated optics or other components, that emits light, at any moment in time, in a primary direction that changes as a function of time.
  • variable emission direction is achieved by an arrangement of two or more light-emitting devices, at least two of which emit in a different direction, that are electronically controlled to change activation levels among two or more activation states (e.g., turn on and turn off) sequentially, periodically, or according to pattern (or combinations thereof)), such that a combination of the contributions from the two or more light sources emit radiation during a complete control cycle, or emission cycle, in accordance with a scan pattern.
  • Individual light-emitting devices define an emission point, and collectively, the light-emitting devices of the light source define an emission region.
  • the light-emitting devices can be arranged and/or spaced according to implementation (e.g., based upon individual emission divergences, area coverage, interference avoidance, other application specific needs, and so forth) such that the emissions collectively cover at least some desired subset of the entire spatial region of interest.
  • a single light-emitting device can be used; the emission direction can be varied directly by moving (mechanically, optically, electrically, or combinations thereof) the light-emitting device, and/or indirectly by moving a light-deflecting optic, such as a prism or mirror to scan the region of interest.
  • scanning encompasses directing light, e.g., as achieved by moving one or more individual light-emitting devices and/or associated optics, as well as making discrete changes in emission direction, e.g., by activating light-emitting devices pointing in different directions.
  • two or more activation states of two or more light sources can be at least partially concurrent, or completely discrete.
  • the scan pattern enables light to sweep a region of interest, by changing the emission direction continuously or discretely as a function of time.
  • the emission can change between discontiguous directions (e.g., in accordance with a random, pseudo-random, or programmed scan pattern).
  • a scan pattern provides a continuous or discontiguous spatial, temporal, or spatio-temporal change in light at the region of interest.
  • Implementation-specific needs can be met using any or combinations of individual light-emitting devices, such as light-emitting diodes (LEDs), lasers, incandescent or fluorescent light bulbs, or any other devices emitting light in one or more suitable frequency ranges, such as the optical, infrared (IR), or ultraviolet (UV) regime.
  • LEDs light-emitting diodes
  • IR infrared
  • UV ultraviolet
  • light scattered or reflected by objects illuminated by the light source is captured using a light detector (or multiple detectors), suited to the application, that measures a property, comprising any of an intensity, color, polarization, other property or combinations thereof, of the incoming light as a function of time; each detector defines a detection point, and when alone or combined with multiple light-sensing elements, defines a detection region.
  • the light detector can be synchronized with the light source so that the timing of the various control states of the light source (corresponding, for example, to different emission compositions, directions, other controllable aspects) is known relative to the timing of the measurements at the detector, enabling associating the direction of the illuminating emission with each point in time of the measured signal.
  • Various embodiments utilize a digital camera with a one- or two-dimensional electronic pixelated sensor, which can be non-coplanar or coplanar, such as a CCD (charge-coupled device) or CMOS (complementary metal-oxide semiconductor) sensor used as a photo-sensor array.
  • the camera may acquire a temporal sequence of captured frames, at a frame rate faster than the scan rate of the light source, such that multiple frames are acquired during an emission cycle of the source and each frame corresponds to a different control state of the illuminating emission.
  • the frames can be processed by a computer, or other suitable processing device(s), which can analyze the frames to determine a phase of a signal formed by the pixel output in response to the changing light conditions experienced by the sensor within each cycle for each pixel, e.g., by means of a discrete Fourier transform.
  • the signal comprises a photocurrent generated in each light-sensor pixel that is processed continuously by an associated phase-detecting circuit. Accordingly, embodiments can enable determining the direction of the illuminating emission from information extracted from the signal.
  • a camera can include a lens and/or other optics that focus light received from a particular direction onto a particular region of the sensor (corresponding to a pixel or group of pixels); thus, the position within the sensor at which the reflected light is received is related to the direction from which the light comes.
  • embodiments provide a method for obtaining positional information about an object within a region of interest.
  • the method may include (a) activating sources directed to portions of the region of interest according to an ordering of points, such that each point in the ordering directs electromagnetic radiation of at least one source to one of the portions of the region of interest (e.g., the NE quadrant, SE quadrant, etc.); (b) capturing a portion of the electromagnetic radiation reflected by an object; (c) forming a signal over time of at least one property (e.g., intensity, color, polarization, other property or combinations thereof) of the captured electromagnetic radiation; (d) determining from the signal at least one point in the ordering in which a dominant contributor to the captured electromagnetic radiation was activated; (e) determining an identity for the dominant contributor from the point in the ordering; (f) determining from the identity of the dominant contributor, a portion of the region of interest to which the electromagnetic radiation from the dominant contributor was directed; and (g) determining positional information for the object based at least in part upon
  • activating sources includes activating illumination sources (e.g., LEDs) directed in different directions according to a known order, such that each point in the order is associated with a particular direction.
  • the ordering can be fixed or can vary from scan instance to scan instance.
  • more than one source can be activated at a time to form a cross-fade or other illumination profile. From the signal, it may then be determined where in the order the greatest contributor—or contributors—of the captured light was activated, and, based thereon, the identity of the greatest contributor(s) and the direction in which the light was emitted may be determined. Based on the direction in which the illumination is emitted, the positional information of the object may be determined.
  • Capturing the reflected light may include capturing data frames (e.g., sensor states resulting from capturing illumination of a scan, or image resulting from a capture with a sensor array of a camera) of the region.
  • the direction of the reflected light relative to the camera may be determined, and the positional information may be further based in part thereon.
  • the data frames may be captured at a rate exceeding a scan rate associated with the illuminating emission.
  • the direction of the illuminating emission associated with the spectral region reflected by the object may be ascertained by determining an intensity peak across a temporal sequence of data frames for at least one pixel corresponding to the object within the data frame.
  • the intensity peak in turn, may be determined, e.g., from a Fourier transform on the temporal sequence of data frames.
  • the electromagnetic radiation is retro-reflected by the object.
  • a retro-reflector may be physically associated with the object.
  • the region may be scanned periodically, according to a same or a different order of activation.
  • the point in the ordering corresponding to capture of the reflected electromagnetic radiation corresponds to a phase within an emission cycle.
  • Determining the direction of the illuminating electromagnetic radiation associated with the electromagnetic radiation reflected by the object may include determining the point in the cycle where the captured radiation is greatest. This point in the cycle may be detected using a phase-detector circuit or equivalent.
  • Scanning the region may involve sequentially operating a plurality of light-emitting devices each directing light to a different portion of the region of interest. In some embodiments, only a subset of the plurality of light-emitting devices are operated so as to reduce a resolution of the scan.
  • Sequentially operating the plurality of devices may comprise sequentially causing the devices to emit pulses of light; successive pulses may overlap temporally. Sequentially operating the plurality of devices may, alternatively or additionally, involve driving each device according to a time-variable intensity having an intensity peak, the peaks occurring sequentially for the plurality of light-emitting devices.
  • light emitted by the plurality of light-emitting devices overlap spatially and temporally; in this case, determining a direction of the illuminating emission associated with the light reflected by the object may include determining an effective primary direction of the overlapping illumination.
  • the “effective primary direction” is the direction along which the intensity maximum resulting from the spatiotemporal overlap between illuminations from two or more sources with different individual primary directions lies.
  • scanning the region includes moving (e.g., rotating or translating) a light-emitting device, or moving a deflecting optic and/or a screen used in conjunction with a light-emitting device.
  • inventions pertain to a system for obtaining positional information about an object within a region of interest.
  • the system may include a directed light source with variable direction for scanning the region with an illuminating light, a detector for capturing light reflected by the object, and circuitry for (i) determining a time of capture of the reflected light and, based thereon, an associated direction of the illuminating light, and (ii) deriving the positional information about the object at least in part from the direction of the illuminating light.
  • the system also includes a retro-reflector affixed to the object.
  • the directed light source includes a plurality of light-emitting devices (e.g., LEDs) emitting light in a respective plurality of different primary directions. Further, the directed light source may include a controller for sequentially operating the plurality of light-emitting devices.
  • the light-emitting devices may be arranged such that their respective primary directions intersect at a common center. For example, the light-emitting devices may be affixed to an arcuate surface, facets of a polygonal surface, or facets of a polyhedral surface.
  • the plurality of light-emitting devices may include a plurality of light emitters and a plurality of associated deflecting optics for deflecting light emitted by the emitters into the different primary directions.
  • the directed light source comprises one or more moving light-emitting devices.
  • the directed light source includes one or more light-emitting devices and a moving deflecting optic and/or a moving screen having a perforation therein.
  • the detector includes a camera for imaging the region; the camera may include a lens and a sensor (e.g., a CCD or MEMS sensor), or a light-sensing device and a scanning mirror.
  • the detector is co-located with the light source, and the system further includes a retro-reflector affixed to the object.
  • the directed light source may include a controller for varying the emission direction so as to periodically scan the region; this controller may be synchronized with the circuitry determining the time of capture of the reflected light.
  • the circuitry causes the detector to be read out at a rate exceeding the scan rate of the directed light source.
  • the circuitry includes a phase-detector circuit for determining a phase within an emission cycle corresponding to a maximum intensity of the captured light, and/or a digital processor configured for performing a Fourier transform on the captured light to thereby determine a phase within an emission cycle corresponding to a maximum intensity of the captured light.
  • embodiments provide a method for determining depth associated with one or more objects within a region of interest.
  • the method includes (a) scanning the region with an illuminating light beam having a temporally variable beam direction so as to illuminate the object(s), (b) acquiring a temporal sequence of images of the region while the region is being scanned, each image corresponding to an instantaneous direction of the illuminating light beam and at least one of the images capturing light reflected by the illuminated object(s); and (c) based at least in part on the instantaneous direction of the light beam in the image(s) capturing light reflected by the object(s), determining a depth associated with the object(s).
  • Multiple of the images acquired during a single scan of the region may capture light reflected by the object(s); the method may include determining a depth profile of the object(s) based thereon.
  • a further aspect relates to a method for locating an object within a region.
  • the method involves, (a) using a light source affixed to the object, scanning the region with an illuminating light beam having a temporally variable beam direction; (b) using a sensor co-located with the light source, capturing reflections of the illuminating beam from a plurality of retro-reflectors fixedly positioned at known locations; (c) based on times of capture of the reflections, determining associated directions of the illuminating light beam; and (d) locating the object relative to the known locations of the retro-reflectors based at least in part on the directions of the illuminating light beam.
  • the object is located in a two-dimensional region based on reflections from at least three retro-reflectors.
  • the device may include a light source for scanning the region with an illuminating light beam having a temporally variable beam direction; a sensor co-located with the light source for capturing reflections of the illuminating beam from the plurality of retro-reflectors; and circuitry for determining, from times of capture of the reflections, directions of the illuminating light beam associated therewith, and for locating the object relative to the retro-reflectors based at least in part on the directions.
  • the object may, for example, be a mobile device.
  • embodiments provide a computer-implemented method for conducting machine control.
  • the method involves scanning a region of space, the scanning including (i) directing at least one light emission from a vantage point of a vantage region to a region of space, (ii) detecting a reflectance of the at least one light emission, and (iii) determining that the detected reflectance indicates a presence of an object in the region of space. Further, the method includes determining one or more object attributes of the object; analyzing the one or more object attributes to determine a potential control surface of the object; determining whether one or more control-surface attribute changes in the potential control surface indicate control information; and, if so, responding to the indicated control information according to response criteria.
  • the first light emission is directed to the region of space according to a first scan pattern
  • determining that the detected reflectance indicates a presence of an object comprises directing a second light emission to the region of space according to a second scan pattern.
  • Directing the second emission may involve scanning to a refined scan pattern, e.g., so as to capture surface detail about the object.
  • determining object attributes of the object comprises determining positional information of at least a portion of the object. Analyzing the object attributes to determine a potential control surface of the object may include determining based at least in part upon the positional information whether a portion of the object provides control information. Determining whether one or more control-surface attribute changes in the potential control surface indicate control information may include determining whether control-surface attribute changes in the potential control surface indicate an engagement gesture. Responding to the indicated control information according to response criteria may include determining a command to a user interface based at least in part upon the engagement gesture.
  • determining object attributes of the object comprises determining dynamic information of at least a portion of the object, physical information of at least a portion of the object, optical and/or radio properties of at least a portion of the object, and/or chemical properties of at least a portion of the object.
  • directing the emission includes scanning across an entryway; determining that the detected reflectance indicates a presence includes detecting an object comprising a person seeking entrance, and conducting a second scanning to a refined scan pattern of the person; determining whether control-surface attribute changes in the potential control surface indicate control information includes determining whether the control-surface attribute changes indicate a vein pattern of a hand of the person; and responding to the indicated control information according to response criteria comprises permitting the person to enter when the vein pattern matches a stored vein pattern of an individual authorized to enter.
  • a method for obtaining positional information includes (a) scanning the region with an illuminating emission having a temporally variable emission direction; (b) capturing light reflected by the object and, based on a time of capture of the reflected light, determining an associated direction of the illuminating emission; and (c) deriving the positional information about the object at least in part from the direction of the illuminating emission.
  • the positional information may be determined further based at least in part on a geometric relationship between the light source and the detector.
  • the positional information may include a distance, a depth, or a position of the object or a surface feature thereof.
  • the positional information comprises a depth profile of the object.
  • the method may involve periodically repeating steps (a) through (c) so as to update the positional information to track movement of the object.
  • these and other aspects of the disclosed technology may enable machines, computers, automata, and/or other types of intelligent devices to obtain information about objects present and events and/or actions (such as user gestures, signals, or other motions conveying commands and information to the device) occurring in a monitored region of interest.
  • FIG. 1 illustrates an exemplary machine sensory and control system in embodiments
  • FIGS. 2A-2E illustrate exemplary emission components of the machine sensory and control system of FIG. 1 , in accordance with various embodiments;
  • FIGS. 3A-3E illustrate exemplary detector components of the machine sensory and control system of FIG. 1 , in accordance with various embodiments;
  • FIGS. 4A-1 and 4 A- 2 schematically illustrate an exemplary system for conducting variation determination with emitters operated in discrete modes
  • FIGS. 4B-1 , 4 B- 2 , and 4 B- 3 schematically illustrate an exemplary system for conducting variation determination with emitters operated in continuously variable modes
  • FIG. 5A is a schematic diagram of an exemplary system for conducting variation determination with a pixelated sensor in accordance with various embodiments
  • FIG. 5B illustrates an exemplary control scheme for the light source depicted in FIG. 1A in accordance with one embodiment
  • FIG. 5D illustrates the computation of depth for the geometric configuration of the system depicted in FIG. 5A , in accordance with one embodiment
  • FIG. 5E illustrates an exemplary control scheme for the light source depicted in FIG. 5A that exploits different scan frequencies in accordance with an alternative embodiment
  • FIG. 5F illustrates a spectral reflection signal detected with the camera depicted in FIG. 5A resulting from the light-source control scheme of FIG. 5E ;
  • FIGS. 6A-6C illustrate different exemplary light-source configurations in accordance with various embodiments
  • FIGS. 7A and 7B illustrate embodiments in which light source and camera are, respectively, integrated into the same unit or provided as separate devices;
  • FIG. 8A illustrates an object location mid-way between the primary directions of two discrete light emitters with overlapping illumination regions
  • FIG. 8B shows the resulting reflection intensity in accordance with one embodiment
  • FIG. 8C illustrates an object located closer to one of two primary directions of two discrete light emitters with overlapping illumination regions
  • FIG. 8D shows the resulting reflection intensity in accordance with one embodiment
  • FIG. 8E illustrates a control scheme with temporal overlap between the pulses emitted by different discrete light emitters having overlapping illumination regions
  • FIGS. 8F and 8G show the resulting reflection intensities in accordance with various embodiments
  • FIG. 8H illustrates the intensity variation of a pulsed light source in accordance with various embodiments
  • FIG. 9A illustrates a light source with only two light emitter producing overlapping illumination regions
  • FIG. 9B illustrates a control scheme for the light source with temporal overlap in accordance with one embodiment
  • FIGS. 10A-10G illustrate moving light sources in accordance with various embodiments
  • FIGS. 11A-11N illustrate different exemplary light sources for scanning a three-dimensional region in accordance with various embodiments
  • FIGS. 12A and 12B are flow charts illustrating methods for depth determination in accordance with various embodiments.
  • FIG. 13A is a schematic diagram of an exemplary system for depth determination that uses light sources producing different light characteristics in accordance with various embodiments;
  • FIG. 13B is a schematic diagram of an exemplary system for depth determination based on retro-reflection off an object in accordance with various embodiments
  • FIG. 13C is a schematic diagram of an exemplary system for determining the two-dimensional position of an object using retro-reflection in accordance with various embodiments
  • FIG. 13D is a schematic diagram illustrating how an object equipped with a light source can be localized based on retro-reflections off multiple stationary reflectors in accordance with various embodiments;
  • FIG. 14 is a block diagram of an exemplary computational facility for determining positional information in accordance with various embodiments
  • FIG. 15 illustrates an exemplary task environment in which various embodiments of the disclosed technology may be utilized
  • FIG. 16 illustrates a variation determination system in accordance with various embodiments
  • FIGS. 17A-17D illustrate predictive information including a model in accordance with various embodiments.
  • FIGS. 18A and 18B illustrate virtual constructs implementing an engagement target with which a control object interacts in accordance with various embodiments.
  • Sensing distance, depth, and/or position of objects automatically can enable machines to be controlled by salient properties of the objects or object motion.
  • the object(s) may generally be any inanimate or animate objects, and may have complex surfaces and/or change in position or shape over time. Certain embodiments provide improved accuracy in positional and/or depth determination, thereby supporting object and/or object-surface recognition, as well as object-change, event, and/or action recognition, and/or combinations thereof.
  • FIG. 1 illustrates an exemplary machine sensory and control system in embodiments.
  • a motion-sensing and controller system provides for detecting that some variation(s) in one or more portions of interest of a user (or other object) has occurred, for determining that an interaction with one or more machines corresponds to the variation(s), for determining whether the interaction should occur, and, if so, for affecting the interaction.
  • the machine sensory and control system typically includes a portion-detection system, a variation-determination system, an interaction system, and an application-control system.
  • one embodiment of detection system 100 embodiment includes an emission module 102 , a detection module 104 , a controller 106 , a signal-processing module 108 , and a machine-control module interface 110 .
  • the emission module 102 illuminates one or more objects of interest 112 (e.g., the user's finger or some other control object) within an area of interest 114 .
  • the emission module 102 includes one or more emitter(s) 120 A, 120 B (e.g., LEDs or other devices emitting light in the IR, visible, or other spectrum regions, or combinations thereof; radio and/or other electromagnetic signal emitting devices) that are controllable via emitter parameters (e.g., frequency, activation state, firing sequences and/or patterns, and so forth) by the controller 106 .
  • emitter parameters e.g., frequency, activation state, firing sequences and/or patterns, and so forth
  • the emitters 120 A, 120 B can be individual elements coupled with materials and/or devices 122 .
  • a light-emitting element 120 A, 120 B may be combined with a lens 122 A (see FIG.
  • multi-lens 122 B see FIG. 2B
  • image-directing film (IDF) 122 C see FIG. 2C
  • liquid lens multiple such elements or combinations thereof, and/or others, with varying or variable optical properties to direct the emission.
  • IDF image-directing film
  • liquid lens multiple such elements or combinations thereof, and/or others, with varying or variable optical properties to direct the emission.
  • FIG. 2D one or more arrays 120 D of emissive elements (combined on a die or otherwise) may be used with or without the addition of devices 122 for directing the emission, and positioned within an emission region 200 (see FIG.
  • emitter parameters e.g., statically mounted (e.g., fixed, parallel, orthogonal or forming other angles with a work surface, one another or a display or other presentation mechanism), dynamically mounted (e.g., pivotable, rotatable and/or translatable), embedded (e.g., within a machine or machinery under control) or otherwise coupleable using an interface (e.g., wired or wireless).
  • structured lighting techniques can provide improved surface-feature-capture capability by casting illumination according to a reference pattern onto the object.
  • Image-capture techniques described in further detail herein can be applied to capture and analyze differences in the reference pattern and the pattern as reflected by the object.
  • the detection system 100 may omit the emission module 102 altogether (e.g., in favor of ambient lighting).
  • the detection module 104 includes one or more capture device(s) 130 A, 130 B (e.g., e.g., devices sensitive to visible light or other electromagnetic radiation) that are controllable via the controller 106 .
  • capture device(s) 130 A, 130 B e.g., e.g., devices sensitive to visible light or other electromagnetic radiation
  • the capture device(s) 130 A, 130 B can comprise one or more individual image-capture elements 130 A or arrays of image-capture elements 130 A (e.g., pixel arrays, CMOS or CCD photo sensor arrays, or other imaging arrays) or individual photosensitive elements 130 B or arrays of photosensitive elements 130 B (e.g., photodiodes, photo sensors, single detector arrays, multi-detector arrays, or other configurations of photo sensitive elements), or combinations thereof.
  • image-capture elements 130 A or arrays of image-capture elements 130 A e.g., pixel arrays, CMOS or CCD photo sensor arrays, or other imaging arrays
  • individual photosensitive elements 130 B or arrays of photosensitive elements 130 B e.g., photodiodes, photo sensors, single detector arrays, multi-detector arrays, or other configurations of photo sensitive elements
  • other existing/emerging detection mechanisms and/or some combination thereof can also be utilized in accordance with the requirements of a particular implementation.
  • Capture device(s) 130 A, 130 B can each define a particular vantage point 300 from which objects 112 within the area of interest 114 are sensed, and can be positioned within a detection region 302 (see FIG. 3A ) according to one or more detector parameters (either statically (e.g., fixed, parallel, orthogonal or forming other angles with a work surface, one another, or a display or other presentation mechanism) or dynamically (e.g., pivotably, rotatably, and/or translatably); and mounted, embedded (e.g., within a machine or machinery under control), or otherwise coupleable using a wired or wireless interface).
  • detector parameters either statically (e.g., fixed, parallel, orthogonal or forming other angles with a work surface, one another, or a display or other presentation mechanism) or dynamically (e.g., pivotably, rotatably, and/or translatably); and mounted, embedded (e.g., within a machine or machinery under control), or otherwise coupleable using
  • Capture devices 130 A, 130 B can be coupled with devices and/or materials (such as, e.g., lenses 310 A (see FIG. 3A ), multi-lenses 310 B (see FIG. 3B ), image-directing film (IDF) 310 C (see FIG. 3C ), liquid lenses, combinations thereof, and/or others) with varying or variable optical properties for directing the reflectance to the capture device 130 A, 130 B for controlling or adjusting resolution, sensitivity, and/or contrast.
  • Capture devices 130 A, 130 B can be designed or adapted to operate in the IR, visible, or other spectrum regions, or combinations thereof; or alternatively operable in conjunction with radio- and/or other electromagnetic-signal-emitting devices in various applications.
  • capture devices 130 A, 130 B can be organized in arrays 320 , in which the image capture device(s) can be interleaved by row (see, e.g., FIG. 3D ), column, or according to a pattern, or can be otherwise addressable individually or in groups.
  • capture devices 130 A, 130 B can capture one or more images for sensing objects 112 and capturing information about the object (e.g., position, motion, and so forth).
  • particular vantage points of capture devices 130 A, 130 B can be directed to area of interest 114 so that fields of view 330 of the capture devices at least partially overlap. Overlap in the fields of view 330 (see, e.g., FIG. 3E ) provides capability to employ stereoscopic vision techniques, including those known in the art, to obtain information from a plurality of images captured substantially contemporaneously.
  • Controller 106 comprises control logic (implemented in hardware, software, or combinations thereof) to conduct selective activation/de-activation of emitter(s) 120 A, 120 B in on-off or other activation states or combinations thereof (and/or to control active directing devices) to produce emissions of (e.g., spatiotemporally) varying intensities in accordance with a scan pattern which can be directed to scan the area of interest 114 .
  • Controller 106 can further comprise control logic (implemented in hardware, software, or combinations thereof) to conduct selection, activation, and control of capture device(s) 130 A, 130 B (and/or to control associated active directing devices) to capture images or otherwise sense differences in reflectance or other illumination.
  • Signal-processing module 108 determines whether captured images and/or sensed differences in reflectance and/or other sensor-perceptible phenomena indicate a possible presence of one or more objects of interest 112 , such as control objects 113 ; the presence of such objects, and/or variations thereof (e.g., in position, shape, etc.), can be used as input to a machine controller via the machine- and application-control module interface 110 .
  • the variation of one or more portions of interest of a user or control object can correspond to a variation of one or more attributes (e.g., position, motion, appearance, surface patterns) of a user's hand or finger(s) 113 , points of interest on the hand, a facial portion, etc., or other control objects (e.g., styli, tools), and so on (or some combination thereof) that is detectable by, or directed at, but otherwise occurs independently of the operation of the machine sensory and control system.
  • attributes e.g., position, motion, appearance, surface patterns
  • control objects e.g., styli, tools
  • the system may be configurable to “observe” ordinary user locomotion (e.g., motion, translation, expression, flexing, deformation, and so on), locomotion directed at controlling one or more machines (e.g., gesturing, intentionally system-directed facial contortion, and so forth), and/or attributes thereof (e.g., rigidity, deformation, fingerprints, veins, pulse rates, and/or other biometric parameters); see, e.g., U.S. Provisional Patent Application Ser. No. 61/952,843 (filed on Mar. 13, 2014), the entire disclosure of which is hereby incorporated by reference.
  • ordinary user locomotion e.g., motion, translation, expression, flexing, deformation, and so on
  • locomotion directed at controlling one or more machines e.g., gesturing, intentionally system-directed facial contortion, and so forth
  • attributes thereof e.g., rigidity, deformation, fingerprints, veins, pulse rates, and/or other biometric parameters
  • the system provides for detecting that some variation(s) in one or more portions of interest (e.g., fingers, fingertips, or other control surface portions) of a user has occurred, for determining that an interaction with one or more machines corresponds to the variation(s), for determining whether the interaction should occur, and, if so, for at least one of initiating, conducting, continuing, discontinuing, and/or modifying the interaction (and/or a corresponding or related interaction).
  • portions of interest e.g., fingers, fingertips, or other control surface portions
  • FIGS. 4A-1 and 4 A- 2 illustrate an exemplary system for conducting distance/depth determination with emitters operated in discrete modes.
  • the system includes an emitting source 402 comprising a number of (in the depicted exemplary embodiment, four) emitting elements (e.g., emitters A, B, C, D) to provide a directed source having a variable direction that illuminates a region of interest 414 including an object of interest 412 .
  • the source 402 can scan a spatial region according to a suitable scan pattern by activating and de-activating select ones of the emitting elements according to a known ordering.
  • the region may be divided into a number of sub-regions (e.g., four quadrants), which are then illuminated by select activation of emitting elements.
  • Emissions that intercept the object 412 are reflected, and a portion of the reflection is captured by a receiver 404 (e.g., a single detector element, any arrangement—e.g., an array, line, etc.—of individual detector elements, a camera or camera-like device comprising a pixelated sensor, or other arrangements of detector elements or combinations thereof).
  • a receiver 404 e.g., a single detector element, any arrangement—e.g., an array, line, etc.—of individual detector elements, a camera or camera-like device comprising a pixelated sensor, or other arrangements of detector elements or combinations thereof.
  • a property e.g., intensity, polarization, frequency, or the like
  • a point in the ordering in which a dominant contributor to the captured electromagnetic radiation was activated can be determined from the signal.
  • the point in the ordering corresponding to the time of receipt of the reflection allows an inference of the direction of the illuminating emission and, thus, of the sub-region in which the reflecting object is located.
  • An identity for the dominant contributor can be determined from the point in the ordering.
  • a portion of the region of interest (e.g., sub-region) to which the electromagnetic radiation from the dominant contributor was directed can be determined from the identity of the dominant contributor.
  • Positional information for the object e.g., sub-region in which at least a portion of the object is located) can be determined based upon the portion of the region of interest.
  • multiple illuminating emissions having different properties may be used contemporaneously or sequentially and distinguished using suitable filters with the detector(s).
  • Multiple sources and detectors may be used to scan the region of interest along different planes so as to provide three-dimensional information as illustrated by FIG. 13A and described in further detail below. For example, a cubic volume may be scanned along two perpendicular planes, each divided into four quadrants. Localizing an object in one of the four quadrants for both scans allows identifying in which one of eight sub-cubes the objects is located.
  • the spatial resolution of object localization achieved in this example embodiment corresponds directly to the spatial resolution of the scan pattern; the more sub-regions there are, the more precise is the determined object location.
  • the sources A, B, C, D may be operated in a discrete, binary mode, i.e., turned on and off one at a time. This results in a binary signal at the receiver 404 , with intensity or amplitude levels of varying degree; reflected illumination from the emitter that provides the most direct illumination of the object 414 , in this case emitter B, results in the point of greatest magnitude in an intensity vs. time signal.
  • FIGS. 4B-1 through 4 B- 3 illustrate an exemplary system for conducting distance/depth determination with emitters operated in continuously variable modes.
  • the system hardware may have the same or similar emitter and receiver hardware as the discrete-mode embodiment described above with reference to FIGS. 4A-1 through 4 A- 2 , but the emitters A, B, C, D are turned on and off gradually, reaching their respective intensity maxima according to a continuum. Since the emitters are energized and de-energized continuously, their active, energized states may temporally overlap using techniques such as described in further detail with reference to FIGS. 8E-8G and 9 A- 9 B below.
  • one emitter remains energized while the next emitter is energized to create a “cross-fade.”
  • the reflected emissions detected at the receiver may result in a continuously varying signal with a (global) maximum corresponding to the most direct illumination (which may be either a single maximum, as shown in FIG. 4B-2 , or the largest one of a number of local maxima, as shown in FIG. 4B-3 ).
  • FIG. 5A conceptually illustrates, in more detail, an exemplary system 500 for distance or depth sensing in accordance with various embodiments.
  • the system 500 includes a plurality of integral, non-integral, and/or communicatively coupled elements: a light source 502 that emits light toward a region of interest 504 , a controller 506 for operating the light source 502 so as to scan the region 504 , a pixelated sensor array 508 (e.g., a camera or other sensor array, typically including associated control hardware) for capturing a portion of the emitted light as reflected from an object 510 , and a sensor-data processing facility 512 for processing the sensor data acquired by the sensor array 508 .
  • the various system components may be provided in a more distributed or more integrated manner.
  • the controller 506 and sensor-data processing facility 512 may each be implemented in hardware, software, or a combination of both. Further, they may be implemented as separate components (as depicted), or integrated into a single device.
  • a suitably programmed general-purpose computer interfacing with the sensor array 508 serves to process the sensor data and determine positional information associated therewith.
  • the computer may also store a program module that provides the functionality of the light-source controller 506 .
  • the controller 506 and/or the sensor-data processing facility 512 may be implemented with dedicated electronic circuitry, such as, e.g., digital signal processors (DSPs), field programmable gate arrays (FPGAs), or application-specific integrated circuits (ASICs), etc.
  • the sensor-data processing facility 512 is provided in the form of custom hardware that is integrated with the sensor array 508 into a single device.
  • the light source 502 may include a plurality of individually controllable light-emitting devices 520 , 522 , 524 , 526 , 528 (such as, e.g., LEDs), disposed, e.g., on a semispherical surface 530 .
  • Each individual light-emitting device defines an emission point, and collectively, the light-emitting devices of the light source define an emission region.
  • the light-emitting devices 520 , 522 , 524 , 526 , 528 may be powered by a dedicated driver circuit 514 , or, alternatively or in addition, via a network link directly by the controller 506 .
  • the controller 106 may set the drive currents to alter the activation states of these devices 520 , 522 , 524 , 526 , 528 (e.g., turn on and off, vary intensity, etc.) at periodic (or other) intervals, e.g., cycling through them in various orderings according to a scan pattern to emit light in different primary directions (indicated in FIG. 5A by dashed lines) into the region 504 . While light source 502 is depicted with five devices, embodiments can be created using virtually any number of light sources (not shown in FIG. 5A for clarity sake) to satisfy implementation-specific criteria or needs.
  • FIG. 5B illustrates an exemplary control scheme for the light-emitting devices 520 , 522 , 524 , 526 , 528 , involving periodic pulsed operation of each device and uniform spacing of the pulses from all devices throughout the emission cycle.
  • the scan rate i.e., the number of emission cycles per second
  • the scan rate may be selected based on the anticipated range of velocities of the object 510 and/or the reference frame in movable/moving sensor/detector configurations, depth of view of the region of interest, expected distance(s) to object(s) anticipated to appear within the region of interest, power consumption, desired resolution of tracking of object(s), application receiving the result, other configuration-dependent criteria, or combinations thereof.
  • scan rate can be selected such that multiple cycles can occur before the object 510 (or reference frame) moves appreciably.
  • FIG. 5B illustrates one embodiment of an emission cycle
  • an emission cycle can be advantageously apportioned into phases, such that a phase ( ⁇ ) corresponds to an activation state of one or more light-emitting devices comprising emitter 502 .
  • Changing the activation state of the light-emitting devices can advantageously sweep the region of interest.
  • the light emitters can be operated individually or in groups, in any order, and according to any scan pattern.
  • light sources can be illuminated to different levels of brightness based upon an activation state. For example, a light emitter can be switched to a first activation state (e.g., partially illuminated), and subsequently switched to a second setting (e.g., more or brightly illuminated). Third, fourth, and additional illumination levels can be provided for in some embodiments.
  • Some embodiments provide for pulsing of illumination source(s) at different intervals, brightness levels, polarizations, coherence properties and so forth based upon activation states.
  • the brightness of each device can be varied continuously, e.g., in a linear, quadratic, exponential, or logarithmic fashion.
  • the sensor array 508 can comprise a pixelated sensor 540 , such as an array constructed of photodiodes, phototransistors, photovoltaic cells, photoresistors and/or other types of photo-sensing devices capable of converting light into an intensity-dependent current and/or voltage.
  • a conventional CCD or CMOS camera is used, and the image sensor of the camera is periodically first exposed to light and then read, resulting in a series of data frames. (Alternatively, in “rolling-shutter” cameras, the sensor is exposed and read sequentially in multiple segments (or even individual pixels), such that the exposure of one segment (or pixel) can temporally overlap with the reading of a previously exposed segment (or pixel).
  • the sensor array facilitates random access to segments or pixels.
  • the rate at which frames are read determines the temporal resolution of image acquisition.
  • the image sensor 540 is read using custom electronic circuitry.
  • each individual sensor can be associated with a dedicated analog circuit that continuously measures the photo-induced current or voltage.
  • the sensor array 508 and sensor-data processing facility 512 can be operated to capture and analyze data frames at a rate that temporally resolves the discrete phases of the light source 502 .
  • the sensor array 508 can acquire data, for the depicted embodiment, at five frames per emission cycle, such that each data frame corresponds to one of the states within the emission cycle.
  • the sensor array 508 can be synchronized (broadly understood) with the light source 502 , e.g., based on a synchronization signal received from the controller 506 or a signal from a separate timer fed to both the controller 506 and the sensor array 508 , such that each data frame can be associated with a direction cc of the illuminating emission.
  • Synchronization can comprise acquisition of a data frame occurring substantially contemporaneously with the activation of one or more light sources in a particular state forming an emission of light, or at a known period of time thereafter.
  • the sensor array can be read at a rate above or below the rate of change in the state of the light source.
  • the frame rate is read at a rate above the rate at which the light source state changes, which prevents a decrease in the angular resolution of the sweep because a single frame does not, in this case, integrate multiple states of the light source.
  • sensor array 508 further comprises an imaging optic 542 , (e.g., a lens, image-directing film, liquid or variable lenses, other optically active material, or combinations thereof) to focus light onto the sensor 540 .
  • an imaging optic 542 e.g., a lens, image-directing film, liquid or variable lenses, other optically active material, or combinations thereof
  • calibration techniques can be employed to correct for variations in components (e.g., deviations in lens optical characteristics, smudges, dust, manufacturing tolerances, burned out emitters and so forth), environment (e.g., temperature, ambient light, vibration, and so forth), usage (e.g., in home, underwater, outdoors, factory environment, highway, and so forth), and/or configurations (flat, 3D, coplanar, non-coplanar, circular, and so forth) of sensor array 508 .
  • components e.g., deviations in lens optical characteristics, smudges, dust, manufacturing tolerances, burned out emitters and so forth
  • environment e.g.,
  • incoming light approaching sensor array 508 at the angle ⁇ relative to the optical axis 544 of the lens 542 can be mapped sub-optimally to the pixels of the sensor 540 due in part to such variations.
  • nominal parameters of the lens shape and position and orientation relative to the sensor can be used to compute a coarse mapping function, and subsequent calibration techniques can refine that function to account for deviations from the nominal parameters.
  • Various calibration methods are well-known to persons of skill in the art.
  • a well-defined test-pattern e.g., a checkerboard pattern
  • calibration information can be stored in a library, look-up table or other suitable storage mechanism for data for use by the sensor-data processing facility 512 .
  • sensor array 508 comprises a micro-electro-mechanical system (MEMS) camera.
  • MEMS camera can scan (e.g., by raster-scan, or other scan pattern) the region of interest using a scanning mirror in conjunction with a single photodiode (or other light-sensing device).
  • Light striking the scanning mirror of the MEMS camera from different directions is captured at different times during a scan.
  • the scanning mirror directs the captured light to the photodiode which converts the light into a signal representing the region scanned.
  • analysis of the signal can provide blobs corresponding to objects in the region of interest captured by the system.
  • the scanning mirror is controlled based at least in part upon control information provided by a secondary imaging or sensory system.
  • a first imaging can provide one or more portions of the region of interest to which the scanning mirror is directed to conduct a more refined scanning.
  • other image recognition techniques in addition to, or instead of, blob tracking are readily provided for by embodiments.
  • the MEMS camera scans the region of interest at a rate much faster than the light source changes state, e.g., once, twice and so forth for each discrete direction of the illuminating emission, thereby enabling the scan to provide substantially the same information as a complete image frame acquired by a conventional camera.
  • location information for an object can be determined in part based on the state (e.g., temporally changing direction, composition, and so forth) of the light that strikes the object (or the particular surface region) and produces a reflection that can be detected by the sensor array 508 .
  • an object 510 is illuminated most directly by light from light emitter 526 at phase ⁇ 4 (see FIG. 5B ) within the cycle, but—due to its location within the region 504 —does not receive as large a quantity of illumination from the other light emitters 520 , 522 , 524 , and 526 .
  • the measured time-dependent intensity at pixels that receive light reflected and/or scattered by the object 510 can exhibit a relative peak (corresponding to the reflection) as received at the sensor array 508 as illustrated by the received signal's ⁇ 4 in FIG. 5C .
  • the emissions of the light emitter(s) can possess any of a variety of profiles, characteristics, and properties.
  • intensity distribution of the emission peaks along the primary direction and falls off to all sides, asymptotically reaching zero at large distances from the optical axis.
  • Light travelling in different primary directions can overlap spatially, including on the surface of the object that scatters light towards the sensor.
  • some or all of the sensor 540 can detect light originating from multiple emitters throughout an emission cycle. A reflecting object can appear brightest to the sensor 108 when the object receives maximal illumination.
  • FIG. 8B illustrates an embodiment in which the intensity of a reflection signal from the object 510 can vary as a function of time if a plurality of emissions spatially overlap such that the object 510 is not only illuminated by emitter 526 , but also by the neighboring emitters 524 , 528 .
  • a reflection peak may be determined, e.g., directly within the time-domain intensity signal captured by the sensor for each pixel.
  • the frequencies to which the sensor array 508 is sensitive can be tailored to the emission frequencies of the light source 502 , e.g., by selecting a sensor material that absorbs primarily in that range, or by equipping the sensor with a suitable color filter that strongly absorbs frequencies other than those emitted by the light source 502 .
  • the signal is filtered to extract signal components that occur with the scan frequency.
  • sensor array 508 comprises a conventional camera read out frame by frame to a general-purpose computer for image-processing
  • the computer can apply a discrete Fourier transform to corresponding pixels of a time series of image frames that spans multiple emission cycles (e.g., using any of the well-known fast Fourier transform (FFT) algorithms), resulting in a complex-valued frequency-domain intensity signal for each pixel.
  • FFT fast Fourier transform
  • the peak of this signal will occur at the scan frequency for all pixels, but the associated phase will vary between pixels, depending on the phase within the emission cycle at which the pixel receives illumination.
  • custom read-out circuitry can be employed, enabling the output signal of each pixel to be analyzed with an analog or digital phase detector, for example, that is driven by a reference signal whose frequency equals the scan frequency of the light source 502 ; the reference signal may be provided to the circuit, e.g., by the controller 506 of the light source.
  • Suitable phase detectors are well-known to those of skill in the art, and can be implemented without undue experimentation. If a type-II phase detector is used, the output is a constant voltage proportional to the phase difference between the reference signal and the measured signal (of the same frequency). More detail on phase-detector circuits is available in the literature, e.g., in Wolaver, Dan H.
  • discriminating between reflection signals and background light and/or identifying reflection peaks comprises some form of thresholding, (e.g., based on signal amplitude or intensity). Thresholding can be implemented in software, hardware, or a combination of both. In some embodiments, the detector is dormant until a reflection signal is detected; i.e., signal processing is triggered only upon detection of a threshold amount or intensity of light.
  • the illumination is emitted in different directions at different times (or phases) within an emission cycle. Accordingly, for a detected reflection signal, the corresponding direction of the illumination can be determined from the point in the scan pattern ordering that light is emitted at the time of detection.
  • illumination can be emitted simultaneously in multiple directions from multiple sources, and the various emissions can be pulsed at different frequencies to facilitate discrimination between the different directions.
  • the light-emitting devices 520 , 522 , 524 , 526 , 528 can be operated continuously in a pulsed fashion, each at a different frequency; this is illustrated in FIG. 5E .
  • the reflection signal received at each pixel may be Fourier-transformed to determine, based on the frequency of the signal, from which of the light-emitting devices 520 , 522 , 524 , 526 , 528 the detected light originates.
  • each pixel can be associated with a direction of the illumination and, based on the position of the pixel within the sensor array, the direction of the reflected light. For example, if the object 510 lies in the path of light from light-emitting device 526 , the Fourier-transformed signal, shown in FIG. 5F , would peak at the scan frequency of that device.
  • discrimination between multiple illuminating emissions simultaneously emitted in different directions can be accomplished based on different light characteristics, such as color or wavelength, intensity, polarization, etc.
  • different color filters may be applied to the light captured by a camera to identify, based on the color(s) detected, from which emitters the light originates.
  • the depth of an object 510 (or a surface point thereof) within the monitored region 504 may be determined by triangulation from the direction of the illumination that strikes the object 510 and the direction of the reflected light that is captured by the sensor array 508 , the former being inferable from the time at which the light is received (e.g., the acquisition time, within the emission cycle, of the particular frame that captures the reflection) or, alternatively, the frequency of the detected signal, and the latter being inferable from the position of the object within the image (or the orientation of the scanning mirror in an embodiment with a MEMS camera). This is conceptually illustrated in FIG. 5A .
  • the light source 502 and sensor array 508 are arranged such that the diameter of the light source 502 and the camera lens 542 lie in the same plane.
  • Light emitter 526 emits an illuminating emission 550 at an angle ⁇
  • the sensor array 508 receives a reflected emission 552 at an angle 13 , both relative to a normal to that plane.
  • the illuminating and reflected emissions 550 , 552 together with a line connecting the center C 1 of the light source 502 to the center C 2 of the lens 542 form a triangle, isolated for greater clarity in FIG. 5D .
  • the depth of the object 510 within the imaged region i.e., the perpendicular distance y of the intersection point P of the light emissions 550 , 552 (which is the point on the surface of the object 510 that intercepts the illuminating emission 550 and is detected in the image as the origin of the reflected emission 552 ) from the baseline connecting C 1 and C 2 , can be computed from the angles ⁇ , ⁇ and the known length d of the baseline as:
  • the above analysis can be repeated for each such reflection to determine associated depths of the object(s).
  • a depth value is determined for each pixel in the image.
  • the light-emitting devices 520 , 522 , 524 , 526 , 528 are advantageously arranged such that the primary directions of their respective emissions all intersect at the center C 1 of the light source 502 , allowing the above formula to be applied straightforwardly to light emitted from any of the devices 520 , 522 , 524 , 526 , 528 .
  • the same advantage can be achieved if the light-emitting devices 520 , 522 , 524 , 526 , 528 are arranged on the facets of a regular polygon 600 (or a portion thereof), as shown in FIG.
  • FIG. 6A a planar arrangement of light emitters 610 in combination with individual light-deflecting optics 612 (e.g., lenses) for each emitter 610 can be configured, by choosing suitable deflection angles via the orientations of the optics, to effect a common intersection point.
  • the light emissions in different directions need not be uniformly distributed over a complete half-circle, but may have varying angular spacing and/or cover an arc (i.e., a portion of a circle) that extends over more or less than 180°, depending on the desired angular extent of the scan field.
  • FIG. 6C shows, for example, an arrangement of light emitters 620 on an arc segment 622 spanning about 120°; the base of the arc is advantageously displaced from the plane of the camera lens 642 such that the center C 1 of the circle falls in that plane.
  • Depth information can, in principle, be derived for any arrangement of light-emitting devices as long as their positions and orientations relative to each other and relative to the camera are known; the particular arrangement may, however, affect the computational complexity (e.g., in terms of the number of equations to be solved) of the depth determination.
  • a defined, fixed geometric relationship between the light-emitting devices of the light source can be achieved by integrating them with, mounting them on, or otherwise attaching them to a common rigid structure, e.g., made of plastic, metal, or any other suitable rigid material.
  • a fixed geometric relationship also exists between the light source and the camera (or other light detector).
  • the light source 700 and detector 702 may be mounted together in a single unit 704 , and may receive power, and/or communicate with other devices (such as the image-processing facility 512 ), via cables 706 or wirelessly.
  • the unit 704 may be of any size, shape, or material; in various embodiments, it is integrated with another device, such as a television, automobile, or computer.
  • FIG. 7B shows another implementation, in which the light source 700 and detector 702 are maintained as separate units (one or both of which may be integrated into another device); the two units may be located at fixed, known spatial positions so that they, nonetheless, bear a fixed geometric relationship.
  • the light source and light detector are movable relative to each other, either arbitrarily or consistently with one or more well-defined degrees of freedom.
  • the light source and detector may be integrated into a flexible or bendable pad or other structure, which fixes their distance along the surface of that structure, but allows their relative orientation and free-space distance to change.
  • the information deficit arising from such unknown parameters of the geometric arrangement may be cured by using multiple sweeping light sources (i.e., multiple directed light sources with variable emission directions) and/or multiple detectors that, collectively, enable solving for the unknown parameters along with the desired depth information associated with the monitored region and/or objects.
  • Embodiments include: (1) a photo-detector embedded in a device with emitters; (2) a photo-detector embedded in a second device with emitters directed to a photo-detector embedded in a first device; (3) a photo-detector separate unit from all emitters; (4) a photo-detector fixed to an object being tracked; (5) photo-detector(s) and emitter(s) can be implemented together as vcsels, such as for example the SPIE Vol 4652 Chip Scale Integration by Vixar, a company located at 2950 Xenium Lane, Suite 104, Plymouth, Minn.; and (6) other configurations including combinations of the described embodiments.
  • Example layouts of emitters are illustrated by FIGS. 11E-11N and the accompanying description herein below. In further embodiments, described in more detail below, one of the light source or the detector is affixed to the object to be tracked.
  • the depth resolution that can be achieved with systems in accordance herewith can vary with the angular resolution of the scan.
  • the angular resolution relates directly to the angular spacing between the illuminating emissions generated by adjacent light-emitting devices.
  • the angular resolution of the scan is effectively increased by exploiting spatial overlap between adjacent emissions to interpolate between their two primary directions.
  • the intensity reaches its maximum when emitter 524 is turned on; the object 800 may, in this configuration, still receive and reflect light from emitter 526 , but at a much lower intensity; this is shown in FIG. 8D .
  • the relative intensities of reflection signals resulting from the operation of the different light emitters can be used to determine the angular position of the object 800 relative to the light source with a precision that exceeds the angular resolution of the primary emission directions.
  • the individual light-emitting devices are operated such that their respective emissions overlap not only spatially, but also temporally.
  • the light emitters may be turned on and off gradually (e.g., by varying their respective drive currents along a continuum), with each emitter being activated before its predecessor in the cycle is completely extinguished to provide a “cross-fade”; an exemplary control scheme is illustrated in FIG. 8E .
  • the spatiotemporal overlap in activation of light emitters results in a continuous intensity distribution. Depending on the degree of temporal overlap, this intensity distribution can have a single maximum 810 , as shown in FIG.
  • the time-varying intensity can be analyzed to ascertain not only the global maximum 814 , but also (or alternatively) the two local maxima 812 or minima 816 flanking it, which can increase the accuracy of the maximum determination.
  • Techniques for determining maxima and minima vary widely, but in one embodiment, a derivative of the intensity vs. time (or distance) can be computed at points in time. When the derivative is zero, a maximum or minimum is indicated.
  • the temporal resolution of read-out exceeds the temporal resolution corresponding to the different emission directions, e.g., multiple frames are acquired for each discrete light pulse.
  • any asymmetry in the signal may be exploited to increase the angular resolution of the scan in a similar fashion as explained above with respect to FIGS. 8A-8D .
  • the temporal overlap between two adjacent light emitters may result, depending on the extent of spatial overlap between their emissions, in a bimodal angular distribution of the emitted light with varying relative magnitudes of the two peaks (which lie along the primary directions), or in a single peak that shifts from the primary direction of the first emission to the primary direction of the second emission. If the light emitters are turned on in the order in which they are arranged (e.g., from left to right or from right to left) they effectively create a continuous sweep, as the angle at which the intensity peaks varies continuously with time from the first emitter to the last emitter.
  • filters can be employed in conjunction with emitters of different characteristics (e.g., frequency, wavelength, polarization, phase, angular frequency, etc.) to provide improved discrimination between overlapping firing emitters.
  • FIG. 9A illustrates an embodiment in which a continuously sweeping light emission is created by a light source 900 that has two light emitters 902 , 904 .
  • the two emitters 902 , 904 are controlled such that the sum of their central emission intensities is constant, but the intensity of each individual emitter's output varies between that constant value and zero; the intensities as functions of time may, e.g., be the squares of a cosine and a sine, respectively, as shown in FIG. 9B .
  • this control scheme can produce an angular intensity distribution with a single maximum that oscillates back and forth between the two primary directions.
  • the direction where the maximum lies can be determined straightforwardly by setting the derivative of the angular intensity distribution with respect to the angle to zero and solving for the angle.
  • the relationship between the effective aggregate primary direction and the time or phase within the emission cycle can be theoretically determined for any number of spatio-temporally overlapping light emissions with discrete individual primary directions.
  • the system may be calibrated to empirically determine the functional relationship between effective primary direction and time/phase; the calibration may involve placing an object sequentially at multiple known angles relative to the light source, acquiring frames during one or more emission cycles for each of these locations, and identifying the phase within each cycle that causes a detectable reflection signal.
  • each light emitter may step through a series of discrete illumination settings.
  • the illumination of objects within the region of interest by different emitters can overlap in time (“cross-fade”) such that, while one light emitter is still illuminated, the next one may begin illumination (e.g., at a low dimming setting).
  • cross-fade the next one may begin illumination
  • three or more light emitters are illuminated concurrently at different settings, but, in other embodiments, only one light emitter is configured at the maximum setting at any given time.
  • Some embodiments can vary overall angular resolution and/or accuracy achieved by the system 100 , at least in part, by varying the number of discrete dimming levels.
  • a continuous sweep is achieved with a light source that includes a rotating or otherwise moving light-emitting device.
  • the device 1000 may, for example, be mounted on a support structure 1002 (e.g., a bar or wheel) that rotates continuously in the same direction ( FIG. 10A ), or back and forth between two angular boundaries ( FIG. 10B ).
  • the rotating light source 1004 may be combined with two or more cameras 1006 , 1008 facing in opposite directions to facilitate a full 360° scan of the space surrounding the light source 1004 and cameras 1006 , 1008 .
  • FIG. 10A and 10B the rotating light source 1004 may be combined with two or more cameras 1006 , 1008 facing in opposite directions to facilitate a full 360° scan of the space surrounding the light source 1004 and cameras 1006 , 1008 .
  • two light-emitting devices 1010 , 1012 may be mounted on opposite ends of the support structure 1002 to continuously scan the half space imaged by a camera 1014 while the light source undergoes a full 360° rotation.
  • moving light sources can generally be implemented with one, two, or more light-emitting devices in a variety of suitable configurations.
  • a moving light source may utilize a moving deflecting optic in conjunction with a stationary light emitter.
  • the deflecting optic is a lens 1020 , a prism 1022 , or a semicircular slab 1024 with a high index of refraction.
  • the slab 1024 may be arranged relative to the light-emitting device 1026 such that the light enters the slab 1024 through its curved boundary and exits it always at the center point 1028 . This configuration results in a scanning that originates from the same point, simplifying the computation of depth values. Computational simplicity is also aided, in various embodiments, by utilizing a narrow, highly collimated light source, e.g., a laser diode.
  • FIG. 10G shows an exemplary embodiment.
  • the screen 1030 transmits light through a hole, slot, or other perforation 1032 while blocking light transmission in other directions, thereby “cutting” a narrower window out of the light from the emitter.
  • This light emission moves (e.g., rotates) as the perforation moves along with the screen; thus, the light emitter and moving screen together form a moving light source.
  • Additional embodiments of moving light sources may occur to people of skill in the art, and are to be considered within the scope of the disclosed technology.
  • the angular resolution of the scan depends on the temporal resolution of image acquisition, i.e., the rate at which information is read from the camera.
  • the frame rate can often be increased at the cost of decreasing the spatial resolution; for example, by reading out only every other pixel, the frame rate can be doubled.
  • the pixel-read-out density and frame rate are set such that the lateral and depth resolution achieved by the system are approximately the same.
  • low or moderate angular resolutions of the scan are employed, and having short or medium distances between the object of interest, the light source and detector, the emission of light in a certain primary direction and the detection of its reflection by the detector can be considered to occur practically simultaneously, i.e., before the direction changes and/or within the same frame. Consequently, the travel time, or time of flight, of light from the source to the object and then to the detector can be neglected in these cases. For example, if the object is no more than one hundred meters away from the light source and detector, the time of flight is less than a microsecond.
  • the reflection can, for practical purposes, be considered as received instantaneously.
  • the time of flight can be taken into consideration to accurately determine the emission direction that corresponds to a particular reflection signal from the time of receipt of that signal. For instance, if the emission direction changes, and a new image frame is acquired, every nanosecond, a total travel distance of only ten meters results in a shift of 33 frames between emission of the light and capture of the reflection.
  • a reflection received by the detector should be associated with the emission direction 33 frames (or 33 ns) prior.
  • a synching event (e.g., pause in scan, burst of energy in scan by activating all, most, many emitters contemporaneously, etc.) can be used to synch up emitter scan cycle and detector scan cycle.
  • a synching event includes a time delay between cycles having a time period of sufficient duration that any emitted energy will have returned.
  • the time period needed for energy emitted to return is typically not known a priori in embodiments in which the distance of the object from the light source and detector is the very parameter to be measured.
  • this problem is addressed by determining the distance iteratively: in a first iteration, the position of the object, and thus its distance from the source and detector, may be determined based on the assumption that the time of flight is zero. The distance thus determined is then used to compute a round-trip travel time adjustment, determine a corrected emission direction, and re-compute the object position. This process may be repeated for a fixed number of iterations, or until a positional accuracy is achieved or some other convergence criterion is satisfied; in some embodiments, one or two iterations suffice for the desired accuracy of the positional measurement.
  • FIG. 5A implies that the light source 502 , sensor array 508 , and object 510 all lie in the same plane, e.g., the horizontal plane, this need not necessarily be the case as shown by FIG. 15 herein below.
  • the object 510 may lie above or below the horizontal plane, and may nonetheless be illuminated by the light emissions during a horizontal sweep of that plane due to the inherent vertical extent of the light emitted.
  • the sensor array 508 may capture a two-dimensional image of the region 504 , mapping all light incident upon the detector under a particular angle relative to the horizontal scan plane to the same row of the image sensor. For example, a reflection signal received from an object location within the scan plane may strike a sensor row within that same plane, whereas a reflection received from an object location below the scan plane may strike the sensor pixels in a row above the plane, and vice versa. From the horizontal position of the reflection in the data frames (i.e., the column of the image sensor) and the direction of the illumination corresponding to this reflection, the depth of the object can be calculated, e.g., as explained with respect to FIG. 5A . The depth value in conjunction with the two-dimensional location of the reflection in the image, in turn, allows computing the location of the reflection point on the surface of the object in a plane parallel to the image plane at the determined depth. Thus, the three-dimensional position of the object is computed.
  • an intensity threshold may be used to define the emission diameter as the full width of the emission portion that covers intensities down to the threshold value (e.g., down to half the maximum intensity).
  • embodiments can provide the ability to accommodate situations wherein the emission intensity falls off on one or more sides of a “peak” defining a horizontal scan plane (e.g., according to a Gaussian or similar vertical intensity distribution). The light practically sweeps a three-dimensional slice of a thickness corresponding to that diameter.
  • the light source 502 may be equipped with means for deliberately increasing the vertical emission diameter and/or divergence, such as a cylindrical lens or other diverging optic, and/or with means for narrowing the horizontal divergence without affecting the vertical divergence, such as a vertical slot.
  • the monitored region is expanded, or the accuracy of positional determinations therein is increased, by utilizing multiple intersecting or parallel scan planes, or by scanning a three-dimensional spatial volume with a beam whose primary direction can be varied both azimuthally and attitudinally.
  • FIG. 11A shows a light source that includes two mutually perpendicular linear arrangements 1100 , 1102 of discrete light-emitting devices 1104 , which may be positioned on a planar or curved surface.
  • the rows correspond to the equator and meridian, respectively, of a semispherical surface 1106 , as shown in FIG. 11B .
  • FIG. 11A shows a light source that includes two mutually perpendicular linear arrangements 1100 , 1102 of discrete light-emitting devices 1104 , which may be positioned on a planar or curved surface.
  • the rows correspond to the equator and meridian, respectively, of a semispherical surface 1106 , as shown in FIG
  • FIG. 11C illustrates another embodiment, in which discrete light-emitting devices 1104 are distributed uniformly over a semispherical surface 1106 (or as smaller or larger surface portion of a sphere).
  • the light-emitting devices 1120 are mounted on the faces of a regular polyhedron 1110 .
  • the light sources shown in FIGS. 11C and 11D are three-dimensional generalizations of the light sources depicted in FIGS. 4A , 5 A and 6 A.
  • FIGS. 11E and 11F illustrate sources 1104 comprising two or more lighting elements 1104 a disposed on surface 1112 along with photo-detectors 1103 .
  • Surface 1112 can be flat, curved, or a combination of multiple surface portions.
  • photo-detectors 1103 receive reflectance of emissions of energy emitted by sources 1104 co-located on surface 1112 as described in further detail with reference to FIGS. 4A-1 , 4 A- 2 , and 13 A.
  • more than one “scanner” unit comprising surface 1112 , sources 1104 , and photo-detectors 1103 can be implemented together such that photo-detectors 1103 of a first surface 1112 receive reflectance of emissions from sources 1104 of a second surface 1112 and vice versa.
  • source 1104 includes a plurality of photo-emitting elements which can be individually activated to emit electromagnetic energy in various directions sequentially, randomly or according to a pattern.
  • the “firing order” indicated by numerals assigned to photo-emitting elements 1104 a are illustrative of one exemplary firing order used in some embodiments.
  • FIGS. 11G-11H illustrate different arrangements of sensors and emitters configured on a single surface 1112 .
  • FIGS. 11I-11J illustrate sensor-and-emitter pair configurations on a single surface 1112 .
  • FIGS. 11K-11N illustrate a variety of other configurations of sensors and emitters on a single surface 1112 .
  • moving lights sources as described, e.g., with respect to FIGS.
  • a light-emitting device may be mounted on a sphere with two rotational degrees of freedom, or combined with a lens that can rotate about two mutually perpendicular axes lying in the plane of the lens (i.e., perpendicular to its optical axis) and intersecting at its center.
  • Some embodiments can determine depth within approximately millimeters or micrometers of accuracy; other embodiments may determine positional information within approximately centimeters of accuracy.
  • the disclosed technology is, however, not limited to any particular accuracy (or other performance parameter), and certain attributes of the disclosed technology may be adjusted to increase or decrease accuracy and/or performance, e.g., as applications of the disclosed technology require. For example, in embodiments that utilize light sources with multiple light-emitting devices generating emissions in a finite number of discrete directions, a coarse scan can be achieved by skipping some light sources when situations call for less accuracy, and a more accurate, fine-grained scan can be achieved by operating the light source with a larger number of light-emitting devices.
  • a coarse scan of an entire field of view is conducted to locate one or more objects, and then followed by a comparatively fine-grained scan limited to a portion of the field of view in which the object has been located.
  • the fine-grained scan may serve to identify detailed features of the object(s), thereby enabling different objects (e.g., hands of different human users, different pets walking across the field of view, etc.) to be distinguished.
  • FIGS. 12A and 12B summarize two methods in accordance herewith, which correspond to two respective classes of applications.
  • FIG. 12A shows a method 1200 that can be used to determine depth values for all pixels of an image capturing a region of interest.
  • the method 1200 involves repeatedly sweeping the region of interest with an emission (action 1202 ), and capturing one-dimensional or, more typically, two-dimensional images of the region at a rate that resolves the different directions of the emission at the desired accuracy (action 1204 ).
  • Capturing images herein means that light from different locations within the region corresponding to different directions of incidence on the image sensor is resolved in some manner (e.g., spatially with a pixelated sensor, or temporally via a raster scan), regardless whether the image data is read out, stored, analyzed, and/or computationally represented frame by frame or, e.g., pixel by pixel.) Further, the method includes determining, for each pixel, the maximum intensity of the detected reflection during an emission cycle and the corresponding phase within the cycle ( 1206 ) (which may, in some cases, be the only phase for which a reflection signal is detected at all).
  • this is done separately for every emission cycle so that any movements of objects within the region are detected at the scan rate; in other embodiments, the signals for multiple successive emission cycles are averaged to improve accuracy at the cost of responsiveness.
  • the direction of the emission to which this peak is attributable is determined for each pixel (action 1208 ), and based on the angles of the illuminating and reflecting emissions, the depth is computed (e.g., as explained with respect to FIG. 5C ) and associated with the respective pixel (action 1210 ). This way, a complete depth profile of the image is obtained.
  • This may serve to measure not only the (e.g., three-dimensional) position of an object as a whole, or the positions of multiple objects (including objects bounding the monitored space, such as walls), but also to determine, e.g., the shape an object's surface and/or any changes thereto over time.
  • the process may be repeated to track changes of the object location, shape, etc.
  • the object is a person's hand, whose position, orientation, and articulation are tracked, e.g., for purposes of gesture recognition.
  • FIG. 12B illustrates a method that may be more appropriate for tracking the position(s) of one or more objects within a spatial region that is large compared to the dimensions of the object(s).
  • This method 1220 likewise includes scanning the entire region repeatedly with a light beam (action 1222 ) and acquiring images of the region (action 1224 ).
  • the method 1220 deviates in the analysis of the image data. Rather than determining depths for all pixels, it involves identifying object(s) of interest in the (one- or two-dimensional) camera images ( 1226 ), and computing depth values only for one or more pixels of the identified object(s) (actions 1228 , 1230 , 1232 ).
  • the object(s) may be identified in the images in any of a number of ways, e.g., using conventional edge-detection or patch-detection techniques, template matching against a database of object images, and/or foreground/background discrimination based on the intensity of the measured reflection integrated over an entire emission cycle (which will generally be greater for objects in the foreground, due to the decrease in intensity with the square of the distance).
  • depth may then be determined using the same procedure as described above for method 1200 , i.e., by determining of the phase within the emission cycle for which the reflection intensity is maximized (action 1228 ), inferring the corresponding illuminating-beam direction therefrom (action 1230 ), and computing the depth from the angles of the illuminating and reflected light (action 1232 ).
  • this method 1220 reduces the computational resources required to process the images.
  • certain embodiments described heretofore utilize a camera that provides one- or two-dimensional frames imaging light reflected from objects of a region of interest, which can be supplemented with depth information employing the methods described above
  • certain alternative embodiments do not require the use of a multi-element camera, but can localize objects using only a single photo-sensitive element (or multiple individual elements in different locations) in conjunction with a directed light source with variable beam direction.
  • the light source may scan a spatial region according to a suitable scan pattern. For instance, the region may be divided into a number of subregions (e.g., four quadrants), which are then sequentially illuminated. When the illuminating light strikes an object, a reflection signal is detected by the light-sensitive element.
  • the time of receipt of the reflection signal allows an inference of the direction of the illuminating emission and, thus, of the subregion in which the reflecting object is located.
  • multiple illuminating emissions having different properties, e.g., different color may be used simultaneously and distinguished using suitable filters with the detector(s).
  • Multiple light sources and detectors may be used to scan the region of interest along different planes so as to provide three-dimensional information. For example, a cubic volume may be scanned along two perpendicular planes, each divided into four quadrants. Localizing an object in one of the four quadrants for both scans allows identifying in which one of eight sub-cubes the objects is located.
  • the spatial resolution of object localization achieved in this manner corresponds directly to the spatial resolution of the scan pattern; the more subregions there are, the more precise is the determined object location.
  • the light emanating therefrom may be distinguished based on its spectral properties (color, frequency, wavelength), intensity, polarization, the (angular) frequency or repetition rate of the scan pattern, the frequency of a binary flash pattern, a temporal intensity modulation pattern, or some other optical property or pattern. For example, if two light sources scan the region of interest periodically, but each at its own repetition rate, a detector may Fourier transform the overall signal it receives to identify peaks at two frequencies corresponding to the two repetition rates. Similarly, if two light sources flash at different frequencies, a Fourier transform and/or suitable frequency filters facilitate discriminating between them and/or extracting a signal corresponding to one or the other light source.
  • Staged filtering allows first extracting the signal from a particular light source and then (following demodulation) extracting a signal at the scan frequency (i.e., repetition rate).
  • the detectors may be equipped with suitable spectral or polarization filters to differentiate between signals from the different sources.
  • An exemplary embodiment is illustrated in FIG. 13A , where an object 1300 may intersect light emitted from two light sources 1302 , 1304 that produce light having different characteristics (say, emit light of different colors).
  • a number of light detectors (which may be individual photosensitive elements, or combinations thereof such as cameras) are placed in various locations to capture reflections off the object, each detector being adapted, e.g., via a suitable filter, to receive light only from one of the light sources 1302 , 1304 .
  • detector/filter combination 1306 captures light originating from light source 1302
  • detector/filter combinations 1308 , 1309 capture light originating from light source 1304 .
  • a particular light source may have one or more associated detectors for receiving light from that source.
  • multiple light sources that emit light with the same properties may share one or more associated detectors.
  • the embodiments described above all take advantage of diffuse reflections of the emission, i.e., of light scattered in all directions. This allows the camera or other sensor to capture reflections off any object within its field of view, regardless of the angle of incidence on the object surface and the angle relative to the surface normal under which the detector appears from the point of incidence (which is the angle of reflection for the portion of the reflected light that is captured).
  • specular reflection is used instead.
  • an object of interest may be specially equipped with a reflector (i.e., mirror), or the surface properties of the object may inherently cause specular reflection (e.g., for smooth, metallic surfaces).
  • a plurality of detectors may be employed, e.g., distributed over a bounding surface (e.g., defined by one or more walls) of the region of interest.
  • multiple light sources may be used, e.g., to scan the region of interest along different planes or in different directions.
  • the light sources may differ in the light and scanning properties (e.g., color, polarization, angular frequency of the scan, etc. as described above) to enable the detectors to distinguish between signals originating from different light source.
  • the object of interest may be equipped with a reflector that reflects light back into the direction it came from, with minimal scattering.
  • a reflector that reflects light back into the direction it came from, with minimal scattering.
  • retro-reflectors are well-known to those of skill in the art; one example is a corner reflector, which includes three mutually perpendicular mirror surfaces arranged like three interior walls of a cube that have a common corner point.
  • the reflected light can be captured by a light detector that is co-located with the light source.
  • the detector is co-located with the light source if it is sufficiently close to the light source to intercept the reflected light, i.e., if its distance from the optical axis is smaller than the beam-profile radius of the reflected beam at the light source.
  • the beam radius may be defined, e.g., based on the maximum allowable intensity fall-off from the central peak intensity, and this maximum allowable fall-off may, in turn, depend on the sensitivity of the detector).
  • the allowable distance of the detector from the light source is the greater, the more divergent the beam is and the more distant the object of interest is.
  • the light detector may be a single photodiode, phototransistor, photovoltaic cells, photoresistor or other kind of photo-sensing device, or an array or irregular arrangement of multiple such devices.
  • multiple light-sensing cells are arranged in a ring around the light source to capture a large portion of the reflected light.
  • the sensor since the reflection of interest comes, at any point in time, only from one direction (which is the direction of the illuminating beam), the sensor need not resolve different directions of incoming light; consequently, there is no need for a focusing optic or for a pixel-wise read-out of the sensor.
  • the detector comprises more than one light sensors, the detected light is integrated over all of the sensors to yield an aggregate signal. (Of course, this does not exclude the use of a regular camera with a pixelated sensor and lens.)
  • FIG. 13B illustrates how a light source 1310 and co-located detector 1312 may be used to track a retro-reflecting object 1314 .
  • the light source 1310 generates a beam 1316 of variable direction to sweep a region of interest 1318 , which is, in this case, a region where the object of interest 1314 is generally expected.
  • the variable beam direction may be achieved with multiple cyclically operated light-emitting devices with fixed beam directions, with one or more moving light emitters, with a fixed light emitter and moving deflector, or generally with any of the light sources described above.
  • the light source 1310 may be operated to repeatedly scan the entire region 1308 , preferably at a rate that is fast compared to the rate of motion of the object 1314 .
  • the light source may scan only a subregion around the previously measured location, or a projected expected location, of the object 1314 .
  • the object 1314 has a retro-reflector 1320 (or multiple retro-reflectors 1320 , in which case an orientation information can be discerned as well) integrated therewith or attached thereto such that, whenever the beam 1316 strikes the retro-reflector 920 , a reflection signal is sent back to and measured by the detector 1312 .
  • Control and detector circuitry 1322 (implemented in a single device or multiple intercommunicating devices) associated with the light source 1310 and the detector 1312 correlates the measured signal with the direction of the illuminating beam 1316 from which it originates.
  • the circuitry 1322 may, for example, include a controller for operating the light source 1310 and an analog phase-detecting circuit that receives a clock signal of the scan frequency as a reference, and measures the timing of the detected reflection relative thereto.
  • the direction, measured from the light source 1310 and detector 1312 , at which the object 1314 occurs can be determined as a function of time.
  • the intersection of the measured direction with that trajectory uniquely determines the position of the object.
  • additional information may be obtained from a second light-source/detector pair, hereinafter referred to as a “scanner.”
  • the position of an object 1330 that moves in two dimension may be determined from two two-dimensional scans simultaneously performed by two scanners 1332 , 1334 .
  • Each of the scanners 1332 , 1334 determines a line along which the object is located during a given emission cycle.
  • the intersection 1336 of the two lines specifies the position of the object 1330 .
  • each line corresponds to the primary direction of the beam that strikes the object 1330
  • the two beams need not strike the object 1330 at the same time, and consequently need not, and typically do not, intersect each other at the current object location.
  • the two scanners 1332 , 1334 may operate independently from each other, and even at different scan frequencies, as long as they track the object relative to a common spatio-temporal reference frame.
  • the embodiment illustrated in FIGS. 13B and 13C can straightforwardly be extended to more than two scanners and/or three-dimensional scan patterns, e.g., to take redundant measurements for improved accuracy and/or to allow tracking an object in three dimensions.
  • Object-tracking methods that utilize specular reflection in accordance herewith may be used, e.g., in various industrial contexts. For instance, they may serve to track the positions of robots moving across a manufacturing floor or a distribution and mailing center.
  • One or more scanners may be mounted, e.g., onto the walls or ceiling, to sweep a light beam thereacross and capture reflections from retro-reflectors affixed to the robots.
  • the roles of the scanner and the retro-reflector are reversed, and one or more retro-reflectors are fixed in space to serve as reference locations for a moving object equipped with a scanner.
  • three retro-reflectors 1340 at known locations suffice to uniquely determine the location and/or orientation of the object 1342 from measurements of the three angles ⁇ 1 , ⁇ 2 , ⁇ 3 enclosed between the lines of sight connecting the object 1342 to respective pairs of retro-reflectors 1340 , as illustrated in FIG. 13D .
  • This configuration may find its application, for example, in the triangulation of mobile device positions.
  • An inexpensive scanner may be integrated into a cell phone or other mobile device, and three or more retro-reflectors may be mounted, e.g., on buildings, cell towers, or other permanent structures.
  • the scanner may continuously sweep its surroundings, preferably with light outside the visible range (e.g., IR light). Based on the reflection signals received from three (or more) retro-reflectors at known locations, the position of the mobile device can be computed.
  • the reflectors may alter some property of the light, such as a polarization, spectral property (e.g., wavelength/frequency, or color), amplitude modulation, etc.).
  • the different reflectors may have different physical properties (e.g., shape, size, color), reflective properties, surface characteristics (e.g., patterns), etc. to effect the different light properties.
  • the retro-reflectors are positioned at known, elevated heights. Assuming that the scanner is on or near the ground, this allows a determination of the distance between the scanner and retro-reflector based on the azimuthal angle of the reflector as measured from the scanner. With directional and distance information available, the position of the mobile device can be inferred from only two reflection signals. Alternatively, the distances may be used to verify and/or increase the accuracy of the computed device location.
  • time of flight techniques or Doppler techniques can be used in conjunction with the techniques described herein.
  • the light source may be controlled to emit, for a number of emission cycles, only one pulse in one direction per emission cycle, allowing a determination of the delay between the time of emission and the time of receipt of any corresponding reflection. This procedure may be repeated for multiple beam directions until a reflection signal is detected.
  • time-of-flight measurements may be used to supplement the triangulation-based depth/distance determination. For example, if the camera is read out much faster than the illuminating beam changes direction, i.e., a sequence of multiple images is acquired for each pulse, the time delay between the emission of the pulse and the receipt of the reflection can be inferred readily from the number of the image within the sequence that captures the reflection.
  • the light emitters such as the one depicted in FIG.
  • the light source that emits directly towards an object is also the one closest to the object.
  • the minimum travel time (or phase difference between pulse emission and detection) among the pulses emitted during an emission cycle, generally coincides with the maximum intensity.
  • the relationship between the direction of illumination and brightness of the reflection may, in some cases, not hold, e.g., due to surface irregularities of the object that cause different reflectivity for light incident from different directions.
  • a travel-time-based determination of the direction of the illuminating beam along which the object is located may, in these scenarios, be more accurate or reliable than an intensity-based determination. Measurements of the Doppler shift of light (or other radiation) reflected by a moving object can provide additional information about the motion of the object, which may supplement information obtained with methods described herein.
  • the various embodiments described herein generally utilize some type of control and/or computational facility (hereinafter “computer”) for operating the light source and detector and analyzing the captured signals or images.
  • This computer may be provided in various form factors, and may be implemented in one or more dedicated, application-specific devices (e.g., a DSP or ASIC) designed or selected for use with the light source and/or detector, or integrated into another, stand-alone computing device (such as a personal computer, tablet computer, or a smart phone), depending on the application context.
  • some or all of the functionality of the computer is integrated into the light source (e.g., into the support structure onto which the light-emitting devices are mounted) and/or incorporated into or affixed to the detector.
  • the computer may include digital circuitry (e.g., a computer processor and memory) and/or analog circuitry (e.g., an analog phase detector).
  • FIG. 14 shows, in a simplified block diagram, an exemplary embodiment of a computer 1400 for determining a distance to an object in accordance with an embodiment of the disclosed technology.
  • the computer 1400 may include a processor 1402 , memory 1404 (e.g., RAM, ROM, and/or flash memory), one or more interfaces 1406 for the light source and detector, and/or one or more user input/output devices 1408 (e.g., a display (optionally touch-enabled), speakers, a keyboard, and/or a mouse), as well as one or more buses 1409 over which these components communicate.
  • memory 1404 e.g., RAM, ROM, and/or flash memory
  • interfaces 1406 for the light source and detector
  • user input/output devices 1408 e.g., a display (optionally touch-enabled), speakers, a keyboard, and/or a mouse
  • buses 1409 over which these components communicate.
  • the computer 1400 may also include other removable/non-removable, volatile/nonvolatile computer storage media, such as a solid-state or magnetic hard disk, an optical drive, flash memory, random-access memory, read-only memory, or any other similar type of storage medium.
  • the processor 1402 may be a general-purpose microprocessor, microcontroller, digital-signal processor, or any other type of computational engine.
  • the interface 1406 may include hardware and/or software that enables communication between the computer 1400 and the light source and/or detector.
  • the interface 1406 may include one or more data ports (such as USB ports) to which devices may be connected, as well as hardware and/or software signal processors to modify sent or received data signals (e.g., to reduce noise or reformat data).
  • the interface 1406 also transmits control signals to, e.g., activate or deactivate attached devices, to control camera settings (frame rate, image quality, sensitivity, zoom level, etc.), or the like. Such signals may be transmitted, e.g., in response to control signals from the processor 1402 , which may in turn be generated in response to user input or other detected events.
  • control signals e.g., activate or deactivate attached devices, to control camera settings (frame rate, image quality, sensitivity, zoom level, etc.), or the like.
  • Such signals may be transmitted, e.g., in response to control signals from the processor 1402 , which may in turn be generated in response to user input or other detected events.
  • the memory 1404 may be used to store instructions to be executed by processor 1402 as well as input and/or output data associated with execution of the instructions. These instructions, illustrated as a group of modules, control the operation of the processor 1402 and its interaction with the other hardware components.
  • an operating system 1410 directs the execution of low-level, basic system functions such as memory allocation, file management, and operation of mass storage devices.
  • the operating system may be or include a variety of operating systems, such as WINDOWS, LINUX, OS/X, iOS, Android or any other type of operating system.
  • the memory 1004 may store control modules 1412 , 1414 for operating the light source and the camera (or other detector), an image-analysis module 1416 for analyzing the image data received from the detector (e.g., to determine the intensity maximum of the time-varying signal for each pixel, to identify objects of interest and determine the associated direction of the incident beam, etc.), and a triangulation module 1418 that computes depth and/or distance values based on the measured image data and the corresponding control state of the light source.
  • the instructions may be implemented in any programming language, including, for example, C, C++, JAVA, Fortran, Basic, Pascal, or low-level assembler languages.
  • Embodiments of the disclosed technology may be used to map out a room or similarly sized area in order to determine its dimensions and/or precisely locate room walls as well as objects, people, or other things in the room.
  • This information may be used by a computer, television, or other device or machine in the room to improve the experience of a user of the device by, for example, allowing the user to interact with the device based on the room dimensions.
  • the device may adjust a property of its output (e.g., a sound level, sound distribution, brightness, or user-interface perspective) based on objects in the room or the position of the user. Further embodiments can be used to track the motion of objects in a field of view, optionally in conjunction with other mobile-tracking systems.
  • Object tracking may be employed, for example, to recognize gestures or to allow the user to interact with a computationally rendered environment; see, e.g., U.S. Patent Application Ser. No. 61/752,725 (filed on Jan. 15, 2013) and Ser. No. 13/742,953 (filed on Jan. 16, 2013), the entire disclosures of which are hereby incorporated by reference.
  • FIG. 15 illustrates an exemplary task environment in which a human operator 1500 of a machine 1502 interacts with the machine 1502 via motions and gestures.
  • the machine may be communicatively coupled to and receive information about the motions and gestures from one or more light-source/detector pairs 1504 .
  • three light-source/detector pairs 1504 are used to scan the region of interest horizontally and vertically.
  • different numbers and arrangements of light sources and detectors may be employed in other embodiments.
  • a region of space in front of or near the machine 1502 may be scanned by directing one or more light emissions from the vantage point(s) or region(s) of the light source(s) to the region of space, detecting any reflectance of the light emission from an object or objects within the region, and, if a reflectance is detected, inferring therefrom the presence of an object (or objects) in the region of space.
  • an exemplary embodiment of a variation-determination system 1600 comprises a model-management module 1602 that provides functionality to build, modify, and/or customize one or more models to recognize variations in objects, positions, motions, and/or attribute states or changes therein based on sensory information obtained from a suitable detection system, such as system 100 shown in FIG. 1 .
  • a motion capture and sensory analyzer 1604 finds motions (e.g., translational, rotational), conformations, and presence of objects within sensory information provided by detection system 100 .
  • the findings of motion capture and sensory analyzer 1604 serve as input of sensed (e.g., observed) information from the environment with which model refiner 1606 can update predictive information (e.g., models, model portions, model attributes, and so forth).
  • the model refiner 1606 may update one or more models 1608 (or portions thereof) from sensory information (e.g., images, scans, other sensory-perceptible phenomena) and environmental information (i.e., context, noise, and so forth); enabling a model analyzer 1610 to recognize object, position, motion, and/or attribute information that might be useful in controlling a machine.
  • Model refiner 1606 employs an object library 1612 to manage objects including one or more models 1608 (e.g., of user portions (e.g., hand, face), other control objects (e.g., styli, tools) or the like) (see, e.g., the models depicted in FIGS.
  • model components e.g., shapes, 2D model portions that sum to 3D, outlines and/or outline portions (e.g., closed curves), attributes (e.g., attach points, neighbors, sizes (e.g., length, width, depth), rigidity/flexibility, torsional rotation, degrees of freedom of motion, and others), and so forth
  • attributes e.g., attach points, neighbors, sizes (e.g., length, width, depth), rigidity/flexibility, torsional rotation, degrees of freedom of motion, and others
  • models, model components, and attributes While illustrated with reference to a particular embodiment in which models, model components, and attributes are co-located within a common object library 1612 , it should be understood that these objects will be maintained separately in some embodiments.
  • Object attributes may include (but are not limited to) the presence or absence of the object; positional attributes such as the (e.g., one-, two-, or three-dimensional) location and/or orientation of the object (or locations and/or orientations of various parts thereof); dynamic attributes characterizing translational, rotational, or other forms of motion of the object (e.g., one-, two-, or three-dimensional momentum or angular momentum); physical attributes (e.g., structural or mechanical attributes such as appearance, shape, structure, conformation, articulation, deformation, flow/dispersion (for liquids), elasticity); optical properties or, more generally, properties affecting or indicative of interaction with electromagnetic radiation of any wavelength (e.g., color, translucence, opaqueness, reflectivity, absorptivity); and/or even chemical properties (as inferred, e.g., from optical properties) (such as material properties and composition).
  • positional attributes such as the (e.g., one-, two-, or three-dimensional) location and/or orientation
  • scanning the region involves multiple emission cycles.
  • the region may (but need not) be scanned in accordance with different scan patterns.
  • an initial emission cycle may serve to detect an object
  • a more refined scan pattern may serve to capture surface detail about the object, determining positional information for at least a portion of the object, or determining other kinds of object attributes.
  • Multiple sequential emission cycles may also serve to detect changes in any of the object attributes, e.g., due to motion or deformation; for such differential object-attribute determinations, the same or similar scan patterns are typically used throughout the cycles.
  • the object attributes may be analyzed to identify a potential control surface of the object.
  • FIG. 17A illustrates predictive information including a model 1700 of a control object constructed from one or more model subcomponents 1702 , 1703 selected and/or configured to represent at least a portion of a surface of control object 112 , a virtual surface portion 1706 and one or more attributes 1708 .
  • Other components can be included in predictive information 1710 not shown in FIG. 17A for clarity sake; such as models (user portions (hand, face), objects (styli, tools), model components (shapes, e.g., 2D model portions that sum to 3D), and model-component attributes (e.g., degrees of freedom of motion, torsional rotation, attach points, neighbors, size (length, width, depth), rigidity/flexibility), and others).
  • the model subcomponents 1702 , 1703 can be selected from a set of radial solids, which can reflect at least a portion of a control object 112 in terms of one or more of structure, motion characteristics, conformational characteristics, other types of characteristics, and/or combinations thereof.
  • radial solids include a contour and a surface defined by a set of points having a fixed distance from the closest corresponding point on the contour.
  • Another radial solid embodiment includes a set of points a fixed distance from corresponding points on a contour along a line normal thereto.
  • computational technique(s) for defining the radial solid include finding a closest point on the contour and the arbitrary point, then projecting outward the length of the radius of the solid.
  • such projection can be a vector normal to the contour at the closest point.
  • An example radial solid e.g., 1702
  • Another type of radial solid e.g., 1703
  • Other types of radial solids can be identified based on the foregoing teachings.
  • updating predictive information to observed information comprises selecting one or more sets of points 1750 in space surrounding or bounding the control object within a field of view of one or more image-capture device(s).
  • points 1750 can be determined using one or more sets of lines 1752 A, 1752 B, 1752 C, and 1752 D originating at vantage point(s) associated with the image-capture device(s) (e.g., FIG. 1 : 130 A, 130 B) and determining therefrom one or more intersection point(s) defining a bounding region (i.e., region formed by lines 1752 A, 1752 B, 1752 C, and 1752 D) surrounding a cross-section of the control object.
  • a bounding region i.e., region formed by lines 1752 A, 1752 B, 1752 C, and 1752 D
  • the bounding region can be used to define a virtual surface ( FIG. 17A : 1706 ) to which model subcomponents 1702 , 1703 , and 1754 can be compared.
  • the virtual surface 1706 can include a visible portion 1760 A and a non-visible “inferred” portion 1760 BB.
  • Virtual surfaces 1706 can include straight portions and/or curved surface portions of one or more virtual solids (i.e., model portions) determined by model refiner 1606 .
  • model refiner 1606 determines to model subcomponent 1754 of an object portion (happens to be a finger) using a virtual solid, an ellipse in this illustration, or any of a variety of 3D shapes (e.g., ellipsoid, sphere, or custom shape) and/or 2D slice(s) that are added together to form a 3D volume.
  • the ellipse equation (1) is solved for ⁇ , subject to the constraints that: (1) (x C , y C ) must lie on the centerline determined from the four tangents 1752 A, 1752 B, 1752 C, 1752 D (i.e., centerline 1756 of FIG. 17B ); and (2) a is fixed at the assumed value a 0 .
  • the ellipse equation can either be solved for ⁇ analytically or solved using an iterative numerical solver (e.g., a Newtonian solver as is known in the art).
  • Equations (1)-(4) The parameters A 1 , B 1 , G 1 , H 1 , v A2 , v AB , v B2 , w A2 , w AB , and w B2 used in equations (7)-(15) are defined as shown in equations (1)-(4).
  • Q 8 4 ⁇ A 1 2 ⁇ n 2 ⁇ v B ⁇ ⁇ 2 2 + 4 ⁇ v B ⁇ ⁇ 2 ⁇ B 1 2 ⁇ ( 1 - n 2 ⁇ v A ⁇ ⁇ 2 ) - ( G 1 ⁇ ( 1 - n 2 ⁇ v A ⁇ ⁇ 2 ) ⁇ w B ⁇ ⁇ 2 + n 2 ⁇ v B ⁇ ⁇ 2 ⁇ w A ⁇ ⁇ 2 + 2 ⁇ H 1 ⁇ v B ⁇ ) 2 ( 7 )
  • Q 7 - ( 2 ⁇ ( 2 ⁇ n 2 ⁇ v AB ⁇ w A ⁇ ⁇ 2 + 4 ⁇ H 1 ⁇ v AB + 2 ⁇ G 1 ⁇ n 2 ⁇ v AB ⁇ w B ⁇ ⁇ 2 + 2 ⁇ G 1 ⁇ ( 1 - n 2 ⁇ v A ⁇ ⁇ 2 ) ⁇ w AB ) ) )
  • For each real root ⁇ , the corresponding values of (x C , y C ) and b can be readily determined.
  • zero or more solutions will be obtained; for example, in some instances, three solutions can be obtained for a typical configuration of tangents.
  • a model builder 1614 and model updater 1616 provide functionality to define, build, and/or customize model(s) 1608 using one or more components in object library 1612 .
  • model refiner 1606 updates and refines the model, bringing the predictive information of the model in line with observed information from the detection system 102 .
  • Model subcomponents 1702 , 1703 , 1754 can be scaled, sized, selected, rotated, translated, moved, or otherwise re-ordered to enable portions of the model corresponding to the virtual surface(s) to conform within the points 1750 in space.
  • Model refiner 1606 employs a variation detector 1608 to substantially continuously determine differences between sensed information and predictive information and provide to model refiner 1606 a variance useful to adjust the model 1608 accordingly.
  • Variation detector 1608 and model refiner 1606 are further enabled to correlate among model portions to preserve continuity with characteristic information of a corresponding object being modeled, continuity in motion, and/or continuity in deformation, conformation and/or torsional rotations.
  • control object when the control object morphs, conforms, and/or translates, motion information reflecting such motion(s) is included in the observed information.
  • Points in space can be recomputed based on the new observation information.
  • the model subcomponents can be scaled, sized, selected, rotated, translated, moved, or otherwise re-ordered to enable portions of the model corresponding to the virtual surface(s) to conform to the set of points in space.
  • motion(s) of the control object can be rigid transformations, in which case points on the virtual surface(s) remain at the same distance(s) from one another through the motion.
  • Motion(s) can be non-rigid transformations, in which points on the virtual surface(s) can vary in distance(s) from one another during the motion.
  • observation information can be used to adjust (and/or recompute) predictive information, thereby enabling “tracking” the control object.
  • the control object can be tracked by determining whether a rigid transformation or a non-rigid transformation occurs.
  • a transformation matrix is applied to each point of the model uniformly.
  • a non-rigid transformation when a non-rigid transformation occurs, an error indication can be determined, and an error-minimization technique such as described herein above can be applied.
  • rigid transformations and/or non-rigid transformations can be composed.
  • One example composition embodiment includes applying a rigid transformation to predictive information. Then an error indication can be determined, and an error minimization technique such as described herein above can be applied.
  • determining a transformation can include calculating a rotation matrix that provides a reduced RMSD (root mean squared deviation) between two paired sets of points.
  • RMSD root mean squared deviation
  • One embodiment can include using Kabsch Algorithm to produce a rotation matrix.
  • one or more force lines can be determined from one or more portions of a virtual surface.
  • predictive information can include collision information concerning two or more capsuloids.
  • a relationship between neighboring capsuloids, each having one or more attributes e.g., determined minima and/or maxima of intersection angles between capsuloids
  • determining a relationship between a first capsuloid having a first set of attributes and a second capsuloid having a second set of attributes includes detecting and resolving conflicts between first attributes and second attributes.
  • a conflict can include a capsuloid having one type of angle value with a neighbor having a second type of angle value incompatible with the first type of angle value. Attempts to attach a capsuloid with a neighboring capsuloid having attributes such that the combination will exceed what is allowed in the observed—or to pair incompatible angles, lengths, shapes, or other such attributes—can be removed from the predicted information without further consideration.
  • predictive information can be artificially constrained to capsuloids positioned in a subset of the observed information—thereby enabling creation of a “lean model.”
  • capsuloid 1702 could be used to denote the portion of the observed without addition of capsuloids 1703 .
  • connections can be made using artificial constructs to link together capsuloids of a lean model.
  • the predictive information can be constrained to a subset of topological information about the observed information representing the control object to form a lean model.
  • a lean model can be associated with a full predictive model. The lean model (or topological information, or properties described above) can be extracted from the predictive model to form a constraint. Then, the constraint can be imposed on the predictive information, thereby enabling the predictive information to be constrained in one or more of behavior, shape, total (system) energy, structure, orientation, compression, shear, torsion, other properties, and/or combinations thereof.
  • the observed can include components reflecting portions of the control object which are occluded from view of the device (“occlusions” or “occluded components”).
  • the predictive information can be “fit” to the observed as described herein above with the additional constraint(s) that some total property of the predictive information (e.g., potential energy) be minimized or maximized (or driven to lower or higher value(s) through iteration or solution). Properties can be derived from nature, properties of the control object being viewed, others, and/or combinations thereof.
  • a deformation of the predictive information subcomponent 1760 can be allowed subject to an overall permitted value of compression, deformation, flexibility, others, and/or combinations thereof.
  • a “friction constraint” is applied on the model 1700 . For example, if fingers of a hand being modeled are close together (in position or orientation), corresponding portions of the model will have more “friction”. The more friction a model subcomponent has in the model, the less the subcomponent moves in response to new observed information. Accordingly, the model is enabled to mimic the way portions of the hand that are physically close together move together, and move less overall.
  • an environmental filter 1620 reduces extraneous noise in sensed information received from the detection system 100 using environmental information to eliminate extraneous elements from the sensory information.
  • Environmental filter 1620 employs contrast enhancement, subtraction of a difference image from an image, software filtering, and background subtraction (using background information provided by objects-of-interest determiner 1622 (see below) to enable model refiner 1606 to build, refine, manage, and maintain model(s) 1608 of objects of interest from which control inputs can be determined.
  • a model analyzer 1610 determines that a reconstructed shape of a sensed object portion matches an object model in an object library, and interprets the reconstructed shape (and/or variations thereon) as user input. Model analyzer 1610 provides output in the form of object, position, motion, and attribute information to an interaction system 1630 .
  • the interaction system 1630 includes an interaction-interpretation module 1632 that provides functionality to recognize command and other information from object, position, motion and attribute information obtained from variation system 1600 .
  • An interaction-interpretation module 1632 embodiment comprises a recognition engine 1634 to recognize command information such as command inputs (i.e., gestures and/or other command inputs (e.g., speech, and so forth)), related information (i.e., biometrics), environmental information (i.e., context, noise, and so forth) and other information discernable from the object, position, motion, and attribute information that might be useful in controlling a machine.
  • command inputs i.e., gestures and/or other command inputs (e.g., speech, and so forth)
  • related information i.e., biometrics
  • environmental information i.e., context, noise, and so forth
  • Recognition engine 1634 employs gesture properties 1636 (e.g., path, velocity, acceleration, and so forth), control objects determined from the object, position, motion, and attribute information by an objects-of-interest determiner 1622 and optionally one or more virtual constructs 1638 (see e.g., FIGS. 18A and 18B : 1800 and 1820 ) to recognize variations in control-object presence or motion indicating command information, related information, environmental information, and other information discernable from the object, position, motion, and attribute information that might be useful in controlling a machine.
  • gesture properties 1636 e.g., path, velocity, acceleration, and so forth
  • control objects determined from the object, position, motion, and attribute information by an objects-of-interest determiner 1622 and optionally one or more virtual constructs 1638 (see e.g., FIGS. 18A and 18B : 1800 and 1820 ) to recognize variations in control-object presence or motion indicating command information, related information, environmental information, and other information discernable from the object, position, motion, and attribute information that
  • virtual construct 1800 , 1820 implement an engagement target with which a control object 112 interacts—enabling the machine sensory and control system to discern variations in control object (i.e., motions into, out of or relative to virtual construct 1800 , 1820 ) as indicating control or other useful information.
  • a gesture trainer 1640 and gesture-properties extractor 1642 provide functionality to define, build, and/or customize gesture properties 1636 .
  • a context determiner 1634 and object-of-interest determiner 1622 provide functionality to determine from the object, position, motion, and attribute information objects of interest (e.g., control objects, or other objects to be modeled and analyzed) and/or objects not of interest (e.g., background), based upon a detected context. For example, when the context is determined to be an identification context, a human face will be determined to be an object of interest to the system and will be determined to be a control object. On the other hand, when the context is determined to be a fingertip control context, the finger tips will be determined to be object(s) of interest and will be determined to be control objects whereas the user's face will be determined not to be an object of interest (i.e., background).
  • objects of interest e.g., control objects, or other objects to be modeled and analyzed
  • objects not of interest e.g., background
  • the tool tip will be determined to be object of interest and a control object whereas the user's fingertips might be determined not to be objects of interest (i.e., background).
  • Background objects can be included in the environmental information provided to environmental filter 1620 of model-management module 1602 .
  • a virtual environment manager 1646 provides creation, selection, modification, and de-selection of one or more virtual constructs 1800 , 1820 (see FIGS. 18A and 18B ).
  • virtual constructs e.g., a virtual object defined in space such that variations in real objects relative to the virtual construct, when detected, can be interpreted for control or other purposes
  • variations i.e., virtual “contact” with the virtual construct, breaking of virtual contact, motion relative to a construct portion, and so forth
  • Interaction-interpretation module 1632 provides as output the command information, related information, and other information discernable from the object, position, motion, and attribute information that might be useful in controlling a machine from recognition engine 1634 to an application control system 1650 .
  • an application control system 1650 includes a control module 1652 that provides functionality to determine and authorize commands based upon the command and other information obtained from interaction system 1630 .
  • a control module 1652 embodiment comprises a command engine 1654 to determine whether to issue command(s) and what command(s) to issue based upon the command information, related information, and other information discernable from the object, position, motion, and attribute information, as received from the interaction-interpretation module 1632 .
  • Command engine 1654 employs command/control repository 1656 (e.g., application commands, OS commands, commands to the machine sensory and control system, miscellaneous commands) and related information indicating context received from the interaction-interpretation module 1632 to determine one or more commands corresponding to the gestures, context, and so forth indicated by the command information.
  • command/control repository 1656 e.g., application commands, OS commands, commands to the machine sensory and control system, miscellaneous commands
  • related information indicating context received from the interaction-interpretation module 1632 to determine one or more commands corresponding to the gestures, context, and so forth indicated by the command information.
  • engagement gestures can be mapped to one or more controls, or a control-less screen location, of
  • Controls can include imbedded controls (e.g., sliders, buttons, and other control objects in an application), or environmental level controls (e.g., windowing controls, scrolls within a window, and other controls affecting the control environment).
  • controls may be displayed using 2D presentations (e.g., a cursor, cross-hairs, icon, graphical representation of the control object, or other displayable object) on display screens and/or presented in 3D forms using holography, projectors, or other mechanisms for creating 3D presentations, or may be audible (e.g., mapped to sounds, or other mechanisms for conveying audible information) and/or touchable via haptic techniques.
  • an authorization engine 1658 employs biometric profiles 1660 (e.g., users, identification information, privileges, and so forth) and biometric information received from the interaction-interpretation module 1632 to determine whether commands and/or controls determined by the command engine 1654 are authorized.
  • a command builder 1662 and biometric profile builder 1660 provide functionality to define, build, and/or customize command/control repository 1652 and biometric profiles 1660 .
  • Selected authorized commands are provided to machine(s) under control (i.e., “client”) via interface layer 1664 .
  • Commands/controls to the virtual environment i.e., interaction control
  • Commands/controls to the emission/detection systems i.e., sensory control
  • emission module 102 and/or detection module 104 are provided as appropriate.
  • control object is a hand
  • analysis of the hand's shape and configuration may determine the positions of the finger tips, which may constitute the relevant control surfaces.
  • changes in control attributes of the identified control surface(s) such as positional changes of the fingertips, may be analyzed to determine whether they are indicative of control information.
  • this may serve to discriminate between deliberate motions intended to provide control input and hand jitter or other inevitable motions. Such discrimination may be based, e.g., on the scale and speed of motion, similarity of the motions to pre-defined motion patterns stored in a library, and/or consistency with deliberate motions as characterized using machine learning algorithms or other approaches.
  • a hand gesture or other motion is analyzed relative to a programmatically defined engagement target (e.g., a plane, curved surface (whether open or closed), point, line, or volume whose position and location in space is well-defined and which need generally not coincide with a physical surface) to determine whether the change in the control surface is indicative of an engagement gesture.
  • a programmatically defined engagement target e.g., a plane, curved surface (whether open or closed), point, line, or volume whose position and location in space is well-defined and which need generally not coincide with a physical surface
  • a particular detected motion corresponds to control information
  • an appropriate response action is taken, generally in accordance with and/or based on response criteria, such as the context in which the control information was received (e.g., the particular software application active at the time, the user accessing the system, an active security level, etc.).
  • the response may involve issuing a command (e.g., open a new document upon a “click,” or shift the displayed content in response to a scrolling motion) to a user interface based on the detected gesture or motion.
  • a command e.g., open a new document upon a “click,” or shift the displayed content in response to a scrolling motion
  • a machine sensory and controller system 1810 can be embodied as a standalone unit(s) 1810 coupleable via an interface (e.g., wired or wireless), embedded (e.g., within a machine 1812 , 1814 or machinery under control), or combinations thereof.
  • an interface e.g., wired or wireless
  • embedded e.g., within a machine 1812 , 1814 or machinery under control
  • the system is used for security purposes and directs light emission at an entryway to a secure room or space. If, during the scan across the entryway, a reflection is detected, this may indicate the presence of a person seeking entrance. A second scan may then be conducted, according to a more refined scan pattern, to obtain more detailed information about the person. For example, a vein pattern of the person's hand may be identified. Vein patterns of authorized users may be stored in a database, allowing the system to check whether the detected person is authorized to enter the secure space. In other words, the control information in this case is authentication information.
  • the system may respond by permitting the person to enter (e.g., by automatically opening a mechanical barrier, temporarily interrupting laser beams crossing the entryway, or by some other means).
  • Certain embodiments of depth-sensing positioning and tracking systems in accordance herewith may also be mounted on automobiles or other mobile platforms to provide information as to the outside environment (e.g., the positions of other automobiles) to other systems within the platform.
  • embodiments of the disclosed technology may be employed in a variety of application areas, including, without limitation, consumer applications including interfaces for computer systems, laptops, tablets, television, game consoles, set top boxes, telephone devices and/or interfaces to other devices; medical applications including controlling devices for performing robotic surgery, medical imaging systems and applications such as CT, ultrasound, x-ray, MRI or the like, laboratory test and diagnostics systems and/or nuclear medicine devices and systems; prosthetics applications including interfaces to devices providing assistance to persons under handicap, disability, recovering from surgery, and/or other infirmity; defense applications including interfaces to aircraft operational controls, navigations systems control, on-board entertainment systems control and/or environmental systems control; automotive applications including interfaces to automobile operational systems control, navigation systems control, on-board entertainment systems control and/or environmental systems control; security applications including, monitoring secure areas for suspicious activity or unauthorized personnel; manufacturing and/or process applications including interfaces to assembly robots, automated test apparatus, work conveyance devices such as conveyors, and/or other factory floor systems and devices, genetic sequencing machines, semiconductor fabrication related machinery, chemical

Abstract

System and methods for locating objects within a region of interest involve, in various embodiments, scanning the region with light of temporally variable direction and detecting reflections of objects therein; positional information about the objects can then be inferred from the resulting reflections.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to and the benefit of, and incorporates herein by reference in their entireties, U.S. Provisional Application Nos. 61/792,025, 61/800,327 and 61/801,479, each filed on Mar. 15, 2013.
  • TECHNICAL FIELD
  • Embodiments of the disclosed technology generally relate to distance measurement and, more particularly, to determining a position and/or depth of an object or features of an object surface in space.
  • BACKGROUND
  • Determining the distance of objects from an observation point, or tracking objects in three dimensions, is important for many applications, including, e.g., surveying, navigation, surveillance, focus finding in photography, and motion capture for gesture- and movement-controlled video gaming, to name just a few. A variety of approaches to distance determination exist. For example, ultrasonic and laser rangefinders are based on time-of-flight measurements, i.e., they emit a sound or light pulse toward the object, capture a reflected signal pulse (or echo), and measure the roundtrip time for the pulse, from which the distance of the object can be calculated. This method generally works well over large distances where precision beyond a few meters is not required. However, since the accuracy of the measurement depends heavily on recording the precise times of broadcast and capture, which is challenging, in particular, for very fast-moving waves (e.g., light waves), this approach is—absent expensive high-speed equipment—generally unsuitable for more precise determinations and/or determinations made over shorter distances. A need therefore exists for alternative methods for determining the distance, depth, and/or three-dimensional position of an object or surface feature thereof.
  • SUMMARY
  • Embodiments hereof provide an approach to distance or depth measurements that utilizes a directed light source with time-variable direction to sweep or scan a spatial region of interest, and a light detector to measure light reflected by object(s) that might exist within the region of interest in a way that temporally resolves the different directions. In one embodiment, the light source scans the spatial region periodically (e.g., like a flash light waving back and forth over the region of interest), such that each direction of the illuminating emission provides information about object(s) that might exist in the region being scanned. For example, light direction can correspond to, and/or can include information about, a particular phase or point within the emission cycle. The direction of the illuminating emission thus provides information about one or more of a presence, location, or other perceivable property of the object(s). For example, in some embodiments, a camera—or other light sensor—can be synchronized with the light source to capture light diffusely reflected (e.g., scattered) off objects in the spatial region. Positional information can be extracted from analyzing the captured light. In an embodiment, positional information can be extracted from the time (or phase within the cycle) at which the light is received. The direction of the illuminating emission, and thus the direction of the object's location viewed from the source/sensor arrangement, can be calculated. For example, positional information can comprise relative distances between the camera, the object, and the receiver.
  • An embodiment uses two such source/sensor arrangements disposed in different locations, each scan the same plane, thereby enabling the location of an object within that plane to be determined from the intersection of the two scanning directions that result in respective reflection signals. In a yet further embodiment, a third source/sensor arrangement is disposed in a different plane such that the emission of the third source/sensor arrangement scans along an axis orthogonal to the two co-planar source/sensor arrangements, thereby enabling a three-dimensional spatial region to be scanned for the location of the object. The distances to the object within the spatial region can be determined by triangulation based upon the relative distances between the object and the various source/sensor arrangements.
  • In further embodiments, an object of interest is equipped with a retro-reflector, and the specular reflection of the illuminating emission is detected with a photo-sensor co-located with the light source.
  • In yet further embodiments, an object may be equipped, for tracking purposes, with a light source and sensor, and its location may be computed from reflection signals that it receives from retro-reflectors at fixed spatial locations.
  • Among other aspects, in an embodiment, directed light source refers to a light source that emits light in accordance with a directional nature (primary direction). The light emission may have any divergence, ranging from highly collimated to wide-angle, and can often be characterized with a Gaussian profile. In an embodiment, the primary direction of the emission comprises the central or average direction, or the direction of brightest emission, which coincide for a Gaussian profile. Some implementations include a directed light source with time-variable direction, comprising an arrangement of one or more light-emitting devices, optionally in conjunction with any associated optics or other components, that emits light, at any moment in time, in a primary direction that changes as a function of time.
  • In some embodiments, variable emission direction is achieved by an arrangement of two or more light-emitting devices, at least two of which emit in a different direction, that are electronically controlled to change activation levels among two or more activation states (e.g., turn on and turn off) sequentially, periodically, or according to pattern (or combinations thereof)), such that a combination of the contributions from the two or more light sources emit radiation during a complete control cycle, or emission cycle, in accordance with a scan pattern.
  • Individual light-emitting devices define an emission point, and collectively, the light-emitting devices of the light source define an emission region. The light-emitting devices can be arranged and/or spaced according to implementation (e.g., based upon individual emission divergences, area coverage, interference avoidance, other application specific needs, and so forth) such that the emissions collectively cover at least some desired subset of the entire spatial region of interest. In alternative embodiments, a single light-emitting device can be used; the emission direction can be varied directly by moving (mechanically, optically, electrically, or combinations thereof) the light-emitting device, and/or indirectly by moving a light-deflecting optic, such as a prism or mirror to scan the region of interest.
  • In embodiments, scanning encompasses directing light, e.g., as achieved by moving one or more individual light-emitting devices and/or associated optics, as well as making discrete changes in emission direction, e.g., by activating light-emitting devices pointing in different directions. In embodiments, two or more activation states of two or more light sources can be at least partially concurrent, or completely discrete. In some embodiments, the scan pattern enables light to sweep a region of interest, by changing the emission direction continuously or discretely as a function of time. For example, in some embodiments, the emission can change between discontiguous directions (e.g., in accordance with a random, pseudo-random, or programmed scan pattern). In embodiments, a scan pattern provides a continuous or discontiguous spatial, temporal, or spatio-temporal change in light at the region of interest. Implementation-specific needs can be met using any or combinations of individual light-emitting devices, such as light-emitting diodes (LEDs), lasers, incandescent or fluorescent light bulbs, or any other devices emitting light in one or more suitable frequency ranges, such as the optical, infrared (IR), or ultraviolet (UV) regime.
  • In an embodiment, light scattered or reflected by objects illuminated by the light source is captured using a light detector (or multiple detectors), suited to the application, that measures a property, comprising any of an intensity, color, polarization, other property or combinations thereof, of the incoming light as a function of time; each detector defines a detection point, and when alone or combined with multiple light-sensing elements, defines a detection region. In embodiments, the light detector can be synchronized with the light source so that the timing of the various control states of the light source (corresponding, for example, to different emission compositions, directions, other controllable aspects) is known relative to the timing of the measurements at the detector, enabling associating the direction of the illuminating emission with each point in time of the measured signal. Various embodiments utilize a digital camera with a one- or two-dimensional electronic pixelated sensor, which can be non-coplanar or coplanar, such as a CCD (charge-coupled device) or CMOS (complementary metal-oxide semiconductor) sensor used as a photo-sensor array. The camera may acquire a temporal sequence of captured frames, at a frame rate faster than the scan rate of the light source, such that multiple frames are acquired during an emission cycle of the source and each frame corresponds to a different control state of the illuminating emission. The frames can be processed by a computer, or other suitable processing device(s), which can analyze the frames to determine a phase of a signal formed by the pixel output in response to the changing light conditions experienced by the sensor within each cycle for each pixel, e.g., by means of a discrete Fourier transform. In alternative embodiments, the signal comprises a photocurrent generated in each light-sensor pixel that is processed continuously by an associated phase-detecting circuit. Accordingly, embodiments can enable determining the direction of the illuminating emission from information extracted from the signal. In an embodiment, a camera can include a lens and/or other optics that focus light received from a particular direction onto a particular region of the sensor (corresponding to a pixel or group of pixels); thus, the position within the sensor at which the reflected light is received is related to the direction from which the light comes.
  • In one aspect, embodiments provide a method for obtaining positional information about an object within a region of interest. The method may include (a) activating sources directed to portions of the region of interest according to an ordering of points, such that each point in the ordering directs electromagnetic radiation of at least one source to one of the portions of the region of interest (e.g., the NE quadrant, SE quadrant, etc.); (b) capturing a portion of the electromagnetic radiation reflected by an object; (c) forming a signal over time of at least one property (e.g., intensity, color, polarization, other property or combinations thereof) of the captured electromagnetic radiation; (d) determining from the signal at least one point in the ordering in which a dominant contributor to the captured electromagnetic radiation was activated; (e) determining an identity for the dominant contributor from the point in the ordering; (f) determining from the identity of the dominant contributor, a portion of the region of interest to which the electromagnetic radiation from the dominant contributor was directed; and (g) determining positional information for the object based at least in part upon the portion of the region of interest. In one embodiment, activating sources includes activating illumination sources (e.g., LEDs) directed in different directions according to a known order, such that each point in the order is associated with a particular direction. The ordering can be fixed or can vary from scan instance to scan instance. Further, more than one source can be activated at a time to form a cross-fade or other illumination profile. From the signal, it may then be determined where in the order the greatest contributor—or contributors—of the captured light was activated, and, based thereon, the identity of the greatest contributor(s) and the direction in which the light was emitted may be determined. Based on the direction in which the illumination is emitted, the positional information of the object may be determined.
  • Capturing the reflected light may include capturing data frames (e.g., sensor states resulting from capturing illumination of a scan, or image resulting from a capture with a sensor array of a camera) of the region. The direction of the reflected light relative to the camera may be determined, and the positional information may be further based in part thereon. The data frames may be captured at a rate exceeding a scan rate associated with the illuminating emission. The direction of the illuminating emission associated with the spectral region reflected by the object may be ascertained by determining an intensity peak across a temporal sequence of data frames for at least one pixel corresponding to the object within the data frame. The intensity peak, in turn, may be determined, e.g., from a Fourier transform on the temporal sequence of data frames. In some embodiments, the electromagnetic radiation is retro-reflected by the object. A retro-reflector may be physically associated with the object.
  • The region may be scanned periodically, according to a same or a different order of activation. The point in the ordering corresponding to capture of the reflected electromagnetic radiation corresponds to a phase within an emission cycle. Determining the direction of the illuminating electromagnetic radiation associated with the electromagnetic radiation reflected by the object may include determining the point in the cycle where the captured radiation is greatest. This point in the cycle may be detected using a phase-detector circuit or equivalent. Scanning the region may involve sequentially operating a plurality of light-emitting devices each directing light to a different portion of the region of interest. In some embodiments, only a subset of the plurality of light-emitting devices are operated so as to reduce a resolution of the scan. Sequentially operating the plurality of devices may comprise sequentially causing the devices to emit pulses of light; successive pulses may overlap temporally. Sequentially operating the plurality of devices may, alternatively or additionally, involve driving each device according to a time-variable intensity having an intensity peak, the peaks occurring sequentially for the plurality of light-emitting devices. In some embodiments, light emitted by the plurality of light-emitting devices overlap spatially and temporally; in this case, determining a direction of the illuminating emission associated with the light reflected by the object may include determining an effective primary direction of the overlapping illumination. The “effective primary direction” is the direction along which the intensity maximum resulting from the spatiotemporal overlap between illuminations from two or more sources with different individual primary directions lies. In various embodiments, scanning the region includes moving (e.g., rotating or translating) a light-emitting device, or moving a deflecting optic and/or a screen used in conjunction with a light-emitting device.
  • In another aspect, embodiments pertain to a system for obtaining positional information about an object within a region of interest. The system may include a directed light source with variable direction for scanning the region with an illuminating light, a detector for capturing light reflected by the object, and circuitry for (i) determining a time of capture of the reflected light and, based thereon, an associated direction of the illuminating light, and (ii) deriving the positional information about the object at least in part from the direction of the illuminating light. In some embodiments, the system also includes a retro-reflector affixed to the object.
  • In some embodiments, the directed light source includes a plurality of light-emitting devices (e.g., LEDs) emitting light in a respective plurality of different primary directions. Further, the directed light source may include a controller for sequentially operating the plurality of light-emitting devices. The light-emitting devices may be arranged such that their respective primary directions intersect at a common center. For example, the light-emitting devices may be affixed to an arcuate surface, facets of a polygonal surface, or facets of a polyhedral surface. (An “arcuate surface” is herein understood to mean a segment of a spherical surface (e.g., a semispherical surface) or a surface having at least one cross-section shaped like a segment of a circle (e.g., a semicircle).) The plurality of light-emitting devices may include a plurality of light emitters and a plurality of associated deflecting optics for deflecting light emitted by the emitters into the different primary directions. In some embodiments, the directed light source comprises one or more moving light-emitting devices. In other embodiments, the directed light source includes one or more light-emitting devices and a moving deflecting optic and/or a moving screen having a perforation therein.
  • In various embodiments, the detector includes a camera for imaging the region; the camera may include a lens and a sensor (e.g., a CCD or MEMS sensor), or a light-sensing device and a scanning mirror. In certain embodiments, the detector is co-located with the light source, and the system further includes a retro-reflector affixed to the object. The directed light source may include a controller for varying the emission direction so as to periodically scan the region; this controller may be synchronized with the circuitry determining the time of capture of the reflected light. The circuitry causes the detector to be read out at a rate exceeding the scan rate of the directed light source. In some embodiments, the circuitry includes a phase-detector circuit for determining a phase within an emission cycle corresponding to a maximum intensity of the captured light, and/or a digital processor configured for performing a Fourier transform on the captured light to thereby determine a phase within an emission cycle corresponding to a maximum intensity of the captured light.
  • In another aspect, embodiments provide a method for determining depth associated with one or more objects within a region of interest. The method includes (a) scanning the region with an illuminating light beam having a temporally variable beam direction so as to illuminate the object(s), (b) acquiring a temporal sequence of images of the region while the region is being scanned, each image corresponding to an instantaneous direction of the illuminating light beam and at least one of the images capturing light reflected by the illuminated object(s); and (c) based at least in part on the instantaneous direction of the light beam in the image(s) capturing light reflected by the object(s), determining a depth associated with the object(s). Multiple of the images acquired during a single scan of the region may capture light reflected by the object(s); the method may include determining a depth profile of the object(s) based thereon.
  • A further aspect relates to a method for locating an object within a region. In various embodiments, the method involves, (a) using a light source affixed to the object, scanning the region with an illuminating light beam having a temporally variable beam direction; (b) using a sensor co-located with the light source, capturing reflections of the illuminating beam from a plurality of retro-reflectors fixedly positioned at known locations; (c) based on times of capture of the reflections, determining associated directions of the illuminating light beam; and (d) locating the object relative to the known locations of the retro-reflectors based at least in part on the directions of the illuminating light beam. In some embodiments, the object is located in a two-dimensional region based on reflections from at least three retro-reflectors.
  • Yet another aspect pertains to a device, affixed to an object of interest, for locating the object within a region relative to a plurality of retro-reflectors fixedly positioned at known locations. The device may include a light source for scanning the region with an illuminating light beam having a temporally variable beam direction; a sensor co-located with the light source for capturing reflections of the illuminating beam from the plurality of retro-reflectors; and circuitry for determining, from times of capture of the reflections, directions of the illuminating light beam associated therewith, and for locating the object relative to the retro-reflectors based at least in part on the directions. The object may, for example, be a mobile device.
  • In a further aspect, embodiments provide a computer-implemented method for conducting machine control. The method involves scanning a region of space, the scanning including (i) directing at least one light emission from a vantage point of a vantage region to a region of space, (ii) detecting a reflectance of the at least one light emission, and (iii) determining that the detected reflectance indicates a presence of an object in the region of space. Further, the method includes determining one or more object attributes of the object; analyzing the one or more object attributes to determine a potential control surface of the object; determining whether one or more control-surface attribute changes in the potential control surface indicate control information; and, if so, responding to the indicated control information according to response criteria. In some embodiments, the first light emission is directed to the region of space according to a first scan pattern, and determining that the detected reflectance indicates a presence of an object comprises directing a second light emission to the region of space according to a second scan pattern. Directing the second emission may involve scanning to a refined scan pattern, e.g., so as to capture surface detail about the object.
  • In some embodiments, determining object attributes of the object comprises determining positional information of at least a portion of the object. Analyzing the object attributes to determine a potential control surface of the object may include determining based at least in part upon the positional information whether a portion of the object provides control information. Determining whether one or more control-surface attribute changes in the potential control surface indicate control information may include determining whether control-surface attribute changes in the potential control surface indicate an engagement gesture. Responding to the indicated control information according to response criteria may include determining a command to a user interface based at least in part upon the engagement gesture. In various embodiments, determining object attributes of the object comprises determining dynamic information of at least a portion of the object, physical information of at least a portion of the object, optical and/or radio properties of at least a portion of the object, and/or chemical properties of at least a portion of the object.
  • In certain embodiments, directing the emission includes scanning across an entryway; determining that the detected reflectance indicates a presence includes detecting an object comprising a person seeking entrance, and conducting a second scanning to a refined scan pattern of the person; determining whether control-surface attribute changes in the potential control surface indicate control information includes determining whether the control-surface attribute changes indicate a vein pattern of a hand of the person; and responding to the indicated control information according to response criteria comprises permitting the person to enter when the vein pattern matches a stored vein pattern of an individual authorized to enter.
  • In a further aspect, a method for obtaining positional information includes (a) scanning the region with an illuminating emission having a temporally variable emission direction; (b) capturing light reflected by the object and, based on a time of capture of the reflected light, determining an associated direction of the illuminating emission; and (c) deriving the positional information about the object at least in part from the direction of the illuminating emission. The positional information may be determined further based at least in part on a geometric relationship between the light source and the detector. The positional information may include a distance, a depth, or a position of the object or a surface feature thereof. In some embodiments, the positional information comprises a depth profile of the object. The method may involve periodically repeating steps (a) through (c) so as to update the positional information to track movement of the object.
  • Advantageously, these and other aspects of the disclosed technology may enable machines, computers, automata, and/or other types of intelligent devices to obtain information about objects present and events and/or actions (such as user gestures, signals, or other motions conveying commands and information to the device) occurring in a monitored region of interest. These and other advantages and features of various embodiments of the disclosed technology will become more apparent through reference to the following description, the accompanying drawings, and the claims. It is to be understood that the features of the various embodiments described herein are not mutually exclusive and can exist in various combinations and permutations.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the following description, various embodiments disclosed technology are described with reference to the following drawings, in which:
  • FIG. 1 illustrates an exemplary machine sensory and control system in embodiments;
  • FIGS. 2A-2E illustrate exemplary emission components of the machine sensory and control system of FIG. 1, in accordance with various embodiments;
  • FIGS. 3A-3E illustrate exemplary detector components of the machine sensory and control system of FIG. 1, in accordance with various embodiments;
  • FIGS. 4A-1 and 4A-2 schematically illustrate an exemplary system for conducting variation determination with emitters operated in discrete modes;
  • FIGS. 4B-1, 4B-2, and 4B-3 schematically illustrate an exemplary system for conducting variation determination with emitters operated in continuously variable modes;
  • FIG. 5A is a schematic diagram of an exemplary system for conducting variation determination with a pixelated sensor in accordance with various embodiments;
  • FIG. 5B illustrates an exemplary control scheme for the light source depicted in FIG. 1A in accordance with one embodiment;
  • FIG. 5C illustrates a reflection signal detected with the camera depicted in FIG. 5A, resulting from the light-source control scheme of FIG. 5B, in accordance with one embodiment;
  • FIG. 5D illustrates the computation of depth for the geometric configuration of the system depicted in FIG. 5A, in accordance with one embodiment;
  • FIG. 5E illustrates an exemplary control scheme for the light source depicted in FIG. 5A that exploits different scan frequencies in accordance with an alternative embodiment;
  • FIG. 5F illustrates a spectral reflection signal detected with the camera depicted in FIG. 5A resulting from the light-source control scheme of FIG. 5E;
  • FIGS. 6A-6C illustrate different exemplary light-source configurations in accordance with various embodiments;
  • FIGS. 7A and 7B illustrate embodiments in which light source and camera are, respectively, integrated into the same unit or provided as separate devices;
  • FIG. 8A illustrates an object location mid-way between the primary directions of two discrete light emitters with overlapping illumination regions, and FIG. 8B shows the resulting reflection intensity in accordance with one embodiment;
  • FIG. 8C illustrates an object located closer to one of two primary directions of two discrete light emitters with overlapping illumination regions, and FIG. 8D shows the resulting reflection intensity in accordance with one embodiment;
  • FIG. 8E illustrates a control scheme with temporal overlap between the pulses emitted by different discrete light emitters having overlapping illumination regions, and FIGS. 8F and 8G show the resulting reflection intensities in accordance with various embodiments;
  • FIG. 8H illustrates the intensity variation of a pulsed light source in accordance with various embodiments;
  • FIG. 9A illustrates a light source with only two light emitter producing overlapping illumination regions, and FIG. 9B illustrates a control scheme for the light source with temporal overlap in accordance with one embodiment;
  • FIGS. 10A-10G illustrate moving light sources in accordance with various embodiments;
  • FIGS. 11A-11N illustrate different exemplary light sources for scanning a three-dimensional region in accordance with various embodiments;
  • FIGS. 12A and 12B are flow charts illustrating methods for depth determination in accordance with various embodiments;
  • FIG. 13A is a schematic diagram of an exemplary system for depth determination that uses light sources producing different light characteristics in accordance with various embodiments;
  • FIG. 13B is a schematic diagram of an exemplary system for depth determination based on retro-reflection off an object in accordance with various embodiments;
  • FIG. 13C is a schematic diagram of an exemplary system for determining the two-dimensional position of an object using retro-reflection in accordance with various embodiments;
  • FIG. 13D is a schematic diagram illustrating how an object equipped with a light source can be localized based on retro-reflections off multiple stationary reflectors in accordance with various embodiments;
  • FIG. 14 is a block diagram of an exemplary computational facility for determining positional information in accordance with various embodiments;
  • FIG. 15 illustrates an exemplary task environment in which various embodiments of the disclosed technology may be utilized;
  • FIG. 16 illustrates a variation determination system in accordance with various embodiments;
  • FIGS. 17A-17D illustrate predictive information including a model in accordance with various embodiments; and
  • FIGS. 18A and 18B illustrate virtual constructs implementing an engagement target with which a control object interacts in accordance with various embodiments.
  • DETAILED DESCRIPTION
  • Described herein are various embodiments of methods and systems for determining the distance, depth, and/or position of objects (and surface features of objects) in space relative to a physical reference (such as a light source and detector, as described in detail below). Sensing distance, depth, and/or position of objects automatically (e.g., programmatically) can enable machines to be controlled by salient properties of the objects or object motion. The object(s) may generally be any inanimate or animate objects, and may have complex surfaces and/or change in position or shape over time. Certain embodiments provide improved accuracy in positional and/or depth determination, thereby supporting object and/or object-surface recognition, as well as object-change, event, and/or action recognition, and/or combinations thereof.
  • FIG. 1 illustrates an exemplary machine sensory and control system in embodiments. In one embodiment, a motion-sensing and controller system provides for detecting that some variation(s) in one or more portions of interest of a user (or other object) has occurred, for determining that an interaction with one or more machines corresponds to the variation(s), for determining whether the interaction should occur, and, if so, for affecting the interaction. The machine sensory and control system (MSCS) typically includes a portion-detection system, a variation-determination system, an interaction system, and an application-control system.
  • As FIG. 1 shows, one embodiment of detection system 100 embodiment includes an emission module 102, a detection module 104, a controller 106, a signal-processing module 108, and a machine-control module interface 110. The emission module 102 illuminates one or more objects of interest 112 (e.g., the user's finger or some other control object) within an area of interest 114. In one embodiment, the emission module 102 includes one or more emitter(s) 120A, 120B (e.g., LEDs or other devices emitting light in the IR, visible, or other spectrum regions, or combinations thereof; radio and/or other electromagnetic signal emitting devices) that are controllable via emitter parameters (e.g., frequency, activation state, firing sequences and/or patterns, and so forth) by the controller 106. However, other existing/emerging emission mechanisms and/or some combination thereof can also be utilized in accordance with the requirements of a particular implementation. The emitters 120A, 120B can be individual elements coupled with materials and/or devices 122. For instance, a light-emitting element 120A, 120B may be combined with a lens 122A (see FIG. 2A), multi-lens 122B (see FIG. 2B), image-directing film (IDF) 122C (see FIG. 2C), liquid lens, multiple such elements or combinations thereof, and/or others, with varying or variable optical properties to direct the emission. Further, as shown in FIG. 2D, one or more arrays 120D of emissive elements (combined on a die or otherwise) may be used with or without the addition of devices 122 for directing the emission, and positioned within an emission region 200 (see FIG. 2A) according to one or more emitter parameters (e.g., statically mounted (e.g., fixed, parallel, orthogonal or forming other angles with a work surface, one another or a display or other presentation mechanism), dynamically mounted (e.g., pivotable, rotatable and/or translatable), embedded (e.g., within a machine or machinery under control) or otherwise coupleable using an interface (e.g., wired or wireless). In some embodiments, illustrated in FIG. 2E, structured lighting techniques can provide improved surface-feature-capture capability by casting illumination according to a reference pattern onto the object. Image-capture techniques described in further detail herein can be applied to capture and analyze differences in the reference pattern and the pattern as reflected by the object. In yet further embodiments, the detection system 100 may omit the emission module 102 altogether (e.g., in favor of ambient lighting).
  • With renewed reference to FIG. 1 and further reference to FIGS. 3A-3E, in one embodiment, the detection module 104 includes one or more capture device(s) 130A, 130B (e.g., e.g., devices sensitive to visible light or other electromagnetic radiation) that are controllable via the controller 106. The capture device(s) 130A, 130B can comprise one or more individual image-capture elements 130A or arrays of image-capture elements 130A (e.g., pixel arrays, CMOS or CCD photo sensor arrays, or other imaging arrays) or individual photosensitive elements 130B or arrays of photosensitive elements 130B (e.g., photodiodes, photo sensors, single detector arrays, multi-detector arrays, or other configurations of photo sensitive elements), or combinations thereof. However, other existing/emerging detection mechanisms and/or some combination thereof can also be utilized in accordance with the requirements of a particular implementation.
  • Capture device(s) 130A, 130B can each define a particular vantage point 300 from which objects 112 within the area of interest 114 are sensed, and can be positioned within a detection region 302 (see FIG. 3A) according to one or more detector parameters (either statically (e.g., fixed, parallel, orthogonal or forming other angles with a work surface, one another, or a display or other presentation mechanism) or dynamically (e.g., pivotably, rotatably, and/or translatably); and mounted, embedded (e.g., within a machine or machinery under control), or otherwise coupleable using a wired or wireless interface). Capture devices 130A, 130B can be coupled with devices and/or materials (such as, e.g., lenses 310A (see FIG. 3A), multi-lenses 310B (see FIG. 3B), image-directing film (IDF) 310C (see FIG. 3C), liquid lenses, combinations thereof, and/or others) with varying or variable optical properties for directing the reflectance to the capture device 130A, 130B for controlling or adjusting resolution, sensitivity, and/or contrast. Capture devices 130A, 130B can be designed or adapted to operate in the IR, visible, or other spectrum regions, or combinations thereof; or alternatively operable in conjunction with radio- and/or other electromagnetic-signal-emitting devices in various applications. Multiple capture devices 130A, 130B can be organized in arrays 320, in which the image capture device(s) can be interleaved by row (see, e.g., FIG. 3D), column, or according to a pattern, or can be otherwise addressable individually or in groups. In an embodiment, capture devices 130A, 130B can capture one or more images for sensing objects 112 and capturing information about the object (e.g., position, motion, and so forth). In embodiments comprising more than one capture device, particular vantage points of capture devices 130A, 130B can be directed to area of interest 114 so that fields of view 330 of the capture devices at least partially overlap. Overlap in the fields of view 330 (see, e.g., FIG. 3E) provides capability to employ stereoscopic vision techniques, including those known in the art, to obtain information from a plurality of images captured substantially contemporaneously.
  • While illustrated with reference to a particular embodiment in which control of emission module 102 and detection module 104 are co-located within a common controller 106, it should be understood that these control functions may, in alternative embodiments, be implemented in separate hardware components, or may each be distributed over a plurality of components. Controller 106 comprises control logic (implemented in hardware, software, or combinations thereof) to conduct selective activation/de-activation of emitter(s) 120A, 120B in on-off or other activation states or combinations thereof (and/or to control active directing devices) to produce emissions of (e.g., spatiotemporally) varying intensities in accordance with a scan pattern which can be directed to scan the area of interest 114. Controller 106 can further comprise control logic (implemented in hardware, software, or combinations thereof) to conduct selection, activation, and control of capture device(s) 130A, 130B (and/or to control associated active directing devices) to capture images or otherwise sense differences in reflectance or other illumination. Signal-processing module 108 determines whether captured images and/or sensed differences in reflectance and/or other sensor-perceptible phenomena indicate a possible presence of one or more objects of interest 112, such as control objects 113; the presence of such objects, and/or variations thereof (e.g., in position, shape, etc.), can be used as input to a machine controller via the machine- and application-control module interface 110.
  • In various embodiments, the variation of one or more portions of interest of a user or control object can correspond to a variation of one or more attributes (e.g., position, motion, appearance, surface patterns) of a user's hand or finger(s) 113, points of interest on the hand, a facial portion, etc., or other control objects (e.g., styli, tools), and so on (or some combination thereof) that is detectable by, or directed at, but otherwise occurs independently of the operation of the machine sensory and control system. Thus, for example, the system may be configurable to “observe” ordinary user locomotion (e.g., motion, translation, expression, flexing, deformation, and so on), locomotion directed at controlling one or more machines (e.g., gesturing, intentionally system-directed facial contortion, and so forth), and/or attributes thereof (e.g., rigidity, deformation, fingerprints, veins, pulse rates, and/or other biometric parameters); see, e.g., U.S. Provisional Patent Application Ser. No. 61/952,843 (filed on Mar. 13, 2014), the entire disclosure of which is hereby incorporated by reference. In one embodiment, the system provides for detecting that some variation(s) in one or more portions of interest (e.g., fingers, fingertips, or other control surface portions) of a user has occurred, for determining that an interaction with one or more machines corresponds to the variation(s), for determining whether the interaction should occur, and, if so, for at least one of initiating, conducting, continuing, discontinuing, and/or modifying the interaction (and/or a corresponding or related interaction).
  • FIGS. 4A-1 and 4A-2 illustrate an exemplary system for conducting distance/depth determination with emitters operated in discrete modes. As shown in FIG. 4A-1, the system includes an emitting source 402 comprising a number of (in the depicted exemplary embodiment, four) emitting elements (e.g., emitters A, B, C, D) to provide a directed source having a variable direction that illuminates a region of interest 414 including an object of interest 412. The source 402 can scan a spatial region according to a suitable scan pattern by activating and de-activating select ones of the emitting elements according to a known ordering. For instance, the region may be divided into a number of sub-regions (e.g., four quadrants), which are then illuminated by select activation of emitting elements. Emissions that intercept the object 412 are reflected, and a portion of the reflection is captured by a receiver 404 (e.g., a single detector element, any arrangement—e.g., an array, line, etc.—of individual detector elements, a camera or camera-like device comprising a pixelated sensor, or other arrangements of detector elements or combinations thereof). When the illumination strikes an object 414, a reflection is detected by the receiver 404 which forms a signal of a property (e.g., intensity, polarization, frequency, or the like) of the reflection received over time. A point in the ordering in which a dominant contributor to the captured electromagnetic radiation was activated can be determined from the signal. The point in the ordering corresponding to the time of receipt of the reflection allows an inference of the direction of the illuminating emission and, thus, of the sub-region in which the reflecting object is located. An identity for the dominant contributor can be determined from the point in the ordering. A portion of the region of interest (e.g., sub-region) to which the electromagnetic radiation from the dominant contributor was directed can be determined from the identity of the dominant contributor. Positional information for the object (e.g., sub-region in which at least a portion of the object is located) can be determined based upon the portion of the region of interest. In some implementations, multiple illuminating emissions having different properties, e.g., different frequencies (e.g., color), may be used contemporaneously or sequentially and distinguished using suitable filters with the detector(s).) Multiple sources and detectors may be used to scan the region of interest along different planes so as to provide three-dimensional information as illustrated by FIG. 13A and described in further detail below. For example, a cubic volume may be scanned along two perpendicular planes, each divided into four quadrants. Localizing an object in one of the four quadrants for both scans allows identifying in which one of eight sub-cubes the objects is located. As will be apparent to one of skill in the art, the spatial resolution of object localization achieved in this example embodiment corresponds directly to the spatial resolution of the scan pattern; the more sub-regions there are, the more precise is the determined object location. As illustrated in FIG. 4A-2, the sources A, B, C, D may be operated in a discrete, binary mode, i.e., turned on and off one at a time. This results in a binary signal at the receiver 404, with intensity or amplitude levels of varying degree; reflected illumination from the emitter that provides the most direct illumination of the object 414, in this case emitter B, results in the point of greatest magnitude in an intensity vs. time signal. A point of greatest magnitude that changes position within the ordering—e.g., occurs relative to activation of emitter D in the illustrated example—in successive captures (e.g., frames) of receiver 404 indicates a variation in the position or shape of the object being tracked. Patterns of variations can be detected and cross-correlated to recognize gestures made by a tracked hand for example.
  • FIGS. 4B-1 through 4B-3 illustrate an exemplary system for conducting distance/depth determination with emitters operated in continuously variable modes. The system hardware may have the same or similar emitter and receiver hardware as the discrete-mode embodiment described above with reference to FIGS. 4A-1 through 4A-2, but the emitters A, B, C, D are turned on and off gradually, reaching their respective intensity maxima according to a continuum. Since the emitters are energized and de-energized continuously, their active, energized states may temporally overlap using techniques such as described in further detail with reference to FIGS. 8E-8G and 9A-9B below. In some embodiments, one emitter remains energized while the next emitter is energized to create a “cross-fade.” The reflected emissions detected at the receiver may result in a continuously varying signal with a (global) maximum corresponding to the most direct illumination (which may be either a single maximum, as shown in FIG. 4B-2, or the largest one of a number of local maxima, as shown in FIG. 4B-3).
  • FIG. 5A conceptually illustrates, in more detail, an exemplary system 500 for distance or depth sensing in accordance with various embodiments. The system 500 includes a plurality of integral, non-integral, and/or communicatively coupled elements: a light source 502 that emits light toward a region of interest 504, a controller 506 for operating the light source 502 so as to scan the region 504, a pixelated sensor array 508 (e.g., a camera or other sensor array, typically including associated control hardware) for capturing a portion of the emitted light as reflected from an object 510, and a sensor-data processing facility 512 for processing the sensor data acquired by the sensor array 508. The various system components may be provided in a more distributed or more integrated manner. The controller 506 and sensor-data processing facility 512 may each be implemented in hardware, software, or a combination of both. Further, they may be implemented as separate components (as depicted), or integrated into a single device. In some embodiments, a suitably programmed general-purpose computer interfacing with the sensor array 508 serves to process the sensor data and determine positional information associated therewith. The computer may also store a program module that provides the functionality of the light-source controller 506. Alternatively, the controller 506 and/or the sensor-data processing facility 512 may be implemented with dedicated electronic circuitry, such as, e.g., digital signal processors (DSPs), field programmable gate arrays (FPGAs), or application-specific integrated circuits (ASICs), etc. In some embodiments, the sensor-data processing facility 512 is provided in the form of custom hardware that is integrated with the sensor array 508 into a single device.
  • As depicted, the light source 502 may include a plurality of individually controllable light-emitting devices 520, 522, 524, 526, 528 (such as, e.g., LEDs), disposed, e.g., on a semispherical surface 530. Each individual light-emitting device defines an emission point, and collectively, the light-emitting devices of the light source define an emission region. The light-emitting devices 520, 522, 524, 526, 528 may be powered by a dedicated driver circuit 514, or, alternatively or in addition, via a network link directly by the controller 506. The controller 106 may set the drive currents to alter the activation states of these devices 520, 522, 524, 526, 528 (e.g., turn on and off, vary intensity, etc.) at periodic (or other) intervals, e.g., cycling through them in various orderings according to a scan pattern to emit light in different primary directions (indicated in FIG. 5A by dashed lines) into the region 504. While light source 502 is depicted with five devices, embodiments can be created using virtually any number of light sources (not shown in FIG. 5A for clarity sake) to satisfy implementation-specific criteria or needs.
  • FIG. 5B illustrates an exemplary control scheme for the light-emitting devices 520, 522, 524, 526, 528, involving periodic pulsed operation of each device and uniform spacing of the pulses from all devices throughout the emission cycle. The scan rate (i.e., the number of emission cycles per second) may be selected based on the anticipated range of velocities of the object 510 and/or the reference frame in movable/moving sensor/detector configurations, depth of view of the region of interest, expected distance(s) to object(s) anticipated to appear within the region of interest, power consumption, desired resolution of tracking of object(s), application receiving the result, other configuration-dependent criteria, or combinations thereof. For example, scan rate can be selected such that multiple cycles can occur before the object 510 (or reference frame) moves appreciably. While FIG. 5B illustrates one embodiment of an emission cycle, it will be appreciated that other emission-cycle embodiments, using different numbers of light sources, cycle parameters, and activation states (not shown by FIG. 5B for clarity sake) are readily apparent to one of skill in the art based on the principles and practices revealed herein. As shown by FIG. 5B, an emission cycle can be advantageously apportioned into phases, such that a phase (Φ) corresponds to an activation state of one or more light-emitting devices comprising emitter 502. Changing the activation state of the light-emitting devices (e.g., turn on and off, vary intensity, and so forth), can advantageously sweep the region of interest. In embodiments, the light emitters can be operated individually or in groups, in any order, and according to any scan pattern. In some embodiments, light sources can be illuminated to different levels of brightness based upon an activation state. For example, a light emitter can be switched to a first activation state (e.g., partially illuminated), and subsequently switched to a second setting (e.g., more or brightly illuminated). Third, fourth, and additional illumination levels can be provided for in some embodiments. Some embodiments provide for pulsing of illumination source(s) at different intervals, brightness levels, polarizations, coherence properties and so forth based upon activation states. In yet further embodiments, the brightness of each device can be varied continuously, e.g., in a linear, quadratic, exponential, or logarithmic fashion.
  • With reference again to FIG. 5A, the sensor array 508 can comprise a pixelated sensor 540, such as an array constructed of photodiodes, phototransistors, photovoltaic cells, photoresistors and/or other types of photo-sensing devices capable of converting light into an intensity-dependent current and/or voltage. In some embodiments, a conventional CCD or CMOS camera is used, and the image sensor of the camera is periodically first exposed to light and then read, resulting in a series of data frames. (Alternatively, in “rolling-shutter” cameras, the sensor is exposed and read sequentially in multiple segments (or even individual pixels), such that the exposure of one segment (or pixel) can temporally overlap with the reading of a previously exposed segment (or pixel). In an embodiment, the sensor array facilitates random access to segments or pixels. The rate at which frames are read determines the temporal resolution of image acquisition. In another embodiments, the image sensor 540 is read using custom electronic circuitry. For example, in one embodiment, each individual sensor can be associated with a dedicated analog circuit that continuously measures the photo-induced current or voltage.
  • The sensor array 508 and sensor-data processing facility 512 can be operated to capture and analyze data frames at a rate that temporally resolves the discrete phases of the light source 502. For example, using a frame-based read-out architecture, the sensor array 508 can acquire data, for the depicted embodiment, at five frames per emission cycle, such that each data frame corresponds to one of the states within the emission cycle. The sensor array 508 can be synchronized (broadly understood) with the light source 502, e.g., based on a synchronization signal received from the controller 506 or a signal from a separate timer fed to both the controller 506 and the sensor array 508, such that each data frame can be associated with a direction cc of the illuminating emission. Synchronization can comprise acquisition of a data frame occurring substantially contemporaneously with the activation of one or more light sources in a particular state forming an emission of light, or at a known period of time thereafter. In embodiments, the sensor array can be read at a rate above or below the rate of change in the state of the light source. In one embodiment, the frame rate is read at a rate above the rate at which the light source state changes, which prevents a decrease in the angular resolution of the sweep because a single frame does not, in this case, integrate multiple states of the light source.
  • In one embodiment, sensor array 508 further comprises an imaging optic 542, (e.g., a lens, image-directing film, liquid or variable lenses, other optically active material, or combinations thereof) to focus light onto the sensor 540. Again with reference to FIG. 5A, in the illustrated embodiment, calibration techniques can be employed to correct for variations in components (e.g., deviations in lens optical characteristics, smudges, dust, manufacturing tolerances, burned out emitters and so forth), environment (e.g., temperature, ambient light, vibration, and so forth), usage (e.g., in home, underwater, outdoors, factory environment, highway, and so forth), and/or configurations (flat, 3D, coplanar, non-coplanar, circular, and so forth) of sensor array 508. For example, incoming light approaching sensor array 508 at the angle β relative to the optical axis 544 of the lens 542 (i.e., a normal through the center of the lens), can be mapped sub-optimally to the pixels of the sensor 540 due in part to such variations. Accordingly, in an embodiment, nominal parameters of the lens shape and position and orientation relative to the sensor can be used to compute a coarse mapping function, and subsequent calibration techniques can refine that function to account for deviations from the nominal parameters. Various calibration methods are well-known to persons of skill in the art. In one embodiment, for example, a well-defined test-pattern (e.g., a checkerboard pattern) of known dimensions is imaged, and a function that maps the known original pattern onto the measured pattern in the image is determined. In an embodiment, calibration information can be stored in a library, look-up table or other suitable storage mechanism for data for use by the sensor-data processing facility 512.
  • In some embodiments, sensor array 508 comprises a micro-electro-mechanical system (MEMS) camera. The MEMS camera can scan (e.g., by raster-scan, or other scan pattern) the region of interest using a scanning mirror in conjunction with a single photodiode (or other light-sensing device). Light striking the scanning mirror of the MEMS camera from different directions is captured at different times during a scan. The scanning mirror directs the captured light to the photodiode which converts the light into a signal representing the region scanned. In an embodiment, analysis of the signal can provide blobs corresponding to objects in the region of interest captured by the system. In some embodiments, the scanning mirror is controlled based at least in part upon control information provided by a secondary imaging or sensory system. For example, a first imaging can provide one or more portions of the region of interest to which the scanning mirror is directed to conduct a more refined scanning. Of course, other image recognition techniques in addition to, or instead of, blob tracking are readily provided for by embodiments. In some embodiments, the MEMS camera scans the region of interest at a rate much faster than the light source changes state, e.g., once, twice and so forth for each discrete direction of the illuminating emission, thereby enabling the scan to provide substantially the same information as a complete image frame acquired by a conventional camera.
  • In embodiments, location information for an object (or a particular surface region thereof) can be determined in part based on the state (e.g., temporally changing direction, composition, and so forth) of the light that strikes the object (or the particular surface region) and produces a reflection that can be detected by the sensor array 508. As illustrated in FIG. 5A, an object 510 is illuminated most directly by light from light emitter 526 at phase φ4 (see FIG. 5B) within the cycle, but—due to its location within the region 504—does not receive as large a quantity of illumination from the other light emitters 520, 522, 524, and 526. In an embodiment of the illustrated configuration, the measured time-dependent intensity at pixels that receive light reflected and/or scattered by the object 510 can exhibit a relative peak (corresponding to the reflection) as received at the sensor array 508 as illustrated by the received signal's φ4 in FIG. 5C.
  • Although depicted schematically in FIG. 5A as a single ray for clarity sake, in embodiments the emissions of the light emitter(s) can possess any of a variety of profiles, characteristics, and properties. In one embodiment, intensity distribution of the emission peaks along the primary direction and falls off to all sides, asymptotically reaching zero at large distances from the optical axis. Light travelling in different primary directions can overlap spatially, including on the surface of the object that scatters light towards the sensor. In embodiments, some or all of the sensor 540 can detect light originating from multiple emitters throughout an emission cycle. A reflecting object can appear brightest to the sensor 108 when the object receives maximal illumination. Location information about the object 510 as viewed from the sensor 508 corresponds to, and can be determined from, the maximum intensity of the reflected light at the sensor and the phase within the emission cycle that emitted the light. FIG. 8B illustrates an embodiment in which the intensity of a reflection signal from the object 510 can vary as a function of time if a plurality of emissions spatially overlap such that the object 510 is not only illuminated by emitter 526, but also by the neighboring emitters 524, 528.
  • When the illumination of the region of interest by the light source 502 significantly exceeds any illumination by ambient light, a reflection peak may be determined, e.g., directly within the time-domain intensity signal captured by the sensor for each pixel. Further, to increase the signal-to-noise ratio (i.e., the intensity ratio between light originating from the light source 502 and ambient light), the frequencies to which the sensor array 508 is sensitive can be tailored to the emission frequencies of the light source 502, e.g., by selecting a sensor material that absorbs primarily in that range, or by equipping the sensor with a suitable color filter that strongly absorbs frequencies other than those emitted by the light source 502. Typically, however, the signal is filtered to extract signal components that occur with the scan frequency. This can be accomplished in several ways, depending on the type of sensor array 508 and image-processing facility 512 that is employed. For example, if sensor array 508 comprises a conventional camera read out frame by frame to a general-purpose computer for image-processing, the computer can apply a discrete Fourier transform to corresponding pixels of a time series of image frames that spans multiple emission cycles (e.g., using any of the well-known fast Fourier transform (FFT) algorithms), resulting in a complex-valued frequency-domain intensity signal for each pixel. The peak of this signal will occur at the scan frequency for all pixels, but the associated phase will vary between pixels, depending on the phase within the emission cycle at which the pixel receives illumination.
  • In an embodiment, custom read-out circuitry can be employed, enabling the output signal of each pixel to be analyzed with an analog or digital phase detector, for example, that is driven by a reference signal whose frequency equals the scan frequency of the light source 502; the reference signal may be provided to the circuit, e.g., by the controller 506 of the light source. Suitable phase detectors are well-known to those of skill in the art, and can be implemented without undue experimentation. If a type-II phase detector is used, the output is a constant voltage proportional to the phase difference between the reference signal and the measured signal (of the same frequency). More detail on phase-detector circuits is available in the literature, e.g., in Wolaver, Dan H. (1991), Phase-Locked Loop Circuit Design, Prentice Hall, ISBN 0-13-662743-9. In some embodiments, discriminating between reflection signals and background light and/or identifying reflection peaks comprises some form of thresholding, (e.g., based on signal amplitude or intensity). Thresholding can be implemented in software, hardware, or a combination of both. In some embodiments, the detector is dormant until a reflection signal is detected; i.e., signal processing is triggered only upon detection of a threshold amount or intensity of light.
  • In the embodiments described above, the illumination is emitted in different directions at different times (or phases) within an emission cycle. Accordingly, for a detected reflection signal, the corresponding direction of the illumination can be determined from the point in the scan pattern ordering that light is emitted at the time of detection. In alternative embodiments, illumination can be emitted simultaneously in multiple directions from multiple sources, and the various emissions can be pulsed at different frequencies to facilitate discrimination between the different directions. For example, with reference to FIG. 5A, the light-emitting devices 520, 522, 524, 526, 528 can be operated continuously in a pulsed fashion, each at a different frequency; this is illustrated in FIG. 5E. In the image acquired by the sensor array 508, the reflection signal received at each pixel may be Fourier-transformed to determine, based on the frequency of the signal, from which of the light-emitting devices 520, 522, 524, 526, 528 the detected light originates. This way, each pixel can be associated with a direction of the illumination and, based on the position of the pixel within the sensor array, the direction of the reflected light. For example, if the object 510 lies in the path of light from light-emitting device 526, the Fourier-transformed signal, shown in FIG. 5F, would peak at the scan frequency of that device. Alternatively or additionally, discrimination between multiple illuminating emissions simultaneously emitted in different directions can be accomplished based on different light characteristics, such as color or wavelength, intensity, polarization, etc. For instance, different color filters may be applied to the light captured by a camera to identify, based on the color(s) detected, from which emitters the light originates.
  • The depth of an object 510 (or a surface point thereof) within the monitored region 504 may be determined by triangulation from the direction of the illumination that strikes the object 510 and the direction of the reflected light that is captured by the sensor array 508, the former being inferable from the time at which the light is received (e.g., the acquisition time, within the emission cycle, of the particular frame that captures the reflection) or, alternatively, the frequency of the detected signal, and the latter being inferable from the position of the object within the image (or the orientation of the scanning mirror in an embodiment with a MEMS camera). This is conceptually illustrated in FIG. 5A. In the depicted embodiment, the light source 502 and sensor array 508 are arranged such that the diameter of the light source 502 and the camera lens 542 lie in the same plane. Light emitter 526 emits an illuminating emission 550 at an angle α, and the sensor array 508 receives a reflected emission 552 at an angle 13, both relative to a normal to that plane. The illuminating and reflected emissions 550, 552 together with a line connecting the center C1 of the light source 502 to the center C2 of the lens 542 form a triangle, isolated for greater clarity in FIG. 5D. The depth of the object 510 within the imaged region, i.e., the perpendicular distance y of the intersection point P of the light emissions 550, 552 (which is the point on the surface of the object 510 that intercepts the illuminating emission 550 and is detected in the image as the origin of the reflected emission 552) from the baseline connecting C1 and C2, can be computed from the angles α, β and the known length d of the baseline as:

  • y=d/(tan(α)+tan(β))
  • For an extended object or multiple objects, the above analysis can be repeated for each such reflection to determine associated depths of the object(s). In some embodiments, a depth value is determined for each pixel in the image.
  • In the embodiment depicted in FIG. 5A, the light-emitting devices 520, 522, 524, 526, 528 are advantageously arranged such that the primary directions of their respective emissions all intersect at the center C1 of the light source 502, allowing the above formula to be applied straightforwardly to light emitted from any of the devices 520, 522, 524, 526, 528. As will be readily apparent to one of skill in the art, the same advantage can be achieved if the light-emitting devices 520, 522, 524, 526, 528 are arranged on the facets of a regular polygon 600 (or a portion thereof), as shown in FIG. 6A, or in any other manner that preserves a common intersection between the emissions from different directions, with the intersection point located in the plane of the lens. For example, as shown in FIG. 6B, a planar arrangement of light emitters 610 in combination with individual light-deflecting optics 612 (e.g., lenses) for each emitter 610 can be configured, by choosing suitable deflection angles via the orientations of the optics, to effect a common intersection point. Further, the light emissions in different directions need not be uniformly distributed over a complete half-circle, but may have varying angular spacing and/or cover an arc (i.e., a portion of a circle) that extends over more or less than 180°, depending on the desired angular extent of the scan field. FIG. 6C shows, for example, an arrangement of light emitters 620 on an arc segment 622 spanning about 120°; the base of the arc is advantageously displaced from the plane of the camera lens 642 such that the center C1 of the circle falls in that plane.
  • Of course, the common intersection point of the emissions travelling in different directions serves merely computational convenience. Depth information can, in principle, be derived for any arrangement of light-emitting devices as long as their positions and orientations relative to each other and relative to the camera are known; the particular arrangement may, however, affect the computational complexity (e.g., in terms of the number of equations to be solved) of the depth determination. For example, a defined, fixed geometric relationship between the light-emitting devices of the light source can be achieved by integrating them with, mounting them on, or otherwise attaching them to a common rigid structure, e.g., made of plastic, metal, or any other suitable rigid material.
  • In some embodiments, a fixed geometric relationship also exists between the light source and the camera (or other light detector). For instance, as shown in FIG. 7A, the light source 700 and detector 702 may be mounted together in a single unit 704, and may receive power, and/or communicate with other devices (such as the image-processing facility 512), via cables 706 or wirelessly. The unit 704 may be of any size, shape, or material; in various embodiments, it is integrated with another device, such as a television, automobile, or computer. FIG. 7B shows another implementation, in which the light source 700 and detector 702 are maintained as separate units (one or both of which may be integrated into another device); the two units may be located at fixed, known spatial positions so that they, nonetheless, bear a fixed geometric relationship.
  • Alternatively, in some embodiments, the light source and light detector are movable relative to each other, either arbitrarily or consistently with one or more well-defined degrees of freedom. For example, the light source and detector may be integrated into a flexible or bendable pad or other structure, which fixes their distance along the surface of that structure, but allows their relative orientation and free-space distance to change. The information deficit arising from such unknown parameters of the geometric arrangement may be cured by using multiple sweeping light sources (i.e., multiple directed light sources with variable emission directions) and/or multiple detectors that, collectively, enable solving for the unknown parameters along with the desired depth information associated with the monitored region and/or objects. Embodiments include: (1) a photo-detector embedded in a device with emitters; (2) a photo-detector embedded in a second device with emitters directed to a photo-detector embedded in a first device; (3) a photo-detector separate unit from all emitters; (4) a photo-detector fixed to an object being tracked; (5) photo-detector(s) and emitter(s) can be implemented together as vcsels, such as for example the SPIE Vol 4652 Chip Scale Integration by Vixar, a company located at 2950 Xenium Lane, Suite 104, Plymouth, Minn.; and (6) other configurations including combinations of the described embodiments. Example layouts of emitters are illustrated by FIGS. 11E-11N and the accompanying description herein below. In further embodiments, described in more detail below, one of the light source or the detector is affixed to the object to be tracked.
  • The depth resolution that can be achieved with systems in accordance herewith can vary with the angular resolution of the scan. With light sources that emit light in a finite number of discrete principle directions (such as the light source 502 depicted in FIG. 5A), and assuming that the light detector temporally resolves all of these directions, the angular resolution relates directly to the angular spacing between the illuminating emissions generated by adjacent light-emitting devices. Thus, the larger the number of light-emitting devices is for a given angular coverage of the scan, the better is, in general, the angular resolution. In some embodiments, however, the angular resolution of the scan is effectively increased by exploiting spatial overlap between adjacent emissions to interpolate between their two primary directions. With reference to FIGS. 5B and 8A, consider, for example, an object 800 located at an angle mid-way between the primary emission directions of two adjacent light emitters 524, 526 whose diverging light emissions overlap. In this case, the time-varying intensity of the reflection from the object 800 (resulting from the control scheme of FIG. 5B) does not have a single maximum per cycle, but is equally bright when the object 800 is illuminated by light emitter 524 as when it is illuminated by light emitter 526, as is illustrated in FIG. 8B. By contrast, if the object 800 is located closer to the primary direction of light emitter 524, as shown in FIG. 8C, the intensity reaches its maximum when emitter 524 is turned on; the object 800 may, in this configuration, still receive and reflect light from emitter 526, but at a much lower intensity; this is shown in FIG. 8D. The closer the object 800 is to the primary direction of light emitted from either light emitter, the more pronounced is generally the reflection of light from that emitter, and the dimmer is any reflection of light from neighboring light emitter. Thus, the relative intensities of reflection signals resulting from the operation of the different light emitters can be used to determine the angular position of the object 800 relative to the light source with a precision that exceeds the angular resolution of the primary emission directions.
  • In some embodiments, the individual light-emitting devices are operated such that their respective emissions overlap not only spatially, but also temporally. For example, the light emitters may be turned on and off gradually (e.g., by varying their respective drive currents along a continuum), with each emitter being activated before its predecessor in the cycle is completely extinguished to provide a “cross-fade”; an exemplary control scheme is illustrated in FIG. 8E. At an individual pixel, the spatiotemporal overlap in activation of light emitters results in a continuous intensity distribution. Depending on the degree of temporal overlap, this intensity distribution can have a single maximum 810, as shown in FIG. 8F (for an object location along the primary direction of light emitter 526), or multiple local maxima 812 corresponding to the different pulses, with a global maximum 814 corresponding to the pulse that is emitted directly towards the reflecting object, as shown in FIG. 8G. In this case, the time-varying intensity can be analyzed to ascertain not only the global maximum 814, but also (or alternatively) the two local maxima 812 or minima 816 flanking it, which can increase the accuracy of the maximum determination. Techniques for determining maxima and minima vary widely, but in one embodiment, a derivative of the intensity vs. time (or distance) can be computed at points in time. When the derivative is zero, a maximum or minimum is indicated. (In some embodiments, the temporal resolution of read-out exceeds the temporal resolution corresponding to the different emission directions, e.g., multiple frames are acquired for each discrete light pulse.) In both cases, any asymmetry in the signal may be exploited to increase the angular resolution of the scan in a similar fashion as explained above with respect to FIGS. 8A-8D.
  • The temporal overlap between two adjacent light emitters may result, depending on the extent of spatial overlap between their emissions, in a bimodal angular distribution of the emitted light with varying relative magnitudes of the two peaks (which lie along the primary directions), or in a single peak that shifts from the primary direction of the first emission to the primary direction of the second emission. If the light emitters are turned on in the order in which they are arranged (e.g., from left to right or from right to left) they effectively create a continuous sweep, as the angle at which the intensity peaks varies continuously with time from the first emitter to the last emitter. By exploiting varying and overlapping intensity levels in this manner, the angular resolution of the scan can be increased in embodiments beyond that provided inherently by the different primary directions associated with the individual light emitters. In an embodiment, filters can be employed in conjunction with emitters of different characteristics (e.g., frequency, wavelength, polarization, phase, angular frequency, etc.) to provide improved discrimination between overlapping firing emitters.
  • FIG. 9A illustrates an embodiment in which a continuously sweeping light emission is created by a light source 900 that has two light emitters 902, 904. In one embodiment, the two emitters 902, 904 are controlled such that the sum of their central emission intensities is constant, but the intensity of each individual emitter's output varies between that constant value and zero; the intensities as functions of time may, e.g., be the squares of a cosine and a sine, respectively, as shown in FIG. 9B. With sufficient spatial overlap between the two emissions (e.g., chosen such that the intensity decreases by less than 50% from the source center to the midpoint between the two primary directions), this control scheme can produce an angular intensity distribution with a single maximum that oscillates back and forth between the two primary directions. For any point in time, the direction where the maximum lies can be determined straightforwardly by setting the derivative of the angular intensity distribution with respect to the angle to zero and solving for the angle. In the same manner, the relationship between the effective aggregate primary direction and the time or phase within the emission cycle can be theoretically determined for any number of spatio-temporally overlapping light emissions with discrete individual primary directions. Alternatively, the system may be calibrated to empirically determine the functional relationship between effective primary direction and time/phase; the calibration may involve placing an object sequentially at multiple known angles relative to the light source, acquiring frames during one or more emission cycles for each of these locations, and identifying the phase within each cycle that causes a detectable reflection signal.
  • Of course, the intensity of the light emitters need not necessarily vary along a continuum. Instead, each light emitter may step through a series of discrete illumination settings. The illumination of objects within the region of interest by different emitters can overlap in time (“cross-fade”) such that, while one light emitter is still illuminated, the next one may begin illumination (e.g., at a low dimming setting). In some embodiments, three or more light emitters are illuminated concurrently at different settings, but, in other embodiments, only one light emitter is configured at the maximum setting at any given time. Some embodiments can vary overall angular resolution and/or accuracy achieved by the system 100, at least in part, by varying the number of discrete dimming levels.
  • In some embodiments, a continuous sweep is achieved with a light source that includes a rotating or otherwise moving light-emitting device. As shown in FIGS. 10A and 10B, the device 1000 may, for example, be mounted on a support structure 1002 (e.g., a bar or wheel) that rotates continuously in the same direction (FIG. 10A), or back and forth between two angular boundaries (FIG. 10B). In the former case, the rotating light source 1004 may be combined with two or more cameras 1006, 1008 facing in opposite directions to facilitate a full 360° scan of the space surrounding the light source 1004 and cameras 1006, 1008. Alternatively, as illustrated in FIG. 10C, two light-emitting devices 1010, 1012 may be mounted on opposite ends of the support structure 1002 to continuously scan the half space imaged by a camera 1014 while the light source undergoes a full 360° rotation. As will be readily apparent to one of skill in the art, moving light sources can generally be implemented with one, two, or more light-emitting devices in a variety of suitable configurations.
  • Further, instead of moving (e.g., rotating) the light-emitting device itself, a moving light source may utilize a moving deflecting optic in conjunction with a stationary light emitter. In various embodiments, illustrated in FIGS. 10D-10F, the deflecting optic is a lens 1020, a prism 1022, or a semicircular slab 1024 with a high index of refraction. In the embodiment of FIG. 10F, the slab 1024 may be arranged relative to the light-emitting device 1026 such that the light enters the slab 1024 through its curved boundary and exits it always at the center point 1028. This configuration results in a scanning that originates from the same point, simplifying the computation of depth values. Computational simplicity is also aided, in various embodiments, by utilizing a narrow, highly collimated light source, e.g., a laser diode.
  • Yet another approach to generating a continuously moving illumination involves the use of a moving screen in front of a light-emitting device that emits over a substantial spatial angle (or, in some embodiments, isotropically in many directions); FIG. 10G shows an exemplary embodiment. The screen 1030 transmits light through a hole, slot, or other perforation 1032 while blocking light transmission in other directions, thereby “cutting” a narrower window out of the light from the emitter. This light emission moves (e.g., rotates) as the perforation moves along with the screen; thus, the light emitter and moving screen together form a moving light source. Additional embodiments of moving light sources may occur to people of skill in the art, and are to be considered within the scope of the disclosed technology.
  • In some embodiments providing a spatially continuous sweep across the region of interest (whether implemented by a moving light source or by a number of discrete stationary emitters with gradually varying and temporally overlapping intensity), the angular resolution of the scan, and thus the depth resolution, depends on the temporal resolution of image acquisition, i.e., the rate at which information is read from the camera. For conventional cameras, the frame rate can often be increased at the cost of decreasing the spatial resolution; for example, by reading out only every other pixel, the frame rate can be doubled. In some embodiments, the pixel-read-out density and frame rate are set such that the lateral and depth resolution achieved by the system are approximately the same.
  • In some embodiments in which relatively low or moderate scan rates, low or moderate angular resolutions of the scan (as determined by the density of different directions that light can be emitted or the read-out rate of the camera, or both or other limiting factors) are employed, and having short or medium distances between the object of interest, the light source and detector, the emission of light in a certain primary direction and the detection of its reflection by the detector can be considered to occur practically simultaneously, i.e., before the direction changes and/or within the same frame. Consequently, the travel time, or time of flight, of light from the source to the object and then to the detector can be neglected in these cases. For example, if the object is no more than one hundred meters away from the light source and detector, the time of flight is less than a microsecond. Accordingly, as long as emissions of light having different directions are separated in time by more than, say, a microsecond, the reflection can, for practical purposes, be considered as received instantaneously. In some embodiments contemplating higher scan rates and angular resolution and/or larger distances, however, the time of flight can be taken into consideration to accurately determine the emission direction that corresponds to a particular reflection signal from the time of receipt of that signal. For instance, if the emission direction changes, and a new image frame is acquired, every nanosecond, a total travel distance of only ten meters results in a shift of 33 frames between emission of the light and capture of the reflection. Thus, in order to accurately determine the position of the object, a reflection received by the detector should be associated with the emission direction 33 frames (or 33 ns) prior. A synching event, (e.g., pause in scan, burst of energy in scan by activating all, most, many emitters contemporaneously, etc.) can be used to synch up emitter scan cycle and detector scan cycle. In one embodiment, illustrated by FIG. 8H, a synching event includes a time delay between cycles having a time period of sufficient duration that any emitted energy will have returned.
  • Of course, the time period needed for energy emitted to return is typically not known a priori in embodiments in which the distance of the object from the light source and detector is the very parameter to be measured. In some embodiments, this problem is addressed by determining the distance iteratively: in a first iteration, the position of the object, and thus its distance from the source and detector, may be determined based on the assumption that the time of flight is zero. The distance thus determined is then used to compute a round-trip travel time adjustment, determine a corrected emission direction, and re-compute the object position. This process may be repeated for a fixed number of iterations, or until a positional accuracy is achieved or some other convergence criterion is satisfied; in some embodiments, one or two iterations suffice for the desired accuracy of the positional measurement.
  • In the above discussion, the operation of the described embodiments is, for ease of explanation, illustrated for two-dimensional space; the disclosed technology is, however, not limited to two dimensions. Other embodiments incorporating the principles discussed herein to three dimensions will be readily apparent to those of skill in the art. For example, while FIG. 5A implies that the light source 502, sensor array 508, and object 510 all lie in the same plane, e.g., the horizontal plane, this need not necessarily be the case as shown by FIG. 15 herein below. The object 510, for example, may lie above or below the horizontal plane, and may nonetheless be illuminated by the light emissions during a horizontal sweep of that plane due to the inherent vertical extent of the light emitted. The sensor array 508 may capture a two-dimensional image of the region 504, mapping all light incident upon the detector under a particular angle relative to the horizontal scan plane to the same row of the image sensor. For example, a reflection signal received from an object location within the scan plane may strike a sensor row within that same plane, whereas a reflection received from an object location below the scan plane may strike the sensor pixels in a row above the plane, and vice versa. From the horizontal position of the reflection in the data frames (i.e., the column of the image sensor) and the direction of the illumination corresponding to this reflection, the depth of the object can be calculated, e.g., as explained with respect to FIG. 5A. The depth value in conjunction with the two-dimensional location of the reflection in the image, in turn, allows computing the location of the reflection point on the surface of the object in a plane parallel to the image plane at the determined depth. Thus, the three-dimensional position of the object is computed.
  • In an embodiment, an intensity threshold may be used to define the emission diameter as the full width of the emission portion that covers intensities down to the threshold value (e.g., down to half the maximum intensity). Thereby, embodiments can provide the ability to accommodate situations wherein the emission intensity falls off on one or more sides of a “peak” defining a horizontal scan plane (e.g., according to a Gaussian or similar vertical intensity distribution). The light practically sweeps a three-dimensional slice of a thickness corresponding to that diameter. To expand the spatial volume covered by the scan, the light source 502 may be equipped with means for deliberately increasing the vertical emission diameter and/or divergence, such as a cylindrical lens or other diverging optic, and/or with means for narrowing the horizontal divergence without affecting the vertical divergence, such as a vertical slot.
  • In alternative embodiments, the monitored region is expanded, or the accuracy of positional determinations therein is increased, by utilizing multiple intersecting or parallel scan planes, or by scanning a three-dimensional spatial volume with a beam whose primary direction can be varied both azimuthally and attitudinally. FIG. 11A, for instance, shows a light source that includes two mutually perpendicular linear arrangements 1100, 1102 of discrete light-emitting devices 1104, which may be positioned on a planar or curved surface. In some embodiments, the rows correspond to the equator and meridian, respectively, of a semispherical surface 1106, as shown in FIG. 11B. FIG. 11C illustrates another embodiment, in which discrete light-emitting devices 1104 are distributed uniformly over a semispherical surface 1106 (or as smaller or larger surface portion of a sphere). In FIG. 11D, the light-emitting devices 1120 are mounted on the faces of a regular polyhedron 1110. (Dashed lines indicate edges or light emitters that are hiding from view.) The light sources shown in FIGS. 11C and 11D are three-dimensional generalizations of the light sources depicted in FIGS. 4A, 5A and 6A. FIGS. 11E and 11F illustrate sources 1104 comprising two or more lighting elements 1104 a disposed on surface 1112 along with photo-detectors 1103. Surface 1112 can be flat, curved, or a combination of multiple surface portions. In one embodiment photo-detectors 1103 receive reflectance of emissions of energy emitted by sources 1104 co-located on surface 1112 as described in further detail with reference to FIGS. 4A-1, 4A-2, and 13A. In an alternative embodiment, more than one “scanner” unit comprising surface 1112, sources 1104, and photo-detectors 1103 can be implemented together such that photo-detectors 1103 of a first surface 1112 receive reflectance of emissions from sources 1104 of a second surface 1112 and vice versa. In FIG. 11E, source 1104 includes a plurality of photo-emitting elements which can be individually activated to emit electromagnetic energy in various directions sequentially, randomly or according to a pattern. Thus, the “firing order” indicated by numerals assigned to photo-emitting elements 1104 a are illustrative of one exemplary firing order used in some embodiments. FIGS. 11G-11H illustrate different arrangements of sensors and emitters configured on a single surface 1112. FIGS. 11I-11J illustrate sensor-and-emitter pair configurations on a single surface 1112. FIGS. 11K-11N illustrate a variety of other configurations of sensors and emitters on a single surface 1112. Of course, moving lights sources as described, e.g., with respect to FIGS. 10A-10G can similarly be generalized to three-dimensional structures. For instance, a light-emitting device may be mounted on a sphere with two rotational degrees of freedom, or combined with a lens that can rotate about two mutually perpendicular axes lying in the plane of the lens (i.e., perpendicular to its optical axis) and intersecting at its center.
  • Various embodiments of systems for determining distance, depth, or other positional information associated with objects have heretofore been described. As will be readily appreciated by those of skill in the art, the performance of the various systems—in terms of accuracy and precision or resolution of the positional determinations—varies greatly, depending on a number of system parameters including, but not limited to, the type (e.g., discrete vs. continuous) and scan rate of the light source, the number of discrete light-emitting devices (if applicable), the beam divergence, the existence and/or extent of spatial and/or temporal overlap of emissions of different sources, the read-out scheme and rate of the camera, the spatial resolution of the camera, and/or the type of data analysis. Some embodiments can determine depth within approximately millimeters or micrometers of accuracy; other embodiments may determine positional information within approximately centimeters of accuracy. The disclosed technology is, however, not limited to any particular accuracy (or other performance parameter), and certain attributes of the disclosed technology may be adjusted to increase or decrease accuracy and/or performance, e.g., as applications of the disclosed technology require. For example, in embodiments that utilize light sources with multiple light-emitting devices generating emissions in a finite number of discrete directions, a coarse scan can be achieved by skipping some light sources when situations call for less accuracy, and a more accurate, fine-grained scan can be achieved by operating the light source with a larger number of light-emitting devices. In some embodiments, a coarse scan of an entire field of view is conducted to locate one or more objects, and then followed by a comparatively fine-grained scan limited to a portion of the field of view in which the object has been located. The fine-grained scan may serve to identify detailed features of the object(s), thereby enabling different objects (e.g., hands of different human users, different pets walking across the field of view, etc.) to be distinguished.
  • FIGS. 12A and 12B summarize two methods in accordance herewith, which correspond to two respective classes of applications. FIG. 12A shows a method 1200 that can be used to determine depth values for all pixels of an image capturing a region of interest. The method 1200 involves repeatedly sweeping the region of interest with an emission (action 1202), and capturing one-dimensional or, more typically, two-dimensional images of the region at a rate that resolves the different directions of the emission at the desired accuracy (action 1204). (Capturing images herein means that light from different locations within the region corresponding to different directions of incidence on the image sensor is resolved in some manner (e.g., spatially with a pixelated sensor, or temporally via a raster scan), regardless whether the image data is read out, stored, analyzed, and/or computationally represented frame by frame or, e.g., pixel by pixel.) Further, the method includes determining, for each pixel, the maximum intensity of the detected reflection during an emission cycle and the corresponding phase within the cycle (1206) (which may, in some cases, be the only phase for which a reflection signal is detected at all). In some embodiments, this is done separately for every emission cycle so that any movements of objects within the region are detected at the scan rate; in other embodiments, the signals for multiple successive emission cycles are averaged to improve accuracy at the cost of responsiveness. From the phase of peak-intensity, the direction of the emission to which this peak is attributable is determined for each pixel (action 1208), and based on the angles of the illuminating and reflecting emissions, the depth is computed (e.g., as explained with respect to FIG. 5C) and associated with the respective pixel (action 1210). This way, a complete depth profile of the image is obtained. This may serve to measure not only the (e.g., three-dimensional) position of an object as a whole, or the positions of multiple objects (including objects bounding the monitored space, such as walls), but also to determine, e.g., the shape an object's surface and/or any changes thereto over time. The process may be repeated to track changes of the object location, shape, etc. For example, in some embodiments, the object is a person's hand, whose position, orientation, and articulation are tracked, e.g., for purposes of gesture recognition.
  • FIG. 12B illustrates a method that may be more appropriate for tracking the position(s) of one or more objects within a spatial region that is large compared to the dimensions of the object(s). This method 1220 likewise includes scanning the entire region repeatedly with a light beam (action 1222) and acquiring images of the region (action 1224). However, the method 1220 deviates in the analysis of the image data. Rather than determining depths for all pixels, it involves identifying object(s) of interest in the (one- or two-dimensional) camera images (1226), and computing depth values only for one or more pixels of the identified object(s) (actions 1228, 1230, 1232). The object(s) may be identified in the images in any of a number of ways, e.g., using conventional edge-detection or patch-detection techniques, template matching against a database of object images, and/or foreground/background discrimination based on the intensity of the measured reflection integrated over an entire emission cycle (which will generally be greater for objects in the foreground, due to the decrease in intensity with the square of the distance). For pixels of interest, depth may then be determined using the same procedure as described above for method 1200, i.e., by determining of the phase within the emission cycle for which the reflection intensity is maximized (action 1228), inferring the corresponding illuminating-beam direction therefrom (action 1230), and computing the depth from the angles of the illuminating and reflected light (action 1232). Advantageously, this method 1220 reduces the computational resources required to process the images.
  • Further, while certain embodiments described heretofore utilize a camera that provides one- or two-dimensional frames imaging light reflected from objects of a region of interest, which can be supplemented with depth information employing the methods described above, certain alternative embodiments do not require the use of a multi-element camera, but can localize objects using only a single photo-sensitive element (or multiple individual elements in different locations) in conjunction with a directed light source with variable beam direction. The light source may scan a spatial region according to a suitable scan pattern. For instance, the region may be divided into a number of subregions (e.g., four quadrants), which are then sequentially illuminated. When the illuminating light strikes an object, a reflection signal is detected by the light-sensitive element. The time of receipt of the reflection signal allows an inference of the direction of the illuminating emission and, thus, of the subregion in which the reflecting object is located. (Alternatively, multiple illuminating emissions having different properties, e.g., different color, may be used simultaneously and distinguished using suitable filters with the detector(s).) Multiple light sources and detectors may be used to scan the region of interest along different planes so as to provide three-dimensional information. For example, a cubic volume may be scanned along two perpendicular planes, each divided into four quadrants. Localizing an object in one of the four quadrants for both scans allows identifying in which one of eight sub-cubes the objects is located. As will be apparent to one of skill in the art, the spatial resolution of object localization achieved in this manner corresponds directly to the spatial resolution of the scan pattern; the more subregions there are, the more precise is the determined object location.
  • In embodiments that utilize multiple light sources, the light emanating therefrom may be distinguished based on its spectral properties (color, frequency, wavelength), intensity, polarization, the (angular) frequency or repetition rate of the scan pattern, the frequency of a binary flash pattern, a temporal intensity modulation pattern, or some other optical property or pattern. For example, if two light sources scan the region of interest periodically, but each at its own repetition rate, a detector may Fourier transform the overall signal it receives to identify peaks at two frequencies corresponding to the two repetition rates. Similarly, if two light sources flash at different frequencies, a Fourier transform and/or suitable frequency filters facilitate discriminating between them and/or extracting a signal corresponding to one or the other light source. Staged filtering allows first extracting the signal from a particular light source and then (following demodulation) extracting a signal at the scan frequency (i.e., repetition rate). Alternatively or additionally, if the different light sources emit different colors or polarizations, the detectors may be equipped with suitable spectral or polarization filters to differentiate between signals from the different sources. An exemplary embodiment is illustrated in FIG. 13A, where an object 1300 may intersect light emitted from two light sources 1302, 1304 that produce light having different characteristics (say, emit light of different colors). A number of light detectors (which may be individual photosensitive elements, or combinations thereof such as cameras) are placed in various locations to capture reflections off the object, each detector being adapted, e.g., via a suitable filter, to receive light only from one of the light sources 1302, 1304. Thus, as illustrated, detector/filter combination 1306 captures light originating from light source 1302, whereas detector/ filter combinations 1308, 1309 capture light originating from light source 1304. Of course, many more light sources and/or detectors may be used. A particular light source may have one or more associated detectors for receiving light from that source. Further, multiple light sources that emit light with the same properties (e.g., color) may share one or more associated detectors. The embodiments described above all take advantage of diffuse reflections of the emission, i.e., of light scattered in all directions. This allows the camera or other sensor to capture reflections off any object within its field of view, regardless of the angle of incidence on the object surface and the angle relative to the surface normal under which the detector appears from the point of incidence (which is the angle of reflection for the portion of the reflected light that is captured). In certain alternative embodiments, specular reflection is used instead. For that purpose, an object of interest may be specially equipped with a reflector (i.e., mirror), or the surface properties of the object may inherently cause specular reflection (e.g., for smooth, metallic surfaces). Since, for specular reflection, the angle of reflection equals the angle of incidence, illuminating light that “sweeps” across the reflector generally results in a sweeping reflection. In order to increase the likelihood of capturing a signal from this moving reflection, a plurality of detectors may be employed, e.g., distributed over a bounding surface (e.g., defined by one or more walls) of the region of interest. Further, as with diffuse-reflector embodiments, multiple light sources may be used, e.g., to scan the region of interest along different planes or in different directions. The light sources may differ in the light and scanning properties (e.g., color, polarization, angular frequency of the scan, etc. as described above) to enable the detectors to distinguish between signals originating from different light source.
  • Alternatively, to ensure the continuous detection of the reflected light, the object of interest may be equipped with a reflector that reflects light back into the direction it came from, with minimal scattering. Various types of retro-reflectors are well-known to those of skill in the art; one example is a corner reflector, which includes three mutually perpendicular mirror surfaces arranged like three interior walls of a cube that have a common corner point. In retro-reflector embodiments, the reflected light can be captured by a light detector that is co-located with the light source. For purposes hereof, the detector is co-located with the light source if it is sufficiently close to the light source to intercept the reflected light, i.e., if its distance from the optical axis is smaller than the beam-profile radius of the reflected beam at the light source. (The beam radius may be defined, e.g., based on the maximum allowable intensity fall-off from the central peak intensity, and this maximum allowable fall-off may, in turn, depend on the sensitivity of the detector). In general, the allowable distance of the detector from the light source is the greater, the more divergent the beam is and the more distant the object of interest is. The light detector may be a single photodiode, phototransistor, photovoltaic cells, photoresistor or other kind of photo-sensing device, or an array or irregular arrangement of multiple such devices. In some embodiments, multiple light-sensing cells are arranged in a ring around the light source to capture a large portion of the reflected light. Importantly, since the reflection of interest comes, at any point in time, only from one direction (which is the direction of the illuminating beam), the sensor need not resolve different directions of incoming light; consequently, there is no need for a focusing optic or for a pixel-wise read-out of the sensor. Instead, to the extent the detector comprises more than one light sensors, the detected light is integrated over all of the sensors to yield an aggregate signal. (Of course, this does not exclude the use of a regular camera with a pixelated sensor and lens.)
  • FIG. 13B illustrates how a light source 1310 and co-located detector 1312 may be used to track a retro-reflecting object 1314. As with the diffuse-reflection embodiments described above, the light source 1310 generates a beam 1316 of variable direction to sweep a region of interest 1318, which is, in this case, a region where the object of interest 1314 is generally expected. The variable beam direction may be achieved with multiple cyclically operated light-emitting devices with fixed beam directions, with one or more moving light emitters, with a fixed light emitter and moving deflector, or generally with any of the light sources described above. The light source 1310 may be operated to repeatedly scan the entire region 1308, preferably at a rate that is fast compared to the rate of motion of the object 1314. Alternatively, the light source may scan only a subregion around the previously measured location, or a projected expected location, of the object 1314. The object 1314 has a retro-reflector 1320 (or multiple retro-reflectors 1320, in which case an orientation information can be discerned as well) integrated therewith or attached thereto such that, whenever the beam 1316 strikes the retro-reflector 920, a reflection signal is sent back to and measured by the detector 1312. Control and detector circuitry 1322 (implemented in a single device or multiple intercommunicating devices) associated with the light source 1310 and the detector 1312 correlates the measured signal with the direction of the illuminating beam 1316 from which it originates. The circuitry 1322 may, for example, include a controller for operating the light source 1310 and an analog phase-detecting circuit that receives a clock signal of the scan frequency as a reference, and measures the timing of the detected reflection relative thereto. Thus, the direction, measured from the light source 1310 and detector 1312, at which the object 1314 occurs can be determined as a function of time.
  • If the movement of the object 1304 is confined to a known trajectory (e.g., by means of rails), the intersection of the measured direction with that trajectory uniquely determines the position of the object. Otherwise, additional information may be obtained from a second light-source/detector pair, hereinafter referred to as a “scanner.” For instance, as shown in FIG. 13C, the position of an object 1330 that moves in two dimension may be determined from two two-dimensional scans simultaneously performed by two scanners 1332, 1334. Each of the scanners 1332, 1334 determines a line along which the object is located during a given emission cycle. The intersection 1336 of the two lines specifies the position of the object 1330. Note that, although each line corresponds to the primary direction of the beam that strikes the object 1330, the two beams need not strike the object 1330 at the same time, and consequently need not, and typically do not, intersect each other at the current object location. In fact, the two scanners 1332, 1334 may operate independently from each other, and even at different scan frequencies, as long as they track the object relative to a common spatio-temporal reference frame. Of course, the embodiment illustrated in FIGS. 13B and 13C can straightforwardly be extended to more than two scanners and/or three-dimensional scan patterns, e.g., to take redundant measurements for improved accuracy and/or to allow tracking an object in three dimensions. Object-tracking methods that utilize specular reflection in accordance herewith may be used, e.g., in various industrial contexts. For instance, they may serve to track the positions of robots moving across a manufacturing floor or a distribution and mailing center. One or more scanners may be mounted, e.g., onto the walls or ceiling, to sweep a light beam thereacross and capture reflections from retro-reflectors affixed to the robots.
  • In some embodiments employing specular-reflection techniques, the roles of the scanner and the retro-reflector are reversed, and one or more retro-reflectors are fixed in space to serve as reference locations for a moving object equipped with a scanner. In two dimensions, three retro-reflectors 1340 at known locations suffice to uniquely determine the location and/or orientation of the object 1342 from measurements of the three angles γ1, γ2, γ3 enclosed between the lines of sight connecting the object 1342 to respective pairs of retro-reflectors 1340, as illustrated in FIG. 13D. This configuration may find its application, for example, in the triangulation of mobile device positions. An inexpensive scanner may be integrated into a cell phone or other mobile device, and three or more retro-reflectors may be mounted, e.g., on buildings, cell towers, or other permanent structures. The scanner may continuously sweep its surroundings, preferably with light outside the visible range (e.g., IR light). Based on the reflection signals received from three (or more) retro-reflectors at known locations, the position of the mobile device can be computed. To facilitate distinguishing the reflection signals received from different retro-reflectors, the reflectors may alter some property of the light, such as a polarization, spectral property (e.g., wavelength/frequency, or color), amplitude modulation, etc.). The different reflectors may have different physical properties (e.g., shape, size, color), reflective properties, surface characteristics (e.g., patterns), etc. to effect the different light properties. In some embodiments, the retro-reflectors are positioned at known, elevated heights. Assuming that the scanner is on or near the ground, this allows a determination of the distance between the scanner and retro-reflector based on the azimuthal angle of the reflector as measured from the scanner. With directional and distance information available, the position of the mobile device can be inferred from only two reflection signals. Alternatively, the distances may be used to verify and/or increase the accuracy of the computed device location.
  • In some embodiments, time of flight techniques or Doppler techniques can be used in conjunction with the techniques described herein. For example, during an initial phase, the light source may be controlled to emit, for a number of emission cycles, only one pulse in one direction per emission cycle, allowing a determination of the delay between the time of emission and the time of receipt of any corresponding reflection. This procedure may be repeated for multiple beam directions until a reflection signal is detected.
  • In embodiments where the light source sends out discrete pulses, and the pulse rate and dimensions of the monitored space are such that the reflection of each pulse is detected before the next pulse is emitted, allowing each image frame to be associated straightforwardly with the direction of the illuminating beam, time-of-flight measurements may be used to supplement the triangulation-based depth/distance determination. For example, if the camera is read out much faster than the illuminating beam changes direction, i.e., a sequence of multiple images is acquired for each pulse, the time delay between the emission of the pulse and the receipt of the reflection can be inferred readily from the number of the image within the sequence that captures the reflection.
  • Furthermore, at significantly higher camera read-out rates, slight time-of-flight differences between pulses from different light emitters reflected by the same object may be resolved; such time-of-flight differences may result from slightly different distances between the various emitters and the object. For instance, two light-emitting devices may emit pulses of light at t=10 ns and t=11 ns, respectively. The first pulse may arrive at the camera sensor at t=10.5 ns while the second pulse may arrive at t=11.6 ns, revealing that the travel time of light from the second emitter is 0.1 ns longer (corresponding to an additional distance of 3 cm). For certain advantageous geometric arrangements of the light emitters, such as the one depicted in FIG. 5A, the light source that emits directly towards an object is also the one closest to the object. Thus, if the pulse emitted directly toward the object results in the brightest reflection signal—as it ordinarily does—the minimum travel time (or phase difference between pulse emission and detection), among the pulses emitted during an emission cycle, generally coincides with the maximum intensity. However, the relationship between the direction of illumination and brightness of the reflection may, in some cases, not hold, e.g., due to surface irregularities of the object that cause different reflectivity for light incident from different directions. A travel-time-based determination of the direction of the illuminating beam along which the object is located may, in these scenarios, be more accurate or reliable than an intensity-based determination. Measurements of the Doppler shift of light (or other radiation) reflected by a moving object can provide additional information about the motion of the object, which may supplement information obtained with methods described herein.
  • The various embodiments described herein generally utilize some type of control and/or computational facility (hereinafter “computer”) for operating the light source and detector and analyzing the captured signals or images. This computer may be provided in various form factors, and may be implemented in one or more dedicated, application-specific devices (e.g., a DSP or ASIC) designed or selected for use with the light source and/or detector, or integrated into another, stand-alone computing device (such as a personal computer, tablet computer, or a smart phone), depending on the application context. In one embodiment, some or all of the functionality of the computer is integrated into the light source (e.g., into the support structure onto which the light-emitting devices are mounted) and/or incorporated into or affixed to the detector. The computer may include digital circuitry (e.g., a computer processor and memory) and/or analog circuitry (e.g., an analog phase detector).
  • FIG. 14 shows, in a simplified block diagram, an exemplary embodiment of a computer 1400 for determining a distance to an object in accordance with an embodiment of the disclosed technology. The computer 1400 may include a processor 1402, memory 1404 (e.g., RAM, ROM, and/or flash memory), one or more interfaces 1406 for the light source and detector, and/or one or more user input/output devices 1408 (e.g., a display (optionally touch-enabled), speakers, a keyboard, and/or a mouse), as well as one or more buses 1409 over which these components communicate. The computer 1400 may also include other removable/non-removable, volatile/nonvolatile computer storage media, such as a solid-state or magnetic hard disk, an optical drive, flash memory, random-access memory, read-only memory, or any other similar type of storage medium. The processor 1402 may be a general-purpose microprocessor, microcontroller, digital-signal processor, or any other type of computational engine. The interface 1406 may include hardware and/or software that enables communication between the computer 1400 and the light source and/or detector. For example, the interface 1406 may include one or more data ports (such as USB ports) to which devices may be connected, as well as hardware and/or software signal processors to modify sent or received data signals (e.g., to reduce noise or reformat data). In some embodiments, the interface 1406 also transmits control signals to, e.g., activate or deactivate attached devices, to control camera settings (frame rate, image quality, sensitivity, zoom level, etc.), or the like. Such signals may be transmitted, e.g., in response to control signals from the processor 1402, which may in turn be generated in response to user input or other detected events.
  • The memory 1404 may be used to store instructions to be executed by processor 1402 as well as input and/or output data associated with execution of the instructions. These instructions, illustrated as a group of modules, control the operation of the processor 1402 and its interaction with the other hardware components. Typically, an operating system 1410 directs the execution of low-level, basic system functions such as memory allocation, file management, and operation of mass storage devices. The operating system may be or include a variety of operating systems, such as WINDOWS, LINUX, OS/X, iOS, Android or any other type of operating system. At a higher level, the memory 1004 may store control modules 1412, 1414 for operating the light source and the camera (or other detector), an image-analysis module 1416 for analyzing the image data received from the detector (e.g., to determine the intensity maximum of the time-varying signal for each pixel, to identify objects of interest and determine the associated direction of the incident beam, etc.), and a triangulation module 1418 that computes depth and/or distance values based on the measured image data and the corresponding control state of the light source. In general, the instructions may be implemented in any programming language, including, for example, C, C++, JAVA, Fortran, Basic, Pascal, or low-level assembler languages.
  • Embodiments of the disclosed technology may be used to map out a room or similarly sized area in order to determine its dimensions and/or precisely locate room walls as well as objects, people, or other things in the room. This information may be used by a computer, television, or other device or machine in the room to improve the experience of a user of the device by, for example, allowing the user to interact with the device based on the room dimensions. The device may adjust a property of its output (e.g., a sound level, sound distribution, brightness, or user-interface perspective) based on objects in the room or the position of the user. Further embodiments can be used to track the motion of objects in a field of view, optionally in conjunction with other mobile-tracking systems. Object tracking may be employed, for example, to recognize gestures or to allow the user to interact with a computationally rendered environment; see, e.g., U.S. Patent Application Ser. No. 61/752,725 (filed on Jan. 15, 2013) and Ser. No. 13/742,953 (filed on Jan. 16, 2013), the entire disclosures of which are hereby incorporated by reference.
  • FIG. 15 illustrates an exemplary task environment in which a human operator 1500 of a machine 1502 interacts with the machine 1502 via motions and gestures. The machine may be communicatively coupled to and receive information about the motions and gestures from one or more light-source/detector pairs 1504. For example, as shown, three light-source/detector pairs 1504 are used to scan the region of interest horizontally and vertically. Of course, different numbers and arrangements of light sources and detectors (whether provided in source/detector pairs 1504 or in configurations where the light sources and detectors are located independently) may be employed in other embodiments. To conduct machine control, a region of space in front of or near the machine 1502 (or, alternatively, in other designated portions of the room) may be scanned by directing one or more light emissions from the vantage point(s) or region(s) of the light source(s) to the region of space, detecting any reflectance of the light emission from an object or objects within the region, and, if a reflectance is detected, inferring therefrom the presence of an object (or objects) in the region of space.
  • For example and with reference to FIG. 16, an exemplary embodiment of a variation-determination system 1600 comprises a model-management module 1602 that provides functionality to build, modify, and/or customize one or more models to recognize variations in objects, positions, motions, and/or attribute states or changes therein based on sensory information obtained from a suitable detection system, such as system 100 shown in FIG. 1. A motion capture and sensory analyzer 1604 finds motions (e.g., translational, rotational), conformations, and presence of objects within sensory information provided by detection system 100. The findings of motion capture and sensory analyzer 1604 serve as input of sensed (e.g., observed) information from the environment with which model refiner 1606 can update predictive information (e.g., models, model portions, model attributes, and so forth).
  • The model refiner 1606 may update one or more models 1608 (or portions thereof) from sensory information (e.g., images, scans, other sensory-perceptible phenomena) and environmental information (i.e., context, noise, and so forth); enabling a model analyzer 1610 to recognize object, position, motion, and/or attribute information that might be useful in controlling a machine. Model refiner 1606 employs an object library 1612 to manage objects including one or more models 1608 (e.g., of user portions (e.g., hand, face), other control objects (e.g., styli, tools) or the like) (see, e.g., the models depicted in FIGS. 17A and 17B), and/or model components (e.g., shapes, 2D model portions that sum to 3D, outlines and/or outline portions (e.g., closed curves), attributes (e.g., attach points, neighbors, sizes (e.g., length, width, depth), rigidity/flexibility, torsional rotation, degrees of freedom of motion, and others), and so forth) useful to define and update models 1608 and model attributes. While illustrated with reference to a particular embodiment in which models, model components, and attributes are co-located within a common object library 1612, it should be understood that these objects will be maintained separately in some embodiments.
  • With the model-management module 1602, one or more object attributes may be determined based on the detected light. Object attributes may include (but are not limited to) the presence or absence of the object; positional attributes such as the (e.g., one-, two-, or three-dimensional) location and/or orientation of the object (or locations and/or orientations of various parts thereof); dynamic attributes characterizing translational, rotational, or other forms of motion of the object (e.g., one-, two-, or three-dimensional momentum or angular momentum); physical attributes (e.g., structural or mechanical attributes such as appearance, shape, structure, conformation, articulation, deformation, flow/dispersion (for liquids), elasticity); optical properties or, more generally, properties affecting or indicative of interaction with electromagnetic radiation of any wavelength (e.g., color, translucence, opaqueness, reflectivity, absorptivity); and/or even chemical properties (as inferred, e.g., from optical properties) (such as material properties and composition).
  • In some embodiments, scanning the region involves multiple emission cycles. During different emission cycles, the region may (but need not) be scanned in accordance with different scan patterns. For example, an initial emission cycle may serve to detect an object, and during a subsequent cycle, a more refined scan pattern may serve to capture surface detail about the object, determining positional information for at least a portion of the object, or determining other kinds of object attributes. Multiple sequential emission cycles may also serve to detect changes in any of the object attributes, e.g., due to motion or deformation; for such differential object-attribute determinations, the same or similar scan patterns are typically used throughout the cycles. The object attributes may be analyzed to identify a potential control surface of the object.
  • FIG. 17A illustrates predictive information including a model 1700 of a control object constructed from one or more model subcomponents 1702, 1703 selected and/or configured to represent at least a portion of a surface of control object 112, a virtual surface portion 1706 and one or more attributes 1708. Other components can be included in predictive information 1710 not shown in FIG. 17A for clarity sake; such as models (user portions (hand, face), objects (styli, tools), model components (shapes, e.g., 2D model portions that sum to 3D), and model-component attributes (e.g., degrees of freedom of motion, torsional rotation, attach points, neighbors, size (length, width, depth), rigidity/flexibility), and others). In an embodiment, the model subcomponents 1702, 1703 can be selected from a set of radial solids, which can reflect at least a portion of a control object 112 in terms of one or more of structure, motion characteristics, conformational characteristics, other types of characteristics, and/or combinations thereof. In one embodiment, radial solids include a contour and a surface defined by a set of points having a fixed distance from the closest corresponding point on the contour. Another radial solid embodiment includes a set of points a fixed distance from corresponding points on a contour along a line normal thereto. In an embodiment, computational technique(s) for defining the radial solid include finding a closest point on the contour and the arbitrary point, then projecting outward the length of the radius of the solid. In an embodiment, such projection can be a vector normal to the contour at the closest point. An example radial solid (e.g., 1702) includes a “capsuloid”, i.e., a capsule-shaped solid including a cylindrical body and semi-spherical ends. Another type of radial solid (e.g., 1703) includes a sphere. Other types of radial solids can be identified based on the foregoing teachings.
  • In an embodiment and with reference to FIG. 17B, updating predictive information to observed information comprises selecting one or more sets of points 1750 in space surrounding or bounding the control object within a field of view of one or more image-capture device(s). As shown, points 1750 can be determined using one or more sets of lines 1752A, 1752B, 1752C, and 1752D originating at vantage point(s) associated with the image-capture device(s) (e.g., FIG. 1: 130A, 130B) and determining therefrom one or more intersection point(s) defining a bounding region (i.e., region formed by lines 1752A, 1752B, 1752C, and 1752D) surrounding a cross-section of the control object. The bounding region can be used to define a virtual surface (FIG. 17A: 1706) to which model subcomponents 1702, 1703, and 1754 can be compared. The virtual surface 1706 can include a visible portion 1760A and a non-visible “inferred” portion 1760BB. Virtual surfaces 1706 can include straight portions and/or curved surface portions of one or more virtual solids (i.e., model portions) determined by model refiner 1606.
  • For example and according to one embodiment illustrated by FIG. 17B, model refiner 1606 determines to model subcomponent 1754 of an object portion (happens to be a finger) using a virtual solid, an ellipse in this illustration, or any of a variety of 3D shapes (e.g., ellipsoid, sphere, or custom shape) and/or 2D slice(s) that are added together to form a 3D volume. Accordingly, beginning with generalized equations for an ellipse (1) with (x, y) being the coordinates of a point on the ellipse, (xC, yC) the center, a and b the axes, and θ the rotation angle, the coefficients C1, C2 and C3 are defined in terms of these parameters, as shown:
  • C 1 x 2 + C 2 xy + C 3 y 2 - ( 2 C 1 x c + C 2 y c ) x - ( 2 C 3 y c + C 2 x c ) y + ( C 1 x c 2 + C 2 x c y c + C 3 y c 2 - 1 ) = 0 C 1 = cos 2 θ a 2 + sin 2 θ b 2 C 2 = - 2 cos θ sin θ ( 1 a 2 - 1 b 2 ) C 3 = sin 2 θ a 2 + cos 2 θ b 2 ( 1 )
  • The ellipse equation (1) is solved for θ, subject to the constraints that: (1) (xC, yC) must lie on the centerline determined from the four tangents 1752A, 1752B, 1752C, 1752D (i.e., centerline 1756 of FIG. 17B); and (2) a is fixed at the assumed value a0. The ellipse equation can either be solved for θ analytically or solved using an iterative numerical solver (e.g., a Newtonian solver as is known in the art). An analytic solution can be obtained by writing an equation for the distances to the four tangent lines given a yC position, then solving for the value of yC that corresponds to the desired radius parameter a=a0. Accordingly, equations (2) for four tangent lines in the x-y plane (of the slice), in which coefficients Ai, Bi and Di (for i=1 to 4) are determined from the tangent lines 1752A, 1752B, 1752C, 1752D identified in an image slice as described above.

  • A 1 x+B 1 y+D 1=0

  • A 2 x+B 2 y+D 2=0

  • A 3 x+B 3 y+D 3=0

  • A 4 x+B 4 y+D 4=0  (2)
  • Four column vectors r12, r23, r14 and r24 are obtained from the coefficients Ai, Bi and Di of equations (2) according to equations (3), in which the “\” operator denotes matrix left division, which is defined for a square matrix M and a column vector v such that M\v=r, where r is the column vector that satisfies Mr=v:
  • r 13 = [ A 1 B 1 A 3 B 3 ] \ [ - D 1 D 3 ] r 23 = [ A 2 B 2 A 3 B 2 ] \ [ - D 21 - D 3 ] r 14 = [ A 1 B 1 A 4 B 4 ] \ [ - D 1 - D 4 ] r 24 = [ A 2 B 2 A 4 B 4 ] \ [ - D 2 - D 4 ] ( 3 )
  • Four component vectors G and H are defined in equations (4) from the vectors of tangent coefficients A, B, and D and scalar quantities p and q, which are defined using the column vectors r12, r23, r14 and r24 from equations (3).

  • c1=(r 13 +r 24)/2

  • c2=(r 14 +r 23)/2

  • δ1=c21 −c11

  • δ2=c22 −c12

  • p=δ1/δ2

  • q=c11 −c12 *p

  • G=Ap+B  (4)

  • H=Aq+D
  • Six scalar quantities vA2, vAB, vB2, wA2, wAB, and wB2 are defined by equation (5) in terms of the components of vectors G and H of equation (4).
  • v = [ G 2 2 G 3 2 G 4 2 ( G 2 H 2 ) 2 ( G 3 H 3 ) 2 ( G 4 H 4 ) 2 H 2 2 H 3 2 H 4 2 ] \ [ 0 0 1 ] w = [ G 2 2 G 3 2 G 4 2 ( G 2 H 2 ) 2 ( G 3 H 3 ) 2 ( G 4 H 4 ) 2 H 2 2 H 3 2 H 4 2 ] \ [ 0 1 0 ] v A 2 = ( v 1 A 1 ) 2 + ( v 2 A 2 ) 2 + ( v 3 A 3 ) 2 v AB = ( v 1 A 1 B 1 ) 2 + ( v 2 A 2 B 2 ) 2 + ( v 3 A 3 B 3 ) 2 v B 2 = ( v 1 B 1 ) 2 + ( v 2 B 2 ) 2 + ( v 3 B 3 ) 2 w A 2 = ( w 1 A 1 ) 2 + ( w 2 A 2 ) 2 + ( w 3 A 3 ) 2 w AB = ( w 1 A 1 B 1 ) 2 + ( w 2 A 2 B 2 ) 2 + ( w 3 A 3 B 3 ) 2 w B 2 = ( w 1 B 1 ) 2 + ( w 2 B 2 ) 2 + ( w 3 B 3 ) 2 ( 5 )
  • Using the parameters defined in equations (1)-(5), solving for θ is accomplished by solving the eighth-degree polynomial equation (6) for t, where the coefficients Qi (for i=0 to 8) are defined as shown in equations (7)-(15).

  • 0=Q 8 t 8 +Q 7 t 7 +Q 6 t 6 +Q 5 t 5 +Q 4 t 4 +Q 3 t 3 +Q 2 t 2 +Q 1 t+Q 0  (6)
  • The parameters A1, B1, G1, H1, vA2, vAB, vB2, wA2, wAB, and wB2 used in equations (7)-(15) are defined as shown in equations (1)-(4). The parameter n is the assumed semi-major axis (in other words, a0). Once the real roots t are known, the possible values of θ are defined as θ=a tan(t).
  • Q 8 = 4 A 1 2 n 2 v B 2 2 + 4 v B 2 B 1 2 ( 1 - n 2 v A 2 ) - ( G 1 ( 1 - n 2 v A 2 ) w B 2 + n 2 v B 2 w A 2 + 2 H 1 v B 2 ) 2 ( 7 ) Q 7 = - ( 2 ( 2 n 2 v AB w A 2 + 4 H 1 v AB + 2 G 1 n 2 v AB w B 2 + 2 G 1 ( 1 - n 2 v A 2 ) w AB ) ) ( G 1 ( 1 - n 2 v A 2 ) w B 2 + n 2 v B 2 w A 2 + 2 H 1 v B 2 ) - 8 A 1 B 1 n 2 v B 2 2 + 16 A 1 2 n 2 v AB v B 2 + ( 4 ( 2 A 1 B 1 ( 1 - n 2 v A 2 ) + 2 B 1 2 n 2 v AB ) ) v B 2 + 8 B 1 2 ( 1 - n 2 v A 2 ) v AB ( 8 ) Q 6 = - ( 2 ( 2 H 1 v B 2 + 2 H 1 v A 2 + n 2 v A 2 w A 2 + n 2 v B 2 ( - 2 w AB + w B 2 ) + G 1 ( n 2 v B 2 + 1 ) w B 2 + 4 G 1 n 2 v AB w AB + G 1 ( 1 - n 2 v A 2 ) v A 2 ) ) x ( G 1 ( 1 - n 2 v A 2 ) w B 2 + n 2 v B 2 w A 2 + 2 H 1 v B 2 ) - ( 2 n 2 v AB w A 2 + 4 H 1 v AB + 2 G 1 n 2 v AB w B 2 + 2 G 1 ( 1 - n 2 v A 2 ) w AB ) 2 + 4 B 1 2 n 2 v B 2 2 - 32 A 1 B 1 n 2 v AB + v B 2 + 4 A 1 2 n 2 ( 2 v A 2 v B 2 + 4 v AB 2 ) + 4 A 1 2 n 2 v B 2 2 + ( 4 ( A 1 2 ( 1 - n 2 v A 2 ) + 4 A 1 B 1 n 2 v AB + B 1 2 ( - n 2 v B 2 + 1 ) + B 1 2 ( 1 - n 2 v A 2 ) ) ) v B 2 + ( 8 ( 2 A 1 B 1 ( 1 - n 2 v A 2 ) + 2 B 1 2 n 2 v AB ) ) v AB + 4 B 1 2 ( 1 - n 2 v A 2 ) v A 2 ( 9 ) Q 5 = - ( 2 ( 4 H 1 v AB + 2 G 1 ( - n 2 v B 2 + 1 ) w AB + 2 G 1 n 2 v AB v A 2 + 2 n 2 v A ( - 2 w AB + w B 2 ) ) ) ( G 1 ( 1 - n 2 v A 2 ) w B 2 + n 2 v B 2 w A 2 + 2 H 1 v B 2 ) - ( 2 ( 2 H 1 v B 2 + 2 H 1 v A 2 + n 2 v A 2 w A 2 + n 2 v B 2 ( - 2 w AB + w B 2 ) + G 1 ( - n 2 v B 2 + 1 ) w B 2 + 4 G 1 n 2 v AB w AB + G 1 ( 1 - n 2 v A 2 ) v A 2 ) ) x ( 2 n 2 v AB w A 2 + 4 H 1 v AB + 2 G 1 n 2 v AB w B 2 + 2 G 1 ( 1 - n 2 v A 2 ) w AB ) + 16 B 1 2 n 2 v AB v B 2 - 8 A 1 B 1 n 2 ( 2 v A 2 v B 2 + 4 v AB 2 ) + 16 A 1 2 v A 2 v AB - 8 A 1 B 1 n 2 v B 2 2 + 16 A 1 2 n 2 v AB v B 2 + ( 4 ( 2 A 1 2 n 2 v AB + 2 A 1 B 1 ( - n 2 v B 2 + 1 ) + 2 A 1 B 1 ( 1 - n 2 v A 2 ) + 2 B 1 2 n 2 v AB ) ) v B 2 + ( 8 ( A 1 2 ( 1 - n 2 v A 2 ) + 4 A 1 B 1 n 2 v AB + B 1 2 ( - n 2 v B 2 + 1 ) + B 1 2 ( 1 - n 2 v A 2 ) ) ) v AB + ( 4 ( 2 A 1 B 1 ( 1 - n 2 v A 2 ) + 2 B 1 2 n 2 v AB ) ) v A 2 ( 10 ) Q 4 = ( 4 ( A 1 2 ( - n 2 v B 2 ) + A 1 2 ( 1 - n 2 v A 2 ) + 4 A 1 B 1 n 2 v AB + B 1 2 ( - n 2 v B 2 + 1 ) ) ) v B 2 + ( 8 ( 2 A 1 2 n 2 v AB + 2 A 1 B 1 ( - n 2 v B 2 + 1 ) + 2 A 1 B 1 ( 1 - n 2 v A 2 ) + 2 B 1 2 n 2 v AB ) ) v AB + ( 4 ( A 1 2 ( 1 - n 2 v A 2 ) + 4 A 1 B 1 n 2 v AB + B 1 2 ( - n 2 v B 2 + 1 ) + B 1 2 ( 1 - n 2 v A 2 ) ) ) v A 2 + 4 B 1 2 n 2 ( 2 v A 2 v B 2 + 4 v AB 2 ) - 32 A 1 B 1 n 2 v A 2 v AB + 4 A 1 2 n 2 v A 2 2 + 4 B 1 2 n 2 v AB v B 2 + 4 A 1 2 n 2 ( 2 v A 2 v B 2 + 4 v AB 2 ) - ( 2 ( G 1 ( - n 2 v B + 1 ) v A 2 + n 2 v A 2 ( - 2 w AB + w B 2 ) + 2 H 1 v A 2 ) ) ( G 1 ( 1 - n 2 v A 2 ) w B 2 + n 2 v B 2 w A 2 + 2 H 1 v B 2 ) - ( 2 ( 4 H 1 v AB + 2 G 1 ( - n 2 v B 2 + 1 ) w AB + 2 G 1 n 2 v AB v A 2 + 2 n 2 v AB ( - 2 w AB + w B 2 ) ) ) x ( 2 n 2 v AB w A 2 + 4 H 1 v AB + 2 G 1 n 2 v AB w B 2 + 2 G 1 ( 1 - n 2 v A 2 ) w AB ) - ( 2 H 1 v B 2 + 2 H 1 v A 2 + n 2 v A 2 w A 2 + n 2 v B 2 ( - 2 w AB + w B 2 ) + G 1 ( - n 2 v B 2 + 1 ) w B 2 + 4 G 1 n 2 v AB w AB + G 1 ( 1 - n 2 v A 2 ) v A 2 ) 2 ( 11 ) Q 3 = - ( 2 ( G 1 ( - n 2 v B 2 + 1 ) v A 2 + n 2 v A 2 ( - 2 w AB + w B 2 ) + 2 H 1 v A 2 ) ) ( 2 n 2 v AB w A 2 + 4 H 1 v AB + 2 G 1 n 2 v AB w B 2 + 2 G 1 ( 1 - n 2 v A 2 ) w AB ) - ( 2 ( 4 H 1 v AB + 2 G 1 ( - n 2 v B 2 + 1 ) w AB + 2 G 1 n 2 v AB v A 2 + 2 n 2 v AB ( - 2 w AB + w B 2 ) ) ) x ( 2 H 1 v B 2 + 2 H 1 v A 2 + n 2 v A 2 w A 2 + n 2 v B 2 ( - 2 w AB + w B 2 ) + G 1 ( - n 2 v B 2 + 1 ) w B 2 + 4 G 1 n 2 v AB w AB + G 1 ( 1 - n 2 v A 2 ) v A 2 ) + 16 B 1 2 n 2 v A 2 v AB - 8 A 1 B 1 n 2 v A 2 2 + 16 B 1 2 n 2 v AB v B 2 - 8 A 1 B 1 n 2 ( 2 v A 2 v B 2 + 4 v AB 2 ) + 16 A 1 2 n 2 v A 2 v AB + ( 4 ( 2 A 1 2 n 2 v AB + 2 A 1 B 1 ( - n 2 v B 2 + 1 ) ) ) v B 2 + ( 8 ( A 1 2 ( - n 2 v B 2 + 1 ) + A 1 2 ( 1 - n 2 v A 2 ) + 4 A 1 B 1 n 2 v AB + B 1 2 ( - n 2 v B 2 + 1 ) ) ) v AB + ( 4 ( 2 A 1 2 n 2 v AB + 2 A 1 B 1 ( - n 2 v B 2 + 1 ) + 2 A 1 B 1 ( 1 - n 2 v A 2 ) + 2 B 1 2 n 2 v AB ) ) v A 2 ( 12 ) Q 2 = 4 A 1 2 ( - n 2 v B 2 + 1 ) v B 2 + ( 8 ( 2 A 1 2 n 2 v AB + 2 A 1 B 1 ( - n 2 v B 2 + 1 ) ) ) v AB + ( 4 ( A 1 2 ( - n 2 v B 2 + 1 ) + A 1 2 ( 1 - n 2 v A 2 ) + 4 A 1 B 1 n 2 v AB + B 1 2 ( - n 2 v B 2 + 1 ) ) ) v A 2 + 4 B 1 2 n 2 v A 2 2 + 4 B 1 2 n 2 ( 2 v A 2 v B 2 + 4 v AB 2 ) - 32 A 1 B 1 n 2 v A 2 v AB + 4 A 1 2 n 2 v A 2 2 - ( 2 ( G 1 ( - n 2 v B 2 + 1 ) v A 2 + n 2 v A 2 ( - 2 w AB + w B 2 ) + 2 H 1 v A 2 ) ) x ( 2 H 1 v B 2 + 2 H 1 v A 2 + n 2 v A 2 w A 2 + n 2 v B 2 ( - 2 w AB + w B 2 ) + G 1 ( - n 2 v B 2 + 1 ) w B 2 + 4 G 1 n 2 v AB w AB + G 1 ( 1 - n 2 v A 2 ) v A 2 ) - ( 4 H 1 v AB + 2 G 1 ( - n 2 v B 2 + 1 ) w AB + 2 G 1 n 2 v AB v A 2 + 2 n 2 v AB ( - 2 w AB + w B 2 ) ) 2 ( 13 ) Q 1 = 8 A 1 2 ( - n 2 v B 2 + 1 ) v AB + ( 4 ( 2 A 1 2 n 2 v AB + 2 A 1 B 1 ( - n 2 v B 2 + 1 ) ) ) v A 2 + 16 B 1 2 n 2 v A 2 v AB - 8 A 1 B 1 n 2 v A 2 2 - ( 2 ( G 1 ( - n 2 v B 2 + 1 ) v A 2 + n 2 v A 2 ( - 2 w AB + w B 2 ) + 2 H 1 v A 2 ) ) ( 4 H 1 v AB + 2 G 1 ( - n 2 v B 2 + 1 ) w AB + 2 G 1 n 2 v AB v A 2 + 2 n 2 v AB ( - 2 w AB + w B 2 ) ) ( 14 ) Q 0 = 4 A 1 2 ( - n 2 v B 2 + 1 ) v A 2 - ( G 1 ( - n 2 v B 2 + 1 ) v A 2 + n 2 v A 2 ( - 2 w AB + w B 2 ) + 2 H 1 v A 2 ) 2 + 4 B 1 2 n 2 v A 2 2 ( 15 )
  • In this exemplary embodiment, equations (6)-(15) have at most three real roots; thus, for any four tangent lines, there are at most three possible ellipses that are tangent to all four lines and that satisfy the a=a0 constraint. (In some instances, there may be fewer than three real roots.) For each real root θ, the corresponding values of (xC, yC) and b can be readily determined. Depending on the particular inputs, zero or more solutions will be obtained; for example, in some instances, three solutions can be obtained for a typical configuration of tangents. Each solution is completely characterized by the parameters {θ, a=a0, b, (xC, yC)}. Alternatively, or additionally, referring to FIG. 16, a model builder 1614 and model updater 1616 provide functionality to define, build, and/or customize model(s) 1608 using one or more components in object library 1612. Once built, model refiner 1606 updates and refines the model, bringing the predictive information of the model in line with observed information from the detection system 102.
  • The model subcomponents 1702, 1703, 1754 can be scaled, sized, selected, rotated, translated, moved, or otherwise re-ordered to enable portions of the model corresponding to the virtual surface(s) to conform within the points 1750 in space. Model refiner 1606 employs a variation detector 1608 to substantially continuously determine differences between sensed information and predictive information and provide to model refiner 1606 a variance useful to adjust the model 1608 accordingly. Variation detector 1608 and model refiner 1606 are further enabled to correlate among model portions to preserve continuity with characteristic information of a corresponding object being modeled, continuity in motion, and/or continuity in deformation, conformation and/or torsional rotations.
  • In an embodiment, when the control object morphs, conforms, and/or translates, motion information reflecting such motion(s) is included in the observed information. Points in space can be recomputed based on the new observation information. The model subcomponents can be scaled, sized, selected, rotated, translated, moved, or otherwise re-ordered to enable portions of the model corresponding to the virtual surface(s) to conform to the set of points in space.
  • In an embodiment, motion(s) of the control object can be rigid transformations, in which case points on the virtual surface(s) remain at the same distance(s) from one another through the motion. Motion(s) can be non-rigid transformations, in which points on the virtual surface(s) can vary in distance(s) from one another during the motion. In an embodiment, observation information can be used to adjust (and/or recompute) predictive information, thereby enabling “tracking” the control object. In embodiments, the control object can be tracked by determining whether a rigid transformation or a non-rigid transformation occurs. In an embodiment, when a rigid transformation occurs, a transformation matrix is applied to each point of the model uniformly. Otherwise, when a non-rigid transformation occurs, an error indication can be determined, and an error-minimization technique such as described herein above can be applied. In an embodiment, rigid transformations and/or non-rigid transformations can be composed. One example composition embodiment includes applying a rigid transformation to predictive information. Then an error indication can be determined, and an error minimization technique such as described herein above can be applied. In an embodiment, determining a transformation can include calculating a rotation matrix that provides a reduced RMSD (root mean squared deviation) between two paired sets of points. One embodiment can include using Kabsch Algorithm to produce a rotation matrix. In an embodiment and by way of example, one or more force lines can be determined from one or more portions of a virtual surface.
  • Collisions
  • In an embodiment, predictive information can include collision information concerning two or more capsuloids. By means of illustration, several possible fits of predicted information to observed information can be removed from consideration based upon a determination that these potential solutions would result in collisions of cap suloids. In an embodiment, a relationship between neighboring capsuloids, each having one or more attributes (e.g., determined minima and/or maxima of intersection angles between capsuloids) can be determined. In an embodiment, determining a relationship between a first capsuloid having a first set of attributes and a second capsuloid having a second set of attributes includes detecting and resolving conflicts between first attributes and second attributes. For example, a conflict can include a capsuloid having one type of angle value with a neighbor having a second type of angle value incompatible with the first type of angle value. Attempts to attach a capsuloid with a neighboring capsuloid having attributes such that the combination will exceed what is allowed in the observed—or to pair incompatible angles, lengths, shapes, or other such attributes—can be removed from the predicted information without further consideration.
  • Lean Model
  • In an embodiment, predictive information can be artificially constrained to capsuloids positioned in a subset of the observed information—thereby enabling creation of a “lean model.” For example, as illustrated in 17A, capsuloid 1702 could be used to denote the portion of the observed without addition of capsuloids 1703. In a yet further embodiment, connections can be made using artificial constructs to link together capsuloids of a lean model. In another embodiment, the predictive information can be constrained to a subset of topological information about the observed information representing the control object to form a lean model. In an embodiment, a lean model can be associated with a full predictive model. The lean model (or topological information, or properties described above) can be extracted from the predictive model to form a constraint. Then, the constraint can be imposed on the predictive information, thereby enabling the predictive information to be constrained in one or more of behavior, shape, total (system) energy, structure, orientation, compression, shear, torsion, other properties, and/or combinations thereof.
  • Occlusions
  • In an embodiment, the observed can include components reflecting portions of the control object which are occluded from view of the device (“occlusions” or “occluded components”). In one embodiment, the predictive information can be “fit” to the observed as described herein above with the additional constraint(s) that some total property of the predictive information (e.g., potential energy) be minimized or maximized (or driven to lower or higher value(s) through iteration or solution). Properties can be derived from nature, properties of the control object being viewed, others, and/or combinations thereof. In another embodiment, as shown by 17C and 17D, a deformation of the predictive information subcomponent 1760 can be allowed subject to an overall permitted value of compression, deformation, flexibility, others, and/or combinations thereof.
  • Friction
  • In an embodiment, a “friction constraint” is applied on the model 1700. For example, if fingers of a hand being modeled are close together (in position or orientation), corresponding portions of the model will have more “friction”. The more friction a model subcomponent has in the model, the less the subcomponent moves in response to new observed information. Accordingly, the model is enabled to mimic the way portions of the hand that are physically close together move together, and move less overall.
  • With renewed reference to FIG. 16A, an environmental filter 1620 reduces extraneous noise in sensed information received from the detection system 100 using environmental information to eliminate extraneous elements from the sensory information. Environmental filter 1620 employs contrast enhancement, subtraction of a difference image from an image, software filtering, and background subtraction (using background information provided by objects-of-interest determiner 1622 (see below) to enable model refiner 1606 to build, refine, manage, and maintain model(s) 1608 of objects of interest from which control inputs can be determined.
  • A model analyzer 1610 determines that a reconstructed shape of a sensed object portion matches an object model in an object library, and interprets the reconstructed shape (and/or variations thereon) as user input. Model analyzer 1610 provides output in the form of object, position, motion, and attribute information to an interaction system 1630.
  • The interaction system 1630 includes an interaction-interpretation module 1632 that provides functionality to recognize command and other information from object, position, motion and attribute information obtained from variation system 1600. An interaction-interpretation module 1632 embodiment comprises a recognition engine 1634 to recognize command information such as command inputs (i.e., gestures and/or other command inputs (e.g., speech, and so forth)), related information (i.e., biometrics), environmental information (i.e., context, noise, and so forth) and other information discernable from the object, position, motion, and attribute information that might be useful in controlling a machine. Recognition engine 1634 employs gesture properties 1636 (e.g., path, velocity, acceleration, and so forth), control objects determined from the object, position, motion, and attribute information by an objects-of-interest determiner 1622 and optionally one or more virtual constructs 1638 (see e.g., FIGS. 18A and 18B: 1800 and 1820) to recognize variations in control-object presence or motion indicating command information, related information, environmental information, and other information discernable from the object, position, motion, and attribute information that might be useful in controlling a machine. With reference to 18A and 18B, virtual construct 1800, 1820 implement an engagement target with which a control object 112 interacts—enabling the machine sensory and control system to discern variations in control object (i.e., motions into, out of or relative to virtual construct 1800, 1820) as indicating control or other useful information. Returning to FIG. 16, a gesture trainer 1640 and gesture-properties extractor 1642 provide functionality to define, build, and/or customize gesture properties 1636.
  • A context determiner 1634 and object-of-interest determiner 1622 provide functionality to determine from the object, position, motion, and attribute information objects of interest (e.g., control objects, or other objects to be modeled and analyzed) and/or objects not of interest (e.g., background), based upon a detected context. For example, when the context is determined to be an identification context, a human face will be determined to be an object of interest to the system and will be determined to be a control object. On the other hand, when the context is determined to be a fingertip control context, the finger tips will be determined to be object(s) of interest and will be determined to be control objects whereas the user's face will be determined not to be an object of interest (i.e., background). Further, when the context is determined to be a stylus (or other tool) held in the fingers of the user, the tool tip will be determined to be object of interest and a control object whereas the user's fingertips might be determined not to be objects of interest (i.e., background). Background objects can be included in the environmental information provided to environmental filter 1620 of model-management module 1602.
  • A virtual environment manager 1646 provides creation, selection, modification, and de-selection of one or more virtual constructs 1800, 1820 (see FIGS. 18A and 18B). In some embodiments, virtual constructs (e.g., a virtual object defined in space such that variations in real objects relative to the virtual construct, when detected, can be interpreted for control or other purposes) are used to determine variations (i.e., virtual “contact” with the virtual construct, breaking of virtual contact, motion relative to a construct portion, and so forth) to be interpreted as engagements, dis-engagements, motions relative to the construct(s), and so forth, enabling the system to interpret pinches, pokes and grabs, and so forth. Interaction-interpretation module 1632 provides as output the command information, related information, and other information discernable from the object, position, motion, and attribute information that might be useful in controlling a machine from recognition engine 1634 to an application control system 1650.
  • Further with reference to FIG. 16, an application control system 1650 includes a control module 1652 that provides functionality to determine and authorize commands based upon the command and other information obtained from interaction system 1630.
  • A control module 1652 embodiment comprises a command engine 1654 to determine whether to issue command(s) and what command(s) to issue based upon the command information, related information, and other information discernable from the object, position, motion, and attribute information, as received from the interaction-interpretation module 1632. Command engine 1654 employs command/control repository 1656 (e.g., application commands, OS commands, commands to the machine sensory and control system, miscellaneous commands) and related information indicating context received from the interaction-interpretation module 1632 to determine one or more commands corresponding to the gestures, context, and so forth indicated by the command information. For example, engagement gestures can be mapped to one or more controls, or a control-less screen location, of a presentation device associated with a machine under control. Controls can include imbedded controls (e.g., sliders, buttons, and other control objects in an application), or environmental level controls (e.g., windowing controls, scrolls within a window, and other controls affecting the control environment). In embodiments, controls may be displayed using 2D presentations (e.g., a cursor, cross-hairs, icon, graphical representation of the control object, or other displayable object) on display screens and/or presented in 3D forms using holography, projectors, or other mechanisms for creating 3D presentations, or may be audible (e.g., mapped to sounds, or other mechanisms for conveying audible information) and/or touchable via haptic techniques.
  • Further, an authorization engine 1658 employs biometric profiles 1660 (e.g., users, identification information, privileges, and so forth) and biometric information received from the interaction-interpretation module 1632 to determine whether commands and/or controls determined by the command engine 1654 are authorized. A command builder 1662 and biometric profile builder 1660 provide functionality to define, build, and/or customize command/control repository 1652 and biometric profiles 1660.
  • Selected authorized commands are provided to machine(s) under control (i.e., “client”) via interface layer 1664. Commands/controls to the virtual environment (i.e., interaction control) are provided to virtual environment manager 1646. Commands/controls to the emission/detection systems (i.e., sensory control) are provided to emission module 102 and/or detection module 104 as appropriate.
  • For example, if the control object is a hand, analysis of the hand's shape and configuration (which may be the object attributes of interest) may determine the positions of the finger tips, which may constitute the relevant control surfaces. Furthermore, changes in control attributes of the identified control surface(s), such as positional changes of the fingertips, may be analyzed to determine whether they are indicative of control information. In hand-gesture-based machine control, for instance, this may serve to discriminate between deliberate motions intended to provide control input and hand jitter or other inevitable motions. Such discrimination may be based, e.g., on the scale and speed of motion, similarity of the motions to pre-defined motion patterns stored in a library, and/or consistency with deliberate motions as characterized using machine learning algorithms or other approaches.
  • Further, in some embodiments, as illustrated with reference to FIGS. 18A and 18B, a hand gesture or other motion is analyzed relative to a programmatically defined engagement target (e.g., a plane, curved surface (whether open or closed), point, line, or volume whose position and location in space is well-defined and which need generally not coincide with a physical surface) to determine whether the change in the control surface is indicative of an engagement gesture. For example, if the fingertip pierces an engagement surface, this may be interpreted as a click event, or if one or more fingertips or the entire hand moves substantially parallel to an engagement surface defined relative to a display screen, this may be interpreted as a scrolling gesture. If a particular detected motion (or, more generally, change in object attributes) corresponds to control information, an appropriate response action is taken, generally in accordance with and/or based on response criteria, such as the context in which the control information was received (e.g., the particular software application active at the time, the user accessing the system, an active security level, etc.). The response may involve issuing a command (e.g., open a new document upon a “click,” or shift the displayed content in response to a scrolling motion) to a user interface based on the detected gesture or motion. As illustrated in, FIGS. 18A and 18B, a machine sensory and controller system 1810 can be embodied as a standalone unit(s) 1810 coupleable via an interface (e.g., wired or wireless), embedded (e.g., within a machine 1812, 1814 or machinery under control), or combinations thereof.
  • To provide yet another concrete example of machine control in accordance herewith, in one embodiment, the system is used for security purposes and directs light emission at an entryway to a secure room or space. If, during the scan across the entryway, a reflection is detected, this may indicate the presence of a person seeking entrance. A second scan may then be conducted, according to a more refined scan pattern, to obtain more detailed information about the person. For example, a vein pattern of the person's hand may be identified. Vein patterns of authorized users may be stored in a database, allowing the system to check whether the detected person is authorized to enter the secure space. In other words, the control information in this case is authentication information. If a match of the detected vein pattern with a stored pattern is found, the system may respond by permitting the person to enter (e.g., by automatically opening a mechanical barrier, temporarily interrupting laser beams crossing the entryway, or by some other means). Certain embodiments of depth-sensing positioning and tracking systems in accordance herewith may also be mounted on automobiles or other mobile platforms to provide information as to the outside environment (e.g., the positions of other automobiles) to other systems within the platform. In general, embodiments of the disclosed technology may be employed in a variety of application areas, including, without limitation, consumer applications including interfaces for computer systems, laptops, tablets, television, game consoles, set top boxes, telephone devices and/or interfaces to other devices; medical applications including controlling devices for performing robotic surgery, medical imaging systems and applications such as CT, ultrasound, x-ray, MRI or the like, laboratory test and diagnostics systems and/or nuclear medicine devices and systems; prosthetics applications including interfaces to devices providing assistance to persons under handicap, disability, recovering from surgery, and/or other infirmity; defense applications including interfaces to aircraft operational controls, navigations systems control, on-board entertainment systems control and/or environmental systems control; automotive applications including interfaces to automobile operational systems control, navigation systems control, on-board entertainment systems control and/or environmental systems control; security applications including, monitoring secure areas for suspicious activity or unauthorized personnel; manufacturing and/or process applications including interfaces to assembly robots, automated test apparatus, work conveyance devices such as conveyors, and/or other factory floor systems and devices, genetic sequencing machines, semiconductor fabrication related machinery, chemical process machinery and/or the like; and/or combinations thereof.
  • Certain embodiments of the disclosed technology are described above. It is, however, expressly noted that the disclosed technology is not limited to those embodiments. Rather, variations, additions, modifications, and other implementations of what is described herein, as will occur to those of ordinary skill in the art, are deemed within the spirit and scope of the disclosed technology. For example, it may be appreciated that the techniques, devices, and systems described herein with reference to embodiments employing light waves may be equally applicable to methods and systems employing other types of radiant energy waves, such as acoustical energy or the like. Moreover, it is to be understood that the features of the various embodiments described herein are not mutually exclusive and can exist in various combinations and permutations, even if such combinations or permutations are not made express herein, without departing from the spirit and scope of the disclosed technology. Accordingly, the scope of the disclosed technology is not intended to be limited by the preceding illustrative description.

Claims (63)

What is claimed is:
1. A method for obtaining positional information about an object within a region of interest, the method comprising:
(a) activating sources directed to portions of the region of interest according to an ordering of points, such that each point in the ordering directs electromagnetic radiation of at least one source to one of the portions of the region of interest;
(b) capturing a portion of the electromagnetic radiation reflected by an object;
(c) forming a signal over time of at least one property of the captured electromagnetic radiation;
(d) determining from the signal, at least one point in the ordering in which a dominant contributor to the captured electromagnetic radiation was activated;
(e) determining an identity for the dominant contributor from the point in the ordering;
(f) determining from the identity of the dominant contributor, a portion of the region of interest to which the electromagnetic radiation from the dominant contributor was directed; and
(g) determining positional information for the object based at least in part upon the portion of the region of interest.
2. The method of claim 1, wherein capturing electromagnetic radiation reflected by the object comprises capturing data frames of the region with a pixelated sensor.
3. The method of claim 2, further comprising determining a direction of the reflected electromagnetic radiation relative to the sensor, the positional information further being based in part on the direction of the reflected electromagnetic radiation.
4. The method of claim 2, wherein the data frames are captured at a rate exceeding a scan rate associated with the illuminating electromagnetic radiation.
5. The method of claim 2, wherein determining a direction of the illuminating electromagnetic radiation associated with the electromagnetic radiation reflected by the object comprises determining an intensity peak across a temporal sequence of data frames for at least one pixel corresponding to the object within the data frame.
6. The method of claim 5, wherein determining the intensity peak comprises performing a Fourier transform on the temporal sequence of data frames for the at least one pixel.
7. The method of claim 1, wherein the electromagnetic radiation is retro-reflected by the object.
8. The method of claim 7, further comprising physically associating the object with a retro-reflector.
9. The method of claim 1, wherein the region is scanned periodically.
10. The method of claim 1, wherein the point in the ordering corresponding to capture of the reflected electromagnetic radiation corresponds to a phase within an emission cycle.
11. The method of claim 10, wherein determining the direction of the illuminating electromagnetic radiation associated with the electromagnetic radiation reflected by the object comprises determining the point in the cycle where the captured radiation is greatest.
12. The method of claim 11, wherein the point in the cycle is detected using a phase-detector circuit.
13. The method of claim 1, wherein scanning the region comprises sequentially operating a plurality of light-emitting devices emitting light to respective different portions of the region of interest.
14. The method of claim 13, further comprising operating only a subset of the plurality of light-emitting devices so as to reduce a resolution of the scan.
15. The method of claim 13, wherein sequentially operating the plurality of devices comprises sequentially causing the devices to emit pulses of light.
16. The method of claim 15, wherein successive pulses overlap temporally.
17. The method of claim 13, wherein sequentially operating the plurality of devices comprises driving each device according to a time-variable intensity having an intensity peak, the peaks occurring sequentially for the plurality of light-emitting devices.
18. The method of claim 18, wherein light emitted by the plurality of light-emitting devices overlap spatially and temporally, determining a direction of the illuminating light associated with the light reflected by the object comprising determining an effective primary direction of the overlapping illuminating light.
19. The method of claim 1, wherein scanning the region comprises moving a light-emitting device.
20. The method of claim 1, wherein scanning the region comprises moving at least one of a deflecting optic or a screen used in conjunction with a light-emitting device.
21. The method of claim 1, wherein the positional information comprises at least one of a distance, a depth, or a position of at least one of the object or a surface feature thereof.
22. The method of claim 1, wherein the positional information comprises a depth profile of the object.
23. The method of claim 1, wherein the positional information is determined further based at least in part on a geometric relationship between the source and the detector.
24. The method of claim 1, further comprising periodically repeating steps (a) through (c) so as to update the positional information to track movement of the object.
25. A system for obtaining positional information about an object within a region of interest, the system comprising:
a directed light source with variable direction for scanning the region with an illuminating light;
a detector for capturing light reflected by the object, and
circuitry for (i) determining a time of capture of the reflected light and, based thereon, an associated direction of the illuminating light, and (ii) deriving the positional information about the object at least in part from the direction of the illuminating light.
26. The system of claim 25, wherein the directed light source comprises a plurality of light-emitting devices emitting light in a respective plurality of different primary directions.
27. The system of claim 26, wherein the directed light source further comprises a controller for sequentially operating the plurality of light-emitting devices.
28. The system of claim 26, wherein the light-emitting devices comprise light-emitting diodes.
29. The system of claim 26, wherein the light-emitting devices are arranged such that their respective primary directions intersect at a common center.
30. The system of claim 29, wherein the light-emitting devices are affixed to at least one of an arcuate surface, facets of a polygonal surface, or facets of a polyhedral surface.
31. The system of claim 28, wherein the plurality of light-emitting devices comprises a plurality of light emitters and a plurality of associated deflecting optics for deflecting light emitted by the emitters into the different primary directions.
32. The system of claim 27, wherein the directed light source comprises at least one moving light-emitting devices.
33. The system of claim 27, wherein the directed light source comprises at least one light-emitting device and at least one of a moving deflecting optic or a moving screen having a perforation therein.
34. The system of claim 27, wherein the detector comprises a camera for imaging the region.
35. The system of claim 34, wherein the camera comprises a lens and a sensor.
36. The system of claim 35, wherein the sensor comprises at least one of a CCD sensor or a MEMS sensor.
37. The system of claim 34, wherein the camera comprising a light-sensing device and a scanning mirror.
38. The system of claim 25, wherein the detector is co-located with the light source, the system further comprising a retro-reflector affixed to the object.
39. The system of claim 25, wherein the directed light source comprises a controller for varying the emission direction so as to periodically scan the region.
40. The system of claim 39, wherein the controller is synchronized with the circuitry.
41. The system of claim 39, wherein the circuitry causes the detector to be read out at a rate exceeding the scan rate of the directed light source.
42. The system of claim 39, wherein the circuitry comprises a phase-detector circuit for determining a phase within an emission cycle corresponding to a maximum intensity of the captured light.
43. The system of claim 39, wherein the circuitry comprises a digital processor configured for performing a Fourier transform on the captured light to thereby determine a phase within an emission cycle corresponding to a maximum intensity of the captured light.
44. The system of claim 25, further comprising a retro-reflector affixed to the object.
45. A method for determining depth associated with at least one object within a region of interest, the method comprising:
scanning the region with an illuminating light beam having a temporally variable beam direction so as to illuminate the at least one object;
acquiring a temporal sequence of images of the region while the region is being scanned, each image corresponding to an instantaneous direction of the illuminating light beam, at least one of the images capturing light reflected by the at least one illuminated object; and
based at least in part on the instantaneous direction of the light beam in the at least one image capturing light reflected by the at least one object, determining a depth associated with the at least one object.
46. The method of claim 46, wherein multiple of the images acquired during a single scan of the region capture light reflected by the at least one object, the method comprising determining a depth profile of the at least one object based thereon.
47. A method for locating an object within a region, the method comprising:
using a light source affixed to the object, scanning the region with an illuminating light beam having a temporally variable beam direction;
using a sensor co-located with the light source, capturing reflections of the illuminating beam from a plurality of retro-reflectors fixedly positioned at known locations;
based on times of capture of the reflections, determining associated directions of the illuminating light beam; and
locating the object relative to the known locations of the retro-reflectors based at least in part on the directions of the illuminating light beam.
48. The method of claim 48, wherein the object is located in a two-dimensional region based on reflections from at least three retro-reflectors.
49. A device, affixed to an object of interest, for locating the object within a region relative to a plurality of retro-reflectors fixedly positioned at known locations, the device comprising:
a light source for scanning the region with an illuminating light beam having a temporally variable beam direction;
a sensor co-located with the light source for capturing reflections of the illuminating beam from the plurality of retro-reflectors;
circuitry for determining, from times of capture of the reflections, directions of the illuminating light beam associated therewith, and for locating the object relative to the retro-reflectors based at least in part on the directions.
50. The device of claim 50, wherein the object is a mobile device.
51. A computer-implemented method for conducting machine control, the method comprising:
scanning a region of space, the scanning including (i) directing at least one light emission from a vantage point of a vantage region to a region of space, (ii) detecting a reflectance of the at least one light emission, and (iii) determining that the detected reflectance indicates a presence of an object in the region of space;
determining one or more object attributes of the object;
analyzing the one or more object attributes to determine a potential control surface of the object;
determining that control-surface attribute changes in the potential control surface indicate control information; and
responding to the indicated control information according to response criteria.
52. A computer-implemented method according to claim 52, wherein the first light emission is directed to the region of space according to a first scan pattern, and wherein determining that the detected reflectance indicates a presence of an object comprises directing a second light emission to the region of space according to a second scan pattern.
53. A computer-implemented method according to claim 53, wherein directing the second emission comprises:
scanning to a refined scan pattern.
54. A computer-implemented method according to claim 54, wherein scanning to the refined scan pattern includes capturing surface detail about the object.
55. A computer-implemented method according to claim 52, wherein determining object attributes of the object comprises:
determining positional information of at least a portion of the object.
56. A computer-implemented method according to claim 56, wherein analyzing the object attributes to determine a potential control surface of the object comprises:
determining based at least in part upon the positional information whether a portion of the object provides control information.
57. A computer-implemented method according to claim 57, wherein determining whether one or more control-surface attribute changes in the potential control surface indicate control information comprises:
determining whether control-surface attribute changes in the potential control surface indicate an engagement gesture.
58. A computer-implemented method according to claim 58, wherein responding to the indicated control information according to response criteria comprises:
determining a command to a user interface based at least in part upon the engagement gesture.
59. A computer-implemented method according to claim 52, wherein determining object attributes of the object comprises:
determining dynamic information of at least a portion of the object.
60. A computer-implemented method according to claim 52, wherein determining object attributes of the object comprises:
determining physical information of at least a portion of the object.
61. A computer-implemented method according to claim 52, wherein determining object attributes of the object comprises:
determining at least one of optical or radio properties of at least a portion of the object.
62. A computer-implemented method according to claim 52, wherein determining object attributes of the object comprises:
determining chemical properties of at least a portion of the object.
63. A computer-implemented method according to claim 1, wherein:
directing the emission includes scanning across an entryway;
determining that the detected reflectance indicates a presence includes detecting an object comprising a person seeking entrance, and conducting a second scanning to a refined scan pattern of the person;
determining whether control-surface attribute changes in the potential control surface indicate control information includes determining whether the control-surface attribute changes indicate a vein pattern of a hand of the person; and
responding to the indicated control information according to response criteria comprises permitting the person to enter when the vein pattern matches a stored vein pattern of an individual authorized to enter.
US14/212,485 2012-01-17 2014-03-14 Determining positional information for an object in space Abandoned US20150253428A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US14/212,485 US20150253428A1 (en) 2013-03-15 2014-03-14 Determining positional information for an object in space
US14/280,018 US9679215B2 (en) 2012-01-17 2014-05-16 Systems and methods for machine control
US15/392,920 US9778752B2 (en) 2012-01-17 2016-12-28 Systems and methods for machine control
US15/696,086 US10691219B2 (en) 2012-01-17 2017-09-05 Systems and methods for machine control
US16/908,643 US11493998B2 (en) 2012-01-17 2020-06-22 Systems and methods for machine control
US17/862,212 US11720180B2 (en) 2012-01-17 2022-07-11 Systems and methods for machine control
US18/209,259 US20230325005A1 (en) 2012-01-17 2023-06-13 Systems and methods for machine control

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361800327P 2013-03-15 2013-03-15
US201361801479P 2013-03-15 2013-03-15
US201361792025P 2013-03-15 2013-03-15
US14/212,485 US20150253428A1 (en) 2013-03-15 2014-03-14 Determining positional information for an object in space

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/154,730 Continuation-In-Part US9501152B2 (en) 2012-01-17 2014-01-14 Free-space user interface and control using virtual constructs

Related Child Applications (3)

Application Number Title Priority Date Filing Date
US13/724,357 Continuation-In-Part US9070019B2 (en) 2012-01-17 2012-12-21 Systems and methods for capturing motion in three-dimensional space
US14/250,758 Continuation-In-Part US20140307920A1 (en) 2012-01-17 2014-04-11 Systems and methods for tracking occluded objects in three-dimensional space
US14/280,018 Continuation-In-Part US9679215B2 (en) 2012-01-17 2014-05-16 Systems and methods for machine control

Publications (1)

Publication Number Publication Date
US20150253428A1 true US20150253428A1 (en) 2015-09-10

Family

ID=51568932

Family Applications (6)

Application Number Title Priority Date Filing Date
US14/214,605 Active US9702977B2 (en) 2013-03-15 2014-03-14 Determining positional information of an object in space
US14/212,485 Abandoned US20150253428A1 (en) 2012-01-17 2014-03-14 Determining positional information for an object in space
US15/625,856 Active US9927522B2 (en) 2013-03-15 2017-06-16 Determining positional information of an object in space
US15/936,185 Active US10585193B2 (en) 2013-03-15 2018-03-26 Determining positional information of an object in space
US16/799,598 Active 2034-05-04 US11693115B2 (en) 2013-03-15 2020-02-24 Determining positional information of an object in space
US18/197,961 Pending US20230288563A1 (en) 2013-03-15 2023-05-16 Determining positional information of an object in space

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/214,605 Active US9702977B2 (en) 2013-03-15 2014-03-14 Determining positional information of an object in space

Family Applications After (4)

Application Number Title Priority Date Filing Date
US15/625,856 Active US9927522B2 (en) 2013-03-15 2017-06-16 Determining positional information of an object in space
US15/936,185 Active US10585193B2 (en) 2013-03-15 2018-03-26 Determining positional information of an object in space
US16/799,598 Active 2034-05-04 US11693115B2 (en) 2013-03-15 2020-02-24 Determining positional information of an object in space
US18/197,961 Pending US20230288563A1 (en) 2013-03-15 2023-05-16 Determining positional information of an object in space

Country Status (2)

Country Link
US (6) US9702977B2 (en)
WO (1) WO2014200589A2 (en)

Cited By (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140285818A1 (en) * 2013-03-15 2014-09-25 Leap Motion, Inc. Determining positional information of an object in space
US20140300544A1 (en) * 2013-04-04 2014-10-09 Funai Electric Co., Ltd. Projector and Electronic Device Having Projector Function
US20150251089A1 (en) * 2014-03-07 2015-09-10 Sony Corporation Information processing apparatus, information processing system, information processing method, and program
US20150254947A1 (en) * 2014-03-07 2015-09-10 Sony Corporation Information processing apparatus, information processing system, information processing method, and program
US20160057340A1 (en) * 2014-08-22 2016-02-25 Samsung Electronics Co., Ltd. Depth detecting apparatus and method, and gesture detecting apparatus and gesture detecting method
US20160076878A1 (en) * 2014-09-17 2016-03-17 Canon Kabushiki Kaisha Depth value measurement using illumination by pixels
US20160323236A1 (en) * 2013-12-16 2016-11-03 Inbubbles Inc. Space Time Region Based Communications
US20170046815A1 (en) * 2015-08-12 2017-02-16 Boe Technology Group Co., Ltd. Display Device, Display System and Resolution Adjusting Method
US20170052024A1 (en) * 2015-08-21 2017-02-23 Adcole Corporation Optical profiler and methods of use thereof
US20170061205A1 (en) * 2012-01-17 2017-03-02 Leap Motion, Inc. Enhanced Contrast for Object Detection and Characterization By Optical Imaging Based on Differences Between Images
CN106548455A (en) * 2015-09-17 2017-03-29 三星电子株式会社 For adjusting the apparatus and method of the brightness of image
US20170109920A1 (en) * 2015-10-20 2017-04-20 Samsung Electronics Co., Ltd. Method and apparatus rendering caustics
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US9697643B2 (en) 2012-01-17 2017-07-04 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US20170199272A1 (en) * 2014-07-03 2017-07-13 Sharp Kabushiki Kaisha Optical reflection sensor and electronic device
US20170242110A1 (en) * 2016-02-22 2017-08-24 Keyence Corporation Optical Safety System
US20170287153A1 (en) * 2016-03-31 2017-10-05 Mda Information Systems Llc Method and apparatus for imaging the silhouette of an object occluding a light source using a synthetic aperature
US20180107108A1 (en) * 2015-04-30 2018-04-19 Sony Corporation Image processing apparatus, image processing method, and program
US10015412B2 (en) * 2016-09-06 2018-07-03 The Trustees For The Time Being Of Junior Barnes Family Trust Video capturing system and method for imaging cyclically moving objects
US20180189591A1 (en) * 2016-05-23 2018-07-05 Sony Corporation Electronic apparatus, method of controlling electronic apparatus, and program
CN108472013A (en) * 2016-01-22 2018-08-31 奥林巴斯株式会社 The working procedure of ultrasound observation apparatus, the working method of ultrasound observation apparatus and ultrasound observation apparatus
US20180253608A1 (en) * 2017-03-03 2018-09-06 Magna Electronics Inc. Trailer angle detection system for vehicle
US10084979B2 (en) * 2016-07-29 2018-09-25 International Business Machines Corporation Camera apparatus and system, method and recording medium for indicating camera field of view
US10114467B2 (en) 2015-11-30 2018-10-30 Photopotech LLC Systems and methods for processing image information
US10145671B2 (en) * 2016-03-31 2018-12-04 Topcon Positioning Systems, Inc. Three dimensional laser measuring system and method
US20190056218A1 (en) * 2017-08-17 2019-02-21 Carl Zeiss Industrielle Messtechnik Gmbh Method and apparatus for determining at least one of dimensional or geometric characteristics of a measurement object
US20190071123A1 (en) * 2017-09-07 2019-03-07 Ford Global Technologies, Llc Hitch assist system featuring trailer location identification
US20190108647A1 (en) * 2017-10-10 2019-04-11 The Boeing Company Systems and methods for 3d cluster recognition for relative tracking
US20190129038A1 (en) * 2017-10-27 2019-05-02 Osram Opto Semiconductors Gmbh Monitoring System for a Mobile Device and Method for Monitoring Surroundings of a Mobile Device
WO2019096369A1 (en) * 2017-11-14 2019-05-23 Alterface Holdings Tracking of a user device
US10306156B2 (en) * 2015-11-30 2019-05-28 Photopotech LLC Image-capture device
US10362296B2 (en) * 2017-08-17 2019-07-23 Microsoft Technology Licensing, Llc Localized depth map generation
US10386468B2 (en) * 2015-03-05 2019-08-20 Hanwha Techwin Co., Ltd. Photographing apparatus and method
US20190347399A1 (en) * 2018-05-09 2019-11-14 Shape Matrix Geometric Instruments, LLC Methods and Apparatus for Encoding Passwords or Other Information
US10482575B2 (en) * 2017-09-28 2019-11-19 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
US20190392601A1 (en) * 2018-02-23 2019-12-26 Librestream Technologies Inc. Image Processing System for Inspecting Object Distance and Dimensions Using a Hand-Held Camera with a Collimated Laser
US20200096640A1 (en) * 2018-09-26 2020-03-26 Qualcomm Incorporated Multi-phase active light depth system
US20200132547A1 (en) * 2018-10-30 2020-04-30 Variable, Inc. System and method for spectral interpolation using multiple illumination sources
CN111200709A (en) * 2019-01-08 2020-05-26 英属开曼群岛商麦迪创科技股份有限公司 Method for setting light source of camera system, camera system and vehicle
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US20200209984A1 (en) * 2018-12-28 2020-07-02 Texas Instruments Incorporated Optical positioning systems and methods
US20200271473A1 (en) * 2018-02-14 2020-08-27 Tusimple, Inc. Lane Marking Localization
US10778877B2 (en) 2015-11-30 2020-09-15 Photopotech LLC Image-capture device
WO2020214201A1 (en) 2019-04-17 2020-10-22 Carnegie Mellon University Agile depth sensing using triangulation light curtains
US20200408883A1 (en) * 2017-07-11 2020-12-31 Robert Bosch Gmbh Lidar device for situation-dependent scanning of solid angles
US10885810B2 (en) 2014-07-11 2021-01-05 Shape Matrix Geometric Instruments, LLC Shape-matrix geometric instrument
US20210052329A1 (en) * 2018-03-30 2021-02-25 Koninklijke Philips N.V. Monitoring of moving objects in an operation room
US10970373B2 (en) * 2018-08-06 2021-04-06 Lg Electronics Inc. Mobile terminal and method for controlling the same
WO2021091105A1 (en) 2019-11-06 2021-05-14 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
US11009356B2 (en) * 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization and fusion
WO2021146117A1 (en) * 2020-01-13 2021-07-22 Sony Interactive Entertainment Inc. Event driven sensor (eds) tracking of light emitting diode (led) array
US11081516B2 (en) * 2018-08-10 2021-08-03 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Display screen, electronic device and method for three-dimensional feature recognition
US11093031B2 (en) * 2017-06-28 2021-08-17 Trumpf Photonic Components Gmbh Display apparatus for computer-mediated reality
US11099653B2 (en) 2013-04-26 2021-08-24 Ultrahaptics IP Two Limited Machine responsiveness to dynamic user movements and gestures
US20210286455A1 (en) * 2020-03-12 2021-09-16 Beijing Xiaomi Mobile Software Co., Ltd. Electronic equipment, method for controlling electronic equipment, and storage medium
US11143749B2 (en) * 2014-05-23 2021-10-12 Signify Holding B.V. Object detection system and method
US11212456B2 (en) * 2018-12-21 2021-12-28 Sony Group Corporation Synchronized projection and image capture
US11217009B2 (en) 2015-11-30 2022-01-04 Photopotech LLC Methods for collecting and processing image information to produce digital assets
US11282288B2 (en) 2019-11-20 2022-03-22 Shape Matrix Geometric Instruments, LLC Methods and apparatus for encoding data in notched shapes
US20220113380A1 (en) * 2020-10-14 2022-04-14 Argo AI, LLC Multi-detector lidar systems and methods for mitigating range aliasing
US11353962B2 (en) 2013-01-15 2022-06-07 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US20220198669A1 (en) * 2019-04-02 2022-06-23 Koninklijke Philips N.V. Segmentation and view guidance in ultrasound imaging and associated devices, systems, and methods
US11412133B1 (en) * 2020-06-26 2022-08-09 Amazon Technologies, Inc. Autonomously motile device with computer vision
US11493998B2 (en) 2012-01-17 2022-11-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US11567578B2 (en) 2013-08-09 2023-01-31 Ultrahaptics IP Two Limited Systems and methods of free-space gestural interaction
US11573095B2 (en) 2017-08-22 2023-02-07 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
US11654375B2 (en) * 2019-08-07 2023-05-23 Universal City Studios Llc Systems and methods for detecting specular surfaces
US20230230251A1 (en) * 2019-11-14 2023-07-20 Samsung Electronics Co., Ltd. Image processing apparatus and method
US11709364B1 (en) 2019-05-22 2023-07-25 Meta Platforms Technologies, Llc Addressable crossed line projector for depth camera assembly
US11733524B1 (en) * 2018-02-01 2023-08-22 Meta Platforms Technologies, Llc Depth camera assembly based on near infra-red illuminator
US11740705B2 (en) 2013-01-15 2023-08-29 Ultrahaptics IP Two Limited Method and system for controlling a machine according to a characteristic of a control object
US11747135B2 (en) 2015-02-13 2023-09-05 Carnegie Mellon University Energy optimized imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor
US11753003B2 (en) * 2017-04-13 2023-09-12 Zoox, Inc. Surface normal determination for LIDAR range samples by detecting probe pulse stretching
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US11810322B2 (en) 2020-04-09 2023-11-07 Tusimple, Inc. Camera pose estimation techniques

Families Citing this family (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2208354A4 (en) 2007-10-10 2010-12-22 Gerard Dirk Smits Image projector with reflected light tracking
US20140168372A1 (en) * 2012-12-17 2014-06-19 Eminent Electronic Technology Corp. Ltd. Sensing apparatus and sensing method for generating three-dimensional image information
US9721383B1 (en) 2013-08-29 2017-08-01 Leap Motion, Inc. Predictive information for free space gesture control and communication
US9996638B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Predictive information for free space gesture control and communication
JP6214783B2 (en) * 2014-02-26 2017-10-18 フィリップス ライティング ホールディング ビー ヴィ Estimating the position of the light source of a lighting fixture from the light reach
US9810913B2 (en) 2014-03-28 2017-11-07 Gerard Dirk Smits Smart head-mounted projection system
US9785247B1 (en) 2014-05-14 2017-10-10 Leap Motion, Inc. Systems and methods of tracking moving hands and recognizing gestural interactions
DE102014109687B4 (en) * 2014-07-10 2020-03-19 Carl Zeiss Microscopy Gmbh Position determination of an object in the beam path of an optical device
US9377533B2 (en) 2014-08-11 2016-06-28 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
KR102062840B1 (en) * 2014-10-31 2020-02-11 삼성전자주식회사 APPARATUS FOR DETECTING POSITION OF OBJECT using Binocular Parallax AND OPERATING METHOD THEREOF
TWI524758B (en) 2014-12-09 2016-03-01 財團法人工業技術研究院 Electronic apparatus and method for incremental pose estimation and photographing thereof
US10043282B2 (en) 2015-04-13 2018-08-07 Gerard Dirk Smits Machine vision for ego-motion, segmenting, and classifying objects
DE102015107517B3 (en) 2015-05-13 2016-06-23 Carl Zeiss Ag Apparatus and method for image acquisition with increased depth of field
EP3095710B1 (en) 2015-05-20 2018-01-03 Goodrich Lighting Systems GmbH Dynamic exterior aircraft light unit and method of operating a dynamic exterior aircraft light unit
US10368414B2 (en) 2015-05-26 2019-07-30 Signify Holding B.V. Determining the position of a portable device relative to a luminaire
KR102311688B1 (en) 2015-06-17 2021-10-12 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9753126B2 (en) 2015-12-18 2017-09-05 Gerard Dirk Smits Real time position sensing of objects
US9813673B2 (en) 2016-01-20 2017-11-07 Gerard Dirk Smits Holographic video capture and telepresence system
US10761195B2 (en) 2016-04-22 2020-09-01 OPSYS Tech Ltd. Multi-wavelength LIDAR system
DE102016116311A1 (en) 2016-05-02 2017-11-02 Carl Zeiss Microscopy Gmbh Angle selective lighting
JP2018031607A (en) 2016-08-23 2018-03-01 ソニーセミコンダクタソリューションズ株式会社 Distance measuring device, electronic device, and method for controlling distance measuring device
US10067230B2 (en) 2016-10-31 2018-09-04 Gerard Dirk Smits Fast scanning LIDAR with dynamic voxel probing
WO2018125850A1 (en) * 2016-12-27 2018-07-05 Gerard Dirk Smits Systems and methods for machine perception
CN108460824B (en) * 2017-02-20 2024-04-02 北京三星通信技术研究有限公司 Method, device and system for determining stereoscopic multimedia information
US10445893B2 (en) * 2017-03-10 2019-10-15 Microsoft Technology Licensing, Llc Dot-based time of flight
KR102592139B1 (en) 2017-03-13 2023-10-23 옵시스 테크 엘티디 Eye-Safe Scanning LIDAR System
US10511828B2 (en) * 2017-03-29 2019-12-17 Intel Corporation Camera platforms with rolling light projection
WO2018209096A2 (en) 2017-05-10 2018-11-15 Gerard Dirk Smits Scan mirror systems and methods
US11048329B1 (en) 2017-07-27 2021-06-29 Emerge Now Inc. Mid-air ultrasonic haptic interface for immersive computing environments
KR102218679B1 (en) 2017-07-28 2021-02-23 옵시스 테크 엘티디 VCSEL Array LIDAR Transmitter with Small Angle Divergence
US10871570B2 (en) 2017-09-14 2020-12-22 Everysight Ltd. System and method for position and orientation tracking
CN111164459A (en) * 2017-09-28 2020-05-15 索尼半导体解决方案公司 Apparatus and method
WO2019079750A1 (en) 2017-10-19 2019-04-25 Gerard Dirk Smits Methods and systems for navigating a vehicle including a novel fiducial marker system
JP7388720B2 (en) 2017-11-15 2023-11-29 オプシス テック リミテッド Noise-adaptive solid-state LIDAR system
WO2019148214A1 (en) 2018-01-29 2019-08-01 Gerard Dirk Smits Hyper-resolved, high bandwidth scanned lidar systems
CN111919137A (en) 2018-04-01 2020-11-10 欧普赛斯技术有限公司 Noise adaptive solid state LIDAR system
KR102622714B1 (en) * 2018-04-08 2024-01-08 디티에스, 인코포레이티드 Ambisonic depth extraction
US20200041614A1 (en) * 2018-08-03 2020-02-06 OPSYS Tech Ltd. Distributed Modular Solid-State LIDAR System
US11158074B1 (en) 2018-10-02 2021-10-26 Facebook Technologies, Llc Depth sensing using temporal coding
US10901092B1 (en) 2018-10-02 2021-01-26 Facebook Technologies, Llc Depth sensing using dynamic illumination with range extension
US10896516B1 (en) 2018-10-02 2021-01-19 Facebook Technologies, Llc Low-power depth sensing using dynamic illumination
US10839536B2 (en) * 2018-10-02 2020-11-17 Facebook Technologies, Llc Depth sensing using grid light patterns
WO2020161871A1 (en) * 2019-02-07 2020-08-13 マクセル株式会社 Composite reception/emission apparatus
CN109872345B (en) * 2019-02-27 2022-08-26 中国科学院光电技术研究所 Single target tracking method under dark background
JP2022526998A (en) 2019-04-09 2022-05-27 オプシス テック リミテッド Solid-state LIDAR transmitter with laser control
EP3977159A4 (en) 2019-05-30 2023-03-01 Opsys Tech Ltd. Eye-safe long-range lidar system using actuator
CN113924506A (en) 2019-06-10 2022-01-11 欧普赛斯技术有限公司 Eye-safe long-range solid-state LIDAR system
CN111104641B (en) * 2019-12-10 2023-07-21 重庆大学 Method for identifying crystal grains by computer in three-dimensional space
US11126267B2 (en) 2019-12-19 2021-09-21 Giantplus Technology Co., Ltd Tactile feedback device and operation method thereof
CN110978058B (en) * 2019-12-24 2022-10-11 复旦大学 Pose measurement and kinematics model correction method suitable for industrial robot
US11635802B2 (en) * 2020-01-13 2023-04-25 Sony Interactive Entertainment Inc. Combined light intensity based CMOS and event detection sensor for high speed predictive tracking and latency compensation in virtual and augmented reality HMD systems
WO2021174227A1 (en) 2020-02-27 2021-09-02 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
US11508077B2 (en) 2020-05-18 2022-11-22 Samsung Electronics Co., Ltd. Method and apparatus with moving object detection
US11474690B2 (en) * 2020-08-14 2022-10-18 VTouch Co., Ltd. Method, system and non-transitory computer-readable recording medium for non-contact control
US11450103B2 (en) * 2020-10-05 2022-09-20 Crazing Lab, Inc. Vision based light detection and ranging system using dynamic vision sensor

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8432377B2 (en) * 2007-08-30 2013-04-30 Next Holdings Limited Optical touchscreen with improved illumination
US20130271397A1 (en) * 2012-04-16 2013-10-17 Qualcomm Incorporated Rapid gesture re-engagement
US20140002365A1 (en) * 2012-06-28 2014-01-02 Intermec Ip Corp. Dual screen display for mobile computing device
US20150193669A1 (en) * 2011-11-21 2015-07-09 Pixart Imaging Inc. System and method based on hybrid biometric detection

Family Cites Families (453)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2665041A (en) 1952-01-09 1954-01-05 Daniel J Maffucci Combination form for washing woolen socks
US4175862A (en) 1975-08-27 1979-11-27 Solid Photography Inc. Arrangement for sensing the geometric characteristics of an object
US4894551A (en) * 1987-06-05 1990-01-16 Anima Corporation Sectional form measuring apparatus
US4879659A (en) 1987-11-24 1989-11-07 Bowlin William P Log processing systems
US4876455A (en) * 1988-02-25 1989-10-24 Westinghouse Electric Corp. Fiber optic solder joint inspection system
US4893223A (en) * 1989-01-10 1990-01-09 Northern Telecom Limited Illumination devices for inspection systems
DE8915535U1 (en) * 1989-03-02 1990-10-25 Fa. Carl Zeiss, 7920 Heidenheim, De
JPH076782B2 (en) 1989-03-10 1995-01-30 工業技術院長 Object shape measuring method and apparatus
US5134661A (en) 1991-03-04 1992-07-28 Reinsch Roger A Method of capture and analysis of digitized image data
US5282067A (en) 1991-10-07 1994-01-25 California Institute Of Technology Self-amplified optical pattern recognition system
DE4201934A1 (en) 1992-01-24 1993-07-29 Siemens Ag Interactive computer system e.g. mouse with hand gesture controlled operation - has 2 or 3 dimensional user surface that allows one or two hand input control of computer
US6184326B1 (en) 1992-03-20 2001-02-06 Fina Technology, Inc. Syndiotactic polypropylene
US7983817B2 (en) 1995-06-07 2011-07-19 Automotive Technologies Internatinoal, Inc. Method and arrangement for obtaining information about vehicle occupants
JP3244798B2 (en) 1992-09-08 2002-01-07 株式会社東芝 Moving image processing device
WO1994017636A1 (en) 1993-01-29 1994-08-04 Bell Communications Research, Inc. Automatic tracking camera control system
WO1994026057A1 (en) 1993-04-29 1994-11-10 Scientific Generics Limited Background separation for still and moving images
US5454043A (en) 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
JPH0795561A (en) 1993-09-21 1995-04-07 Sony Corp Displayed object explanation system
US5659475A (en) 1994-03-17 1997-08-19 Brown; Daniel M. Electronic air traffic control system for use in airport towers
US5594469A (en) 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5900863A (en) 1995-03-16 1999-05-04 Kabushiki Kaisha Toshiba Method and apparatus for controlling computer without touching input device
JP3737537B2 (en) 1995-03-22 2006-01-18 帝人ファイバー株式会社 Deterioration detection method for illumination means for image processing
IL114838A0 (en) 1995-08-04 1996-11-14 Spiegel Ehud Apparatus and method for object tracking
US5574511A (en) 1995-10-18 1996-11-12 Polaroid Corporation Background replacement for an image
US5742263A (en) 1995-12-18 1998-04-21 Telxon Corporation Head tracking system for a head mounted display system
JPH09259278A (en) 1996-03-25 1997-10-03 Matsushita Electric Ind Co Ltd Image processor
JP3662376B2 (en) * 1996-05-10 2005-06-22 浜松ホトニクス株式会社 Internal characteristic distribution measuring method and apparatus
US6002808A (en) 1996-07-26 1999-12-14 Mitsubishi Electric Information Technology Center America, Inc. Hand gesture control system
US6184926B1 (en) 1996-11-26 2001-02-06 Ncr Corporation System and method for detecting a human face in uncontrolled environments
JP3438855B2 (en) 1997-01-23 2003-08-18 横河電機株式会社 Confocal device
US6492986B1 (en) 1997-06-02 2002-12-10 The Trustees Of The University Of Pennsylvania Method for human face shape and motion estimation based on integrating optical flow and deformable models
US6075895A (en) 1997-06-20 2000-06-13 Holoplex Methods and apparatus for gesture recognition based on templates
US6252598B1 (en) 1997-07-03 2001-06-26 Lucent Technologies Inc. Video hand image computer interface
US6263091B1 (en) 1997-08-22 2001-07-17 International Business Machines Corporation System and method for identifying foreground and background portions of digitized images
US6720949B1 (en) 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
US6072494A (en) 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
US6195104B1 (en) 1997-12-23 2001-02-27 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6181343B1 (en) 1997-12-23 2001-01-30 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6031161A (en) 1998-02-04 2000-02-29 Dekalb Genetics Corporation Inbred corn plant GM9215 and seeds thereof
US6154558A (en) 1998-04-22 2000-11-28 Hsieh; Kuan-Hong Intention identification method
JP2000023038A (en) 1998-06-30 2000-01-21 Toshiba Corp Image extractor
US6493041B1 (en) 1998-06-30 2002-12-10 Sun Microsystems, Inc. Method and apparatus for the detection of motion in video
US7036094B1 (en) 1998-08-10 2006-04-25 Cybernet Systems Corporation Behavior recognition system
US6950534B2 (en) 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
JP4016526B2 (en) 1998-09-08 2007-12-05 富士ゼロックス株式会社 3D object identification device
US6501515B1 (en) 1998-10-13 2002-12-31 Sony Corporation Remote control system
US6594632B1 (en) 1998-11-02 2003-07-15 Ncr Corporation Methods and apparatus for hands-free operation of a voice recognition system
US7483049B2 (en) 1998-11-20 2009-01-27 Aman James A Optimizations for live event, real-time, 3D object tracking
WO2000034919A1 (en) 1998-12-04 2000-06-15 Interval Research Corporation Background estimation and segmentation based on range and color
US6204852B1 (en) 1998-12-09 2001-03-20 Lucent Technologies Inc. Video hand image three-dimensional computer interface
US6147678A (en) 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6578203B1 (en) 1999-03-08 2003-06-10 Tazwell L. Anderson, Jr. Audio/video signal distribution system for head mounted displays
WO2000070558A1 (en) 1999-05-18 2000-11-23 Sanyo Electric Co., Ltd. Dynamic image processing method and device and medium
US6804656B1 (en) 1999-06-23 2004-10-12 Visicu, Inc. System and method for providing continuous, expert network critical care services from a remote location(s)
US6346933B1 (en) 1999-09-21 2002-02-12 Seiko Epson Corporation Interactive display presentation system
US6734911B1 (en) 1999-09-30 2004-05-11 Koninklijke Philips Electronics N.V. Tracking camera using a lens that generates both wide-angle and narrow-angle views
JP4332964B2 (en) 1999-12-21 2009-09-16 ソニー株式会社 Information input / output system and information input / output method
US6738424B1 (en) 1999-12-27 2004-05-18 Objectvideo, Inc. Scene model generation from video for use in video processing
US6771294B1 (en) 1999-12-29 2004-08-03 Petri Pulli User interface
GB2358098A (en) 2000-01-06 2001-07-11 Sharp Kk Method of segmenting a pixelled image
US6674877B1 (en) 2000-02-03 2004-01-06 Microsoft Corporation System and method for visually tracking occluded objects in real time
US6965113B2 (en) 2000-02-10 2005-11-15 Evotec Ag Fluorescence intensity multiple distributions analysis: concurrent determination of diffusion times and molecular brightness
US6798628B1 (en) 2000-11-17 2004-09-28 Pass & Seymour, Inc. Arc fault circuit detector having two arc fault detection levels
US6463402B1 (en) 2000-03-06 2002-10-08 Ralph W. Bennett Infeed log scanning for lumber optimization
CA2406124A1 (en) 2000-04-21 2001-11-22 Lawrence E. Albertelli Wide-field extended-depth doubly telecentric catadioptric optical system for digital imaging
US6417970B1 (en) 2000-06-08 2002-07-09 Interactive Imaging Systems Two stage optical system for head mounted display
JP4040825B2 (en) * 2000-06-12 2008-01-30 富士フイルム株式会社 Image capturing apparatus and distance measuring method
JP5042437B2 (en) 2000-07-05 2012-10-03 スマート テクノロジーズ ユーエルシー Camera-based touch system
US7227526B2 (en) 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
US6850872B1 (en) 2000-08-30 2005-02-01 Microsoft Corporation Facial image processing methods and systems
US6901170B1 (en) 2000-09-05 2005-05-31 Fuji Xerox Co., Ltd. Image processing device and recording medium
US20020105484A1 (en) 2000-09-25 2002-08-08 Nassir Navab System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality
JP4483067B2 (en) 2000-10-24 2010-06-16 沖電気工業株式会社 Target object extraction image processing device
US8042740B2 (en) 2000-11-24 2011-10-25 Metrologic Instruments, Inc. Method of reading bar code symbols on objects at a point-of-sale station by passing said objects through a complex of stationary coplanar illumination and imaging planes projected into a 3D imaging volume
US6774869B2 (en) 2000-12-22 2004-08-10 Board Of Trustees Operating Michigan State University Teleportal face-to-face system
WO2002071333A2 (en) 2001-03-08 2002-09-12 Université Joseph Fourier Quantitative analysis, visualization and movement correction in dynamic processes
US7542586B2 (en) 2001-03-13 2009-06-02 Johnson Raymond C Touchless identification system for monitoring hand washing or application of a disinfectant
DE50207722D1 (en) 2001-03-20 2006-09-14 Thomson Licensing ELEMENT FOR COMBINED SYMMETRIZATION AND HOMOGENIZATION OF A RADIATION BUNDLE
US6814656B2 (en) 2001-03-20 2004-11-09 Luis J. Rodriguez Surface treatment disks for rotary tools
US7009773B2 (en) 2001-05-23 2006-03-07 Research Foundation Of The University Of Central Florida, Inc. Compact microlenslet arrays imager
US6919880B2 (en) 2001-06-01 2005-07-19 Smart Technologies Inc. Calibrating camera offsets to facilitate object position determination using triangulation
US8035612B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US20030053659A1 (en) 2001-06-29 2003-03-20 Honeywell International Inc. Moving object assessment system and method
US20030053658A1 (en) 2001-06-29 2003-03-20 Honeywell International Inc. Surveillance system and methods regarding same
US20030123703A1 (en) 2001-06-29 2003-07-03 Honeywell International Inc. Method for monitoring a moving object and system regarding same
US20040125228A1 (en) 2001-07-25 2004-07-01 Robert Dougherty Apparatus and method for determining the range of remote objects
US6999126B2 (en) 2001-09-17 2006-02-14 Mazzapica C Douglas Method of eliminating hot spot in digital photograph
US9052777B2 (en) 2001-11-02 2015-06-09 Neonode Inc. Optical elements with alternating reflective lens facets
US7213707B2 (en) 2001-12-11 2007-05-08 Walgreen Co. Product shipping and display carton
US6804654B2 (en) 2002-02-11 2004-10-12 Telemanager Technologies, Inc. System and method for providing prescription services using voice recognition
US7215828B2 (en) 2002-02-13 2007-05-08 Eastman Kodak Company Method and system for determining image orientation
WO2003071410A2 (en) 2002-02-15 2003-08-28 Canesta, Inc. Gesture recognition system using depth perceptive sensors
JP2003256814A (en) 2002-02-27 2003-09-12 Olympus Optical Co Ltd Substrate checking device
AU2003213780B2 (en) 2002-03-08 2008-03-13 Quantum Interface, Llc Electric device control apparatus
DE10213643A1 (en) 2002-03-27 2003-10-09 Geka Brush Gmbh cosmetics unit
US7120297B2 (en) 2002-04-25 2006-10-10 Microsoft Corporation Segmented layered image system
US20030210262A1 (en) 2002-05-10 2003-11-13 Tripath Imaging, Inc. Video microscopy system and multi-view virtual slide viewer capable of simultaneously acquiring and displaying various digital views of an area of interest located on a microscopic slide
US7646372B2 (en) 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US8570378B2 (en) 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US7760248B2 (en) 2002-07-27 2010-07-20 Sony Computer Entertainment Inc. Selective sound source listening in conjunction with computer interactive processing
US7046924B2 (en) 2002-11-25 2006-05-16 Eastman Kodak Company Method and computer program product for determining an area of importance in an image using eye monitoring information
US7400344B2 (en) 2002-12-19 2008-07-15 Hitachi Kokusai Electric Inc. Object tracking method and object tracking apparatus
GB2398469B (en) 2003-02-12 2005-10-26 Canon Europa Nv Image processing apparatus
JP2004246252A (en) 2003-02-17 2004-09-02 Takenaka Komuten Co Ltd Apparatus and method for collecting image information
US7257237B1 (en) 2003-03-07 2007-08-14 Sandia Corporation Real time markerless motion tracking using linked kinematic chains
US7532206B2 (en) 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
US7738725B2 (en) 2003-03-19 2010-06-15 Mitsubishi Electric Research Laboratories, Inc. Stylized rendering using a multi-flash camera
US7665041B2 (en) 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
DE602004006190T8 (en) 2003-03-31 2008-04-10 Honda Motor Co., Ltd. Device, method and program for gesture recognition
JP2007501942A (en) 2003-05-19 2007-02-01 マイクロ−エプシロン・メステヒニク・ゲーエムベーハー・ウント・コンパニー・カー・ゲー Optical test method and optical test apparatus for optically controlling the quality of an object preferably having a circular edge
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
DE10326035B4 (en) 2003-06-10 2005-12-22 Hema Electronic Gmbh Method for adaptive error detection on a structured surface
DE602004032540D1 (en) 2003-06-17 2011-06-16 Univ Brown METHOD AND DEVICE FOR MODEL-BASED DETECTION OF A STRUCTURE IN PROJECTION DATA
US7606417B2 (en) 2004-08-16 2009-10-20 Fotonation Vision Limited Foreground/background segmentation in digital images with differential exposure calculations
US7244233B2 (en) 2003-07-29 2007-07-17 Ntd Laboratories, Inc. System and method for utilizing shape analysis to assess fetal abnormality
US7633633B2 (en) 2003-08-29 2009-12-15 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Position determination that is responsive to a retro-reflective object
WO2005041579A2 (en) 2003-10-24 2005-05-06 Reactrix Systems, Inc. Method and system for processing captured image information in an interactive video display system
GB2407635B (en) 2003-10-31 2006-07-12 Hewlett Packard Development Co Improvements in and relating to camera control
JP4162095B2 (en) 2003-12-11 2008-10-08 ストライダー ラブス,インコーポレイテッド A technique for predicting the surface of a shielded part by calculating symmetry.
US7217913B2 (en) * 2003-12-18 2007-05-15 Micron Technology, Inc. Method and system for wavelength-dependent imaging and detection using a hybrid filter
US7663689B2 (en) 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
US7184022B2 (en) 2004-01-16 2007-02-27 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Position determination and motion tracking
US7707039B2 (en) 2004-02-15 2010-04-27 Exbiblio B.V. Automatic modification of web pages
US8872914B2 (en) 2004-02-04 2014-10-28 Acushnet Company One camera stereo system
US7812860B2 (en) 2004-04-01 2010-10-12 Exbiblio B.V. Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
CN100450147C (en) 2004-02-18 2009-01-07 松下电器产业株式会社 Method and device of image correction
JP4419603B2 (en) 2004-02-25 2010-02-24 日本電気株式会社 Driving method of liquid crystal display device
DE102004015785B4 (en) 2004-03-25 2012-06-06 Sikora Ag Method for determining the dimension of a cross-section of a flat cable or a sector conductor
WO2005104010A2 (en) 2004-04-15 2005-11-03 Gesture Tek, Inc. Tracking bimanual movements
JP4751032B2 (en) 2004-04-22 2011-08-17 株式会社森精機製作所 Displacement detector
US7308112B2 (en) 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
JP2008505383A (en) 2004-06-29 2008-02-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Personal gesture signature
JP4916096B2 (en) 2004-07-01 2012-04-11 イビデン株式会社 Optical communication device
US20060017720A1 (en) 2004-07-15 2006-01-26 Li You F System and method for 3D measurement and surface reconstruction
US7576767B2 (en) 2004-07-26 2009-08-18 Geo Semiconductors Inc. Panoramic vision system and method
US7542040B2 (en) 2004-08-11 2009-06-02 The United States Of America As Represented By The Secretary Of The Navy Simulated locomotion method and apparatus
EP1645944B1 (en) 2004-10-05 2012-08-15 Sony France S.A. A content-management interface
US7706571B2 (en) 2004-10-13 2010-04-27 Sarnoff Corporation Flexible layer tracking with weak online appearance model
GB2419433A (en) 2004-10-20 2006-04-26 Glasgow School Of Art Automated Gesture Recognition
WO2006057768A2 (en) 2004-11-24 2006-06-01 Battelle Memorial Institute Optical system for cell imaging
CN101198964A (en) 2005-01-07 2008-06-11 格斯图尔泰克股份有限公司 Creating 3D images of objects by illuminating with infrared patterns
US7598942B2 (en) 2005-02-08 2009-10-06 Oblong Industries, Inc. System and method for gesture based control system
CN101208723A (en) 2005-02-23 2008-06-25 克雷格·萨默斯 Automatic scene modeling for the 3D camera and 3D video
US7715589B2 (en) 2005-03-07 2010-05-11 Massachusetts Institute Of Technology Occluding contour detection and storage for digital photography
JP4678487B2 (en) 2005-03-15 2011-04-27 オムロン株式会社 Image processing system, image processing apparatus and method, recording medium, and program
US8185176B2 (en) 2005-04-26 2012-05-22 Novadaq Technologies, Inc. Method and apparatus for vasculature visualization with applications in neurosurgery and neurology
US20060239921A1 (en) 2005-04-26 2006-10-26 Novadaq Technologies Inc. Real time vascular imaging during solid organ transplant
US20090309710A1 (en) 2005-04-28 2009-12-17 Aisin Seiki Kabushiki Kaisha Vehicle Vicinity Monitoring System
JP2006323212A (en) 2005-05-19 2006-11-30 Konica Minolta Photo Imaging Inc Lens unit and imaging apparatus having the same
US7613363B2 (en) 2005-06-23 2009-11-03 Microsoft Corp. Image superresolution through edge extraction and contrast enhancement
KR101334081B1 (en) 2005-07-08 2013-11-29 일렉트로 싸이언티픽 인더스트리이즈 인코포레이티드 Achieving convergent light rays emitted by planar array of light sources
US20070018966A1 (en) 2005-07-25 2007-01-25 Blythe Michael M Predicted object location
CA2622744C (en) 2005-09-16 2014-09-16 Flixor, Inc. Personalizing a video
US8057408B2 (en) 2005-09-22 2011-11-15 The Regents Of The University Of Michigan Pulsed cavitational ultrasound therapy
DE102005047160B4 (en) 2005-09-30 2007-06-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus, methods and computer program for determining information about a shape and / or a position of an ellipse in a graphic image
US7643158B2 (en) * 2005-10-04 2010-01-05 Motion Analysis Corporation Method for synchronizing the operation of multiple devices for generating three dimensional surface models of moving objects
US8050461B2 (en) 2005-10-11 2011-11-01 Primesense Ltd. Depth-varying light fields for three dimensional sensing
US9046962B2 (en) 2005-10-31 2015-06-02 Extreme Reality Ltd. Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region
US7570732B2 (en) 2005-11-09 2009-08-04 Dexela Limited Methods and apparatus for obtaining low-dose imaging
US20070130547A1 (en) 2005-12-01 2007-06-07 Navisense, Llc Method and system for touchless user interface control
CN100502463C (en) 2005-12-14 2009-06-17 浙江工业大学 Method for collecting characteristics in telecommunication flow information video detection
DE102005061557B3 (en) 2005-12-22 2007-11-22 Siemens Ag Imaging apparatus and method for operating an imaging device
US7466790B2 (en) 2006-03-02 2008-12-16 General Electric Company Systems and methods for improving a resolution of an image
US7834780B2 (en) 2006-03-20 2010-11-16 Tektronix, Inc. Waveform compression and display
JP4797752B2 (en) 2006-03-31 2011-10-19 株式会社デンソー Operating object extraction device for moving objects
JP2007271876A (en) 2006-03-31 2007-10-18 Denso Corp Speech recognizer and program for speech recognition
EP2030171A1 (en) 2006-04-10 2009-03-04 Avaworks Incorporated Do-it-yourself photo realistic talking head creation system and method
US20080018598A1 (en) 2006-05-16 2008-01-24 Marsden Randal J Hands-free computer access for medical and dentistry applications
US7676169B2 (en) 2006-05-22 2010-03-09 Lexmark International, Inc. Multipath toner patch sensor for use in an image forming device
US8086971B2 (en) 2006-06-28 2011-12-27 Nokia Corporation Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
EP1879149B1 (en) 2006-07-10 2016-03-16 Fondazione Bruno Kessler method and apparatus for tracking a number of objects or object parts in image sequences
US8180114B2 (en) 2006-07-13 2012-05-15 Northrop Grumman Systems Corporation Gesture recognition interface system with vertical display
US8589824B2 (en) * 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
US9696808B2 (en) 2006-07-13 2017-07-04 Northrop Grumman Systems Corporation Hand-gesture recognition method
US20080030429A1 (en) 2006-08-07 2008-02-07 International Business Machines Corporation System and method of enhanced virtual reality
US20080064954A1 (en) 2006-08-24 2008-03-13 Baylor College Of Medicine Method of measuring propulsion in lymphatic structures
KR100783552B1 (en) 2006-10-11 2007-12-07 삼성전자주식회사 Input control method and device for mobile phone
US20110025818A1 (en) 2006-11-07 2011-02-03 Jonathan Gallmeier System and Method for Controlling Presentations and Videoconferences Using Hand Motions
US8102465B2 (en) 2006-11-07 2012-01-24 Fujifilm Corporation Photographing apparatus and photographing method for photographing an image by controlling light irradiation on a subject
EP2081494B1 (en) 2006-11-16 2018-07-11 Vanderbilt University System and method of compensating for organ deformation
US7605686B2 (en) 2006-11-16 2009-10-20 Motorola, Inc. Alerting system for a communication device
EP2100454B1 (en) 2006-11-20 2019-10-30 Axis AB Wireless network camera systems
JP5073273B2 (en) * 2006-11-21 2012-11-14 スタンレー電気株式会社 Perspective determination method and apparatus
SE0602545L (en) 2006-11-29 2008-05-30 Tobii Technology Ab Eye tracking illumination
US7840031B2 (en) 2007-01-12 2010-11-23 International Business Machines Corporation Tracking a range of body movement based on 3D captured image streams of a user
US7971156B2 (en) 2007-01-12 2011-06-28 International Business Machines Corporation Controlling resource access based on user gesturing in a 3D captured image stream of the user
WO2008087652A2 (en) * 2007-01-21 2008-07-24 Prime Sense Ltd. Depth mapping using multi-beam illumination
KR20080073933A (en) 2007-02-07 2008-08-12 삼성전자주식회사 Object tracking method and apparatus, and object pose information calculating method and apparatus
CA2717485A1 (en) 2007-03-02 2008-09-12 Organic Motion System and method for tracking three dimensional objects
JP2008227569A (en) 2007-03-08 2008-09-25 Seiko Epson Corp Photographing device, electronic device, photography control method and photography control program
JP4605170B2 (en) 2007-03-23 2011-01-05 株式会社デンソー Operation input device
JP2008250774A (en) 2007-03-30 2008-10-16 Denso Corp Information equipment operation device
TWI433052B (en) 2007-04-02 2014-04-01 Primesense Ltd Depth mapping using projected patterns
US20100201880A1 (en) 2007-04-13 2010-08-12 Pioneer Corporation Shot size identifying apparatus and method, electronic apparatus, and computer program
JP4854582B2 (en) 2007-04-25 2012-01-18 キヤノン株式会社 Image processing apparatus and image processing method
US20080291160A1 (en) 2007-05-09 2008-11-27 Nintendo Co., Ltd. System and method for recognizing multi-axis gestures based on handheld controller accelerometer outputs
US20080278589A1 (en) 2007-05-11 2008-11-13 Karl Ola Thorn Methods for identifying a target subject to automatically focus a digital camera and related systems, and computer program products
US8229134B2 (en) 2007-05-24 2012-07-24 University Of Maryland Audio camera using microphone arrays for real time capture of audio images and method for jointly processing the audio images with video images
US7940985B2 (en) 2007-06-06 2011-05-10 Microsoft Corporation Salient object detection
US20090002489A1 (en) 2007-06-29 2009-01-01 Fuji Xerox Co., Ltd. Efficient tracking multiple objects through occlusion
JP2009031939A (en) 2007-07-25 2009-02-12 Advanced Telecommunication Research Institute International Image processing apparatus, method and program
JP4929109B2 (en) 2007-09-25 2012-05-09 株式会社東芝 Gesture recognition apparatus and method
US8144233B2 (en) 2007-10-03 2012-03-27 Sony Corporation Display control device, display control method, and display control program for superimposing images to create a composite image
US20090093307A1 (en) 2007-10-08 2009-04-09 Sony Computer Entertainment America Inc. Enhanced game controller
US8139110B2 (en) 2007-11-01 2012-03-20 Northrop Grumman Systems Corporation Calibration of a gesture recognition interface system
US8288968B2 (en) 2007-11-08 2012-10-16 Lite-On It Corporation Lighting system arranged with multiple light units where each of adjacent light units having light beams overlap each other
US8777875B2 (en) 2008-07-23 2014-07-15 Otismed Corporation System and method for manufacturing arthroplasty jigs having improved mating accuracy
WO2009085233A2 (en) 2007-12-21 2009-07-09 21Ct, Inc. System and method for visually tracking with occlusions
US20120204133A1 (en) 2009-01-13 2012-08-09 Primesense Ltd. Gesture-Based User Interface
US8319832B2 (en) 2008-01-31 2012-11-27 Denso Corporation Input apparatus and imaging apparatus
US8270669B2 (en) 2008-02-06 2012-09-18 Denso Corporation Apparatus for extracting operating object and apparatus for projecting operating hand
US8555207B2 (en) 2008-02-27 2013-10-08 Qualcomm Incorporated Enhanced input using recognized gestures
CA2714534C (en) 2008-02-28 2018-03-20 Kenneth Perlin Method and apparatus for providing input to a processor, and a sensor pad
DE102008000479A1 (en) 2008-03-03 2009-09-10 Amad - Mennekes Holding Gmbh & Co. Kg Plug-in device with strain relief
US8073203B2 (en) 2008-04-15 2011-12-06 Cyberlink Corp. Generating effects in a webcam application
US8970374B2 (en) * 2008-04-17 2015-03-03 Shilat Optronics Ltd Intrusion warning system
EP2283375B1 (en) 2008-04-18 2014-11-12 Eidgenössische Technische Hochschule (ETH) Travelling-wave nuclear magnetic resonance method
US8249345B2 (en) 2008-06-27 2012-08-21 Mako Surgical Corp. Automatic image segmentation using contour propagation
WO2010007662A1 (en) 2008-07-15 2010-01-21 イチカワ株式会社 Heat-resistant cushion material for forming press
US8131063B2 (en) 2008-07-16 2012-03-06 Seiko Epson Corporation Model-based object image processing
JP2010033367A (en) 2008-07-29 2010-02-12 Canon Inc Information processor and information processing method
US20100027845A1 (en) 2008-07-31 2010-02-04 Samsung Electronics Co., Ltd. System and method for motion detection based on object trajectory
DE102008040949B4 (en) 2008-08-01 2012-03-08 Sirona Dental Systems Gmbh Optical projection grating, measuring camera with optical projection grating and method for producing an optical projection grating
US8520979B2 (en) 2008-08-19 2013-08-27 Digimarc Corporation Methods and systems for content processing
TW201009650A (en) 2008-08-28 2010-03-01 Acer Inc Gesture guide system and method for controlling computer system by gesture
US20100053209A1 (en) 2008-08-29 2010-03-04 Siemens Medical Solutions Usa, Inc. System for Processing Medical Image data to Provide Vascular Function Information
US20100053151A1 (en) 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device
DE102008045387B4 (en) * 2008-09-02 2017-02-09 Carl Zeiss Ag Apparatus and method for measuring a surface
TWI425203B (en) 2008-09-03 2014-02-01 Univ Nat Central Apparatus for scanning hyper-spectral image and method thereof
JP4613994B2 (en) 2008-09-16 2011-01-19 ソニー株式会社 Dynamic estimation device, dynamic estimation method, program
WO2010032268A2 (en) 2008-09-19 2010-03-25 Avinash Saxena System and method for controlling graphical objects
US9030564B2 (en) 2008-10-10 2015-05-12 Qualcomm Incorporated Single camera tracker
CN101729808B (en) 2008-10-14 2012-03-28 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
WO2010045406A2 (en) 2008-10-15 2010-04-22 The Regents Of The University Of California Camera system with autonomous miniature camera and light source assembly and method for image enhancement
US8744122B2 (en) 2008-10-22 2014-06-03 Sri International System and method for object detection from a moving platform
CN201332447Y (en) 2008-10-22 2009-10-21 康佳集团股份有限公司 Television for controlling or operating game through gesture change
DE102008052928A1 (en) 2008-10-23 2010-05-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device, method and computer program for detecting a gesture in an image, and device, method and computer program for controlling a device
US20100121189A1 (en) 2008-11-12 2010-05-13 Sonosite, Inc. Systems and methods for image presentation for medical examination and interventional procedures
TW201020896A (en) 2008-11-19 2010-06-01 Nat Applied Res Laboratories Method of gesture control
US8502787B2 (en) 2008-11-26 2013-08-06 Panasonic Corporation System and method for differentiating between intended and unintended user input on a touchpad
EP2193825B1 (en) 2008-12-03 2017-03-22 Alcatel Lucent Mobile device for augmented reality applications
KR101215987B1 (en) 2008-12-22 2012-12-28 한국전자통신연구원 Apparatus for separating foreground from back ground and method thereof
US8289162B2 (en) 2008-12-22 2012-10-16 Wimm Labs, Inc. Gesture-based user interface for a wearable portable device
US8379987B2 (en) 2008-12-30 2013-02-19 Nokia Corporation Method, apparatus and computer program product for providing hand segmentation for gesture analysis
US8290208B2 (en) 2009-01-12 2012-10-16 Eastman Kodak Company Enhanced safety during laser projection
JP2012515966A (en) 2009-01-26 2012-07-12 ズッロ・テクノロジーズ・(2009)・リミテッド Device and method for monitoring the behavior of an object
US7996793B2 (en) 2009-01-30 2011-08-09 Microsoft Corporation Gesture recognizer system architecture
JP4771183B2 (en) 2009-01-30 2011-09-14 株式会社デンソー Operating device
US8624962B2 (en) 2009-02-02 2014-01-07 Ydreams—Informatica, S.A. Ydreams Systems and methods for simulating three-dimensional virtual interactions from two-dimensional camera images
EP2391972B1 (en) 2009-02-02 2015-05-27 Eyesight Mobile Technologies Ltd. System and method for object recognition and tracking in a video stream
US9569001B2 (en) 2009-02-03 2017-02-14 Massachusetts Institute Of Technology Wearable gestural interface
KR20110116201A (en) 2009-02-05 2011-10-25 디지맥 코포레이션 Television-based advertising and distribution of tv widgets for the cell phone
KR100992411B1 (en) 2009-02-06 2010-11-05 (주)실리콘화일 Image sensor capable of judging proximity of a subject
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
GB2467932A (en) 2009-02-19 2010-08-25 Sony Corp Image processing device and method
US20100216508A1 (en) 2009-02-23 2010-08-26 Augusta Technology, Inc. Systems and Methods for Driving an External Display Device Using a Mobile Phone Device
JP2010204730A (en) 2009-02-27 2010-09-16 Seiko Epson Corp System of controlling device in response to gesture
US20100245289A1 (en) * 2009-03-31 2010-09-30 Miroslav Svajda Apparatus and method for optical proximity sensing and touch input control
JP4840620B2 (en) 2009-04-30 2011-12-21 株式会社デンソー In-vehicle electronic device operation device
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
JP5522757B2 (en) 2009-05-12 2014-06-18 コーニンクレッカ フィリップス エヌ ヴェ Camera, system having camera, method of operating camera, and method of deconvolving recorded image
WO2010134512A1 (en) 2009-05-20 2010-11-25 株式会社 日立メディコ Medical image diagnosis device and region-of-interest setting method therefor
US8269175B2 (en) 2009-05-22 2012-09-18 Motorola Mobility Llc Electronic device with sensing assembly and method for detecting gestures of geometric shapes
TWI395483B (en) 2009-05-25 2013-05-01 Visionatics Inc Motion object detection method using adaptive background model and computer program product thereof
US8112719B2 (en) 2009-05-26 2012-02-07 Topseed Technology Corp. Method for controlling gesture-based remote control system
US20100302357A1 (en) 2009-05-26 2010-12-02 Che-Hao Hsu Gesture-based remote control system
JP2011010258A (en) 2009-05-27 2011-01-13 Seiko Epson Corp Image processing apparatus, image display system, and image extraction device
EP2436001B1 (en) * 2009-05-27 2018-02-14 Analog Devices, Inc. Position measurement systems using position sensitive detectors
US8009022B2 (en) 2009-05-29 2011-08-30 Microsoft Corporation Systems and methods for immersive interaction with virtual objects
US8693724B2 (en) 2009-05-29 2014-04-08 Microsoft Corporation Method and system implementing user-centric gesture control
US8418085B2 (en) 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US20100309097A1 (en) 2009-06-04 2010-12-09 Roni Raviv Head mounted 3d display
KR20100136649A (en) 2009-06-19 2010-12-29 삼성전자주식회사 Method for embodying user interface using a proximity sensor in potable terminal and apparatus thereof
CN101930610B (en) 2009-06-26 2012-05-02 思创影像科技股份有限公司 Method for detecting moving object by using adaptable background model
US20110007072A1 (en) 2009-07-09 2011-01-13 University Of Central Florida Research Foundation, Inc. Systems and methods for three-dimensionally modeling moving objects
US9131142B2 (en) 2009-07-17 2015-09-08 Nikon Corporation Focusing device and camera
JP5771913B2 (en) 2009-07-17 2015-09-02 株式会社ニコン Focus adjustment device and camera
KR20110010906A (en) 2009-07-27 2011-02-08 삼성전자주식회사 Apparatus and method for controlling of electronic machine using user interaction
US8428368B2 (en) 2009-07-31 2013-04-23 Echostar Technologies L.L.C. Systems and methods for hand gesture control of an electronic device
WO2011024193A2 (en) 2009-08-20 2011-03-03 Natarajan Kannan Electronically variable field of view (fov) infrared illuminator
JP5614014B2 (en) 2009-09-04 2014-10-29 ソニー株式会社 Information processing apparatus, display control method, and display control program
US8341558B2 (en) 2009-09-16 2012-12-25 Google Inc. Gesture recognition on computing device correlating input to a template
US8681124B2 (en) 2009-09-22 2014-03-25 Microsoft Corporation Method and system for recognition of user gesture interaction with passive surface video displays
US9507411B2 (en) 2009-09-22 2016-11-29 Facebook, Inc. Hand tracker for device with display
JP2011081453A (en) 2009-10-02 2011-04-21 Toshiba Corp Apparatus and method for reproducing video
US8547327B2 (en) 2009-10-07 2013-10-01 Qualcomm Incorporated Proximity object tracker
CA2777566C (en) 2009-10-13 2014-12-16 Recon Instruments Inc. Control systems and methods for head-mounted information systems
GB2474536B (en) 2009-10-13 2011-11-02 Pointgrab Ltd Computer vision gesture based control of a device
US9400548B2 (en) 2009-10-19 2016-07-26 Microsoft Technology Licensing, Llc Gesture personalization and profile roaming
KR101613555B1 (en) 2009-10-26 2016-04-19 엘지전자 주식회사 Mobile terminal
US20110107216A1 (en) 2009-11-03 2011-05-05 Qualcomm Incorporated Gesture-based user interface
US8843857B2 (en) 2009-11-19 2014-09-23 Microsoft Corporation Distance scalable no touch computing
KR101092909B1 (en) 2009-11-27 2011-12-12 (주)디스트릭트홀딩스 Gesture Interactive Hologram Display Appatus and Method
KR101688655B1 (en) 2009-12-03 2016-12-21 엘지전자 주식회사 Controlling power of devices which is controllable with user's gesture by detecting presence of user
WO2011069157A2 (en) 2009-12-04 2011-06-09 Next Holdings Limited Methods and systems for position detection
KR101373285B1 (en) 2009-12-08 2014-03-11 한국전자통신연구원 A mobile terminal having a gesture recognition function and an interface system using the same
GB0921461D0 (en) 2009-12-08 2010-01-20 Qinetiq Ltd Range based sensing
EP2337323A1 (en) 2009-12-17 2011-06-22 NTT DoCoMo, Inc. Method and an apparatus for performing interaction between a mobile device and a screen
KR101307341B1 (en) 2009-12-18 2013-09-11 한국전자통신연구원 Method and apparatus for motion capture of dynamic object
US8232990B2 (en) 2010-01-05 2012-07-31 Apple Inc. Working with 3D objects
US8866791B2 (en) 2010-01-06 2014-10-21 Apple Inc. Portable electronic device having mode dependent user input controls
US8631355B2 (en) 2010-01-08 2014-01-14 Microsoft Corporation Assigning gesture dictionaries
US9019201B2 (en) 2010-01-08 2015-04-28 Microsoft Technology Licensing, Llc Evolving universal gesture sets
US9268404B2 (en) 2010-01-08 2016-02-23 Microsoft Technology Licensing, Llc Application gesture interpretation
US8933884B2 (en) 2010-01-15 2015-01-13 Microsoft Corporation Tracking groups of users in motion capture system
US9335825B2 (en) 2010-01-26 2016-05-10 Nokia Technologies Oy Gesture control
RU2422878C1 (en) 2010-02-04 2011-06-27 Владимир Валентинович Девятков Method of controlling television using multimodal interface
KR101184460B1 (en) 2010-02-05 2012-09-19 연세대학교 산학협력단 Device and method for controlling a mouse pointer
US8659658B2 (en) 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
US20110213664A1 (en) 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
TW201133358A (en) 2010-03-18 2011-10-01 Hon Hai Prec Ind Co Ltd System and method for detecting objects in a video image
CN102201121A (en) 2010-03-23 2011-09-28 鸿富锦精密工业(深圳)有限公司 System and method for detecting article in video scene
CN102822773A (en) 2010-03-24 2012-12-12 惠普开发有限公司 Gesture mapping for display device
EP2369433A1 (en) 2010-03-24 2011-09-28 ABB Research Ltd. Computer-based method and device for automatically providing control parameters for a plurality of coal mills supplying coal powder to a plant
CA2735325C (en) 2010-03-25 2015-01-20 User Interface In Sweden Ab System and method for gesture detection and feedback
EP2372512A1 (en) 2010-03-30 2011-10-05 Harman Becker Automotive Systems GmbH Vehicle user interface unit for a vehicle electronic device
JP2011210139A (en) 2010-03-30 2011-10-20 Sony Corp Image processing apparatus and method, and program
US20110251896A1 (en) 2010-04-09 2011-10-13 Affine Systems, Inc. Systems and methods for matching an advertisement to a video
CN201859393U (en) 2010-04-13 2011-06-08 任峰 Three-dimensional gesture recognition box
US20130038694A1 (en) 2010-04-27 2013-02-14 Sanjay Nichani Method for moving object detection using an image sensor and structured light
WO2011134083A1 (en) 2010-04-28 2011-11-03 Ryerson University System and methods for intraoperative guidance feedback
CN102236412A (en) 2010-04-30 2011-11-09 宏碁股份有限公司 Three-dimensional gesture recognition system and vision-based gesture recognition method
US9539510B2 (en) 2010-04-30 2017-01-10 Microsoft Technology Licensing, Llc Reshapable connector with variable rigidity
US9110557B2 (en) 2010-05-04 2015-08-18 Timocco Ltd. System and method for tracking and mapping an object to a target
US20110289455A1 (en) 2010-05-18 2011-11-24 Microsoft Corporation Gestures And Gesture Recognition For Manipulating A User-Interface
US8457353B2 (en) 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US8396252B2 (en) 2010-05-20 2013-03-12 Edge 3 Technologies Systems and related methods for three dimensional gesture recognition in vehicles
US20110299737A1 (en) 2010-06-04 2011-12-08 Acer Incorporated Vision-based hand movement recognition system and method thereof
EP2395413B1 (en) 2010-06-09 2018-10-03 The Boeing Company Gesture-based human machine interface
JP2011257337A (en) * 2010-06-11 2011-12-22 Seiko Epson Corp Optical position detection device and display device with position detection function
US8670029B2 (en) 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
US20110314427A1 (en) 2010-06-18 2011-12-22 Samsung Electronics Co., Ltd. Personalization using custom gestures
US8416187B2 (en) 2010-06-22 2013-04-09 Microsoft Corporation Item navigation using motion-capture data
DE102010030616A1 (en) 2010-06-28 2011-12-29 Robert Bosch Gmbh Method and device for detecting a disturbing object in a camera image
US20120050007A1 (en) 2010-08-24 2012-03-01 Babak Forutanpour Methods and apparatus for interacting with an electronic device application by moving an object in the air over an electronic device display
US20130241832A1 (en) 2010-09-07 2013-09-19 Zrro Technologies (2009) Ltd. Method and device for controlling the behavior of virtual objects on a display
US8842084B2 (en) 2010-09-08 2014-09-23 Telefonaktiebolaget L M Ericsson (Publ) Gesture-based object manipulation methods and devices
WO2012040086A1 (en) 2010-09-20 2012-03-29 Kopin Corporation Miniature communications gateway for head mounted display
IL208600A (en) 2010-10-10 2016-07-31 Rafael Advanced Defense Systems Ltd Network-based real time registered augmented reality for mobile devices
CN101951474A (en) 2010-10-12 2011-01-19 冠捷显示科技(厦门)有限公司 Television technology based on gesture control
US20120092254A1 (en) * 2010-10-14 2012-04-19 Chee Heng Wong Proximity sensor with motion detection
IL208910A0 (en) 2010-10-24 2011-02-28 Rafael Advanced Defense Sys Tracking and identification of a moving object from a moving sensor using a 3d model
CN102053702A (en) 2010-10-26 2011-05-11 南京航空航天大学 Dynamic gesture control system and method
US8817087B2 (en) 2010-11-01 2014-08-26 Robert Bosch Gmbh Robust video-based handwriting and gesture recognition for in-car applications
JP5521995B2 (en) * 2010-11-18 2014-06-18 セイコーエプソン株式会社 Optical position detection device and device with position detection function
US9244606B2 (en) 2010-12-20 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
KR101587962B1 (en) 2010-12-22 2016-01-28 한국전자통신연구원 Motion capture apparatus and method
US8929609B2 (en) 2011-01-05 2015-01-06 Qualcomm Incorporated Method and apparatus for scaling gesture recognition to physical dimensions of a user
US8570320B2 (en) 2011-01-31 2013-10-29 Microsoft Corporation Using a three-dimensional environment model in gameplay
SG182880A1 (en) 2011-02-01 2012-08-30 Univ Singapore A method and system for interaction with micro-objects
CN103347437B (en) 2011-02-09 2016-06-08 苹果公司 Gaze detection in 3D mapping environment
US8723789B1 (en) 2011-02-11 2014-05-13 Imimtek, Inc. Two-dimensional method and system enabling three-dimensional user interaction with a device
US10216866B2 (en) 2011-02-25 2019-02-26 Smiths Heimann Gmbh Image reconstruction based on parametric models
US20120223959A1 (en) 2011-03-01 2012-09-06 Apple Inc. System and method for a touchscreen slider with toggle control
US8600107B2 (en) 2011-03-31 2013-12-03 Smart Technologies Ulc Interactive input system and method
US9117147B2 (en) 2011-04-29 2015-08-25 Siemens Aktiengesellschaft Marginal space learning for multi-person tracking over mega pixel imagery
US8457355B2 (en) 2011-05-05 2013-06-04 International Business Machines Corporation Incorporating video meta-data in 3D models
US8686943B1 (en) * 2011-05-13 2014-04-01 Imimtek, Inc. Two-dimensional method and system enabling three-dimensional user interaction with a device
US8810640B2 (en) 2011-05-16 2014-08-19 Ut-Battelle, Llc Intrinsic feature-based pose measurement for imaging motion compensation
US8842163B2 (en) 2011-06-07 2014-09-23 International Business Machines Corporation Estimation of object properties in 3D world
US20120320080A1 (en) 2011-06-14 2012-12-20 Microsoft Corporation Motion based virtual object navigation
US8773512B1 (en) 2011-06-30 2014-07-08 Aquifi, Inc. Portable remote control device enabling three-dimensional user interaction with at least one appliance
US9086794B2 (en) 2011-07-14 2015-07-21 Microsoft Technology Licensing, Llc Determining gestures on context based menus
US8891868B1 (en) 2011-08-04 2014-11-18 Amazon Technologies, Inc. Recognizing gestures captured by video
TW201310389A (en) 2011-08-19 2013-03-01 Vatics Inc Motion object detection method using image contrast enhancement
CN103636191B (en) 2011-08-23 2016-11-02 松下电器产业株式会社 Three-dimensional image pickup device, lens control device
US8830302B2 (en) 2011-08-24 2014-09-09 Lg Electronics Inc. Gesture-based user interface method and apparatus
EP2755115A4 (en) 2011-09-07 2015-05-06 Nitto Denko Corp Method for detecting motion of input body and input device using same
WO2013036621A1 (en) 2011-09-09 2013-03-14 Thales Avionics, Inc. Controlling vehicle entertainment systems responsive to sensed passenger gestures
JP5624530B2 (en) 2011-09-29 2014-11-12 株式会社東芝 Command issuing device, method and program
US20130097566A1 (en) 2011-10-17 2013-04-18 Carl Fredrik Alexander BERGLUND System and method for displaying items on electronic devices
US8235529B1 (en) 2011-11-30 2012-08-07 Google Inc. Unlocking a screen using eye tracking information
AU2011253910B2 (en) 2011-12-08 2015-02-26 Canon Kabushiki Kaisha Method, apparatus and system for tracking an object in a sequence of images
WO2013103410A1 (en) 2012-01-05 2013-07-11 California Institute Of Technology Imaging surround systems for touch-free display control
US8942419B1 (en) 2012-01-06 2015-01-27 Google Inc. Position estimation using predetermined patterns of light sources
US20150097772A1 (en) 2012-01-06 2015-04-09 Thad Eugene Starner Gaze Signal Based on Physical Characteristics of the Eye
US9230171B2 (en) 2012-01-06 2016-01-05 Google Inc. Object outlining to initiate a visual search
US20150084864A1 (en) 2012-01-09 2015-03-26 Google Inc. Input Method
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US9679215B2 (en) * 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US8638989B2 (en) 2012-01-17 2014-01-28 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
CN104145276B (en) 2012-01-17 2017-05-03 厉动公司 Enhanced contrast for object detection and characterization by optical imaging
US20130182079A1 (en) 2012-01-17 2013-07-18 Ocuspec Motion capture using cross-sections of an object
US10691219B2 (en) * 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US9070019B2 (en) 2012-01-17 2015-06-30 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US8693731B2 (en) 2012-01-17 2014-04-08 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
US20140307920A1 (en) 2013-04-12 2014-10-16 David Holz Systems and methods for tracking occluded objects in three-dimensional space
US9213822B2 (en) 2012-01-20 2015-12-15 Apple Inc. Device, method, and graphical user interface for accessing an application in a locked device
KR101709844B1 (en) 2012-02-15 2017-02-23 애플 인크. Apparatus and method for mapping
KR101905648B1 (en) 2012-02-27 2018-10-11 삼성전자 주식회사 Apparatus and method for shooting a moving picture of camera device
TWI456486B (en) 2012-03-06 2014-10-11 Acer Inc Electronic apparatus and method for controlling the same
WO2013136053A1 (en) 2012-03-10 2013-09-19 Digitaloptics Corporation Miniature camera module with mems-actuated autofocus
CN108469899B (en) 2012-03-13 2021-08-10 视力移动技术有限公司 Method of identifying an aiming point or area in a viewing space of a wearable display device
US9122354B2 (en) 2012-03-14 2015-09-01 Texas Instruments Incorporated Detecting wave gestures near an illuminated surface
US9218723B2 (en) 2012-03-20 2015-12-22 Intralot S.A.—Integrated Lottery Systems and Services Methods and systems for a gesture-controlled lottery terminal
US8942881B2 (en) 2012-04-02 2015-01-27 Google Inc. Gesture-based automotive controls
TWI464640B (en) 2012-04-03 2014-12-11 Wistron Corp Gesture sensing apparatus and electronic system having gesture input function
US20130300831A1 (en) 2012-05-11 2013-11-14 Loren Mavromatis Camera scene fitting of real world scenes
US9671566B2 (en) 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US9697418B2 (en) 2012-07-09 2017-07-04 Qualcomm Incorporated Unsupervised movement detection and gesture recognition
US9726883B2 (en) 2012-07-27 2017-08-08 Nissan Motor Co., Ltd Three-dimensional object detection device and foreign matter detection device
US9124778B1 (en) 2012-08-29 2015-09-01 Nomi Corporation Apparatuses and methods for disparity-based tracking and analysis of objects in a region of interest
US10839227B2 (en) 2012-08-29 2020-11-17 Conduent Business Services, Llc Queue group leader identification
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
JP6186689B2 (en) 2012-09-26 2017-08-30 セイコーエプソン株式会社 Video display system
EP2915025B8 (en) 2012-11-01 2021-06-02 Eyecam, Inc. Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing
US9386298B2 (en) 2012-11-08 2016-07-05 Leap Motion, Inc. Three-dimensional image sensors
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
WO2014078414A1 (en) 2012-11-13 2014-05-22 Joseph Wu Chemically defined production of cardiomyocytes from pluripotent stem cells
JP6058978B2 (en) 2012-11-19 2017-01-11 サターン ライセンシング エルエルシーSaturn Licensing LLC Image processing apparatus, image processing method, photographing apparatus, and computer program
WO2014083953A1 (en) 2012-11-27 2014-06-05 ソニー株式会社 Display device, display method, and computer program
US9081571B2 (en) 2012-11-29 2015-07-14 Amazon Technologies, Inc. Gesture detection management for an electronic device
US10912131B2 (en) 2012-12-03 2021-02-02 Samsung Electronics Co., Ltd. Method and mobile terminal for controlling bluetooth low energy device
KR101448749B1 (en) 2012-12-10 2014-10-08 현대자동차 주식회사 System and method for object image detecting
US9274608B2 (en) 2012-12-13 2016-03-01 Eyesight Mobile Technologies Ltd. Systems and methods for triggering actions based on touch-free gesture detection
US8761448B1 (en) 2012-12-13 2014-06-24 Intel Corporation Gesture pre-processing of video stream using a markered region
US9733713B2 (en) 2012-12-26 2017-08-15 Futurewei Technologies, Inc. Laser beam based gesture control interface for mobile devices
US20140189579A1 (en) 2013-01-02 2014-07-03 Zrro Technologies (2009) Ltd. System and method for controlling zooming and/or scrolling
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US9459697B2 (en) 2013-01-15 2016-10-04 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US9720504B2 (en) 2013-02-05 2017-08-01 Qualcomm Incorporated Methods for system engagement via 3D object detection
US20140240215A1 (en) 2013-02-26 2014-08-28 Corel Corporation System and method for controlling a user interface utility using a vision system
US20140240225A1 (en) 2013-02-26 2014-08-28 Pointgrab Ltd. Method for touchless control of a device
GB201303707D0 (en) 2013-03-01 2013-04-17 Tosas Bautista Martin System and method of interaction for mobile devices
US9056396B1 (en) 2013-03-05 2015-06-16 Autofuss Programming of a robotic arm using a motion capture system
US20140253785A1 (en) 2013-03-07 2014-09-11 Mediatek Inc. Auto Focus Based on Analysis of State or State Change of Image Content
JP6037901B2 (en) 2013-03-11 2016-12-07 日立マクセル株式会社 Operation detection device, operation detection method, and display control data generation method
WO2014140827A2 (en) 2013-03-14 2014-09-18 Eyesight Mobile Technologies Ltd. Systems and methods for proximity sensor and image sensor based gesture detection
US9702977B2 (en) * 2013-03-15 2017-07-11 Leap Motion, Inc. Determining positional information of an object in space
US8954340B2 (en) 2013-03-15 2015-02-10 State Farm Mutual Automobile Insurance Company Risk evaluation based on vehicle operator behavior
US9766709B2 (en) 2013-03-15 2017-09-19 Leap Motion, Inc. Dynamic user interactions for display control
KR102037930B1 (en) 2013-03-15 2019-10-30 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
US10509533B2 (en) 2013-05-14 2019-12-17 Qualcomm Incorporated Systems and methods of generating augmented reality (AR) objects
US10137361B2 (en) 2013-06-07 2018-11-27 Sony Interactive Entertainment America Llc Systems and methods for using reduced hops to generate an augmented virtual reality scene within a head mounted system
US9908048B2 (en) 2013-06-08 2018-03-06 Sony Interactive Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display
WO2014208087A1 (en) 2013-06-27 2014-12-31 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Motion sensor device having plurality of light sources
US9239950B2 (en) 2013-07-01 2016-01-19 Hand Held Products, Inc. Dimensioning system
US9857876B2 (en) 2013-07-22 2018-01-02 Leap Motion, Inc. Non-linear motion capture using Frenet-Serret frames
JP2015027015A (en) 2013-07-29 2015-02-05 ソニー株式会社 Information presentation device and information processing system
GB201314984D0 (en) 2013-08-21 2013-10-02 Sony Comp Entertainment Europe Head-mountable apparatus and systems
US9261966B2 (en) 2013-08-22 2016-02-16 Sony Corporation Close range natural user interface system and method of operation thereof
US8922590B1 (en) 2013-10-01 2014-12-30 Myth Innovations, Inc. Augmented reality interface and method of use
CN105579929B (en) 2013-10-29 2019-11-05 英特尔公司 Human-computer interaction based on gesture
US9546776B2 (en) 2013-10-31 2017-01-17 General Electric Company Customizable modular luminaire
US9402018B2 (en) 2013-12-17 2016-07-26 Amazon Technologies, Inc. Distributing processing for imaging processing
US20150205358A1 (en) 2014-01-20 2015-07-23 Philip Scott Lyren Electronic Device with Touchless User Interface
US20150205400A1 (en) 2014-01-21 2015-07-23 Microsoft Corporation Grip Detection
US9311718B2 (en) 2014-01-23 2016-04-12 Microsoft Technology Licensing, Llc Automated content scrolling
EP3116616B1 (en) 2014-03-14 2019-01-30 Sony Interactive Entertainment Inc. Gaming device with volumetric sensing
CN106255916B (en) 2014-03-14 2019-08-09 索尼互动娱乐股份有限公司 Track the method and system of head-mounted display (HMD) and the calibration for the adjustment of HMD headband
US20160062571A1 (en) 2014-09-02 2016-03-03 Apple Inc. Reduced size user interface
US9984505B2 (en) 2014-09-30 2018-05-29 Sony Interactive Entertainment Inc. Display of text information on a head-mounted display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8432377B2 (en) * 2007-08-30 2013-04-30 Next Holdings Limited Optical touchscreen with improved illumination
US20150193669A1 (en) * 2011-11-21 2015-07-09 Pixart Imaging Inc. System and method based on hybrid biometric detection
US20130271397A1 (en) * 2012-04-16 2013-10-17 Qualcomm Incorporated Rapid gesture re-engagement
US20140002365A1 (en) * 2012-06-28 2014-01-02 Intermec Ip Corp. Dual screen display for mobile computing device

Cited By (142)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11782516B2 (en) 2012-01-17 2023-10-10 Ultrahaptics IP Two Limited Differentiating a detected object from a background using a gaussian brightness falloff pattern
US10699155B2 (en) 2012-01-17 2020-06-30 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10410411B2 (en) 2012-01-17 2019-09-10 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US10366308B2 (en) 2012-01-17 2019-07-30 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US11493998B2 (en) 2012-01-17 2022-11-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US9934580B2 (en) 2012-01-17 2018-04-03 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US20170061205A1 (en) * 2012-01-17 2017-03-02 Leap Motion, Inc. Enhanced Contrast for Object Detection and Characterization By Optical Imaging Based on Differences Between Images
US9778752B2 (en) 2012-01-17 2017-10-03 Leap Motion, Inc. Systems and methods for machine control
US9741136B2 (en) 2012-01-17 2017-08-22 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US11308711B2 (en) 2012-01-17 2022-04-19 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9697643B2 (en) 2012-01-17 2017-07-04 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9652668B2 (en) * 2012-01-17 2017-05-16 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10565784B2 (en) 2012-01-17 2020-02-18 Ultrahaptics IP Two Limited Systems and methods for authenticating a user according to a hand of the user moving in a three-dimensional (3D) space
US11740705B2 (en) 2013-01-15 2023-08-29 Ultrahaptics IP Two Limited Method and system for controlling a machine according to a characteristic of a control object
US11874970B2 (en) 2013-01-15 2024-01-16 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11353962B2 (en) 2013-01-15 2022-06-07 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11693115B2 (en) 2013-03-15 2023-07-04 Ultrahaptics IP Two Limited Determining positional information of an object in space
US9927522B2 (en) 2013-03-15 2018-03-27 Leap Motion, Inc. Determining positional information of an object in space
US9702977B2 (en) * 2013-03-15 2017-07-11 Leap Motion, Inc. Determining positional information of an object in space
US10585193B2 (en) 2013-03-15 2020-03-10 Ultrahaptics IP Two Limited Determining positional information of an object in space
US20140285818A1 (en) * 2013-03-15 2014-09-25 Leap Motion, Inc. Determining positional information of an object in space
US9606637B2 (en) * 2013-04-04 2017-03-28 Funai Electric Co., Ltd. Projector and electronic device having projector function
US20140300544A1 (en) * 2013-04-04 2014-10-09 Funai Electric Co., Ltd. Projector and Electronic Device Having Projector Function
US11099653B2 (en) 2013-04-26 2021-08-24 Ultrahaptics IP Two Limited Machine responsiveness to dynamic user movements and gestures
US11567578B2 (en) 2013-08-09 2023-01-31 Ultrahaptics IP Two Limited Systems and methods of free-space gestural interaction
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US9973466B2 (en) * 2013-12-16 2018-05-15 Inbubbles Inc. Space time region based communications
US11706184B2 (en) 2013-12-16 2023-07-18 Inbubbles Inc. Space time region based communications
US20160323236A1 (en) * 2013-12-16 2016-11-03 Inbubbles Inc. Space Time Region Based Communications
US11140120B2 (en) 2013-12-16 2021-10-05 Inbubbles Inc. Space time region based communications
US9672808B2 (en) * 2014-03-07 2017-06-06 Sony Corporation Information processing apparatus, information processing system, information processing method, and program
US10238964B2 (en) * 2014-03-07 2019-03-26 Sony Corporation Information processing apparatus, information processing system, and information processing method
US10088907B2 (en) 2014-03-07 2018-10-02 Sony Corporation Information processing apparatus and information processing method
US20150251089A1 (en) * 2014-03-07 2015-09-10 Sony Corporation Information processing apparatus, information processing system, information processing method, and program
US20150254947A1 (en) * 2014-03-07 2015-09-10 Sony Corporation Information processing apparatus, information processing system, information processing method, and program
US11143749B2 (en) * 2014-05-23 2021-10-12 Signify Holding B.V. Object detection system and method
US20170199272A1 (en) * 2014-07-03 2017-07-13 Sharp Kabushiki Kaisha Optical reflection sensor and electronic device
US10885810B2 (en) 2014-07-11 2021-01-05 Shape Matrix Geometric Instruments, LLC Shape-matrix geometric instrument
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US9699377B2 (en) * 2014-08-22 2017-07-04 Samsung Electronics Co., Ltd. Depth detecting apparatus and method, and gesture detecting apparatus and gesture detecting method
US20160057340A1 (en) * 2014-08-22 2016-02-25 Samsung Electronics Co., Ltd. Depth detecting apparatus and method, and gesture detecting apparatus and gesture detecting method
US9664507B2 (en) * 2014-09-17 2017-05-30 Canon Kabushiki Kaisha Depth value measurement using illumination by pixels
US20160076878A1 (en) * 2014-09-17 2016-03-17 Canon Kabushiki Kaisha Depth value measurement using illumination by pixels
US11747135B2 (en) 2015-02-13 2023-09-05 Carnegie Mellon University Energy optimized imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor
US10386468B2 (en) * 2015-03-05 2019-08-20 Hanwha Techwin Co., Ltd. Photographing apparatus and method
US20180107108A1 (en) * 2015-04-30 2018-04-19 Sony Corporation Image processing apparatus, image processing method, and program
US10444617B2 (en) * 2015-04-30 2019-10-15 Sony Corporation Image processing apparatus and image processing method
US20170046815A1 (en) * 2015-08-12 2017-02-16 Boe Technology Group Co., Ltd. Display Device, Display System and Resolution Adjusting Method
US10032251B2 (en) * 2015-08-12 2018-07-24 Boe Technology Group Co., Ltd Display device, display system and resolution adjusting method
US20170052024A1 (en) * 2015-08-21 2017-02-23 Adcole Corporation Optical profiler and methods of use thereof
CN106548455A (en) * 2015-09-17 2017-03-29 三星电子株式会社 For adjusting the apparatus and method of the brightness of image
US20170109920A1 (en) * 2015-10-20 2017-04-20 Samsung Electronics Co., Ltd. Method and apparatus rendering caustics
US10127711B2 (en) * 2015-10-20 2018-11-13 Samsung Electronics Co., Ltd. Method and apparatus rendering caustics
US10114467B2 (en) 2015-11-30 2018-10-30 Photopotech LLC Systems and methods for processing image information
US11217009B2 (en) 2015-11-30 2022-01-04 Photopotech LLC Methods for collecting and processing image information to produce digital assets
US10778877B2 (en) 2015-11-30 2020-09-15 Photopotech LLC Image-capture device
US11699243B2 (en) 2015-11-30 2023-07-11 Photopotech LLC Methods for collecting and processing image information to produce digital assets
US10306156B2 (en) * 2015-11-30 2019-05-28 Photopotech LLC Image-capture device
CN108472013A (en) * 2016-01-22 2018-08-31 奥林巴斯株式会社 The working procedure of ultrasound observation apparatus, the working method of ultrasound observation apparatus and ultrasound observation apparatus
US10317516B2 (en) * 2016-02-22 2019-06-11 Keyence Corporation Optical safety system
US20170242110A1 (en) * 2016-02-22 2017-08-24 Keyence Corporation Optical Safety System
US20170287153A1 (en) * 2016-03-31 2017-10-05 Mda Information Systems Llc Method and apparatus for imaging the silhouette of an object occluding a light source using a synthetic aperature
US10145671B2 (en) * 2016-03-31 2018-12-04 Topcon Positioning Systems, Inc. Three dimensional laser measuring system and method
US10152802B2 (en) * 2016-03-31 2018-12-11 Radiant Geospatial Solutions Llc Method and apparatus for imaging the silhouette of an object occluding a light source using a synthetic aperature
EP3467765A4 (en) * 2016-05-23 2020-02-26 Sony Corporation Electronic device, control method for electronic device, and program
US10565462B2 (en) * 2016-05-23 2020-02-18 Sony Corporation Electronic apparatus, method of controlling electronic apparatus, and program
US20180189591A1 (en) * 2016-05-23 2018-07-05 Sony Corporation Electronic apparatus, method of controlling electronic apparatus, and program
EP3467765A1 (en) * 2016-05-23 2019-04-10 Sony Corporation Electronic device, control method for electronic device, and program
US10958851B2 (en) * 2016-07-29 2021-03-23 International Business Machines Corporation Camera apparatus for indicating camera field of view
US10084979B2 (en) * 2016-07-29 2018-09-25 International Business Machines Corporation Camera apparatus and system, method and recording medium for indicating camera field of view
US20180338101A1 (en) * 2016-07-29 2018-11-22 International Business Machines Corporation Camera apparatus and system, method and recording medium for indicating camera field of view
US10630909B2 (en) * 2016-07-29 2020-04-21 International Business Machines Corporation Camera apparatus and system, method and recording medium for indicating camera field of view
US20200007731A1 (en) * 2016-07-29 2020-01-02 International Business Machines Corporation Camera apparatus and system, method and recording medium for indicating camera field of view
US10015412B2 (en) * 2016-09-06 2018-07-03 The Trustees For The Time Being Of Junior Barnes Family Trust Video capturing system and method for imaging cyclically moving objects
US10706291B2 (en) * 2017-03-03 2020-07-07 Magna Electronics Inc. Trailer angle detection system for vehicle
US20180253608A1 (en) * 2017-03-03 2018-09-06 Magna Electronics Inc. Trailer angle detection system for vehicle
US11753003B2 (en) * 2017-04-13 2023-09-12 Zoox, Inc. Surface normal determination for LIDAR range samples by detecting probe pulse stretching
US11093031B2 (en) * 2017-06-28 2021-08-17 Trumpf Photonic Components Gmbh Display apparatus for computer-mediated reality
US20200408883A1 (en) * 2017-07-11 2020-12-31 Robert Bosch Gmbh Lidar device for situation-dependent scanning of solid angles
US10458784B2 (en) * 2017-08-17 2019-10-29 Carl Zeiss Industrielle Messtechnik Gmbh Method and apparatus for determining at least one of dimensional or geometric characteristics of a measurement object
US20190056218A1 (en) * 2017-08-17 2019-02-21 Carl Zeiss Industrielle Messtechnik Gmbh Method and apparatus for determining at least one of dimensional or geometric characteristics of a measurement object
US10362296B2 (en) * 2017-08-17 2019-07-23 Microsoft Technology Licensing, Llc Localized depth map generation
US11573095B2 (en) 2017-08-22 2023-02-07 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
US11874130B2 (en) 2017-08-22 2024-01-16 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
US20190071123A1 (en) * 2017-09-07 2019-03-07 Ford Global Technologies, Llc Hitch assist system featuring trailer location identification
US11338851B2 (en) * 2017-09-07 2022-05-24 Ford Global Technologies, Llc Hitch assist system featuring trailer location identification
US11138692B2 (en) * 2017-09-28 2021-10-05 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
US11790490B2 (en) * 2017-09-28 2023-10-17 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
US20220092741A1 (en) * 2017-09-28 2022-03-24 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
US10482575B2 (en) * 2017-09-28 2019-11-19 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
US20190108647A1 (en) * 2017-10-10 2019-04-11 The Boeing Company Systems and methods for 3d cluster recognition for relative tracking
US20190129038A1 (en) * 2017-10-27 2019-05-02 Osram Opto Semiconductors Gmbh Monitoring System for a Mobile Device and Method for Monitoring Surroundings of a Mobile Device
WO2019096369A1 (en) * 2017-11-14 2019-05-23 Alterface Holdings Tracking of a user device
BE1025874B1 (en) * 2017-11-14 2019-08-06 Alterface Holdings PURSUIT OF A USER DEVICE
US11733524B1 (en) * 2018-02-01 2023-08-22 Meta Platforms Technologies, Llc Depth camera assembly based on near infra-red illuminator
US20210278221A1 (en) * 2018-02-14 2021-09-09 Tusimple, Inc. Lane marking localization and fusion
US20200271473A1 (en) * 2018-02-14 2020-08-27 Tusimple, Inc. Lane Marking Localization
US20210278232A1 (en) * 2018-02-14 2021-09-09 Tusimple, Inc. Lane marking localization
US11740093B2 (en) * 2018-02-14 2023-08-29 Tusimple, Inc. Lane marking localization and fusion
US11009365B2 (en) * 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization
US11852498B2 (en) * 2018-02-14 2023-12-26 Tusimple, Inc. Lane marking localization
US11009356B2 (en) * 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization and fusion
US20190392601A1 (en) * 2018-02-23 2019-12-26 Librestream Technologies Inc. Image Processing System for Inspecting Object Distance and Dimensions Using a Hand-Held Camera with a Collimated Laser
US11232578B2 (en) * 2018-02-23 2022-01-25 Librestream Technologies Inc Image processing system for inspecting object distance and dimensions using a hand-held camera with a collimated laser
US20210052329A1 (en) * 2018-03-30 2021-02-25 Koninklijke Philips N.V. Monitoring of moving objects in an operation room
US20190347399A1 (en) * 2018-05-09 2019-11-14 Shape Matrix Geometric Instruments, LLC Methods and Apparatus for Encoding Passwords or Other Information
US20190347398A1 (en) * 2018-05-09 2019-11-14 Shape Matrix Geometric Instruments, LLC Methods and Apparatus for Encoding Passwords or Other Information
US10872139B2 (en) * 2018-05-09 2020-12-22 Shape Matrix Geometric Instruments, LLC Methods and apparatus for encoding passwords or other information
US10878075B2 (en) * 2018-05-09 2020-12-29 Shape Matrix Geometric Instruments, LLC Methods and apparatus for encoding passwords or other information
US10970373B2 (en) * 2018-08-06 2021-04-06 Lg Electronics Inc. Mobile terminal and method for controlling the same
US11081516B2 (en) * 2018-08-10 2021-08-03 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Display screen, electronic device and method for three-dimensional feature recognition
US20200096640A1 (en) * 2018-09-26 2020-03-26 Qualcomm Incorporated Multi-phase active light depth system
US10732285B2 (en) * 2018-09-26 2020-08-04 Qualcomm Incorporated Multi-phase active light depth system
US20200132547A1 (en) * 2018-10-30 2020-04-30 Variable, Inc. System and method for spectral interpolation using multiple illumination sources
US10746599B2 (en) * 2018-10-30 2020-08-18 Variable, Inc. System and method for spectral interpolation using multiple illumination sources
US11212456B2 (en) * 2018-12-21 2021-12-28 Sony Group Corporation Synchronized projection and image capture
US20200209984A1 (en) * 2018-12-28 2020-07-02 Texas Instruments Incorporated Optical positioning systems and methods
US11272086B2 (en) * 2019-01-08 2022-03-08 Yu-Sian Jiang Camera system, vehicle and method for configuring light source of camera system
CN111200709A (en) * 2019-01-08 2020-05-26 英属开曼群岛商麦迪创科技股份有限公司 Method for setting light source of camera system, camera system and vehicle
US20220198669A1 (en) * 2019-04-02 2022-06-23 Koninklijke Philips N.V. Segmentation and view guidance in ultrasound imaging and associated devices, systems, and methods
WO2020214201A1 (en) 2019-04-17 2020-10-22 Carnegie Mellon University Agile depth sensing using triangulation light curtains
EP3956631A4 (en) * 2019-04-17 2022-12-28 Carnegie Mellon University Agile depth sensing using triangulation light curtains
US11709364B1 (en) 2019-05-22 2023-07-25 Meta Platforms Technologies, Llc Addressable crossed line projector for depth camera assembly
US11654375B2 (en) * 2019-08-07 2023-05-23 Universal City Studios Llc Systems and methods for detecting specular surfaces
EP3977168A4 (en) * 2019-11-06 2022-08-03 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
WO2021091105A1 (en) 2019-11-06 2021-05-14 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
US20230230251A1 (en) * 2019-11-14 2023-07-20 Samsung Electronics Co., Ltd. Image processing apparatus and method
US11900610B2 (en) * 2019-11-14 2024-02-13 Samsung Electronics Co., Ltd. Image processing apparatus and method
US11282288B2 (en) 2019-11-20 2022-03-22 Shape Matrix Geometric Instruments, LLC Methods and apparatus for encoding data in notched shapes
US11797082B2 (en) 2020-01-13 2023-10-24 Sony Interactive Entertainment Inc. Event driven sensor (EDS) tracking of light emitting diode (LED) array
EP4090910A4 (en) * 2020-01-13 2024-03-20 Sony Interactive Entertainment Inc Event driven sensor (eds) tracking of light emitting diode (led) array
WO2021146117A1 (en) * 2020-01-13 2021-07-22 Sony Interactive Entertainment Inc. Event driven sensor (eds) tracking of light emitting diode (led) array
US11340696B2 (en) 2020-01-13 2022-05-24 Sony Interactive Entertainment Inc. Event driven sensor (EDS) tracking of light emitting diode (LED) array
US20210286455A1 (en) * 2020-03-12 2021-09-16 Beijing Xiaomi Mobile Software Co., Ltd. Electronic equipment, method for controlling electronic equipment, and storage medium
US11861114B2 (en) * 2020-03-12 2024-01-02 Beijing Xiaomi Mobile Software Co., Ltd. Electronic equipment, method for controlling electronic equipment, and storage medium
US11810322B2 (en) 2020-04-09 2023-11-07 Tusimple, Inc. Camera pose estimation techniques
US11412133B1 (en) * 2020-06-26 2022-08-09 Amazon Technologies, Inc. Autonomously motile device with computer vision
US11822018B2 (en) * 2020-10-14 2023-11-21 Lg Innotek Co., Ltd. Multi-detector LiDAR systems and methods for mitigating range aliasing
US20220113380A1 (en) * 2020-10-14 2022-04-14 Argo AI, LLC Multi-detector lidar systems and methods for mitigating range aliasing

Also Published As

Publication number Publication date
US10585193B2 (en) 2020-03-10
US9927522B2 (en) 2018-03-27
US9702977B2 (en) 2017-07-11
US20190018141A1 (en) 2019-01-17
US11693115B2 (en) 2023-07-04
US20230288563A1 (en) 2023-09-14
WO2014200589A3 (en) 2015-03-19
WO2014200589A2 (en) 2014-12-18
WO2014200589A9 (en) 2015-02-05
US20170285169A1 (en) 2017-10-05
US20200249352A1 (en) 2020-08-06
US20140285818A1 (en) 2014-09-25

Similar Documents

Publication Publication Date Title
US20150253428A1 (en) Determining positional information for an object in space
US9911240B2 (en) Systems and method of interacting with a virtual object
US9778752B2 (en) Systems and methods for machine control
US10452151B2 (en) Non-tactile interface systems and methods
US11435788B2 (en) Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US8971565B2 (en) Human interface electronic device
CN105593786B (en) Object's position determines
US20170186182A1 (en) Method and system for generating light pattern using polygons
US10691219B2 (en) Systems and methods for machine control
US9348419B2 (en) Method for synchronizing operation of systems
US9851574B2 (en) Mirror array display system
KR20210053980A (en) Depth detection using grid light patterns
US11029408B2 (en) Distance-imaging system and method of distance imaging
US11720180B2 (en) Systems and methods for machine control
US20230325005A1 (en) Systems and methods for machine control
EP3104209B1 (en) Method and system for generating light pattern using polygons

Legal Events

Date Code Title Description
AS Assignment

Owner name: LEAP MOTION, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HOLZ, DAVID;REEL/FRAME:035328/0253

Effective date: 20140320

AS Assignment

Owner name: TRIPLEPOINT CAPITAL LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:LEAP MOTION, INC.;REEL/FRAME:036644/0314

Effective date: 20150918

AS Assignment

Owner name: THE FOUNDERS FUND IV, LP, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:LEAP MOTION, INC.;REEL/FRAME:036796/0151

Effective date: 20150918

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: LEAP MOTION, INC., CALIFORNIA

Free format text: TERMINATION OF SECURITY AGREEMENT;ASSIGNOR:THE FOUNDERS FUND IV, LP, AS COLLATERAL AGENT;REEL/FRAME:047444/0567

Effective date: 20181101

AS Assignment

Owner name: LEAP MOTION, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:TRIPLEPOINT CAPITAL LLC;REEL/FRAME:049337/0130

Effective date: 20190524

AS Assignment

Owner name: ULTRAHAPTICS IP TWO LIMITED, UNITED KINGDOM

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LMI LIQUIDATING CO., LLC.;REEL/FRAME:051580/0165

Effective date: 20190930

Owner name: LMI LIQUIDATING CO., LLC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEAP MOTION, INC.;REEL/FRAME:052914/0871

Effective date: 20190930

AS Assignment

Owner name: LMI LIQUIDATING CO., LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:ULTRAHAPTICS IP TWO LIMITED;REEL/FRAME:052848/0240

Effective date: 20190524

AS Assignment

Owner name: TRIPLEPOINT CAPITAL LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:LMI LIQUIDATING CO., LLC;REEL/FRAME:052902/0571

Effective date: 20191228