WO2014186304A1 - Interference reduction for tof systems - Google Patents

Interference reduction for tof systems Download PDF

Info

Publication number
WO2014186304A1
WO2014186304A1 PCT/US2014/037753 US2014037753W WO2014186304A1 WO 2014186304 A1 WO2014186304 A1 WO 2014186304A1 US 2014037753 W US2014037753 W US 2014037753W WO 2014186304 A1 WO2014186304 A1 WO 2014186304A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
periods
light source
emit
tof
Prior art date
Application number
PCT/US2014/037753
Other languages
French (fr)
Inventor
Andrew Payne
Cyrus Bamji
Dawson Yee
Barry Thompson
Zhanping Xu
Brock Roland
Larry Prather
Travis Perry
Mike Fenton
Sunil Acharya
Algird Gudaitis
Matthew Morris
Original Assignee
Microsoft Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corporation filed Critical Microsoft Corporation
Priority to EP14734582.1A priority Critical patent/EP2997395B1/en
Priority to CN201480028101.8A priority patent/CN105264401B/en
Publication of WO2014186304A1 publication Critical patent/WO2014186304A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/493Extracting wanted echo signals

Definitions

  • IR remote control signals typically have a wavelength of about 940nm and typically have a carrier frequency between 10 kHz and 100 kHz, and even more specifically between 30 kHz and 60 kHz. For an even more specific example, many IR remote control signals have a carrier frequency of about 36 kHz (this is not to be confused with the actual frequency of the IR light itself).
  • a time-of-flight (TOF) camera which can also be referred to as a TOF system, may be located in close proximity to (e.g., within the same room as) one or more of the aforementioned consumer electronic devices (e.g., a television, a set top box and/or a media player) that is/are configured to be remotely controlled by a handheld remote control device.
  • a TOF camera may be part of a gaming console that is within the same room as a television, a set top box and/or a DVD player, which can also be referred to as other systems.
  • Such a TOF camera typically operates by illuminating a target with a modulated IR light source and detecting IR light that reflects off the target and is incident on an image pixel detector array of the TOF camera.
  • the IR light source is usually modulated at a relatively high carrier frequency (e.g., about 100 MHz, which is within the radio frequency range) during integration and is typically switched off between frames or captures and during readout. While the carrier frequency of the modulated IR light source is typically well above the carrier frequency of remote control signals, transitions from times during which the light source does not emit the RF modulated light to times during which the light source emits RF modulated light, and vice versa, can produce lower frequency content that can interfere with the remote control signals.
  • a low frequency (LF) power envelope associated with the modulated IR light, produced by the TOF camera may interfere with remote control signals intended to control another system (e.g., a television) within close proximity to the TOF camera.
  • a vast majority of the interference produced by the TOF camera will not correspond to a valid remote control command, and thus, will be rejected by an IR receiver of the other system (e.g., the television) that is intended to be controlled by remote control signals.
  • the interference produced by the TOF camera may be significant enough to prevent a user from being able to actually remotely control the other system (e.g., the television) that is within close proximity to the TOF camera.
  • a TOF camera can render a remote control device inoperative. Due to relatively poor optical bandpass characteristics of IR receives of televisions, or other systems, such interference problems may even occur where the IR wavelength used by a TOF camera differs from the IR wavelength used by a remote control device. For example, such interference problems may even occur where the wavelength of the IR light used by the TOF camera is about 860nm and the IR light used by a remote control device is about 940nm. Further, it is noted that a TOF camera may also cause similar interference problems with other systems that receive and respond to wireless IR signals, such as, but not limited to, systems that include wireless IR headphones and three-dimensional (3D) television systems that include active shutter 3D glasses.
  • 3D three-dimensional
  • Certain embodiments disclosed are directed to time-of-flight (TOF) systems, and methods for use therewith, that substantially reduce interference that the TOF system may cause to at least one other system that is configured to wirelessly receive and respond to IR light signals.
  • Some such embodiments involve emitting IR light having a low frequency (LF) power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, wherein at least a portion of IR light being emitted is radio frequency (RF) modulated IR light, and thus, includes an RF component.
  • RF radio frequency
  • Such embodiments can also involve detecting at least a portion of the emitted RF modulated IR light that has reflected off one or more objects.
  • a TOF system can produce depth images in dependence on results of the detecting, as well as update an application in dependence on the depth images.
  • a LF power envelope is the LF average power delivered over time by a signal.
  • a TOF system can be configured to obtain a separate depth image corresponding to each of a plurality of frame periods, wherein each frame period is followed by an inter- frame period, each frame period includes at least two integration periods, and each integration period is followed by a readout period.
  • IR light can be emitted during each of the integration periods, to enable depth images to be produced. Additionally, to reduce how often there are transitions from times during which IR light is being emitted and times during which IR light is not being emitted, and thereby reduce frequency content associated with the transitions, the IR light can also be emitted during the readout periods between pairs of the integration periods within each frame period.
  • IR light in order to decrease a gain level of an automatic gain control (AGC) circuit for use with an IR light receiver of at least one other system configured to wirelessly receive and respond to IR light signals, and thereby make the IR light receiver of the at least one other system less sensitive to interference from the TOF system, IR light can be emitted during the readout periods between pairs of the integration periods within each frame period, as well as during at least a portion the inter-frame periods between pairs of frames. This can be in addition to the IR light that is emitted during the integration periods.
  • AGC automatic gain control
  • IR light may be emitted by producing a drive signal including an RF component and having a LF power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, and driving at least one light source with the drive signal including the RF component.
  • the LF power envelope can be shaped by ramping up pulse amplitudes of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse amplitudes of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • the LF power envelope can be shaped by ramping up pulse duty cycles of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse duty cycles of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • the LF power envelope can be shaped by ramping down temporal gaps between pulses or pulse trains of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping up temporal gaps between pulses or pulse trains of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • the LF power envelope can be shaped by ramping down how often gaps occur between pulses or pulse trains of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping up how often gaps occur between pulses or pulse trains of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is drive to emit IR light.
  • any of the aforementioned ramping up preferably occurs over a time period of at least 50 ⁇ , and any of the aforementioned ramping down preferably also occurs over a time period of at least 50 ⁇ Time permitting, the ramping up and ramping down may occur over longer periods of time.
  • embodiments of the present technology can be used to reduce the adverse effects that TOF systems may have on other systems that are configured to wirelessly receive and respond to IR light signals, while preserving correct TOF operation.
  • Such embodiments preferably do not degrade, or minimally degrade, performance of TOF systems. Additionally, such embodiments preferably do not increase, or minimally increase, power usage by TOF system.
  • FIGS. 1A and IB illustrate an example embodiment of a tracking system with a user playing a game.
  • FIG. 2A illustrates an example embodiment of a capture device that may be used as part of the tracking system.
  • FIG. 2B illustrates an exemplary embodiment of a TOF camera that may be part of the capture device of FIG. 2A.
  • FIG. 3 illustrates an example embodiment of a computing system that may be used to track user behavior and update an application based on the user behavior.
  • FIG. 4 illustrates another example embodiment of a computing system that may be used to track user behavior and update an application based on the tracked user behavior.
  • FIG. 5 illustrates an exemplary depth image
  • FIG. 6 depicts exemplary data in an exemplary depth image.
  • FIG. 7 illustrates exemplary timing and amplitude details associated with two exemplary frames of a signal for use with a TOF system.
  • FIG. 8 illustrates an exemplary LF frequency power spectrum associated with the signal shown in FIG. 7.
  • FIG. 9 illustrates how the LF power spectrum of drive and IR light signals can be shaped, in accordance with an embodiment, to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals.
  • FIG. 10 illustrates how the LF power spectrum of drive and IR light signals can be shaped, in accordance with another embodiment, to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals.
  • FIG. 1 1 illustrates how the LF power spectrum of drive and IR light signals can be shaped, in accordance with still another embodiment, to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals.
  • FIG. 12 illustrates an embodiment that reduces how often there are transitions from times during which IR light signals are emitted to times during which the IR light signals are not emitted, and vice versa, to thereby reduce certain frequency content associated with such transitions.
  • FIG. 13 illustrates an embodiment that combines the embodiments of FIG. 9 and 12.
  • FIG. 14 illustrates an embodiment that decreases a gain level set by an automatic gain control (AGC) circuit associated with a receiver of another system in close proximity to a TOF system, and thereby, makes the receiver of the other system less susceptible to interference from the TOF system.
  • AGC automatic gain control
  • FIG. 15 illustrates an additional technique for smoothing out LF power envelopes in accordance with an embodiment.
  • FIG. 16 is a high level flow diagram that is used to summarize methods according to various embodiments of the present technology.
  • Certain embodiments of the present technology disclosed herein are directed to TOF systems, and methods for user therewith, that substantially reduce interference that a TOF system may cause to at least one other system (e.g., a television, a set top box, a DVD player, IR headphones and/or active 3D glasses) that is configured to wirelessly receive and respond to IR light signals.
  • a TOF system may cause to at least one other system (e.g., a television, a set top box, a DVD player, IR headphones and/or active 3D glasses) that is configured to wirelessly receive and respond to IR light signals.
  • FIGS. 1A and IB illustrate an example embodiment of a tracking system 100 with a user 118 playing a boxing video game.
  • the tracking system 100 may be used to recognize, analyze, and/or track a human target such as the user 118 or other objects within range of the tracking system 100.
  • the tracking system 100 includes a computing system 112 and a capture device 120.
  • the capture device 120 can be used to obtain depth images and color images (also known as RGB images) that can be used by the computing system 112 to identify one or more users or other objects, as well as to track motion and/or other user behaviors.
  • the tracked motion and/or other user behavior can be used to update an application.
  • a user can manipulate game characters or other aspects of the application by using movement of the user's body and/or objects around the user, rather than (or in addition to) using controllers, remotes, keyboards, mice, or the like.
  • a video game system can update the position of images displayed in a video game based on the new positions of the objects or update an avatar based on motion of the user.
  • the computing system 112 may be a computer, a gaming system or console, or the like.
  • the computing system 1 12 may include hardware components and/or software components such that computing system 112 may be used to execute applications such as gaming applications, non-gaming applications, or the like.
  • computing system 112 may include a processor such as a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions stored on a processor readable storage device for performing the processes described herein.
  • the capture device 120 may include, for example, a camera that may be used to visually monitor one or more users, such as the user 118, such that gestures and/or movements performed by the one or more users may be captured, analyzed, and tracked to perform one or more controls or actions within the application and/or animate an avatar or on-screen character, as will be described in more detail below.
  • the tracking system 100 may be connected to an audiovisual device 116 such as a television, a monitor, a high-definition television (HDTV), or the like that may provide game or application visuals and/or audio to a user such as the user 118.
  • the computing system 112 may include a video adapter such as a graphics card and/or an audio adapter such as a sound card that may provide audiovisual signals associated with the game application, non-game application, or the like.
  • the audiovisual device 116 may receive the audiovisual signals from the computing system 112 and may then output the game or application visuals and/or audio associated with the audiovisual signals to the user 118.
  • the audiovisual device 16 may be connected to the computing system 112 via, for example, an S-Video cable, a coaxial cable, an HDMI cable, a DVI cable, a VGA cable, component video cable, or the like.
  • the tracking system 100 may be used to recognize, analyze, and/or track a human target such as the user 118.
  • the user 118 may be tracked using the capture device 120 such that the gestures and/or movements of user 118 may be captured to animate an avatar or on-screen character and/or may be interpreted as controls that may be used to affect the application being executed by computing system 112.
  • the user 118 may move his or her body to control the application and/or animate the avatar or on-screen character.
  • the application executing on the computing system 112 may be a boxing game that the user 118 is playing.
  • the computing system 112 may use the audiovisual device 116 to provide a visual representation of a boxing opponent 138 to the user 118.
  • the computing system 112 may also use the audiovisual device 116 to provide a visual representation of a player avatar 140 that the user 118 may control with his or her movements.
  • the user 118 may throw a punch in physical space to cause the player avatar 140 to throw a punch in game space.
  • the computer system 112 and the capture device 120 recognize and analyze the punch of the user 118 in physical space such that the punch may be interpreted as a game control of the player avatar 140 in game space and/or the motion of the punch may be used to animate the player avatar 140 in game space.
  • Other movements by the user 118 may also be interpreted as other controls or actions and/or used to animate the player avatar, such as controls to bob, weave, shuffle, block, jab, or throw a variety of different power punches.
  • some movements may be interpreted as controls that may correspond to actions other than controlling the player avatar 140.
  • the player may use movements to end, pause, or save a game, select a level, view high scores, communicate with a friend, etc.
  • the player may use movements to select the game or other application from a main user interface.
  • a full range of motion of the user 118 may be available, used, and analyzed in any suitable manner to interact with an application.
  • the human target such as the user 118 may have an object.
  • the user of an electronic game may be holding the object such that the motions of the player and the object may be used to adjust and/or control parameters of the game.
  • the motion of a player holding a racket may be tracked and utilized for controlling an on-screen racket in an electronic sports game.
  • the motion of a player holding an object may be tracked and utilized for controlling an on-screen weapon in an electronic combat game.
  • Objects not held by the user can also be tracked, such as objects thrown, pushed or rolled by the user (or a different user) as well as self- propelled objects.
  • other games can also be implemented.
  • the tracking system 100 may further be used to interpret target movements as operating system and/or application controls that are outside the realm of games.
  • target movements as operating system and/or application controls that are outside the realm of games.
  • virtually any controllable aspect of an operating system and/or application may be controlled by movements of the target such as the user 118.
  • FIG. 2A illustrates an example embodiment of the capture device 120 that may be used in the tracking system 100.
  • the capture device 120 may be configured to capture video with depth information including a depth image that may include depth values via any suitable technique including, for example, time-of-flight, structured light, stereo image, or the like.
  • the capture device 120 may organize the depth information into "Z layers", or layers that may be perpendicular to a Z axis extending from the depth camera along its line of sight.
  • the capture device 120 may include an image camera component 222.
  • the image camera component 222 may be a depth camera that may capture a depth image of a scene.
  • the depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel in the 2-D pixel area may represent a depth value such as a distance in, for example, centimeters, millimeters, or the like of an object in the captured scene from the camera.
  • 2-D two-dimensional
  • the image camera component 222 may include an infra-red (IR) light component 224, a three-dimensional (3-D) camera 226, and an RGB camera 228 that may be used to capture the depth image of a scene.
  • IR infra-red
  • 3-D three-dimensional
  • RGB camera 228 an RGB camera 228 that may be used to capture the depth image of a scene.
  • the IR light component 224 of the capture device 120 may emit an infrared light onto the scene and may then use sensors (not specifically shown in FIG. 2A) to detect the backscattered light from the surface of one or more targets and objects in the scene using, for example, the 3-D camera 226 and/or the RGB camera 228.
  • pulsed IR light may be used such that the time between an outgoing light pulse and a corresponding incoming light pulse may be measured and used to determine a physical distance from the capture device 120 to a particular location on the targets or objects in the scene. Additionally or alternatively, the phase of the outgoing light wave may be compared to the phase of the incoming light wave to determine a phase shift. The phase shift may then be used to determine a physical distance from the capture device to a particular location on the targets or objects. Additional details of an exemplary TOF type of 3-D camera 226 are described below with reference to FIG. 2B.
  • TOF analysis may be used to indirectly determine a physical distance from the capture device 120 to a particular location on the targets or objects by analyzing the intensity of the reflected beam of light over time via various techniques including, for example, shuttered light pulse imaging.
  • the capture device 120 may use a structured light to capture depth information.
  • patterned light i.e., light displayed as a known pattern such as grid pattern, a stripe pattern, or different pattern
  • the pattern may become deformed in response.
  • Such a deformation of the pattern may be captured by, for example, the 3-D camera 226 and/or the RGB camera 228 and may then be analyzed to determine a physical distance from the capture device to a particular location on the targets or objects.
  • the IR Light component 224 is displaced from the cameras 226 and 228 so triangulation can be used to determined distance from cameras 226 and 228.
  • the capture device 120 will include a dedicated IR sensor to sense the IR light.
  • the capture device 120 may include two or more physically separated cameras that may view a scene from different angles to obtain visual stereo data that may be resolved to generate depth information.
  • Other types of depth image sensors can also be used to create a depth image.
  • the capture device 120 may further include a microphone 230.
  • the microphone 230 may include a transducer or sensor that may receive and convert sound into an electrical signal. According to one embodiment, the microphone 230 may be used to reduce feedback between the capture device 120 and the computing system 112 in the target recognition, analysis, and tracking system 100. Additionally, the microphone 230 may be used to receive audio signals (e.g., voice commands) that may also be provided by the user to control applications such as game applications, non-game applications, or the like that may be executed by the computing system 112.
  • audio signals e.g., voice commands
  • the capture device 120 may further include a processor 232 that may be in operative communication with the image camera component 222.
  • the processor 232 may include a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions including, for example, instructions for receiving a depth image, generating the appropriate data format (e.g., frame) and transmitting the data to computing system 112.
  • the capture device 120 may further include a memory component 234 that may store the instructions that may be executed by the processor 232, images or frames of images captured by the 3-D camera and/or RGB camera, or any other suitable information, images, or the like.
  • the memory component 234 may include random access memory (RAM), read only memory (ROM), cache, Flash memory, a hard disk, or any other suitable storage component.
  • RAM random access memory
  • ROM read only memory
  • cache Flash memory
  • a hard disk or any other suitable storage component.
  • the memory component 234 may be a separate component in communication with the image capture component 222 and the processor 232.
  • the memory component 234 may be integrated into the processor 232 and/or the image capture component 222.
  • the capture device 120 may be in communication with the computing system 212 via a communication link 236.
  • the communication link 236 may be a wired connection including, for example, a USB connection, a Firewire connection, an Ethernet cable connection, or the like and/or a wireless connection such as a wireless 802.11b, g, a, or n connection.
  • the computing system 112 may provide a clock to the capture device 120 that may be used to determine when to capture, for example, a scene via the communication link 236.
  • the capture device 120 provides the depth images and color images captured by, for example, the 3-D camera 226 and/or the RGB camera 228 to the computing system 112 via the communication link 236.
  • the depth images and color images are transmitted at 30 frames per second.
  • the computing system 112 may then use the model, depth information, and captured images to, for example, control an application such as a game or word processor and/or animate an avatar or on-screen character.
  • Computing system 112 includes gestures library 240, structure data 242, depth image processing and object reporting module 244 and application 246.
  • Depth image processing and object reporting module 244 uses the depth images to track motion of objects, such as the user and other objects. To assist in the tracking of the objects, depth image processing and object reporting module 244 uses gestures library 240 and structure data 242.
  • Structure data 242 includes structural information about objects that may be tracked. For example, a skeletal model of a human may be stored to help understand movements of the user and recognize body parts. Structural information about inanimate objects may also be stored to help recognize those objects and help understand movement.
  • Gestures library 240 may include a collection of gesture filters, each comprising information concerning a gesture that may be performed by the skeletal model (as the user moves). The data captured by the cameras 226, 228 and the capture device 120 in the form of the skeletal model and movements associated with it may be compared to the gesture filters in the gesture library 240 to identify when a user (as represented by the skeletal model) has performed one or more gestures. Those gestures may be associated with various controls of an application. Thus, the computing system 112 may use the gestures library 240 to interpret movements of the skeletal model and to control application 246 based on the movements. As such, gestures library may be used by depth image processing and object reporting module 244 and application 246.
  • Application 246 can be a video game, productivity application, etc.
  • depth image processing and object reporting module 244 will report to application 246 an identification of each object detected and the location of the object for each frame.
  • Application 246 will use that information to update the position or movement of an avatar or other images in the display.
  • FIG. 2B illustrates an example embodiment of a TOF type of 3-D camera 226, which can also be referred to as a TOF camera 226, or more generally can be referred to as a TOF system 226.
  • the TOF system 226 is shown as including a driver 260 that drives a light source 250.
  • the light source 250 can be the IR light component 224 shown in FIG. 2A, or can be one or more other light emitting element. More generally, the light source 250 can include one or more light emitting elements, such as, but not limited to, laser diodes or light emitting diodes (LEDs).
  • a laser diode can include one or more vertical- cavity surface-emitting lasers (VCESLs) or edge emitting lasers, but is not limited thereto.
  • VCESLs vertical- cavity surface-emitting lasers
  • a first light source including one or more laser diodes
  • a second light source including one or more LEDs. While it is likely that such light emitting elements emit IR light, light of alternative wavelengths can alternatively be emitted by the light emitting elements. Unless stated otherwise, it is assumed that the light source 250 emits IR light.
  • the TOF system 226 is also shown as including a clock signal generator 262, which produces a clock signal that is provided to the driver 260. Additionally, the TOF system 226 is shown as including a microprocessor 264 that can control the clock signal generator 262 and/or the driver 260. The TOF system 226 is also shown as including an image pixel detector array 268, readout circuitry 270 and memory 266.
  • the image pixel detector array 268 might include, e.g., 320 x 240 image pixel detectors, but is not limited thereto. Each image pixel detector can be, e.g., a complementary metal-oxide- semiconductor (CMOS) sensor or a charged coupled device (CCD) sensor, but is not limited thereto.
  • CMOS complementary metal-oxide- semiconductor
  • CCD charged coupled device
  • each image pixel detector can have its own dedicated readout circuit, or readout circuitry can be shared by many image pixel detectors.
  • the components of the TOF system 226 shown within the block 280 are implemented in a single integrated circuit (IC), which can also be referred to as a single TOF chip.
  • the driver 260 can produce a radio frequency (RF) modulated drive signal in dependence on a clock signal received from clock signal generator 262.
  • the driver 260 can include, for example, one or more buffers, amplifiers and/or modulators, but is not limited thereto.
  • the clock signal generator 262 can include, for example, one or more reference clocks and/or voltage controlled oscillators, but is not limited thereto.
  • the microprocessor 264 which can be part of a microcontroller unit, can be used to control the clock signal generator 262 and/or the driver 260.
  • the microprocessor 264 can access waveform information stored in the memory 266 in order to produce an RF modulated drive signal in accordance with various embodiments described herein.
  • the TOF system 226 can includes its own memory 266 and microprocessor 264, as shown in FIG. 2B.
  • the processor 232 and/or memory 234 of the capture device 120 can be used to control aspects of the TOF system 226.
  • the light source 250 In response to being driven by an RF modulated drive signal, the light source 250 emits RF modulated light, which can also be referred to as an RF modulate light signal.
  • RF modulated light can also be referred to as an RF modulate light signal.
  • a carrier frequency of the RF modulated drive signal and the RF modulated light can be in a range from about 5 MHz to many hundreds of MHz, but for illustrative purposes will be assumed to be about 100 MHz.
  • the light emitted by the light source 250 is transmitted through an optional lens or light shaping diffuser 252 towards a target object (e.g., a user 118).
  • each individual image pixel detector of the array 268 produces an integration value indicative of a magnitude and a phase of detected RF modulated light originating from the light source that has reflected off the object and is incident of the image pixel detector.
  • integrations values or more generally TOF information, enable distances (Z) to be determined, and collectively, enable depth images to be produced.
  • optical energy from the light source 250 and detected optical energy signals are synchronized to each other such that a phase difference, and thus a distance Z, can be measured from each image pixel detector.
  • the readout circuitry 270 converts analog integration values generated by the image pixel detector array 268 into digital readout signals, which are provided to the microprocessor 264 and/or the memory 266, and which can be used to produce depth images.
  • FIG. 3 illustrates an example embodiment of a computing system that may be the computing system 112 shown in FIGS. 1A-2B used to track motion and/or animate (or otherwise update) an avatar or other on-screen object displayed by an application.
  • the computing system such as the computing system 112 described above with respect to FIGS. 1A-2 may be a multimedia console, such as a gaming console.
  • the multimedia console 300 has a central processing unit (CPU) 301 having a level 1 cache 102, a level 2 cache 304, and a flash ROM (Read Only Memory) 306.
  • the level 1 cache 302 and a level 2 cache 304 temporarily store data and hence reduce the number of memory access cycles, thereby improving processing speed and throughput.
  • the CPU 301 may be provided having more than one core, and thus, additional level 1 and level 2 caches 302 and 304.
  • the flash ROM 306 may store executable code that is loaded during an initial phase of a boot process when the multimedia console 300 is powered ON.
  • a graphics processing unit (GPU) 308 and a video encoder/video codec (coder/decoder) 314 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the graphics processing unit 308 to the video encoder/video codec 314 via a bus. The video processing pipeline outputs data to an A/V (audio/video) port 340 for transmission to a television or other display.
  • a memory controller 310 is connected to the GPU 308 to facilitate processor access to various types of memory 312, such as, but not limited to, a RAM (Random Access Memory).
  • the multimedia console 300 includes an I/O controller 320, a system management controller 322, an audio processing unit 323, a network interface 324, a first USB host controller 326, a second USB controller 328 and a front panel I/O subassembly 330 that are preferably implemented on a module 318.
  • the USB controllers 326 and 328 serve as hosts for peripheral controllers 342(l)-342(2), a wireless adapter 348, and an external memory device 346 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc.).
  • the network interface 324 and/or wireless adapter 348 provide access to a network (e.g., the Internet, home network, etc.) and may be any of a wide variety of various wired or wireless adapter components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like.
  • a network e.g., the Internet, home network, etc.
  • wired or wireless adapter components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like.
  • System memory 343 is provided to store application data that is loaded during the boot process.
  • a media drive 344 is provided and may comprise a DVD/CD drive, Blu- Ray drive, hard disk drive, or other removable media drive, etc.
  • the media drive 344 may be internal or external to the multimedia console 300.
  • Application data may be accessed via the media drive 344 for execution, playback, etc. by the multimedia console 300.
  • the media drive 344 is connected to the I/O controller 320 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 1394).
  • the system management controller 322 provides a variety of service functions related to assuring availability of the multimedia console 300.
  • the audio processing unit 323 and an audio codec 332 form a corresponding audio processing pipeline with high fidelity and stereo processing. Audio data is carried between the audio processing unit 323 and the audio codec 332 via a communication link.
  • the audio processing pipeline outputs data to the A/V port 340 for reproduction by an external audio player or device having audio capabilities.
  • the front panel I/O subassembly 330 supports the functionality of the power button 350 and the eject button 352, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the multimedia console 300.
  • a system power supply module 336 provides power to the components of the multimedia console 300.
  • a fan 338 cools the circuitry within the multimedia console 300.
  • the CPU 301, GPU 308, memory controller 310, and various other components within the multimedia console 300 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures.
  • bus architectures can include a Peripheral Component Interconnects (PCI) bus, PCI-Express bus, etc.
  • application data may be loaded from the system memory 343 into memory 312 and/or caches 302, 304 and executed on the CPU 301.
  • the application may present a graphical user interface that provides a consistent user experience when navigating to different media types available on the multimedia console 300.
  • applications and/or other media contained within the media drive 344 may be launched or played from the media drive 344 to provide additional functionalities to the multimedia console 300.
  • the multimedia console 300 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the multimedia console 300 allows one or more users to interact with the system, watch movies, or listen to music. However, with the integration of broadband connectivity made available through the network interface 324 or the wireless adapter 348, the multimedia console 300 may further be operated as a participant in a larger network community.
  • a set amount of hardware resources are reserved for system use by the multimedia console operating system. These resources may include a reservation of memory (e.g., 16MB), CPU and GPU cycles (e.g., 5%), networking bandwidth (e.g., 8 Kbps), etc. Because these resources are reserved at system boot time, the reserved resources do not exist from the application's view.
  • the memory reservation preferably is large enough to contain the launch kernel, concurrent system applications and drivers.
  • the CPU reservation is preferably constant such that if the reserved CPU usage is not used by the system applications, an idle thread will consume any unused cycles.
  • lightweight messages generated by the system applications e.g., popups
  • the amount of memory required for an overlay depends on the overlay area size and the overlay preferably scales with screen resolution. Where a full user interface is used by the concurrent system application, it is preferable to use a resolution independent of application resolution. A scaler may be used to set this resolution such that the need to change frequency and cause a TV resynch is eliminated.
  • the multimedia console 300 boots and system resources are reserved, concurrent system applications execute to provide system functionalities.
  • the system functionalities are encapsulated in a set of system applications that execute within the reserved system resources described above.
  • the operating system kernel identifies threads that are system application threads versus gaming application threads.
  • the system applications are preferably scheduled to run on the CPU 301 at predetermined times and intervals in order to provide a consistent system resource view to the application. The scheduling is to minimize cache disruption for the gaming application running on the console.
  • a multimedia console application manager controls the gaming application audio level (e.g., mute, attenuate) when system applications are active.
  • Input devices are shared by gaming applications and system applications.
  • the input devices are not reserved resources, but are to be switched between system applications and the gaming application such that each will have a focus of the device.
  • the application manager preferably controls the switching of input stream, without knowledge the gaming application's knowledge and a driver maintains state information regarding focus switches.
  • the cameras 226, 228 and capture device 120 may define additional input devices for the console 300 via USB controller 326 or other interface.
  • FIG. 4 illustrates another example embodiment of a computing system 420 that may be the computing system 112 shown in FIGS. 1A-2B used to track motion and/or animate (or otherwise update) an avatar or other on-screen object displayed by an application.
  • the computing system 420 is only one example of a suitable computing system and is not intended to suggest any limitation as to the scope of use or functionality of the presently disclosed subject matter. Neither should the computing system 420 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary computing system 420.
  • the various depicted computing elements may include circuitry configured to instantiate specific aspects of the present disclosure.
  • circuitry used in the disclosure can include specialized hardware components configured to perform function(s) by firmware or switches.
  • the term circuitry can include a general purpose processing unit, memory, etc., configured by software instructions that embody logic operable to perform function(s).
  • an implementer may write source code embodying logic and the source code can be compiled into machine readable code that can be processed by the general purpose processing unit. Since one skilled in the art can appreciate that the state of the art has evolved to a point where there is little difference between hardware, software, or a combination of hardware/software, the selection of hardware versus software to effectuate specific functions is a design choice left to an implementer.
  • Computing system 420 comprises a computer 441, which typically includes a variety of computer readable media.
  • Computer readable media can be any available media that can be accessed by computer 441 and includes both volatile and nonvolatile media, removable and non-removable media.
  • the system memory 422 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 423 and random access memory (RAM) 460.
  • ROM read only memory
  • RAM random access memory
  • a basic input/output system 424 (BIOS) containing the basic routines that help to transfer information between elements within computer 441, such as during start-up, is typically stored in ROM 423.
  • BIOS basic input/output system 424
  • RAM 460 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 459.
  • FIG. 4 illustrates operating system 425, application programs 426, other program modules 427, and program data 428.
  • the computer 441 may also include other removable/non-removable, volatile/nonvolatile computer storage media.
  • FIG. 4 illustrates a hard disk drive 438 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 439 that reads from or writes to a removable, nonvolatile magnetic disk 454, and an optical disk drive 440 that reads from or writes to a removable, nonvolatile optical disk 453 such as a CD ROM or other optical media.
  • removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
  • the hard disk drive 438 is typically connected to the system bus 421 through a non-removable memory interface such as interface 434, and magnetic disk drive 439 and optical disk drive 440 are typically connected to the system bus 421 by a removable memory interface, such as interface 435.
  • the drives and their associated computer storage media discussed above and illustrated in FIG. 4, provide storage of computer readable instructions, data structures, program modules and other data for the computer 441.
  • hard disk drive 438 is illustrated as storing operating system 458, application programs 457, other program modules 456, and program data 455. Note that these components can either be the same as or different from operating system 425, application programs 426, other program modules 427, and program data 428.
  • Operating system 458, application programs 457, other program modules 456, and program data 455 are given different numbers here to illustrate that, at a minimum, they are different copies.
  • a user may enter commands and information into the computer 441 through input devices such as a keyboard 451 and pointing device 452, commonly referred to as a mouse, trackball or touch pad.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are often connected to the processing unit 459 through a user input interface 436 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
  • the cameras 226, 228 and capture device 120 may define additional input devices for the computing system 420 that connect via user input interface 436.
  • a monitor 442 or other type of display device is also connected to the system bus 421 via an interface, such as a video interface 432.
  • computers may also include other peripheral output devices such as speakers 444 and printer 443, which may be connected through an output peripheral interface 433.
  • Capture Device 120 may connect to computing system 420 via output peripheral interface 433, network interface 437, or other interface.
  • the computer 441 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 446.
  • the remote computer 446 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 441, although only a memory storage device 447 has been illustrated in FIG. 4.
  • the logical connections depicted include a local area network (LAN) 445 and a wide area network (WAN) 449, but may also include other networks.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • the computer 441 When used in a LAN networking environment, the computer 441 is connected to the LAN 445 through a network interface 437. When used in a WAN networking environment, the computer 441 typically includes a modem 450 or other means for establishing communications over the WAN 449, such as the Internet.
  • the modem 450 which may be internal or external, may be connected to the system bus 421 via the user input interface 436, or other appropriate mechanism.
  • program modules depicted relative to the computer 441, or portions thereof may be stored in the remote memory storage device.
  • FIG. 4 illustrates application programs 448 as residing on memory device 447. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • the capture device 120 provides RGB images (also known as color images) and depth images to the computing system 112.
  • the depth image may be a plurality of observed pixels where each observed pixel has an observed depth value.
  • the depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel in the 2-D pixel area may have a depth value such as a length or distance in, for example, centimeters, millimeters, or the like of an object in the captured scene from the capture device.
  • FIG. 5 illustrates an example embodiment of a depth image that may be received at computing system 112 from capture device 120.
  • the depth image may be an image and/or frame of a scene captured by, for example, the 3- D camera 226 and/or the RGB camera 228 of the capture device 120 described above with respect to FIG. 2A.
  • the depth image may include a human target corresponding to, for example, a user such as the user 118 described above with respect to FIGS. 1A and IB and one or more non-human targets such as a wall, a table, a monitor, or the like in the captured scene.
  • the depth image may include a plurality of observed pixels where each observed pixel has an observed depth value associated therewith.
  • the depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel at particular x-value and y-value in the 2-D pixel area may have a depth value such as a length or distance in, for example, centimeters, millimeters, or the like of a target or object in the captured scene from the capture device.
  • a depth image can specify, for each of the pixels in the depth image, a pixel location and a pixel depth. Following a segmentation process, each pixel in the depth image can also have a segmentation value associated with it.
  • the pixel location can be indicated by an x-position value (i.e., a horizontal value) and a y-position value (i.e., a vertical value).
  • the pixel depth can be indicated by a z-position value (also referred to as a depth value), which is indicative of a distance between the capture device (e.g., 120) used to obtain the depth image and the portion of the user represented by the pixel.
  • the segmentation value is used to indicate whether a pixel corresponds to a specific user, or does not correspond to a user.
  • the depth image may be colorized or grayscale such that different colors or shades of the pixels of the depth image correspond to and/or visually depict different distances of the targets from the capture device 120.
  • one or more high-variance and/or noisy depth values may be removed and/or smoothed from the depth image; portions of missing and/or removed depth information may be filled in and/or reconstructed; and/or any other suitable processing may be performed on the received depth image.
  • FIG. 6 provides another view/representation of a depth image (not corresponding to the same example as FIG. 5).
  • the view of FIG. 6 shows the depth data for each pixel as an integer that represents the distance of the target to capture device 120 for that pixel.
  • the example depth image of FIG. 6 shows 24x24 pixels; however, it is likely that a depth image of greater resolution would be used.
  • a TOF system may be located in close proximity to (e.g., within the same room as) a consumer electronic device (e.g., a television, a set top box and/or a media player) that is/are configured to be remotely controlled by a handheld remote control device.
  • the capture device 120 can include a TOF system 226 that is located close to the television or display 116.
  • a TOF system may be located in close proximity to other types of systems configured to wirelessly receive and respond to IR light signals, such as, but not limited to, systems that include wireless IR headphones and 3D television systems that include active shutter 3D glasses.
  • the TOF system may operate by illuminating a target (e.g., user 118) with RF modulated IR light and detecting IR light that reflects off the target and is incident on an image pixel detector array of the TOF camera. While the carrier frequency of the RF modulated IR light is typically well above the carrier frequency of remote control signals, abrupt transitions from times during which the light source does not emit the RF modulated light to times during which the light source emits RF modulated light, and vice versa, can produce lower frequency content that can interfere with the remote control signals.
  • a low frequency content associated with the modulated IR light, produced by the TOF system may interfere with remote control signals intended to control another device (e.g., a television) within the vicinity of the TOF system. While most if not all of the interference produced by the TOF system will not correspond to a valid remote control command (and thus, will be rejected by a remote control receiver of the other device as an invalid command), the interference produced by the TOF system may be significant enough to prevent a user from being able to actually remotely control the other device (e.g., the television or display 116) that is within close proximity to the TOF system.
  • the low frequency content produced by the TOF system can similarly interference with other types of systems that are configured to wirelessly receive and respond to IR light signals.
  • Certain embodiments of the present technology described below which are for use with a TOF system, substantially reduce frequency content within the range of frequencies known to be used by remote control devices and/or within one or more other frequency ranges known to be used by other systems configured to wirelessly receive and respond to IR light signals. Accordingly, such embodiments enable other systems to operate in their intended manner when in close proximity to a TOF system. For a more specific example, such embodiments enable consumer electronic devices to be remotely controlled even though they are in close proximity to a TOF system.
  • IR remote control signals have a carrier frequency between 10 kHz and 100 kHz, and even more specifically between 30 kHz and 60 kHz.
  • Certain remote control devices for example, transmit IR remote control signals having a carrier frequency of about 36 kHz (this is not to be confused with the actual frequency of the IR light itself).
  • IR remote control signals having a carrier frequency of about 455 KHz.
  • Still other systems utilize IR remote control signals having a carrier frequency of about 1 MHz.
  • a consumer electronic device e.g., television, set top box or media player
  • that is controllable by remote control signals includes a remote control receiver that is configured to receive and decode remote control signals within an expected frequency range, examples of which were discussed above.
  • FIG. 7 will first be used to describe a typical RF modulated drive signal generated by a TOF system and a typical RF modulated IR light signal emitted by the TOF system. More specifically, FIG. 7 illustrates exemplary pulse timing and pulse amplitude details associated with two exemplary frames of signals for use with a TOF system.
  • the frames shown in FIG. 7 can be for use with a TOF system that, for example, is configured to obtain a separate depth image corresponding to each of a plurality of frames, which can also be referred to as frame periods.
  • the waveforms shown in FIG. 7 are illustrative of both an RF modulated drive signal used to drive an IR light source, as well as an RF modulated IR light signal produced by (and more specifically, emitted by) the light source being driven by the RF modulated drive signal.
  • each frame period is followed by an inter-frame period, which separates the frame period from the next frame period.
  • the length of each frame period may or may not be the same as the length of each inter-frame period.
  • Each frame period includes at least two integration periods, and each integration period is followed by a readout period.
  • a frame period may include ten integration periods, each of which is followed by a respective one of ten readout periods. Except for the last readout period of a frame period, each of the readout periods separates a pair of the integration periods of the frame period.
  • the frame rate can be, for example, 30 Hz, but is not limited thereto. Where the frame rate is 30 Hz, each frame period plus inter- frame period pair is about 33.33 msec.
  • each of the integration periods is shown as including numerous pulses having the same pulse amplitude, and each of the readout periods and inter-frame periods is shown as including no pulses.
  • FIG. 7 is illustrative of an RF modulated drive signal, as well as the RF modulated IR light signal generated by driving an IR light source with the RF modulated drive signal. Assuming that the pulse frequency is about 100 MHz, the pulse frequency is well above the frequency range known to be used by most other systems that are configured to wirelessly receive and respond to IR light signals.
  • the LF power envelope of the RF modulated IR light includes significant frequency content within the frequency ranges known to be used by other systems that are configured to wirelessly receive and respond to IR light signal, examples of which were discussed above. Such frequency content is primarily due to the abrupt transitions from times during which the light source is not driven by the RF modulated drive signal to times during which the light source is driven by the RF modulated drive signal, and vice versa, which occur at the beginning and end of each of the integration periods.
  • the shaded areas labeled 702 in FIG. 7 are illustrative of the LF power envelopes of the RF modulated drive signal and the RF modulated light signal produced using the drive signal.
  • a LF power envelope, as the term is used herein, is the LF average power delivered over time by a signal.
  • FIG. 8 illustrates an exemplary LF frequency power spectrum associated with the signal shown in FIG. 7.
  • a system e.g., a television or display
  • the power spectrum associated with the signal shown in FIG. 7 will also include a peak at the carrier frequency, e.g., at 100 MHz.
  • a first embodiment for smoothing out the edges of the LF power envelopes involves ramping up the pulse amplitudes of an RF component of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light.
  • This embodiment also includes ramping down pulse amplitudes of the RF component of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • the RF component of the IR light signal emitted by the light source will also have pulse amplitudes that ramp up and thereafter ramp down.
  • the shaded areas labeled 902 in FIG. 9 are illustrative of the LF power envelopes of the drive signal and the IR light signal produced using the drive signal.
  • the ramping up of the pulse amplitudes should occur over a period of at least 50 ⁇ and the ramping down of the pulse amplitudes should occur over a period of at least 50 ⁇ Time permitting, the ramping up of the pulse amplitudes preferably occurs over a period between approximately 1 msec and 10 msec, and the ramping down similarly preferably occurs over a period between approximately 1 msec and 10 msec, which should ensure a substantial reduction in frequency content between 10 kHz and 100 kHz, which is the frequency range typically used to transmit IR remote control signals. It is noted that FIG. 9 is included for illustrative purposes, but is not drawn to scale, since the ramping up and down of pulse amplitudes will occur over much more than three or four pulses.
  • the microprocessor 264 can control the clock signal generator 262 to produce clock pulses having pulse amplitudes that ramp up and ramp down.
  • the microprocessor 264 can control the driver 260 to produce an RF modulated drive signal that includes pulse amplitudes that ramp up and ramp down.
  • the driver 260 can include a pulse amplitude modulator that is controlled by the microprocessor 264.
  • the microprocessor 264 may access information regarding pulse amplitudes from the memory 266.
  • Such a pulse amplitude modulator can be implemented, e.g., using an amplifier having an adjustable gain that is controlled by the microcontroller.
  • a second embodiment for smoothing out the edges of the LF power envelopes involves ramping up the pulse duty cycles of an RF component of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light.
  • This embodiment also includes ramping down pulse duty cycles of the RF component of the drive signal when transitioning from a time during which the light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • the RF component of the light signal emitted by the light source will also have pulse duty cycles that ramp up and thereafter ramp down.
  • the shaded areas labeled 1002 in FIG. 10 are illustrative of the LF power envelopes of the drive signal and the IR light signal produced using the drive signal.
  • the ramping up of the pulse duty cycles should occur over a period of at least 50 ⁇ and the ramping down of the pulse duty cycles should occur over a period of at least 50 ⁇ Time permitting, the ramping up of the pulse duty cycles preferably occurs over a period between approximately 1 msec and 10 msec, and the ramping down similarly occurs over a period between approximately 1 msec and 10 msec, which should ensure a substantial reduction in frequency content between 10 kHz and 100 kHz, which is the frequency range typically used to transmit IR remote control signals. It is noted that FIG. 10 is included for illustrative purposes, but is not drawn to scale, since the ramping up and down of pulse duty cycles will occur over much more than three or four pulses.
  • the microprocessor 264 can control the clock signal generator 262 to produce clock pulses having pulse duty cycles that ramp up and ramp down.
  • the microprocessor 264 can control the driver 260 to produce a drive signal having an RF component that includes pulse duty cycles that ramp up and ramp down.
  • the driver 260 can include a pulse duty cycle modulator that is controlled by the microprocessor 264.
  • the microprocessor 264 may access information regarding pulse duty cycles from the memory 266.
  • Such a pulse duty cycle modulator can be implemented, e.g., using a pulse width modulator. These are just a few examples, which are not meant to be all encompassing.
  • FIG. 11 Another embodiment for smoothing out the edges of the LF power envelopes, which is illustrated in FIG. 11, involves ramping down temporal gaps between adjacent pulses or pulse trains of the RF modulated drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light.
  • This embodiment also includes ramping up temporal gaps between adjacent pulses or pulse trains of the RF component of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • the RF component of the IR light signal emitted by the light source will also have pulses or pulse trains with temporal gaps that ramp down and thereafter ramp up.
  • the shaded areas labeled 1102 in FIG. 11 are illustrative of the LF power envelopes of the drive signal and the light signal produced using the drive signal.
  • the ramping down of the temporal gaps between adjacent pulses or pulse trains should occur over a period of at least 50 ⁇ and the ramping down of the temporal gaps between adjacent pulses or pulse trains should occur over a period of at least 50 ⁇ Time permitting, the ramping down of the temporal gaps between adjacent pulses or pulse trains preferably occurs over a period between approximately 1 msec and 10 msec, and the ramping up of the temporal gaps between adjacent pulses or pulse trains similarly preferably occurs over a period between approximately 1 msec and 10 msec, which should ensure a substantial reduction in frequency content between 10 kHz and 100 kHz, which is the frequency range typically used to transmit IR remote control signals. It is noted that FIG.
  • 11 is included for illustrative purposes, but is not drawn to scale, since the ramping down and up of temporal gaps between adjacent pulses will occur over much more than three or four pulses.
  • the microprocessor 264 can control the clock signal generator 262 to produce clock pulses having temporal gaps between pulses that that ramp down and ramp up. This can be accomplished, e.g., by not outputting certain clock pulses.
  • a gating circuit can be located between the clock signal generator 262 and the driver 260, so that some clock pulses are not provided to the driver 260, to thereby control the temporal gaps between adjacent pulses or pulse trains, and/or how often gaps occur.
  • the gating circuit can be part of or upstream from the driver 260 and can selectively prevent some drive pulses from being provided to the light source 250, to thereby control the temporal gaps between adjacent pulses or pulse trains output by the light source and/or how often gaps occur.
  • the microprocessor 264 can control the clock signal generator 262, the driver 260 and/or such a gating circuit to achieve the ramping up and down of temporal gaps between pulses or pulse trains, and/or how often gaps occur.
  • the microprocessor 264 may access information regarding temporal gaps and/or how often gaps should occur between pulses or pulse trains from the memory 266. These are just a few examples, which are not meant to be all encompassing.
  • the abrupt transitions from times during which a light source emits IR light to times during which no light source emits IR light, and vice versa produces frequency content that can interfere with the IR remote control signals and/or other IR signals used by other systems.
  • the LF power envelopes associated with the IR light, produced by the TOF camera may interfere with one or more other system (configured to wirelessly receive and respond to IR light signals) that is/are within close proximity to the TOF camera.
  • An embodiment which will now be described with reference to FIG. 12, reduces how often there are transitions from times during which IR light is emitted to times during which IR light is not emitted, and vice versa, thereby reducing frequency content associated with such transitions.
  • the RF modulated drive and RF modulated light signals are only produced during the integration periods of frame periods, but are not produced during the readout periods of frame periods, as was shown in FIGS. 7 and 9-11.
  • drive and IR light signals are produced during each of the integration periods of each frame period as well as during each of the readout periods between pairs of the integration periods within each frame period, as shown in FIG. 12. This has the effect of reducing how often there are transitions from times during which no light source emits IR light to times during which a light source is driven to emit IR light, and vice versa, and thereby reduces frequency content associated with such transitions.
  • FIG. 12 are illustrative of the LF power envelopes of the drive and IR light signals produced using such an embodiment.
  • FIG. 12 even though there are multiple integration and readout periods per frame period, there is only one LF power envelope per frame period, and thus only one rising edge of the LF power envelope and only one falling edge of the LF power envelope.
  • the same light source can be used for emitting IR light during integration periods and readout periods.
  • a first light source e.g., an IR laser diode
  • a second light source e.g., an IR LED
  • Other variations are also possible, and within the scope of an embodiment.
  • FIG. 13 illustrates an embodiment that combines the embodiment described with reference to FIG. 12 with the embodiment described with reference to FIG. 9.
  • the shaded areas labeled 1302 in FIG. 13 are illustrative of the LF power envelopes of the drive signal and the emitted IR light signal produced using such an embodiment.
  • the embodiment described with reference to FIG. 13 achieves the benefits of the embodiment described with reference to FIG. 12 as well as the benefits of the embodiment described with reference to FIG. 9.
  • the embodiment described with reference to FIG. 12 can be combined with one of the embodiments described with reference to FIGS. 10 or 11 to achieve similar substantial reductions in frequency content within the frequency range known to be used by remote controlled devices and other systems configured to wirelessly receive and respond to IR light signals.
  • FIG. 9 can be combined with one or more of the embodiments described with reference to FIGS. 10 and 11. It is also possible that the embodiment described with reference to FIG. 12 can be combined with more than one of the embodiments described with reference to FIGS. 9-11.
  • the readout circuitry (e.g., 270 in FIG. 2B) either does not generate readout signals corresponding to reflected light detected by the image pixel detector array 268 during readout periods, or readout signals corresponding to reflected light detected by the image pixel detector array 268 during readout periods should be ignored by the TOF system.
  • the readout circuitry (e.g., 270 in FIG. 2B) generates, and the TOF system utilizes, the readout signals corresponding to reflected IR light detected by the image pixel detector array 268 during readout periods.
  • AGC automatic gain control
  • Many systems that are configured to wireless receive and respond to IR signals include a receiver that has an automatic gain control (AGC) circuit that adjusts the sensitivity of the receiver in dependence on ambient light conditions. More specifically, such AGC circuits usually decrease gain of a receiver amplifier when there is high ambient light conditions, which makes the receiver less sensitive; and the AGC circuits usually increase the gain of the receiver amplifier when where there is low ambient light conditions, which makes the receiver more sensitive. The more sensitive the receiver, the less need for a direct line of sight between a sub-system that transmit IR signals (e.g., a remote control device that transmits IR remote control signals) and the receiver, which for example, can be built into a television or set top box.
  • a sub-system that transmit IR signals e.g., a remote control device that transmits IR remote control signals
  • the less sensitive the receiver the more need for a direct line of sight between the sub-system (e.g., a remote control device that transmits IR remote control signals) and the receiver.
  • the reason for reducing the amplifier gain during high ambient light conditions is that the reduction in gain makes the receiver less sensitive to interference resulting from ambient light.
  • reducing the gain of a receiver amplifier also has the effect of making the receiver less sensitive to interference resulting from RF component of IR light produced by a TOF system.
  • An embodiment of the present technology which shall now be described with reference to FIG. 14, purposely reduces the receiver amplifier gain (even during low ambient light conditions) in order to make a receiver (e.g., a remote control receiver) less sensitive to interference resulting from IR light produced by a TOF system.
  • the AGC circuit of a receiver may, for example, have about 50 dB of adjustable gain. Such an AGC circuit automatically varies the gain of a receiver amplifier between its minimum gain (in a bright environment) and its maximum gain (in a dark environment).
  • the driver e.g., 250
  • a light source e.g. 250
  • the TOF system purposely increases the percentage of each frame period during with IR light is emitted, as shown in FIG. 14.
  • the width of the LF power envelope 1402 can be made greater than the width of the frame period by emitting IR light during portions of the adjacent inter-frame periods.
  • the TOF system may emit IR light during at least of portion of inter-frame periods, which is not typically done.
  • this embodiment can be combined with previously described embodiments. For example, in FIG. 14, the pulse amplitudes are ramped up and down in the manner originally described with reference to FIG.
  • the same light source can be used for emitting IR light during integration periods, the readout periods and portions of the inter-frame periods.
  • a first light source e.g., an IR laser diode
  • a second light source e.g., an IR LED
  • Other variations are also possible, and within the scope of an embodiment.
  • FIGS. 9-14 the pulse amplitudes within the middle integration period(s) of a frame period were shown as staying the same. However, this need not be the case. For example, the pulse amplitudes may change from one integration period to the next. In order to reduce frequency content resulting from such abrupt relatively significant changes in pulse amplitudes, it is advantageous to produce RF modulated drive and RF modulated light signals during the readout periods (between such integration periods) and smooth out the changes in the pulse amplitudes, as can be appreciated from FIG. 15. More generally, FIG. 15 illustrates a smoothing out of all transitions of the LF power envelope 1502. The embodiment described with reference to FIG. 15 can be combined with one or more of the previously described embodiments.
  • ramping up and down pulse amplitude to smooth out the LF power envelope 1502
  • ramping up and down of pulse duty cycles temporal gaps and/or how often gaps occur between adjacent pulses or pulse trains can be used to smooth out the LF power envelope 1502.
  • FIG. 16 The high level flow diagram of FIG. 16 will now be used to summarize methods according to various embodiments of the present technology.
  • Such methods which are for use by a TOF system, are for substantially reducing interference that the TOF system may cause to a further device that is configured to wirelessly receive and respond to IR signals transmitted by a remote control device that is intended to remotely control the further device (e.g., a television).
  • step 1602 involves emitting IR light having a LF power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, wherein at least a portion of IR light being emitted is RF modulated IR light.
  • Frequency ranges within which frequency content can be reduced using embodiments described herein include, but are not limited to: 10 kHz - 100 kHz frequencies that are used by some systems for sending IR remote control signals; 455 kHz (+/- 10%) and/or 1 MHz (+/- 10%>) frequencies that are used by other systems for sending IR remote control signals; 2.3 MHz (+/- 10%>) and 2.8 MHz (+/- 10%>) frequencies that are used by some systems for transmitting IR signals to wireless IR headphones; and 25 KHz - 30 KHz frequencies that are used by some systems for transmitting IR signals to wireless 3D shutter glasses.
  • step 1604 involves detecting at least a portion of the emitted RF modulated IR light that has reflected off one or more objects.
  • depth images are produced in dependence on results of the detecting.
  • the depth images are used to update an application.
  • depth images can be used to tracked motion and/or other user behavior, which can be used, e.g., to manipulate a game character or other aspects of the application in response to movement of a user's body and/or objects around the user, rather than (or in addition to) using controllers, remotes, keyboards, mice, or the like.
  • a video game system can update the position of images displayed in a video game based on the new positions of the objects or update an avatar based on motion of the user as detected based on depth images.
  • step 1602 can involve ramping up pulse amplitudes of the RF component of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse amplitudes of the RF component of the drive signal when transitioning from a time during which the light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • This will result in similar ramping up and ramping down of pulse amplitudes of the light pulses emitted by the light source.
  • step 1602 can involve ramping up pulse duty cycles of the RF component of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse duty cycles of the RF component of the drive signal when transitioning from a time during which the light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • This will result in similar ramping up and ramping down of pulse duty cycles of the light pulses emitted by the light source.
  • step 1602 can involve ramping down temporal gaps and/or how often gaps occur between adjacent pulses or pulse trains of the RF component of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping up temporal gaps and/or how often gaps occur between adjacent pulses or pulse trains of the RF component of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
  • This will result in similar ramping down and ramping up of temporal gaps and/or how often gaps occur between light pulses or pulse trains emitted by the light source.
  • step 1602 can involve emitting IR light during integration periods of each frame period as well as during each of the readout periods between pairs of the integration periods within each frame period. Such an embodiment reduces how often there are transitions from times during which IR light is emitted to times during which IR light is not emitted, and vice versa, thereby reducing frequency content associated with such transitions.
  • IR light can also be emitted during portions of inter-frame periods.

Abstract

Embodiments disclosed herein are directed to time-of-flight (TOF) systems, and methods for use therewith, that substantially reduce interference that the TOF system may cause to at least one other system that is configured to wirelessly receive and respond to IR light signals. Some such embodiments involve emitting IR light having a low frequency (LF) power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system that may be in close proximity to the TOF system. Such embodiments can also involve detecting at least a portion of the emitted RF modulated IR light that has reflected off one or more objects. A TOF system can produce depth images in dependence on results of the detecting, as well as update an application in dependence on the depth images.

Description

INTERFERENCE REDUCTION FOR TOF SYSTEMS
BACKGROUND
[0001] Various consumer electronic devices, such as televisions, set top boxes and media players, are configured to be remotely controlled by handheld remote control devices that transmit modulated infrared (IR) remote control signals. Such IR remote control signals typically have a wavelength of about 940nm and typically have a carrier frequency between 10 kHz and 100 kHz, and even more specifically between 30 kHz and 60 kHz. For an even more specific example, many IR remote control signals have a carrier frequency of about 36 kHz (this is not to be confused with the actual frequency of the IR light itself).
[0002] A time-of-flight (TOF) camera, which can also be referred to as a TOF system, may be located in close proximity to (e.g., within the same room as) one or more of the aforementioned consumer electronic devices (e.g., a television, a set top box and/or a media player) that is/are configured to be remotely controlled by a handheld remote control device. For example, a TOF camera may be part of a gaming console that is within the same room as a television, a set top box and/or a DVD player, which can also be referred to as other systems. Such a TOF camera typically operates by illuminating a target with a modulated IR light source and detecting IR light that reflects off the target and is incident on an image pixel detector array of the TOF camera. The IR light source is usually modulated at a relatively high carrier frequency (e.g., about 100 MHz, which is within the radio frequency range) during integration and is typically switched off between frames or captures and during readout. While the carrier frequency of the modulated IR light source is typically well above the carrier frequency of remote control signals, transitions from times during which the light source does not emit the RF modulated light to times during which the light source emits RF modulated light, and vice versa, can produce lower frequency content that can interfere with the remote control signals. Explained another way, a low frequency (LF) power envelope associated with the modulated IR light, produced by the TOF camera, may interfere with remote control signals intended to control another system (e.g., a television) within close proximity to the TOF camera. A vast majority of the interference produced by the TOF camera will not correspond to a valid remote control command, and thus, will be rejected by an IR receiver of the other system (e.g., the television) that is intended to be controlled by remote control signals. However, the interference produced by the TOF camera may be significant enough to prevent a user from being able to actually remotely control the other system (e.g., the television) that is within close proximity to the TOF camera. This can be frustrating to the user, as they may not be able to adjust the volume, brightness, channel, and/or the like, of the other system (e.g., the television) using the remote control device. In other words, a TOF camera can render a remote control device inoperative. Due to relatively poor optical bandpass characteristics of IR receives of televisions, or other systems, such interference problems may even occur where the IR wavelength used by a TOF camera differs from the IR wavelength used by a remote control device. For example, such interference problems may even occur where the wavelength of the IR light used by the TOF camera is about 860nm and the IR light used by a remote control device is about 940nm. Further, it is noted that a TOF camera may also cause similar interference problems with other systems that receive and respond to wireless IR signals, such as, but not limited to, systems that include wireless IR headphones and three-dimensional (3D) television systems that include active shutter 3D glasses.
SUMMARY
[0003] Certain embodiments disclosed are directed to time-of-flight (TOF) systems, and methods for use therewith, that substantially reduce interference that the TOF system may cause to at least one other system that is configured to wirelessly receive and respond to IR light signals. Some such embodiments involve emitting IR light having a low frequency (LF) power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, wherein at least a portion of IR light being emitted is radio frequency (RF) modulated IR light, and thus, includes an RF component. Such embodiments can also involve detecting at least a portion of the emitted RF modulated IR light that has reflected off one or more objects. A TOF system can produce depth images in dependence on results of the detecting, as well as update an application in dependence on the depth images. A LF power envelope, as the term is used herein, is the LF average power delivered over time by a signal.
[0004] A TOF system can be configured to obtain a separate depth image corresponding to each of a plurality of frame periods, wherein each frame period is followed by an inter- frame period, each frame period includes at least two integration periods, and each integration period is followed by a readout period. IR light can be emitted during each of the integration periods, to enable depth images to be produced. Additionally, to reduce how often there are transitions from times during which IR light is being emitted and times during which IR light is not being emitted, and thereby reduce frequency content associated with the transitions, the IR light can also be emitted during the readout periods between pairs of the integration periods within each frame period.
[0005] In certain embodiments, in order to decrease a gain level of an automatic gain control (AGC) circuit for use with an IR light receiver of at least one other system configured to wirelessly receive and respond to IR light signals, and thereby make the IR light receiver of the at least one other system less sensitive to interference from the TOF system, IR light can be emitted during the readout periods between pairs of the integration periods within each frame period, as well as during at least a portion the inter-frame periods between pairs of frames. This can be in addition to the IR light that is emitted during the integration periods.
[0006] IR light may be emitted by producing a drive signal including an RF component and having a LF power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, and driving at least one light source with the drive signal including the RF component.
[0007] In an embodiment, the LF power envelope can be shaped by ramping up pulse amplitudes of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse amplitudes of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
[0008] In an embodiment, the LF power envelope can be shaped by ramping up pulse duty cycles of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse duty cycles of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
[0009] In an embodiment, the LF power envelope can be shaped by ramping down temporal gaps between pulses or pulse trains of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping up temporal gaps between pulses or pulse trains of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
[0010] In an embodiment, the LF power envelope can be shaped by ramping down how often gaps occur between pulses or pulse trains of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping up how often gaps occur between pulses or pulse trains of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is drive to emit IR light.
[0011] Any of the aforementioned ramping up preferably occurs over a time period of at least 50 μβεΰ, and any of the aforementioned ramping down preferably also occurs over a time period of at least 50 μβεα Time permitting, the ramping up and ramping down may occur over longer periods of time.
[0012] More generally, embodiments of the present technology can be used to reduce the adverse effects that TOF systems may have on other systems that are configured to wirelessly receive and respond to IR light signals, while preserving correct TOF operation.
Such embodiments preferably do not degrade, or minimally degrade, performance of TOF systems. Additionally, such embodiments preferably do not increase, or minimally increase, power usage by TOF system.
[0013] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] FIGS. 1A and IB illustrate an example embodiment of a tracking system with a user playing a game.
[0015] FIG. 2A illustrates an example embodiment of a capture device that may be used as part of the tracking system.
[0016] FIG. 2B illustrates an exemplary embodiment of a TOF camera that may be part of the capture device of FIG. 2A.
[0017] FIG. 3 illustrates an example embodiment of a computing system that may be used to track user behavior and update an application based on the user behavior. [0018] FIG. 4 illustrates another example embodiment of a computing system that may be used to track user behavior and update an application based on the tracked user behavior.
[0019] FIG. 5 illustrates an exemplary depth image.
[0020] FIG. 6 depicts exemplary data in an exemplary depth image.
[0021] FIG. 7 illustrates exemplary timing and amplitude details associated with two exemplary frames of a signal for use with a TOF system.
[0022] FIG. 8 illustrates an exemplary LF frequency power spectrum associated with the signal shown in FIG. 7.
[0023] FIG. 9 illustrates how the LF power spectrum of drive and IR light signals can be shaped, in accordance with an embodiment, to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals.
[0024] FIG. 10 illustrates how the LF power spectrum of drive and IR light signals can be shaped, in accordance with another embodiment, to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals.
[0025] FIG. 1 1 illustrates how the LF power spectrum of drive and IR light signals can be shaped, in accordance with still another embodiment, to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals.
[0026] FIG. 12 illustrates an embodiment that reduces how often there are transitions from times during which IR light signals are emitted to times during which the IR light signals are not emitted, and vice versa, to thereby reduce certain frequency content associated with such transitions.
[0027] FIG. 13 illustrates an embodiment that combines the embodiments of FIG. 9 and 12.
[0028] FIG. 14 illustrates an embodiment that decreases a gain level set by an automatic gain control (AGC) circuit associated with a receiver of another system in close proximity to a TOF system, and thereby, makes the receiver of the other system less susceptible to interference from the TOF system.
[0029] FIG. 15 illustrates an additional technique for smoothing out LF power envelopes in accordance with an embodiment. [0030] FIG. 16 is a high level flow diagram that is used to summarize methods according to various embodiments of the present technology.
DETAILED DESCRIPTION
[0031] Certain embodiments of the present technology disclosed herein are directed to TOF systems, and methods for user therewith, that substantially reduce interference that a TOF system may cause to at least one other system (e.g., a television, a set top box, a DVD player, IR headphones and/or active 3D glasses) that is configured to wirelessly receive and respond to IR light signals. Before providing additional details of such embodiments of the present technology, exemplary details of systems with which embodiments of the present technology can be used will first be described.
[0032] FIGS. 1A and IB illustrate an example embodiment of a tracking system 100 with a user 118 playing a boxing video game. In an example embodiment, the tracking system 100 may be used to recognize, analyze, and/or track a human target such as the user 118 or other objects within range of the tracking system 100. As shown in FIG. 1A, the tracking system 100 includes a computing system 112 and a capture device 120. As will be describe in additional detail below, the capture device 120 can be used to obtain depth images and color images (also known as RGB images) that can be used by the computing system 112 to identify one or more users or other objects, as well as to track motion and/or other user behaviors. The tracked motion and/or other user behavior can be used to update an application. Therefore, a user can manipulate game characters or other aspects of the application by using movement of the user's body and/or objects around the user, rather than (or in addition to) using controllers, remotes, keyboards, mice, or the like. For example, a video game system can update the position of images displayed in a video game based on the new positions of the objects or update an avatar based on motion of the user.
[0033] The computing system 112 may be a computer, a gaming system or console, or the like. According to an example embodiment, the computing system 1 12 may include hardware components and/or software components such that computing system 112 may be used to execute applications such as gaming applications, non-gaming applications, or the like. In one embodiment, computing system 112 may include a processor such as a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions stored on a processor readable storage device for performing the processes described herein. [0034] The capture device 120 may include, for example, a camera that may be used to visually monitor one or more users, such as the user 118, such that gestures and/or movements performed by the one or more users may be captured, analyzed, and tracked to perform one or more controls or actions within the application and/or animate an avatar or on-screen character, as will be described in more detail below.
[0035] According to one embodiment, the tracking system 100 may be connected to an audiovisual device 116 such as a television, a monitor, a high-definition television (HDTV), or the like that may provide game or application visuals and/or audio to a user such as the user 118. For example, the computing system 112 may include a video adapter such as a graphics card and/or an audio adapter such as a sound card that may provide audiovisual signals associated with the game application, non-game application, or the like. The audiovisual device 116 may receive the audiovisual signals from the computing system 112 and may then output the game or application visuals and/or audio associated with the audiovisual signals to the user 118. According to one embodiment, the audiovisual device 16 may be connected to the computing system 112 via, for example, an S-Video cable, a coaxial cable, an HDMI cable, a DVI cable, a VGA cable, component video cable, or the like.
[0036] As shown in FIGS. 1A and IB, the tracking system 100 may be used to recognize, analyze, and/or track a human target such as the user 118. For example, the user 118 may be tracked using the capture device 120 such that the gestures and/or movements of user 118 may be captured to animate an avatar or on-screen character and/or may be interpreted as controls that may be used to affect the application being executed by computing system 112. Thus, according to one embodiment, the user 118 may move his or her body to control the application and/or animate the avatar or on-screen character.
[0037] In the example depicted in FIGS. 1A and IB, the application executing on the computing system 112 may be a boxing game that the user 118 is playing. For example, the computing system 112 may use the audiovisual device 116 to provide a visual representation of a boxing opponent 138 to the user 118. The computing system 112 may also use the audiovisual device 116 to provide a visual representation of a player avatar 140 that the user 118 may control with his or her movements. For example, as shown in FIG. IB, the user 118 may throw a punch in physical space to cause the player avatar 140 to throw a punch in game space. Thus, according to an example embodiment, the computer system 112 and the capture device 120 recognize and analyze the punch of the user 118 in physical space such that the punch may be interpreted as a game control of the player avatar 140 in game space and/or the motion of the punch may be used to animate the player avatar 140 in game space.
[0038] Other movements by the user 118 may also be interpreted as other controls or actions and/or used to animate the player avatar, such as controls to bob, weave, shuffle, block, jab, or throw a variety of different power punches. Furthermore, some movements may be interpreted as controls that may correspond to actions other than controlling the player avatar 140. For example, in one embodiment, the player may use movements to end, pause, or save a game, select a level, view high scores, communicate with a friend, etc. According to another embodiment, the player may use movements to select the game or other application from a main user interface. Thus, in example embodiments, a full range of motion of the user 118 may be available, used, and analyzed in any suitable manner to interact with an application.
[0039] In example embodiments, the human target such as the user 118 may have an object. In such embodiments, the user of an electronic game may be holding the object such that the motions of the player and the object may be used to adjust and/or control parameters of the game. For example, the motion of a player holding a racket may be tracked and utilized for controlling an on-screen racket in an electronic sports game. In another example embodiment, the motion of a player holding an object may be tracked and utilized for controlling an on-screen weapon in an electronic combat game. Objects not held by the user can also be tracked, such as objects thrown, pushed or rolled by the user (or a different user) as well as self- propelled objects. In addition to boxing, other games can also be implemented.
[0040] According to other example embodiments, the tracking system 100 may further be used to interpret target movements as operating system and/or application controls that are outside the realm of games. For example, virtually any controllable aspect of an operating system and/or application may be controlled by movements of the target such as the user 118.
[0041] FIG. 2A illustrates an example embodiment of the capture device 120 that may be used in the tracking system 100. According to an example embodiment, the capture device 120 may be configured to capture video with depth information including a depth image that may include depth values via any suitable technique including, for example, time-of-flight, structured light, stereo image, or the like. According to one embodiment, the capture device 120 may organize the depth information into "Z layers", or layers that may be perpendicular to a Z axis extending from the depth camera along its line of sight. [0042] As shown in FIG. 2A, the capture device 120 may include an image camera component 222. According to an example embodiment, the image camera component 222 may be a depth camera that may capture a depth image of a scene. The depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel in the 2-D pixel area may represent a depth value such as a distance in, for example, centimeters, millimeters, or the like of an object in the captured scene from the camera.
[0043] As shown in FIG. 2A, according to an example embodiment, the image camera component 222 may include an infra-red (IR) light component 224, a three-dimensional (3-D) camera 226, and an RGB camera 228 that may be used to capture the depth image of a scene. For example, in time-of-flight (TOF) analysis, the IR light component 224 of the capture device 120 may emit an infrared light onto the scene and may then use sensors (not specifically shown in FIG. 2A) to detect the backscattered light from the surface of one or more targets and objects in the scene using, for example, the 3-D camera 226 and/or the RGB camera 228. In some embodiments, pulsed IR light may be used such that the time between an outgoing light pulse and a corresponding incoming light pulse may be measured and used to determine a physical distance from the capture device 120 to a particular location on the targets or objects in the scene. Additionally or alternatively, the phase of the outgoing light wave may be compared to the phase of the incoming light wave to determine a phase shift. The phase shift may then be used to determine a physical distance from the capture device to a particular location on the targets or objects. Additional details of an exemplary TOF type of 3-D camera 226 are described below with reference to FIG. 2B.
[0044] According to another example embodiment, TOF analysis may be used to indirectly determine a physical distance from the capture device 120 to a particular location on the targets or objects by analyzing the intensity of the reflected beam of light over time via various techniques including, for example, shuttered light pulse imaging.
[0045] In another example embodiment, the capture device 120 may use a structured light to capture depth information. In such an analysis, patterned light (i.e., light displayed as a known pattern such as grid pattern, a stripe pattern, or different pattern) may be projected onto the scene via, for example, the IR light component 224. Upon striking the surface of one or more targets or objects in the scene, the pattern may become deformed in response. Such a deformation of the pattern may be captured by, for example, the 3-D camera 226 and/or the RGB camera 228 and may then be analyzed to determine a physical distance from the capture device to a particular location on the targets or objects. In some implementations, the IR Light component 224 is displaced from the cameras 226 and 228 so triangulation can be used to determined distance from cameras 226 and 228. In some implementations, the capture device 120 will include a dedicated IR sensor to sense the IR light.
[0046] According to another embodiment, the capture device 120 may include two or more physically separated cameras that may view a scene from different angles to obtain visual stereo data that may be resolved to generate depth information. Other types of depth image sensors can also be used to create a depth image.
[0047] The capture device 120 may further include a microphone 230. The microphone 230 may include a transducer or sensor that may receive and convert sound into an electrical signal. According to one embodiment, the microphone 230 may be used to reduce feedback between the capture device 120 and the computing system 112 in the target recognition, analysis, and tracking system 100. Additionally, the microphone 230 may be used to receive audio signals (e.g., voice commands) that may also be provided by the user to control applications such as game applications, non-game applications, or the like that may be executed by the computing system 112.
[0048] In an example embodiment, the capture device 120 may further include a processor 232 that may be in operative communication with the image camera component 222. The processor 232 may include a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions including, for example, instructions for receiving a depth image, generating the appropriate data format (e.g., frame) and transmitting the data to computing system 112.
[0049] The capture device 120 may further include a memory component 234 that may store the instructions that may be executed by the processor 232, images or frames of images captured by the 3-D camera and/or RGB camera, or any other suitable information, images, or the like. According to an example embodiment, the memory component 234 may include random access memory (RAM), read only memory (ROM), cache, Flash memory, a hard disk, or any other suitable storage component. As shown in FIG. 2A, in one embodiment, the memory component 234 may be a separate component in communication with the image capture component 222 and the processor 232. According to another embodiment, the memory component 234 may be integrated into the processor 232 and/or the image capture component 222.
[0050] As shown in FIG. 2A, the capture device 120 may be in communication with the computing system 212 via a communication link 236. The communication link 236 may be a wired connection including, for example, a USB connection, a Firewire connection, an Ethernet cable connection, or the like and/or a wireless connection such as a wireless 802.11b, g, a, or n connection. According to one embodiment, the computing system 112 may provide a clock to the capture device 120 that may be used to determine when to capture, for example, a scene via the communication link 236. Additionally, the capture device 120 provides the depth images and color images captured by, for example, the 3-D camera 226 and/or the RGB camera 228 to the computing system 112 via the communication link 236. In one embodiment, the depth images and color images are transmitted at 30 frames per second. The computing system 112 may then use the model, depth information, and captured images to, for example, control an application such as a game or word processor and/or animate an avatar or on-screen character.
[0051] Computing system 112 includes gestures library 240, structure data 242, depth image processing and object reporting module 244 and application 246. Depth image processing and object reporting module 244 uses the depth images to track motion of objects, such as the user and other objects. To assist in the tracking of the objects, depth image processing and object reporting module 244 uses gestures library 240 and structure data 242.
[0052] Structure data 242 includes structural information about objects that may be tracked. For example, a skeletal model of a human may be stored to help understand movements of the user and recognize body parts. Structural information about inanimate objects may also be stored to help recognize those objects and help understand movement.
[0053] Gestures library 240 may include a collection of gesture filters, each comprising information concerning a gesture that may be performed by the skeletal model (as the user moves). The data captured by the cameras 226, 228 and the capture device 120 in the form of the skeletal model and movements associated with it may be compared to the gesture filters in the gesture library 240 to identify when a user (as represented by the skeletal model) has performed one or more gestures. Those gestures may be associated with various controls of an application. Thus, the computing system 112 may use the gestures library 240 to interpret movements of the skeletal model and to control application 246 based on the movements. As such, gestures library may be used by depth image processing and object reporting module 244 and application 246.
[0054] Application 246 can be a video game, productivity application, etc. In one embodiment, depth image processing and object reporting module 244 will report to application 246 an identification of each object detected and the location of the object for each frame. Application 246 will use that information to update the position or movement of an avatar or other images in the display.
[0055] FIG. 2B illustrates an example embodiment of a TOF type of 3-D camera 226, which can also be referred to as a TOF camera 226, or more generally can be referred to as a TOF system 226. The TOF system 226 is shown as including a driver 260 that drives a light source 250. The light source 250 can be the IR light component 224 shown in FIG. 2A, or can be one or more other light emitting element. More generally, the light source 250 can include one or more light emitting elements, such as, but not limited to, laser diodes or light emitting diodes (LEDs). A laser diode can include one or more vertical- cavity surface-emitting lasers (VCESLs) or edge emitting lasers, but is not limited thereto. It is also possible that there are multiple types of light sources, e.g., a first light source including one or more laser diodes, and a second light source including one or more LEDs. While it is likely that such light emitting elements emit IR light, light of alternative wavelengths can alternatively be emitted by the light emitting elements. Unless stated otherwise, it is assumed that the light source 250 emits IR light.
[0056] The TOF system 226 is also shown as including a clock signal generator 262, which produces a clock signal that is provided to the driver 260. Additionally, the TOF system 226 is shown as including a microprocessor 264 that can control the clock signal generator 262 and/or the driver 260. The TOF system 226 is also shown as including an image pixel detector array 268, readout circuitry 270 and memory 266. The image pixel detector array 268 might include, e.g., 320 x 240 image pixel detectors, but is not limited thereto. Each image pixel detector can be, e.g., a complementary metal-oxide- semiconductor (CMOS) sensor or a charged coupled device (CCD) sensor, but is not limited thereto. Depending upon implementation, each image pixel detector can have its own dedicated readout circuit, or readout circuitry can be shared by many image pixel detectors. In accordance with certain embodiments, the components of the TOF system 226 shown within the block 280 are implemented in a single integrated circuit (IC), which can also be referred to as a single TOF chip.
[0057] The driver 260 can produce a radio frequency (RF) modulated drive signal in dependence on a clock signal received from clock signal generator 262. Accordingly, the driver 260 can include, for example, one or more buffers, amplifiers and/or modulators, but is not limited thereto. The clock signal generator 262 can include, for example, one or more reference clocks and/or voltage controlled oscillators, but is not limited thereto. The microprocessor 264, which can be part of a microcontroller unit, can be used to control the clock signal generator 262 and/or the driver 260. For example, the microprocessor 264 can access waveform information stored in the memory 266 in order to produce an RF modulated drive signal in accordance with various embodiments described herein. The TOF system 226 can includes its own memory 266 and microprocessor 264, as shown in FIG. 2B. Alternatively, or additionally, the processor 232 and/or memory 234 of the capture device 120 can be used to control aspects of the TOF system 226.
[0058] In response to being driven by an RF modulated drive signal, the light source 250 emits RF modulated light, which can also be referred to as an RF modulate light signal. For an example, a carrier frequency of the RF modulated drive signal and the RF modulated light can be in a range from about 5 MHz to many hundreds of MHz, but for illustrative purposes will be assumed to be about 100 MHz. The light emitted by the light source 250 is transmitted through an optional lens or light shaping diffuser 252 towards a target object (e.g., a user 118). Assuming that there is a target object within the field of view of the TOF camera, a portion of the RF modulated emitted light reflects off the target object, passes through an aperture field stop and lens (collectively 272), and is incident on the image pixel detector array 268 where an image is formed. In some implementations, each individual image pixel detector of the array 268 produces an integration value indicative of a magnitude and a phase of detected RF modulated light originating from the light source that has reflected off the object and is incident of the image pixel detector. Such integrations values, or more generally TOF information, enable distances (Z) to be determined, and collectively, enable depth images to be produced. In certain embodiments, optical energy from the light source 250 and detected optical energy signals are synchronized to each other such that a phase difference, and thus a distance Z, can be measured from each image pixel detector. The readout circuitry 270 converts analog integration values generated by the image pixel detector array 268 into digital readout signals, which are provided to the microprocessor 264 and/or the memory 266, and which can be used to produce depth images.
[0059] FIG. 3 illustrates an example embodiment of a computing system that may be the computing system 112 shown in FIGS. 1A-2B used to track motion and/or animate (or otherwise update) an avatar or other on-screen object displayed by an application. The computing system such as the computing system 112 described above with respect to FIGS. 1A-2 may be a multimedia console, such as a gaming console. As shown in FIG. 3, the multimedia console 300 has a central processing unit (CPU) 301 having a level 1 cache 102, a level 2 cache 304, and a flash ROM (Read Only Memory) 306. The level 1 cache 302 and a level 2 cache 304 temporarily store data and hence reduce the number of memory access cycles, thereby improving processing speed and throughput. The CPU 301 may be provided having more than one core, and thus, additional level 1 and level 2 caches 302 and 304. The flash ROM 306 may store executable code that is loaded during an initial phase of a boot process when the multimedia console 300 is powered ON.
[0060] A graphics processing unit (GPU) 308 and a video encoder/video codec (coder/decoder) 314 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the graphics processing unit 308 to the video encoder/video codec 314 via a bus. The video processing pipeline outputs data to an A/V (audio/video) port 340 for transmission to a television or other display. A memory controller 310 is connected to the GPU 308 to facilitate processor access to various types of memory 312, such as, but not limited to, a RAM (Random Access Memory).
[0061] The multimedia console 300 includes an I/O controller 320, a system management controller 322, an audio processing unit 323, a network interface 324, a first USB host controller 326, a second USB controller 328 and a front panel I/O subassembly 330 that are preferably implemented on a module 318. The USB controllers 326 and 328 serve as hosts for peripheral controllers 342(l)-342(2), a wireless adapter 348, and an external memory device 346 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc.). The network interface 324 and/or wireless adapter 348 provide access to a network (e.g., the Internet, home network, etc.) and may be any of a wide variety of various wired or wireless adapter components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like.
[0062] System memory 343 is provided to store application data that is loaded during the boot process. A media drive 344 is provided and may comprise a DVD/CD drive, Blu- Ray drive, hard disk drive, or other removable media drive, etc. The media drive 344 may be internal or external to the multimedia console 300. Application data may be accessed via the media drive 344 for execution, playback, etc. by the multimedia console 300. The media drive 344 is connected to the I/O controller 320 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 1394).
[0063] The system management controller 322 provides a variety of service functions related to assuring availability of the multimedia console 300. The audio processing unit 323 and an audio codec 332 form a corresponding audio processing pipeline with high fidelity and stereo processing. Audio data is carried between the audio processing unit 323 and the audio codec 332 via a communication link. The audio processing pipeline outputs data to the A/V port 340 for reproduction by an external audio player or device having audio capabilities.
[0064] The front panel I/O subassembly 330 supports the functionality of the power button 350 and the eject button 352, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the multimedia console 300. A system power supply module 336 provides power to the components of the multimedia console 300. A fan 338 cools the circuitry within the multimedia console 300.
[0065] The CPU 301, GPU 308, memory controller 310, and various other components within the multimedia console 300 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures can include a Peripheral Component Interconnects (PCI) bus, PCI-Express bus, etc.
[0066] When the multimedia console 300 is powered ON, application data may be loaded from the system memory 343 into memory 312 and/or caches 302, 304 and executed on the CPU 301. The application may present a graphical user interface that provides a consistent user experience when navigating to different media types available on the multimedia console 300. In operation, applications and/or other media contained within the media drive 344 may be launched or played from the media drive 344 to provide additional functionalities to the multimedia console 300.
[0067] The multimedia console 300 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the multimedia console 300 allows one or more users to interact with the system, watch movies, or listen to music. However, with the integration of broadband connectivity made available through the network interface 324 or the wireless adapter 348, the multimedia console 300 may further be operated as a participant in a larger network community.
[0068] When the multimedia console 300 is powered ON, a set amount of hardware resources are reserved for system use by the multimedia console operating system. These resources may include a reservation of memory (e.g., 16MB), CPU and GPU cycles (e.g., 5%), networking bandwidth (e.g., 8 Kbps), etc. Because these resources are reserved at system boot time, the reserved resources do not exist from the application's view.
[0069] In particular, the memory reservation preferably is large enough to contain the launch kernel, concurrent system applications and drivers. The CPU reservation is preferably constant such that if the reserved CPU usage is not used by the system applications, an idle thread will consume any unused cycles. [0070] With regard to the GPU reservation, lightweight messages generated by the system applications (e.g., popups) are displayed by using a GPU interrupt to schedule code to render popup into an overlay. The amount of memory required for an overlay depends on the overlay area size and the overlay preferably scales with screen resolution. Where a full user interface is used by the concurrent system application, it is preferable to use a resolution independent of application resolution. A scaler may be used to set this resolution such that the need to change frequency and cause a TV resynch is eliminated.
[0071] After the multimedia console 300 boots and system resources are reserved, concurrent system applications execute to provide system functionalities. The system functionalities are encapsulated in a set of system applications that execute within the reserved system resources described above. The operating system kernel identifies threads that are system application threads versus gaming application threads. The system applications are preferably scheduled to run on the CPU 301 at predetermined times and intervals in order to provide a consistent system resource view to the application. The scheduling is to minimize cache disruption for the gaming application running on the console.
[0072] When a concurrent system application requires audio, audio processing is scheduled asynchronously to the gaming application due to time sensitivity. A multimedia console application manager (described below) controls the gaming application audio level (e.g., mute, attenuate) when system applications are active.
[0073] Input devices (e.g., controllers 342(1) and 342(2)) are shared by gaming applications and system applications. The input devices are not reserved resources, but are to be switched between system applications and the gaming application such that each will have a focus of the device. The application manager preferably controls the switching of input stream, without knowledge the gaming application's knowledge and a driver maintains state information regarding focus switches. The cameras 226, 228 and capture device 120 may define additional input devices for the console 300 via USB controller 326 or other interface.
[0074] FIG. 4 illustrates another example embodiment of a computing system 420 that may be the computing system 112 shown in FIGS. 1A-2B used to track motion and/or animate (or otherwise update) an avatar or other on-screen object displayed by an application. The computing system 420 is only one example of a suitable computing system and is not intended to suggest any limitation as to the scope of use or functionality of the presently disclosed subject matter. Neither should the computing system 420 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary computing system 420. In some embodiments the various depicted computing elements may include circuitry configured to instantiate specific aspects of the present disclosure. For example, the term circuitry used in the disclosure can include specialized hardware components configured to perform function(s) by firmware or switches. In other examples embodiments the term circuitry can include a general purpose processing unit, memory, etc., configured by software instructions that embody logic operable to perform function(s). In example embodiments where circuitry includes a combination of hardware and software, an implementer may write source code embodying logic and the source code can be compiled into machine readable code that can be processed by the general purpose processing unit. Since one skilled in the art can appreciate that the state of the art has evolved to a point where there is little difference between hardware, software, or a combination of hardware/software, the selection of hardware versus software to effectuate specific functions is a design choice left to an implementer. More specifically, one of skill in the art can appreciate that a software process can be transformed into an equivalent hardware structure, and a hardware structure can itself be transformed into an equivalent software process. Thus, the selection of a hardware implementation versus a software implementation is one of design choice and left to the implementer.
[0075] Computing system 420 comprises a computer 441, which typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 441 and includes both volatile and nonvolatile media, removable and non-removable media. The system memory 422 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 423 and random access memory (RAM) 460. A basic input/output system 424 (BIOS), containing the basic routines that help to transfer information between elements within computer 441, such as during start-up, is typically stored in ROM 423. RAM 460 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 459. By way of example, and not limitation, FIG. 4 illustrates operating system 425, application programs 426, other program modules 427, and program data 428.
[0076] The computer 441 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, FIG. 4 illustrates a hard disk drive 438 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 439 that reads from or writes to a removable, nonvolatile magnetic disk 454, and an optical disk drive 440 that reads from or writes to a removable, nonvolatile optical disk 453 such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 438 is typically connected to the system bus 421 through a non-removable memory interface such as interface 434, and magnetic disk drive 439 and optical disk drive 440 are typically connected to the system bus 421 by a removable memory interface, such as interface 435.
[0077] The drives and their associated computer storage media discussed above and illustrated in FIG. 4, provide storage of computer readable instructions, data structures, program modules and other data for the computer 441. In FIG. 4, for example, hard disk drive 438 is illustrated as storing operating system 458, application programs 457, other program modules 456, and program data 455. Note that these components can either be the same as or different from operating system 425, application programs 426, other program modules 427, and program data 428. Operating system 458, application programs 457, other program modules 456, and program data 455 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into the computer 441 through input devices such as a keyboard 451 and pointing device 452, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 459 through a user input interface 436 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). The cameras 226, 228 and capture device 120 may define additional input devices for the computing system 420 that connect via user input interface 436. A monitor 442 or other type of display device is also connected to the system bus 421 via an interface, such as a video interface 432. In addition to the monitor, computers may also include other peripheral output devices such as speakers 444 and printer 443, which may be connected through an output peripheral interface 433. Capture Device 120 may connect to computing system 420 via output peripheral interface 433, network interface 437, or other interface. [0078] The computer 441 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 446. The remote computer 446 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 441, although only a memory storage device 447 has been illustrated in FIG. 4. The logical connections depicted include a local area network (LAN) 445 and a wide area network (WAN) 449, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
[0079] When used in a LAN networking environment, the computer 441 is connected to the LAN 445 through a network interface 437. When used in a WAN networking environment, the computer 441 typically includes a modem 450 or other means for establishing communications over the WAN 449, such as the Internet. The modem 450, which may be internal or external, may be connected to the system bus 421 via the user input interface 436, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 441, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 4 illustrates application programs 448 as residing on memory device 447. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
[0080] As explained above, the capture device 120 provides RGB images (also known as color images) and depth images to the computing system 112. The depth image may be a plurality of observed pixels where each observed pixel has an observed depth value. For example, the depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel in the 2-D pixel area may have a depth value such as a length or distance in, for example, centimeters, millimeters, or the like of an object in the captured scene from the capture device.
[0081] FIG. 5 illustrates an example embodiment of a depth image that may be received at computing system 112 from capture device 120. According to an example embodiment, the depth image may be an image and/or frame of a scene captured by, for example, the 3- D camera 226 and/or the RGB camera 228 of the capture device 120 described above with respect to FIG. 2A. As shown in FIG. 5, the depth image may include a human target corresponding to, for example, a user such as the user 118 described above with respect to FIGS. 1A and IB and one or more non-human targets such as a wall, a table, a monitor, or the like in the captured scene. The depth image may include a plurality of observed pixels where each observed pixel has an observed depth value associated therewith. For example, the depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel at particular x-value and y-value in the 2-D pixel area may have a depth value such as a length or distance in, for example, centimeters, millimeters, or the like of a target or object in the captured scene from the capture device. In other words, a depth image can specify, for each of the pixels in the depth image, a pixel location and a pixel depth. Following a segmentation process, each pixel in the depth image can also have a segmentation value associated with it. The pixel location can be indicated by an x-position value (i.e., a horizontal value) and a y-position value (i.e., a vertical value). The pixel depth can be indicated by a z-position value (also referred to as a depth value), which is indicative of a distance between the capture device (e.g., 120) used to obtain the depth image and the portion of the user represented by the pixel. The segmentation value is used to indicate whether a pixel corresponds to a specific user, or does not correspond to a user.
[0082] In one embodiment, the depth image may be colorized or grayscale such that different colors or shades of the pixels of the depth image correspond to and/or visually depict different distances of the targets from the capture device 120. Upon receiving the image, one or more high-variance and/or noisy depth values may be removed and/or smoothed from the depth image; portions of missing and/or removed depth information may be filled in and/or reconstructed; and/or any other suitable processing may be performed on the received depth image.
[0083] FIG. 6 provides another view/representation of a depth image (not corresponding to the same example as FIG. 5). The view of FIG. 6 shows the depth data for each pixel as an integer that represents the distance of the target to capture device 120 for that pixel. The example depth image of FIG. 6 shows 24x24 pixels; however, it is likely that a depth image of greater resolution would be used.
Techniques for reducing IR remote control interference caused by TOF systems
[0084] As mentioned above, a TOF system (e.g., 226) may be located in close proximity to (e.g., within the same room as) a consumer electronic device (e.g., a television, a set top box and/or a media player) that is/are configured to be remotely controlled by a handheld remote control device. For example, referring back to FIGS. 1A and 2A, the capture device 120 can include a TOF system 226 that is located close to the television or display 116. Additionally, or alternatively, a TOF system may be located in close proximity to other types of systems configured to wirelessly receive and respond to IR light signals, such as, but not limited to, systems that include wireless IR headphones and 3D television systems that include active shutter 3D glasses. As also explained above, the TOF system may operate by illuminating a target (e.g., user 118) with RF modulated IR light and detecting IR light that reflects off the target and is incident on an image pixel detector array of the TOF camera. While the carrier frequency of the RF modulated IR light is typically well above the carrier frequency of remote control signals, abrupt transitions from times during which the light source does not emit the RF modulated light to times during which the light source emits RF modulated light, and vice versa, can produce lower frequency content that can interfere with the remote control signals. Explained another way, a low frequency content associated with the modulated IR light, produced by the TOF system, may interfere with remote control signals intended to control another device (e.g., a television) within the vicinity of the TOF system. While most if not all of the interference produced by the TOF system will not correspond to a valid remote control command (and thus, will be rejected by a remote control receiver of the other device as an invalid command), the interference produced by the TOF system may be significant enough to prevent a user from being able to actually remotely control the other device (e.g., the television or display 116) that is within close proximity to the TOF system. The low frequency content produced by the TOF system can similarly interference with other types of systems that are configured to wirelessly receive and respond to IR light signals. Certain embodiments of the present technology described below, which are for use with a TOF system, substantially reduce frequency content within the range of frequencies known to be used by remote control devices and/or within one or more other frequency ranges known to be used by other systems configured to wirelessly receive and respond to IR light signals. Accordingly, such embodiments enable other systems to operate in their intended manner when in close proximity to a TOF system. For a more specific example, such embodiments enable consumer electronic devices to be remotely controlled even though they are in close proximity to a TOF system.
[0085] Most, IR remote control signals have a carrier frequency between 10 kHz and 100 kHz, and even more specifically between 30 kHz and 60 kHz. Certain remote control devices, for example, transmit IR remote control signals having a carrier frequency of about 36 kHz (this is not to be confused with the actual frequency of the IR light itself). There are also some systems that utilize IR remote control signals having a carrier frequency of about 455 KHz. Still other systems utilize IR remote control signals having a carrier frequency of about 1 MHz. A consumer electronic device (e.g., television, set top box or media player) that is controllable by remote control signals includes a remote control receiver that is configured to receive and decode remote control signals within an expected frequency range, examples of which were discussed above.
[0086] Before describing various embodiments of the present technology, FIG. 7 will first be used to describe a typical RF modulated drive signal generated by a TOF system and a typical RF modulated IR light signal emitted by the TOF system. More specifically, FIG. 7 illustrates exemplary pulse timing and pulse amplitude details associated with two exemplary frames of signals for use with a TOF system. The frames shown in FIG. 7 can be for use with a TOF system that, for example, is configured to obtain a separate depth image corresponding to each of a plurality of frames, which can also be referred to as frame periods. The waveforms shown in FIG. 7 are illustrative of both an RF modulated drive signal used to drive an IR light source, as well as an RF modulated IR light signal produced by (and more specifically, emitted by) the light source being driven by the RF modulated drive signal.
[0087] As shown in FIG. 7, each frame period is followed by an inter-frame period, which separates the frame period from the next frame period. Depending upon implementation, the length of each frame period may or may not be the same as the length of each inter-frame period. Each frame period includes at least two integration periods, and each integration period is followed by a readout period. For a more specific example, a frame period may include ten integration periods, each of which is followed by a respective one of ten readout periods. Except for the last readout period of a frame period, each of the readout periods separates a pair of the integration periods of the frame period. The frame rate can be, for example, 30 Hz, but is not limited thereto. Where the frame rate is 30 Hz, each frame period plus inter- frame period pair is about 33.33 msec.
[0088] Still referring to FIG. 7, each of the integration periods is shown as including numerous pulses having the same pulse amplitude, and each of the readout periods and inter-frame periods is shown as including no pulses. As mentioned above, FIG. 7 is illustrative of an RF modulated drive signal, as well as the RF modulated IR light signal generated by driving an IR light source with the RF modulated drive signal. Assuming that the pulse frequency is about 100 MHz, the pulse frequency is well above the frequency range known to be used by most other systems that are configured to wirelessly receive and respond to IR light signals. Nevertheless, the LF power envelope of the RF modulated IR light includes significant frequency content within the frequency ranges known to be used by other systems that are configured to wirelessly receive and respond to IR light signal, examples of which were discussed above. Such frequency content is primarily due to the abrupt transitions from times during which the light source is not driven by the RF modulated drive signal to times during which the light source is driven by the RF modulated drive signal, and vice versa, which occur at the beginning and end of each of the integration periods. The shaded areas labeled 702 in FIG. 7 are illustrative of the LF power envelopes of the RF modulated drive signal and the RF modulated light signal produced using the drive signal. A LF power envelope, as the term is used herein, is the LF average power delivered over time by a signal.
[0089] FIG. 8 illustrates an exemplary LF frequency power spectrum associated with the signal shown in FIG. 7. As can be appreciated from FIG. 8, there is significant frequency content within the 10 kHz to 100 kHz frequency range known to be used by remote control devices. It is this spectral energy produced by a TOF system that results, for example, in interference that may prevent a system (e.g., a television or display) within close proximity to the TOF system from being remotely controlled using IR remote control signals. Although not shown in FIG. 8, since FIG. 8 only shows LF content, the power spectrum associated with the signal shown in FIG. 7 will also include a peak at the carrier frequency, e.g., at 100 MHz.
[0090] Certain embodiments of the present technology, which are described below, smooth out the edges of the LF power envelopes of the drive and IR light signals. This has the effect of substantially reducing frequency content within the frequency ranges known to be used by remote controlled devices and other systems configured to wirelessly receive and respond to IR light signals.
[0091] A first embodiment for smoothing out the edges of the LF power envelopes, which is illustrated in FIG. 9, involves ramping up the pulse amplitudes of an RF component of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light. This embodiment also includes ramping down pulse amplitudes of the RF component of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light. By driving the light source with the drive signal having pulse amplitudes that ramp up and thereafter ramp down, the RF component of the IR light signal emitted by the light source will also have pulse amplitudes that ramp up and thereafter ramp down. The shaded areas labeled 902 in FIG. 9 are illustrative of the LF power envelopes of the drive signal and the IR light signal produced using the drive signal. The ramping up of the pulse amplitudes should occur over a period of at least 50 μβεΰ and the ramping down of the pulse amplitudes should occur over a period of at least 50 μβεα Time permitting, the ramping up of the pulse amplitudes preferably occurs over a period between approximately 1 msec and 10 msec, and the ramping down similarly preferably occurs over a period between approximately 1 msec and 10 msec, which should ensure a substantial reduction in frequency content between 10 kHz and 100 kHz, which is the frequency range typically used to transmit IR remote control signals. It is noted that FIG. 9 is included for illustrative purposes, but is not drawn to scale, since the ramping up and down of pulse amplitudes will occur over much more than three or four pulses.
[0092] There are various different ways to implement the embodiment described with reference to FIG. 9. For example, referring back to FIG. 2B, the microprocessor 264 can control the clock signal generator 262 to produce clock pulses having pulse amplitudes that ramp up and ramp down. Alternatively, the microprocessor 264 can control the driver 260 to produce an RF modulated drive signal that includes pulse amplitudes that ramp up and ramp down. For example, the driver 260 can include a pulse amplitude modulator that is controlled by the microprocessor 264. The microprocessor 264 may access information regarding pulse amplitudes from the memory 266. Such a pulse amplitude modulator can be implemented, e.g., using an amplifier having an adjustable gain that is controlled by the microcontroller. These are just a few examples, which are not meant to be all encompassing.
[0093] A second embodiment for smoothing out the edges of the LF power envelopes, which is illustrated in FIG. 10, involves ramping up the pulse duty cycles of an RF component of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light. This embodiment also includes ramping down pulse duty cycles of the RF component of the drive signal when transitioning from a time during which the light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light. By driving the light source with the drive signal having pulse duty cycles that ramp up and thereafter ramp down, the RF component of the light signal emitted by the light source will also have pulse duty cycles that ramp up and thereafter ramp down. The shaded areas labeled 1002 in FIG. 10 are illustrative of the LF power envelopes of the drive signal and the IR light signal produced using the drive signal. The ramping up of the pulse duty cycles should occur over a period of at least 50 μβεΰ and the ramping down of the pulse duty cycles should occur over a period of at least 50 μβεα Time permitting, the ramping up of the pulse duty cycles preferably occurs over a period between approximately 1 msec and 10 msec, and the ramping down similarly occurs over a period between approximately 1 msec and 10 msec, which should ensure a substantial reduction in frequency content between 10 kHz and 100 kHz, which is the frequency range typically used to transmit IR remote control signals. It is noted that FIG. 10 is included for illustrative purposes, but is not drawn to scale, since the ramping up and down of pulse duty cycles will occur over much more than three or four pulses.
[0094] There are various different ways to implement the embodiment described with reference to FIG. 10. For example, referring back to FIG. 2B, the microprocessor 264 can control the clock signal generator 262 to produce clock pulses having pulse duty cycles that ramp up and ramp down. Alternatively, the microprocessor 264 can control the driver 260 to produce a drive signal having an RF component that includes pulse duty cycles that ramp up and ramp down. For example, the driver 260 can include a pulse duty cycle modulator that is controlled by the microprocessor 264. The microprocessor 264 may access information regarding pulse duty cycles from the memory 266. Such a pulse duty cycle modulator can be implemented, e.g., using a pulse width modulator. These are just a few examples, which are not meant to be all encompassing.
[0095] Another embodiment for smoothing out the edges of the LF power envelopes, which is illustrated in FIG. 11, involves ramping down temporal gaps between adjacent pulses or pulse trains of the RF modulated drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light. This embodiment also includes ramping up temporal gaps between adjacent pulses or pulse trains of the RF component of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light. By driving the light source with the drive signal including an RF component having temporal gaps between adjacent pulses or pulse trains that ramp down and thereafter ramp up, the RF component of the IR light signal emitted by the light source will also have pulses or pulse trains with temporal gaps that ramp down and thereafter ramp up. The shaded areas labeled 1102 in FIG. 11 are illustrative of the LF power envelopes of the drive signal and the light signal produced using the drive signal. The ramping down of the temporal gaps between adjacent pulses or pulse trains should occur over a period of at least 50 μβεΰ and the ramping down of the temporal gaps between adjacent pulses or pulse trains should occur over a period of at least 50 μβεα Time permitting, the ramping down of the temporal gaps between adjacent pulses or pulse trains preferably occurs over a period between approximately 1 msec and 10 msec, and the ramping up of the temporal gaps between adjacent pulses or pulse trains similarly preferably occurs over a period between approximately 1 msec and 10 msec, which should ensure a substantial reduction in frequency content between 10 kHz and 100 kHz, which is the frequency range typically used to transmit IR remote control signals. It is noted that FIG. 11 is included for illustrative purposes, but is not drawn to scale, since the ramping down and up of temporal gaps between adjacent pulses will occur over much more than three or four pulses. Alternatively, or additionally, there can be a ramping down of how often gaps occur between pulses or pulse trains of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light; and there can be a ramping up of how often gaps occur between pulses or pulse trains of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is drive to emit IR light.
[0096] There are various different ways to implement the embodiment described with reference to FIG. 11. For example, referring back to FIG. 2B, the microprocessor 264 can control the clock signal generator 262 to produce clock pulses having temporal gaps between pulses that that ramp down and ramp up. This can be accomplished, e.g., by not outputting certain clock pulses. Alternatively, a gating circuit can be located between the clock signal generator 262 and the driver 260, so that some clock pulses are not provided to the driver 260, to thereby control the temporal gaps between adjacent pulses or pulse trains, and/or how often gaps occur. In still another embodiment, the gating circuit can be part of or upstream from the driver 260 and can selectively prevent some drive pulses from being provided to the light source 250, to thereby control the temporal gaps between adjacent pulses or pulse trains output by the light source and/or how often gaps occur. The microprocessor 264 can control the clock signal generator 262, the driver 260 and/or such a gating circuit to achieve the ramping up and down of temporal gaps between pulses or pulse trains, and/or how often gaps occur. The microprocessor 264 may access information regarding temporal gaps and/or how often gaps should occur between pulses or pulse trains from the memory 266. These are just a few examples, which are not meant to be all encompassing.
[0097] As mentioned above, the abrupt transitions from times during which a light source emits IR light to times during which no light source emits IR light, and vice versa, produces frequency content that can interfere with the IR remote control signals and/or other IR signals used by other systems. As also mentioned above, explained another way, the LF power envelopes associated with the IR light, produced by the TOF camera, may interfere with one or more other system (configured to wirelessly receive and respond to IR light signals) that is/are within close proximity to the TOF camera. An embodiment, which will now be described with reference to FIG. 12, reduces how often there are transitions from times during which IR light is emitted to times during which IR light is not emitted, and vice versa, thereby reducing frequency content associated with such transitions.
[0098] Typically, the RF modulated drive and RF modulated light signals are only produced during the integration periods of frame periods, but are not produced during the readout periods of frame periods, as was shown in FIGS. 7 and 9-11. In accordance with an embodiment, drive and IR light signals are produced during each of the integration periods of each frame period as well as during each of the readout periods between pairs of the integration periods within each frame period, as shown in FIG. 12. This has the effect of reducing how often there are transitions from times during which no light source emits IR light to times during which a light source is driven to emit IR light, and vice versa, and thereby reduces frequency content associated with such transitions. The shaded areas labeled 1202 in FIG. 12 are illustrative of the LF power envelopes of the drive and IR light signals produced using such an embodiment. As can be appreciated from FIG. 12, even though there are multiple integration and readout periods per frame period, there is only one LF power envelope per frame period, and thus only one rising edge of the LF power envelope and only one falling edge of the LF power envelope. In other words, during each frame period, there is only one transition from a time during which no light source emits IR light to a time during which a light source is driven by the drive signal to emit IR light; and there is only one transition from a time during which a light source is driven to emit IR light to a time during which no light source emits IR light. The same light source can be used for emitting IR light during integration periods and readout periods. Alternatively, a first light source (e.g., an IR laser diode) can be used to emit IR light during the integration periods, and a second light source (e.g., an IR LED) can be used to emit IR light during the readout periods. Other variations are also possible, and within the scope of an embodiment.
[0099] The rising and falling edges of the LF power envelope 1202 in FIG. 12 are abrupt, which will result in at least some frequency content within the frequency range known to be used by remote controlled devices and other systems configured to wirelessly receive and respond to IR light signals. To further reduce such frequency content, the embodiment just described with reference to FIG. 12 can be combined with at least one of the previously described embodiments discussed with reference to FIGS. 9-11. For example, FIG. 13 illustrates an embodiment that combines the embodiment described with reference to FIG. 12 with the embodiment described with reference to FIG. 9. The shaded areas labeled 1302 in FIG. 13 are illustrative of the LF power envelopes of the drive signal and the emitted IR light signal produced using such an embodiment. As can be appreciated from FIG. 13, there is only one LF power envelope 1302 per frame period, and thus only one rising edge of the LF power envelope and only one falling edge of the LF power envelope. Additionally, the rising and falling edges of the LF power envelope 1302 in FIG. 13 are smoothed out. Thus, the embodiment described with reference to FIG. 13 achieves the benefits of the embodiment described with reference to FIG. 12 as well as the benefits of the embodiment described with reference to FIG. 9. Alternatively, the embodiment described with reference to FIG. 12 can be combined with one of the embodiments described with reference to FIGS. 10 or 11 to achieve similar substantial reductions in frequency content within the frequency range known to be used by remote controlled devices and other systems configured to wirelessly receive and respond to IR light signals.
[00100] The embodiments described above can also be combined in other manners. For example, the embodiment described with reference to FIG. 9 can be combined with one or more of the embodiments described with reference to FIGS. 10 and 11. It is also possible that the embodiment described with reference to FIG. 12 can be combined with more than one of the embodiments described with reference to FIGS. 9-11.
[00101] In accordance with certain embodiments, where IR light signals are also emitted during readout periods, the readout circuitry (e.g., 270 in FIG. 2B) either does not generate readout signals corresponding to reflected light detected by the image pixel detector array 268 during readout periods, or readout signals corresponding to reflected light detected by the image pixel detector array 268 during readout periods should be ignored by the TOF system. In accordance with alternative embodiments, in order to improve the overall electrical efficiency of the TOF system, the readout circuitry (e.g., 270 in FIG. 2B) generates, and the TOF system utilizes, the readout signals corresponding to reflected IR light detected by the image pixel detector array 268 during readout periods. [00102] Many systems (e.g., televisions or set top boxes) that are configured to wireless receive and respond to IR signals include a receiver that has an automatic gain control (AGC) circuit that adjusts the sensitivity of the receiver in dependence on ambient light conditions. More specifically, such AGC circuits usually decrease gain of a receiver amplifier when there is high ambient light conditions, which makes the receiver less sensitive; and the AGC circuits usually increase the gain of the receiver amplifier when where there is low ambient light conditions, which makes the receiver more sensitive. The more sensitive the receiver, the less need for a direct line of sight between a sub-system that transmit IR signals (e.g., a remote control device that transmits IR remote control signals) and the receiver, which for example, can be built into a television or set top box. Conversely, the less sensitive the receiver, the more need for a direct line of sight between the sub-system (e.g., a remote control device that transmits IR remote control signals) and the receiver. The reason for reducing the amplifier gain during high ambient light conditions is that the reduction in gain makes the receiver less sensitive to interference resulting from ambient light. Experiments have shown that reducing the gain of a receiver amplifier also has the effect of making the receiver less sensitive to interference resulting from RF component of IR light produced by a TOF system. An embodiment of the present technology, which shall now be described with reference to FIG. 14, purposely reduces the receiver amplifier gain (even during low ambient light conditions) in order to make a receiver (e.g., a remote control receiver) less sensitive to interference resulting from IR light produced by a TOF system.
[00103] The AGC circuit of a receiver may, for example, have about 50 dB of adjustable gain. Such an AGC circuit automatically varies the gain of a receiver amplifier between its minimum gain (in a bright environment) and its maximum gain (in a dark environment). In accordance with an embodiments, in order to decrease the sensitivity of an AGC circuit of another system (e.g., a system that is configured to wirelessly receive and respond to IR remote control signals) and thereby make the other system less susceptible to interference from the TOF system, the driver (e.g., 250) of a TOF system drives a light source (e.g., 250) of the TOF system to cause IR light to also be emitted during the readout periods between pairs of the integration periods within each frame period and during at least a portion of the inter-frame periods between pairs of frames. In other words, the TOF system purposely increases the percentage of each frame period during with IR light is emitted, as shown in FIG. 14. This causes an AGC circuit of another system (e.g., a television) within close proximity to the TOF system to reduce its level of gain, making the receiver of the other system less susceptible to interference from the TOF system. As can be appreciated from FIG. 14, the width of the LF power envelope 1402 can be made greater than the width of the frame period by emitting IR light during portions of the adjacent inter-frame periods. In other words, the TOF system may emit IR light during at least of portion of inter-frame periods, which is not typically done. As can be appreciated from FIG. 14, this embodiment can be combined with previously described embodiments. For example, in FIG. 14, the pulse amplitudes are ramped up and down in the manner originally described with reference to FIG. 9 to smooth out the LF power envelope; and the pulses are also generated during readout periods to reduce the number of LF power envelope transitions as was originally described with reference to FIG. 12. The same light source can be used for emitting IR light during integration periods, the readout periods and portions of the inter-frame periods. Alternatively, a first light source (e.g., an IR laser diode) can be used to emit IR light during the integration periods, and a second light source (e.g., an IR LED) can be used to emit IR light during the readout periods and portions of the inter- frame periods. Other variations are also possible, and within the scope of an embodiment.
[00104] In FIGS. 9-14, the pulse amplitudes within the middle integration period(s) of a frame period were shown as staying the same. However, this need not be the case. For example, the pulse amplitudes may change from one integration period to the next. In order to reduce frequency content resulting from such abrupt relatively significant changes in pulse amplitudes, it is advantageous to produce RF modulated drive and RF modulated light signals during the readout periods (between such integration periods) and smooth out the changes in the pulse amplitudes, as can be appreciated from FIG. 15. More generally, FIG. 15 illustrates a smoothing out of all transitions of the LF power envelope 1502. The embodiment described with reference to FIG. 15 can be combined with one or more of the previously described embodiments. For example, instead of ramping up and down pulse amplitude to smooth out the LF power envelope 1502, ramping up and down of pulse duty cycles, temporal gaps and/or how often gaps occur between adjacent pulses or pulse trains can be used to smooth out the LF power envelope 1502.
[00105] The high level flow diagram of FIG. 16 will now be used to summarize methods according to various embodiments of the present technology. Such methods, which are for use by a TOF system, are for substantially reducing interference that the TOF system may cause to a further device that is configured to wirelessly receive and respond to IR signals transmitted by a remote control device that is intended to remotely control the further device (e.g., a television).
[00106] Referring to FIG. 16, step 1602 involves emitting IR light having a LF power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, wherein at least a portion of IR light being emitted is RF modulated IR light. Frequency ranges within which frequency content can be reduced using embodiments described herein include, but are not limited to: 10 kHz - 100 kHz frequencies that are used by some systems for sending IR remote control signals; 455 kHz (+/- 10%) and/or 1 MHz (+/- 10%>) frequencies that are used by other systems for sending IR remote control signals; 2.3 MHz (+/- 10%>) and 2.8 MHz (+/- 10%>) frequencies that are used by some systems for transmitting IR signals to wireless IR headphones; and 25 KHz - 30 KHz frequencies that are used by some systems for transmitting IR signals to wireless 3D shutter glasses.
[00107] Still referring to FIG. 16, step 1604 involves detecting at least a portion of the emitted RF modulated IR light that has reflected off one or more objects. At step 1606, depth images are produced in dependence on results of the detecting. At step 1608, the depth images are used to update an application. For example, depth images can be used to tracked motion and/or other user behavior, which can be used, e.g., to manipulate a game character or other aspects of the application in response to movement of a user's body and/or objects around the user, rather than (or in addition to) using controllers, remotes, keyboards, mice, or the like. For example, a video game system can update the position of images displayed in a video game based on the new positions of the objects or update an avatar based on motion of the user as detected based on depth images.
[00108] As was discussed above with reference to FIG. 9, step 1602 can involve ramping up pulse amplitudes of the RF component of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse amplitudes of the RF component of the drive signal when transitioning from a time during which the light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light. This will result in similar ramping up and ramping down of pulse amplitudes of the light pulses emitted by the light source.
[00109] As was discussed above with reference to FIG. 10, step 1602 can involve ramping up pulse duty cycles of the RF component of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping down pulse duty cycles of the RF component of the drive signal when transitioning from a time during which the light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light. This will result in similar ramping up and ramping down of pulse duty cycles of the light pulses emitted by the light source.
[00110] As was discussed above with reference to FIG. 11, step 1602 can involve ramping down temporal gaps and/or how often gaps occur between adjacent pulses or pulse trains of the RF component of a drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a light source is driven by the drive signal to emit IR light, and ramping up temporal gaps and/or how often gaps occur between adjacent pulses or pulse trains of the RF component of the drive signal when transitioning from a time during which a light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light. This will result in similar ramping down and ramping up of temporal gaps and/or how often gaps occur between light pulses or pulse trains emitted by the light source.
[00111] As was discussed above with reference to FIG. 12, step 1602 can involve emitting IR light during integration periods of each frame period as well as during each of the readout periods between pairs of the integration periods within each frame period. Such an embodiment reduces how often there are transitions from times during which IR light is emitted to times during which IR light is not emitted, and vice versa, thereby reducing frequency content associated with such transitions. In certain embodiments, discussed above with reference to FIG. 14, IR light can also be emitted during portions of inter-frame periods.
[00112] Additional details of various methods of the present technology can be appreciated from the above discussion of FIGS. 9-15.
[00113] Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. It is intended that the scope of the technology be defined by the claims appended hereto.

Claims

1. For use by a time-of-flight (TOF) system that emits and detects infrared (IR) light, a method for substantially reducing interference that the TOF system may cause to at least one other system that is configured to wirelessly receive and respond to IR light signals, the method comprising:
emitting IR light having a low frequency (LF) power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, wherein at least a portion of IR light being emitted is radio frequency (RF) modulated IR light; and
detecting at least a portion of the emitted RF modulated IR light that has reflected off one or more objects.
2. The method of claim 1, wherein the emitting IR light comprises:
emitting IR light having a LF power envelope that is shaped to substantially reduce frequency content within the frequency range from 10 kHz to 100 kHz that is known to be used by other systems for transmitting and receiving remote control signals.
3. The method of claim 1, wherein the emitting IR light comprises:
emitting IR light having a LF power envelope that is shaped to substantially reduce frequency content within at least two frequency ranges below 3 MHz that are known to be used by other systems configured to wirelessly receive and respond to IR light signals.
4. The method of claim 1, 2 or 3, wherein:
the TOF system is configured to obtain a separate depth image corresponding to each of a plurality of frame periods,
each frame period is followed by an inter-frame period,
each frame period includes at least two integration periods, and
each integration period is followed by a readout period; and
the emitting IR light includes emitting IR light during each of the integration periods; and to reduce how often there are transitions from times during which IR light is being emitted and times during which IR light is not being emitted, and thereby reduce frequency content associated with the transitions, the emitting IR light also includes emitting IR light during the readout periods between pairs of the integration periods within each frame period.
5. The method of claim 1, 2 or 3, wherein:
the TOF system is configured to obtain a separate depth image corresponding to each of a plurality of frame periods,
each frame period is followed by an inter-frame period,
each frame period includes at least two integration periods, and
each integration period is followed by a readout period; and
the emitting IR light includes emitting IR light during each of the integration periods; and in order to decrease a gain level of an automatic gain control (AGC) circuit for use with an IR light receiver of at least one other system configured to wirelessly receive and respond to IR light signals, and thereby make the IR light receiver of the at least one other system less sensitive to interference from the TOF system, the emitting IR light also includes emitting IR light during at least a portion at least one of:
(i) the readout periods between pairs of the integration periods within each frame period, or
(ii) the inter-frame periods between pairs of frames.
6. The method of claim 1, 2 or 3, wherein the emitting IR light includes:
producing a drive signal including an RF component and having a LF power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals; and
driving at least one light source with the drive signal including the RF component.
7. A time-of- flight (TOF) system, comprising:
at least one light source configured to emit infrared (IR) light in response to be driven; a driver configured to drive the at least one light source to emit IR light having a low frequency (LF) power envelope that is shaped to substantially reduce frequency content within at least one frequency range known to be used by at least one other system configured to wirelessly receive and respond to IR light signals, wherein at least a portion of IR light being emitted is radio frequency (RF) modulated IR light; and
an image pixel detector array configured to detect at least a portion of the emitted RF modulated IR light that has reflected off one or more objects and is incident on an image pixel detector array.
8. The system of claim 7, wherein:
the TOF system is configured to obtain a separate depth image corresponding to each of a plurality of frame periods,
each frame period is followed by an inter-frame period,
each frame period includes at least two integration periods, and
each integration period is followed by a readout period; and
the driver is configured to drive a said light source to emit IR light during each of the integration periods;
the driver is configured to drive a said light source to emit IR light during the readout periods between pairs of the integration periods within each frame period, in order to reduce how often there are transitions from times during which IR light is being emitted and times during which IR light is not being emitted, and thereby reduce frequency content associated with the transitions; and
the light source driven to emit IR light during the readout periods can be the same or different than the light source driven to emit IR light during the integration periods.
9. The system of claim 7, wherein:
the TOF system is configured to obtain a separate depth image corresponding to each of a plurality of frame periods,
each frame period is followed by an inter-frame period,
each frame period includes at least two integration periods, and
each integration period is followed by a readout period; and
the driver is configured to drive a said light source to emit IR light during each of the integration periods;
the driver is configured to drive a said light source to emit IR light during at least a portion at least one of (i) the readout periods between pairs of the integration periods within each frame period or (ii) the inter-frame periods between pairs of frames, in order to decrease a gain level of an automatic gain control (AGC) circuit for use with an IR light receiver of at least one other system configured to wirelessly receive and respond to IR light signals, and thereby make the IR light receiver of the at least one other system less sensitive to interference from the TOF system; and
the light source that is driven to emit IR light during at least a portion at least one of (i) the readout periods between pairs of the integration periods within each frame period or (ii) the inter-frame periods between pairs of frames, can be the same as or different than the light source driven to emit IR light during the integration periods.
10. The system of claim 7, 8 or 9 further comprising:
a clock signal generator configured to produce a clock signal that is provided to the driver; and
a processor configured to control at least one of the clock signal generator or the driver; wherein at least one of the processor, the clock signal generator or the driver is/are configured to:
ramp up pulse amplitudes or duty cycles of the drive signal when transitioning from a time during which no light source is driven to emit IR light to a time during which a said light source is driven by the drive signal to emit IR light; and
ramp down pulse amplitudes or duty cycles of the drive signal when transitioning from a time during which a said light source is driven by the drive signal to emit IR light to a time during which no light source is driven to emit IR light.
PCT/US2014/037753 2013-05-13 2014-05-13 Interference reduction for tof systems WO2014186304A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP14734582.1A EP2997395B1 (en) 2013-05-13 2014-05-13 Interference reduction for tof systems
CN201480028101.8A CN105264401B (en) 2013-05-13 2014-05-13 Interference reduction for TOF system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361822873P 2013-05-13 2013-05-13
US61/822,873 2013-05-13
US14/055,660 US9442186B2 (en) 2013-05-13 2013-10-16 Interference reduction for TOF systems
US14/055,660 2013-10-16

Publications (1)

Publication Number Publication Date
WO2014186304A1 true WO2014186304A1 (en) 2014-11-20

Family

ID=51864564

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/037753 WO2014186304A1 (en) 2013-05-13 2014-05-13 Interference reduction for tof systems

Country Status (4)

Country Link
US (1) US9442186B2 (en)
EP (1) EP2997395B1 (en)
CN (1) CN105264401B (en)
WO (1) WO2014186304A1 (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9773155B2 (en) * 2014-10-14 2017-09-26 Microsoft Technology Licensing, Llc Depth from time of flight camera
CN107533139B (en) * 2015-03-17 2022-02-08 康奈尔大学 Depth-of-field imaging apparatus, method and application
US11002531B2 (en) 2015-04-20 2021-05-11 Samsung Electronics Co., Ltd. CMOS image sensor for RGB imaging and depth measurement with laser sheet scan
US10250833B2 (en) 2015-04-20 2019-04-02 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
US20160309135A1 (en) 2015-04-20 2016-10-20 Ilia Ovsiannikov Concurrent rgbz sensor and system
US10132616B2 (en) 2015-04-20 2018-11-20 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US11736832B2 (en) 2015-04-20 2023-08-22 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
US9819930B2 (en) * 2015-05-26 2017-11-14 Omnivision Technologies, Inc. Time of flight imaging with improved initiation signaling
US10397546B2 (en) 2015-09-30 2019-08-27 Microsoft Technology Licensing, Llc Range imaging
US10523923B2 (en) 2015-12-28 2019-12-31 Microsoft Technology Licensing, Llc Synchronizing active illumination cameras
US10462452B2 (en) 2016-03-16 2019-10-29 Microsoft Technology Licensing, Llc Synchronizing active illumination cameras
US10234561B2 (en) 2016-05-09 2019-03-19 Microsoft Technology Licensing, Llc Specular reflection removal in time-of-flight camera apparatus
CN106226777A (en) * 2016-08-17 2016-12-14 乐视控股(北京)有限公司 Infrared acquisition localization method and system
US10514483B2 (en) * 2016-10-13 2019-12-24 Massachusetts Institute Of Technology Time-of-flight camera for imaging through optical cavity
US10928489B2 (en) * 2017-04-06 2021-02-23 Microsoft Technology Licensing, Llc Time of flight camera
US10215856B1 (en) 2017-11-27 2019-02-26 Microsoft Technology Licensing, Llc Time of flight camera
US10901087B2 (en) 2018-01-15 2021-01-26 Microsoft Technology Licensing, Llc Time of flight camera
FR3078851B1 (en) * 2018-03-07 2020-02-14 Teledyne E2V Semiconductors Sas VISION CAMERA WITH OPTIMIZED FLIGHT-TIME MEASUREMENT FOR A MULTI-CAMERA ENVIRONMENT
US20190045169A1 (en) * 2018-05-30 2019-02-07 Intel Corporation Maximizing efficiency of flight optical depth sensors in computing environments
JP2020020681A (en) * 2018-08-01 2020-02-06 ソニーセミコンダクタソリューションズ株式会社 Light source device, image sensor and sensing module
DE102019206318A1 (en) * 2019-05-03 2020-11-05 Robert Bosch Gmbh Cumulative short pulse emission for pulsed LIDAR devices with long exposure times
CN111024368B (en) * 2019-12-06 2022-02-11 炬佑智能科技(苏州)有限公司 TOF camera stray light detection device and detection method
US20230091317A1 (en) * 2020-02-18 2023-03-23 Evolution Optiks Limited Multiview system, method and display for rendering multiview content, and viewer localisation system, method and device therefor
CN116669153B (en) * 2022-12-09 2023-10-20 荣耀终端有限公司 Transmission power backoff method, terminal device, and computer-readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1647839A2 (en) * 2004-10-18 2006-04-19 Audi Ag Method and distance measuring device for determining the distance between an object and the device
DE102006050303A1 (en) * 2005-12-05 2007-06-14 Cedes Ag Sensor arrangement and sensor device for a sensor arrangement
US20100110280A1 (en) * 2007-02-15 2010-05-06 Chiaki Aoyama Environment recognition apparatus
US20120098964A1 (en) * 2010-10-22 2012-04-26 Mesa Imaging Ag System and Method for Multi TOF Camera Operation Using Phase Hopping

Family Cites Families (177)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2849252C3 (en) * 1978-11-14 1981-07-30 Bruno Dipl.-Ing. 8602 Stegaurach Richter Optical-electrical measuring device
US4695953A (en) 1983-08-25 1987-09-22 Blair Preston E TV animation interactively controlled by the viewer
US4630910A (en) 1984-02-16 1986-12-23 Robotic Vision Systems, Inc. Method of measuring in three-dimensions at high speed
US4627620A (en) 1984-12-26 1986-12-09 Yang John P Electronic athlete trainer for improving skills in reflex, speed and accuracy
US4645458A (en) 1985-04-15 1987-02-24 Harald Phillip Athletic evaluation and training apparatus
US4702475A (en) 1985-08-16 1987-10-27 Innovating Training Products, Inc. Sports technique and reaction training system
US4843568A (en) 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US4711543A (en) 1986-04-14 1987-12-08 Blair Preston E TV animation interactively controlled by the viewer
US4796997A (en) 1986-05-27 1989-01-10 Synthetic Vision Systems, Inc. Method and system for high-speed, 3-D imaging of an object at a vision station
US5184295A (en) 1986-05-30 1993-02-02 Mann Ralph V System and method for teaching physical skills
US4751642A (en) 1986-08-29 1988-06-14 Silva John M Interactive sports simulation system with physiological sensing and psychological conditioning
US4809065A (en) 1986-12-01 1989-02-28 Kabushiki Kaisha Toshiba Interactive system and related method for displaying data to produce a three-dimensional image of an object
US4817950A (en) 1987-05-08 1989-04-04 Goo Paul E Video game control unit and attitude sensor
US5239463A (en) 1988-08-04 1993-08-24 Blair Preston E Method and apparatus for player interaction with animated characters and objects
US5239464A (en) 1988-08-04 1993-08-24 Blair Preston E Interactive video system providing repeated switching of multiple tracks of actions sequences
US4901362A (en) 1988-08-08 1990-02-13 Raytheon Company Method of recognizing patterns
US4893183A (en) 1988-08-11 1990-01-09 Carnegie-Mellon University Robotic vision system
JPH02199526A (en) 1988-10-14 1990-08-07 David G Capper Control interface apparatus
US4925189A (en) 1989-01-13 1990-05-15 Braeunig Thomas F Body-mounted video game exercise device
US5229756A (en) 1989-02-07 1993-07-20 Yamaha Corporation Image control apparatus
US5469740A (en) 1989-07-14 1995-11-28 Impulse Technology, Inc. Interactive video testing and training system
JPH03103822U (en) 1990-02-13 1991-10-29
US5101444A (en) 1990-05-18 1992-03-31 Panacea, Inc. Method and apparatus for high speed object location
US5148154A (en) 1990-12-04 1992-09-15 Sony Corporation Of America Multi-dimensional user interface
US5534917A (en) 1991-05-09 1996-07-09 Very Vivid, Inc. Video image based control system
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5295491A (en) 1991-09-26 1994-03-22 Sam Technology, Inc. Non-invasive human neurocognitive performance capability testing method and system
US6054991A (en) 1991-12-02 2000-04-25 Texas Instruments Incorporated Method of modeling player position and movement in a virtual reality system
WO1993010708A1 (en) 1991-12-03 1993-06-10 French Sportech Corporation Interactive video testing and training system
US5875108A (en) 1991-12-23 1999-02-23 Hoffberg; Steven M. Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
JPH07325934A (en) 1992-07-10 1995-12-12 Walt Disney Co:The Method and equipment for provision of graphics enhanced to virtual world
US5999908A (en) 1992-08-06 1999-12-07 Abelow; Daniel H. Customer-based product design module
US5278923A (en) * 1992-09-02 1994-01-11 Harmonic Lightwaves, Inc. Cascaded optical modulation system with high linearity
US5320538A (en) 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method
IT1257294B (en) 1992-11-20 1996-01-12 DEVICE SUITABLE TO DETECT THE CONFIGURATION OF A PHYSIOLOGICAL-DISTAL UNIT, TO BE USED IN PARTICULAR AS AN ADVANCED INTERFACE FOR MACHINES AND CALCULATORS.
US5495576A (en) 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5690582A (en) 1993-02-02 1997-11-25 Tectrix Fitness Equipment, Inc. Interactive exercise apparatus
JP2799126B2 (en) 1993-03-26 1998-09-17 株式会社ナムコ Video game device and game input device
US5405152A (en) 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback
US5454043A (en) 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5423554A (en) 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5980256A (en) 1993-10-29 1999-11-09 Carmein; David E. E. Virtual reality system with enhanced sensory apparatus
JP3419050B2 (en) 1993-11-19 2003-06-23 株式会社日立製作所 Input device
US5347306A (en) 1993-12-17 1994-09-13 Mitsubishi Electric Research Laboratories, Inc. Animated electronic meeting place
JP2552427B2 (en) 1993-12-28 1996-11-13 コナミ株式会社 Tv play system
US5577981A (en) 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5580249A (en) 1994-02-14 1996-12-03 Sarcos Group Apparatus for simulating mobility of a human
US5597309A (en) 1994-03-28 1997-01-28 Riess; Thomas Method and apparatus for treatment of gait problems associated with parkinson's disease
US5385519A (en) 1994-04-19 1995-01-31 Hsu; Chi-Hsueh Running machine
US7209221B2 (en) * 1994-05-23 2007-04-24 Automotive Technologies International, Inc. Method for obtaining and displaying information about objects in a vehicular blind spot
US5524637A (en) 1994-06-29 1996-06-11 Erickson; Jon W. Interactive system for measuring physiological exertion
JPH0844490A (en) 1994-07-28 1996-02-16 Matsushita Electric Ind Co Ltd Interface device
US5563988A (en) 1994-08-01 1996-10-08 Massachusetts Institute Of Technology Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual environment
US6714665B1 (en) 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
US5516105A (en) 1994-10-06 1996-05-14 Exergame, Inc. Acceleration activated joystick
US5638300A (en) 1994-12-05 1997-06-10 Johnson; Lee E. Golf swing analysis system
JPH08161292A (en) 1994-12-09 1996-06-21 Matsushita Electric Ind Co Ltd Method and system for detecting congestion degree
US5594469A (en) 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5682229A (en) 1995-04-14 1997-10-28 Schwartz Electro-Optics, Inc. Laser range camera
US5913727A (en) 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game
JP3481631B2 (en) 1995-06-07 2003-12-22 ザ トラスティース オブ コロンビア ユニヴァーシティー イン ザ シティー オブ ニューヨーク Apparatus and method for determining a three-dimensional shape of an object using relative blur in an image due to active illumination and defocus
US5682196A (en) 1995-06-22 1997-10-28 Actv, Inc. Three-dimensional (3D) video presentation system providing interactive 3D presentation with personalized audio responses for multiple viewers
US5702323A (en) 1995-07-26 1997-12-30 Poulton; Craig K. Electronic exercise enhancer
US6430997B1 (en) 1995-11-06 2002-08-13 Trazer Technologies, Inc. System and method for tracking and assessing movement skills in multidimensional space
US6098458A (en) 1995-11-06 2000-08-08 Impulse Technology, Ltd. Testing and training system for assessing movement and agility skills without a confining field
US6073489A (en) 1995-11-06 2000-06-13 French; Barry J. Testing and training system for assessing the ability of a player to complete a task
WO1999044698A2 (en) 1998-03-03 1999-09-10 Arena, Inc. System and method for tracking and assessing movement skills in multidimensional space
US6308565B1 (en) 1995-11-06 2001-10-30 Impulse Technology Ltd. System and method for tracking and assessing movement skills in multidimensional space
US6176782B1 (en) 1997-12-22 2001-01-23 Philips Electronics North America Corp. Motion-based command generation technology
US5933125A (en) 1995-11-27 1999-08-03 Cae Electronics, Ltd. Method and apparatus for reducing instability in the display of a virtual environment
US5641288A (en) 1996-01-11 1997-06-24 Zaenglein, Jr.; William G. Shooting simulating process and training device using a virtual reality display screen
CA2253626A1 (en) 1996-05-08 1997-11-13 Real Vision Corporation Real time simulation using position sensing
US6173066B1 (en) 1996-05-21 2001-01-09 Cybernet Systems Corporation Pose determination and tracking by matching 3D objects to a 2D sensor
US5989157A (en) 1996-08-06 1999-11-23 Walton; Charles A. Exercising system with electronic inertial game playing
EP0959444A4 (en) 1996-08-14 2005-12-07 Nurakhmed Nurislamovic Latypov Method for following and imaging a subject's three-dimensional position and orientation, method for presenting a virtual space to a subject, and systems for implementing said methods
JP3064928B2 (en) 1996-09-20 2000-07-12 日本電気株式会社 Subject extraction method
DE69626208T2 (en) 1996-12-20 2003-11-13 Hitachi Europ Ltd Method and system for recognizing hand gestures
US6009210A (en) 1997-03-05 1999-12-28 Digital Equipment Corporation Hands-free interface to a virtual reality environment using head tracking
US6100896A (en) 1997-03-24 2000-08-08 Mitsubishi Electric Information Technology Center America, Inc. System for designing graphical multi-participant environments
US5877803A (en) 1997-04-07 1999-03-02 Tritech Mircoelectronics International, Ltd. 3-D image detector
US6215898B1 (en) 1997-04-15 2001-04-10 Interval Research Corporation Data processing system and method
JP3077745B2 (en) 1997-07-31 2000-08-14 日本電気株式会社 Data processing method and apparatus, information storage medium
US6188777B1 (en) 1997-08-01 2001-02-13 Interval Research Corporation Method and apparatus for personnel detection and tracking
US6720949B1 (en) 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
US6289112B1 (en) 1997-08-22 2001-09-11 International Business Machines Corporation System and method for determining block direction in fingerprint images
AUPO894497A0 (en) 1997-09-02 1997-09-25 Xenotech Research Pty Ltd Image processing method and apparatus
EP0905644A3 (en) 1997-09-26 2004-02-25 Matsushita Electric Industrial Co., Ltd. Hand gesture recognizing device
US6141463A (en) 1997-10-10 2000-10-31 Electric Planet Interactive Method and system for estimating jointed-figure configurations
US6072494A (en) 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
AU1099899A (en) 1997-10-15 1999-05-03 Electric Planet, Inc. Method and apparatus for performing a clean background subtraction
US6384819B1 (en) 1997-10-15 2002-05-07 Electric Planet, Inc. System and method for generating an animatable character
US6101289A (en) 1997-10-15 2000-08-08 Electric Planet, Inc. Method and apparatus for unencumbered capture of an object
US6130677A (en) 1997-10-15 2000-10-10 Electric Planet, Inc. Interactive computer vision system
US6181343B1 (en) 1997-12-23 2001-01-30 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6159100A (en) 1998-04-23 2000-12-12 Smith; Michael D. Virtual reality game
US6077201A (en) 1998-06-12 2000-06-20 Cheng; Chau-Yang Exercise bicycle
US20010008561A1 (en) 1999-08-10 2001-07-19 Paul George V. Real-time object tracking system
US6681031B2 (en) 1998-08-10 2004-01-20 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US6801637B2 (en) 1999-08-10 2004-10-05 Cybernet Systems Corporation Optical body tracker
US7121946B2 (en) 1998-08-10 2006-10-17 Cybernet Systems Corporation Real-time head tracking system for computer games and other applications
US7036094B1 (en) 1998-08-10 2006-04-25 Cybernet Systems Corporation Behavior recognition system
US6950534B2 (en) 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
IL126284A (en) 1998-09-17 2002-12-01 Netmor Ltd System and method for three dimensional positioning and tracking
DE69936620T2 (en) 1998-09-28 2008-05-21 Matsushita Electric Industrial Co., Ltd., Kadoma Method and device for segmenting hand gestures
US6661918B1 (en) 1998-12-04 2003-12-09 Interval Research Corporation Background estimation and segmentation based on range and color
US6147678A (en) 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
DE69840608D1 (en) 1998-12-16 2009-04-09 3Dv Systems Ltd SELF-BUTTING PHOTO-SENSITIVE SURFACE
US6570555B1 (en) 1998-12-30 2003-05-27 Fuji Xerox Co., Ltd. Method and apparatus for embodied conversational characters with multimodal input/output in an interface device
US6363160B1 (en) 1999-01-22 2002-03-26 Intel Corporation Interface using pattern recognition and tracking
US7003134B1 (en) 1999-03-08 2006-02-21 Vulcan Patents Llc Three dimensional object pose estimation which employs dense depth information
US6299308B1 (en) 1999-04-02 2001-10-09 Cybernet Systems Corporation Low-cost non-imaging eye tracker system for computer control
US6503195B1 (en) 1999-05-24 2003-01-07 University Of North Carolina At Chapel Hill Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction
US6476834B1 (en) 1999-05-28 2002-11-05 International Business Machines Corporation Dynamic creation of selectable items on surfaces
US6873723B1 (en) 1999-06-30 2005-03-29 Intel Corporation Segmenting three-dimensional video images using stereo
US6738066B1 (en) 1999-07-30 2004-05-18 Electric Plant, Inc. System, method and article of manufacture for detecting collisions between video images generated by a camera and an object depicted on a display
US7113918B1 (en) 1999-08-01 2006-09-26 Electric Planet, Inc. Method for video enabled electronic commerce
US7050606B2 (en) 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
US6663491B2 (en) 2000-02-18 2003-12-16 Namco Ltd. Game apparatus, storage medium and computer program that adjust tempo of sound
US6633294B1 (en) 2000-03-09 2003-10-14 Seth Rosenthal Method and apparatus for using captured high density motion for animation
EP1152261A1 (en) 2000-04-28 2001-11-07 CSEM Centre Suisse d'Electronique et de Microtechnique SA Device and method for spatially resolved photodetection and demodulation of modulated electromagnetic waves
US6640202B1 (en) 2000-05-25 2003-10-28 International Business Machines Corporation Elastic sensor mesh system for 3-dimensional measurement, mapping and kinematics applications
US6731799B1 (en) 2000-06-01 2004-05-04 University Of Washington Object segmentation with background extraction and moving boundary techniques
US6788809B1 (en) 2000-06-30 2004-09-07 Intel Corporation System and method for gesture recognition in three dimensions using stereo imaging and color vision
US7227526B2 (en) 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
US7058204B2 (en) 2000-10-03 2006-06-06 Gesturetek, Inc. Multiple camera control system
US7039676B1 (en) 2000-10-31 2006-05-02 International Business Machines Corporation Using video image analysis to automatically transmit gestures over a network in a chat or instant messaging session
US6515740B2 (en) 2000-11-09 2003-02-04 Canesta, Inc. Methods for CMOS-compatible three-dimensional image sensing using quantum efficiency modulation
US7346918B2 (en) * 2000-12-27 2008-03-18 Z-Band, Inc. Intelligent device system and method for distribution of digital signals on a wideband signal distribution system
US6539931B2 (en) 2001-04-16 2003-04-01 Koninklijke Philips Electronics N.V. Ball throwing assistant
US6853227B2 (en) * 2001-04-17 2005-02-08 K-Tek Corporation Controller for generating a periodic signal with an adjustable duty cycle
US8035612B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
JP3420221B2 (en) 2001-06-29 2003-06-23 株式会社コナミコンピュータエンタテインメント東京 GAME DEVICE AND PROGRAM
US6937742B2 (en) 2001-09-28 2005-08-30 Bellsouth Intellectual Property Corporation Gesture activated home appliance
ATE321689T1 (en) 2002-04-19 2006-04-15 Iee Sarl SAFETY DEVICE FOR A VEHICLE
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
US7170492B2 (en) 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
US7348963B2 (en) 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US7489812B2 (en) 2002-06-07 2009-02-10 Dynamic Digital Depth Research Pty Ltd. Conversion and encoding techniques
US7576727B2 (en) 2002-12-13 2009-08-18 Matthew Bell Interactive directed light/sound system
JP4235729B2 (en) 2003-02-03 2009-03-11 国立大学法人静岡大学 Distance image sensor
EP1477924B1 (en) 2003-03-31 2007-05-02 HONDA MOTOR CO., Ltd. Gesture recognition apparatus, method and program
US7372977B2 (en) 2003-05-29 2008-05-13 Honda Motor Co., Ltd. Visual tracking using depth data
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US7620202B2 (en) 2003-06-12 2009-11-17 Honda Motor Co., Ltd. Target orientation estimation using depth sensing
US6867729B2 (en) * 2003-07-30 2005-03-15 Magnetrol International Guided wave radar level transmitter with automatic velocity compensation
WO2005041579A2 (en) 2003-10-24 2005-05-06 Reactrix Systems, Inc. Method and system for processing captured image information in an interactive video display system
CN100573548C (en) 2004-04-15 2009-12-23 格斯图尔泰克股份有限公司 The method and apparatus of tracking bimanual movements
US7308112B2 (en) 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
US7704135B2 (en) 2004-08-23 2010-04-27 Harrison Jr Shelton E Integrated game system, method, and device
KR20060070280A (en) 2004-12-20 2006-06-23 한국전자통신연구원 Apparatus and its method of user interface using hand gesture recognition
CN102831387B (en) 2005-01-07 2016-12-14 高通股份有限公司 Object in detect and track image
CN101198964A (en) 2005-01-07 2008-06-11 格斯图尔泰克股份有限公司 Creating 3D images of objects by illuminating with infrared patterns
JP5080273B2 (en) 2005-01-07 2012-11-21 クアルコム,インコーポレイテッド Tilt sensor based on optical flow
JP5631535B2 (en) 2005-02-08 2014-11-26 オブロング・インダストリーズ・インコーポレーテッド System and method for a gesture-based control system
US7317836B2 (en) 2005-03-17 2008-01-08 Honda Motor Co., Ltd. Pose estimation based on critical point analysis
WO2006124935A2 (en) 2005-05-17 2006-11-23 Gesturetek, Inc. Orientation-sensitive signal output
US8427426B2 (en) 2005-05-27 2013-04-23 Sony Computer Entertainment Inc. Remote input device
DE602005010696D1 (en) 2005-08-12 2008-12-11 Mesa Imaging Ag Highly sensitive, fast pixel for use in an image sensor
US20080026838A1 (en) 2005-08-22 2008-01-31 Dunstan James E Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games
US7450736B2 (en) 2005-10-28 2008-11-11 Honda Motor Co., Ltd. Monocular tracking of 3D human motion with a coordinated mixture of factor analyzers
US7656743B2 (en) * 2005-11-10 2010-02-02 Qualcomm, Incorporated Clock signal generation techniques for memories that do not generate a strobe
US7405812B1 (en) 2006-05-18 2008-07-29 Canesta, Inc. Method and system to avoid inter-system interference for phase-based time-of-flight systems
US7701439B2 (en) 2006-07-13 2010-04-20 Northrop Grumman Corporation Gesture recognition simulation system and method
US7620377B2 (en) * 2006-08-30 2009-11-17 General Dynamics C4 Systems Bandwidth enhancement for envelope elimination and restoration transmission systems
JP5395323B2 (en) 2006-09-29 2014-01-22 ブレインビジョン株式会社 Solid-state image sensor
US7412077B2 (en) 2006-12-29 2008-08-12 Motorola, Inc. Apparatus and methods for head pose estimation and head gesture detection
US7729530B2 (en) 2007-03-03 2010-06-01 Sergey Antonov Method and apparatus for 3-D data input to a personal computer with a multimedia oriented operating system
US7852262B2 (en) 2007-08-16 2010-12-14 Cybernet Systems Corporation Wireless mobile indoor/outdoor tracking system
WO2009124601A1 (en) 2008-04-11 2009-10-15 Ecole Polytechnique Federale De Lausanne Epfl Time-of-flight based imaging system using a display as illumination source
CN101254344B (en) 2008-04-18 2010-06-16 李刚 Game device of field orientation corresponding with display screen dot array in proportion and method
US8009295B2 (en) * 2009-01-29 2011-08-30 Celight, Inc. Chemical sensing with noise pre-compensation
EP2486474B1 (en) * 2009-10-07 2020-03-04 Elliptic Laboratories AS User interfaces
US8986211B2 (en) * 2009-10-12 2015-03-24 Kona Medical, Inc. Energetic modulation of nerves
US20120017153A1 (en) 2010-07-15 2012-01-19 Ken Matsuda Dynamic video editing
US8548270B2 (en) 2010-10-04 2013-10-01 Microsoft Corporation Time-of-flight depth imaging
US8642938B2 (en) 2012-01-13 2014-02-04 Omnivision Technologies, Inc. Shared time of flight pixel

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1647839A2 (en) * 2004-10-18 2006-04-19 Audi Ag Method and distance measuring device for determining the distance between an object and the device
DE102006050303A1 (en) * 2005-12-05 2007-06-14 Cedes Ag Sensor arrangement and sensor device for a sensor arrangement
US20100110280A1 (en) * 2007-02-15 2010-05-06 Chiaki Aoyama Environment recognition apparatus
US20120098964A1 (en) * 2010-10-22 2012-04-26 Mesa Imaging Ag System and Method for Multi TOF Camera Operation Using Phase Hopping

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
02: "SwissRanger SR-3000 Manual", 1 October 2006 (2006-10-01), XP055137827, Retrieved from the Internet <URL:https://aiweb.techfak.uni-bielefeld.de/files/SR3000_manual_V1.03.pdf> [retrieved on 20140902] *
JAMES MURE-DUBOIS ET AL: "Author manuscript, published in "Workshop on Multi-camera and Multi-modal Sensor Fusion Algorithms and Applications Fusion of Time of Flight Camera Point Clouds", 1 October 2008 (2008-10-01), XP055137771, Retrieved from the Internet <URL:http://hal.archives-ouvertes.fr/docs/00/32/67/81/PDF/1569139570.pdf> [retrieved on 20140902] *

Also Published As

Publication number Publication date
US20140333917A1 (en) 2014-11-13
CN105264401B (en) 2017-12-05
CN105264401A (en) 2016-01-20
EP2997395B1 (en) 2020-10-07
EP2997395A1 (en) 2016-03-23
US9442186B2 (en) 2016-09-13

Similar Documents

Publication Publication Date Title
US9442186B2 (en) Interference reduction for TOF systems
US10024968B2 (en) Optical modules that reduce speckle contrast and diffraction artifacts
US20150070489A1 (en) Optical modules for use with depth cameras
EP3055711B1 (en) Illumination modules that emit structured light
US10205931B2 (en) Power efficient laser diode driver circuit and method
US20100302365A1 (en) Depth Image Noise Reduction
US8294767B2 (en) Body scan
US9084002B2 (en) Heterogeneous image sensor synchronization
US20110150271A1 (en) Motion detection using depth images
US8605205B2 (en) Display as lighting for photos or video
CN107077730B (en) Silhouette-based limb finder determination
US9215478B2 (en) Protocol and format for communicating an image from a camera to a computing environment

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480028101.8

Country of ref document: CN

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14734582

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2014734582

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE