US20120277001A1 - Manual and Camera-based Game Control - Google Patents

Manual and Camera-based Game Control Download PDF

Info

Publication number
US20120277001A1
US20120277001A1 US13/220,609 US201113220609A US2012277001A1 US 20120277001 A1 US20120277001 A1 US 20120277001A1 US 201113220609 A US201113220609 A US 201113220609A US 2012277001 A1 US2012277001 A1 US 2012277001A1
Authority
US
United States
Prior art keywords
player
gesture
game
display
head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/220,609
Inventor
Thomas William Lansdale
Charles Robert Griffiths
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US13/220,609 priority Critical patent/US20120277001A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GRIFFITHS, Charles Robert, LANSDALE, Thomas William
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GRIFFITHS, Charles Robert, LANSDALE, Thomas William
Priority to CN201280020708.2A priority patent/CN103501869A/en
Priority to KR1020137028313A priority patent/KR20140023951A/en
Priority to JP2014508466A priority patent/JP2014523258A/en
Priority to EP12777746.4A priority patent/EP2701817B1/en
Priority to PCT/US2012/034677 priority patent/WO2012148853A2/en
Publication of US20120277001A1 publication Critical patent/US20120277001A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5553Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/638Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit

Definitions

  • a game system receives an image stream depicting a player of a game and also receives manual input from a hand operated controller used by the player.
  • the manual input may be used to control an avatar and the image stream is used to recognize gestures of the player and control background objects in the game using the recognized gestures. For example, by using gestures such as head rotations the player's hands are free to continue operating a hand-held controller and the player has increased control and an immersive game play experience.
  • methods for detecting head gestures from image streams in real time are described such as tracking velocity of player's heads and motion through quadrants of an image stream.
  • FIG. 1 is a schematic diagram of a player holding a game controller and sitting before a game apparatus having a depth camera;
  • FIG. 2 is a schematic diagram of a game system incorporating an image capture device, a hand held controller, a computing device and a display;
  • FIG. 3 is a plan view of a hand held controller
  • FIG. 4 is a perspective view of the hand held controller of FIG. 3 ;
  • FIG. 5 is a schematic diagram of a display during game play
  • FIG. 6 is a flow diagram of a method of operation of a game system
  • FIG. 7 is a flow diagram of another method of operation of a game system
  • FIG. 8 is a schematic diagram of a head tracking process
  • FIG. 9 illustrates an exemplary computing-based device in which embodiments of a game system may be implemented.
  • FIG. 1 illustrates an example control system 100 for controlling a computer game.
  • the control system comprises both a hand-held controller and a camera-based control system.
  • the control system comprises both a hand-held controller and a camera-based control system.
  • Integration is achieved as described herein to enable fine grained control of game systems in a robust, easy to use manner which enhances the player experience.
  • FIG. 1 shows a user 102 playing, in this illustrative example, a two dimensional side-scrolling platformer game. This type of game is may be clearly depicted in two dimensional drawings; however, the methods described herein are also applicable to three dimensional games, augmented reality applications and games of other types.
  • camera-based control system 100 can be used to, among other things, determine body pose, bind, recognize, analyze, track, associate to a human target, provide feedback, and/or adapt to aspects of a human target such as user 102 (also referred to herein as a player).
  • a human target such as user 102 (also referred to herein as a player).
  • user 102 also referred to herein as a player
  • one player is depicted for clarity. However, two or more players may also use the control system at the same time.
  • the camera-based control system 100 comprises a computing device 104 .
  • the computing device 104 can be a general purpose computer, gaming system or console, or dedicated image processing device.
  • the computing device 104 can include hardware components and/or software components such that the computing device 104 can be used to execute applications such as gaming applications and/or non-gaming applications.
  • the structure of the computing device 104 is discussed hereinafter with reference to FIG. 9 .
  • the camera-based control system 100 further comprises a capture device 106 .
  • the capture device 106 can be, for example, an image sensor or detector that can be used to visually monitor one or more users (such as user 102 ) such that gestures performed by the one or more users can be captured, analyzed, processed, and tracked to perform one or more controls or actions within a game or application, as described in more detail below.
  • the camera-based control system 100 can further comprise a display device 108 connected to the computing device 104 .
  • the computing device can be a television, a monitor, a high-definition television (HDTV), or the like that can provide game or application visuals (and optionally audio) to the user 102 .
  • HDMI high-definition television
  • the user 102 can be tracked using the capture device 106 such that the position, movements and size of user 102 can be interpreted by the computing device 104 (and/or the capture device 106 ) as controls that can be used to affect the application being executed by computing device 104 .
  • the user 102 can move his or her body (or parts of his or her body) to control an executed game or application.
  • the application executing on the computing device 104 is a two dimensional side-scrolling platformer game that the user 102 is playing.
  • the computing device 104 controls the display device 108 to provide a visual representation of a terrain comprising a landscape, tree, and the sun to the user 102 .
  • the computing device 104 also controls the display device 108 to provide a visual representation of a user avatar that the user 102 can control with his or her movements and/or by using a hand held controller 110 .
  • the computing device 104 can comprise a body pose estimator that is arranged to recognize and track different body parts of the user, and map these onto the avatar. In this way, the avatar copies the movements of the user 102 such that if the user 102 , for example walks in physical space, this causes the user avatar to walk in game space.
  • FIG. 2 illustrates a schematic diagram of the capture device 106 that can be used in the camera-based control system 100 of FIG. 1 .
  • the capture device 106 is configured to capture video images with depth information.
  • a capture device can be referred to as a depth camera.
  • the depth information can be in the form of a depth image that includes depth values, i.e. a value associated with each image element of the depth image that is related to the distance between the depth camera and an item or object located at that image element.
  • image element is used to refer to a pixel, group of pixels, voxel, group of voxels or other higher level component of an image.
  • the depth information can be obtained using any suitable technique including, for example, time-of-flight, structured light, stereo image, or the like.
  • the capture device 106 can organize the depth information into “Z layers,” or layers that may be perpendicular to a Z-axis extending from the depth camera along its line of sight.
  • the capture device 106 comprises at least one imaging sensor 200 .
  • the imaging sensor 200 comprises a depth camera 202 arranged to capture a depth image of a scene.
  • the captured depth image can include a two-dimensional (2-D) area of the captured scene where each image element in the 2-D area represents a depth value such as a length or distance of an object in the captured scene from the depth camera 202 .
  • the capture device can also include an emitter 204 arranged to illuminate the scene in such a manner that depth information can be ascertained by the depth camera 202 .
  • the depth camera 202 is an infra-red (IR) time-of-flight camera
  • the emitter 204 emits IR light onto the scene
  • the depth camera 202 is arranged to detect backscattered light from the surface of one or more targets and objects in the scene.
  • pulsed infrared light can be emitted from the emitter 204 such that the time between an outgoing light pulse and a corresponding incoming light pulse can be detected by the depth camera and measured and used to determine a physical distance from the capture device 106 to a location on the targets or objects in the scene.
  • the phase of the outgoing light wave from the emitter 204 can be compared to the phase of the incoming light wave at the depth camera 202 to determine a phase shift.
  • the phase shift can then be used to determine a physical distance from the capture device 106 to a location on the targets or objects.
  • time-of-flight analysis can be used to indirectly determine a physical distance from the capture device 106 to a location on the targets or objects by analyzing the intensity of the reflected beam of light over time via various techniques including, for example, shuttered light pulse imaging.
  • the capture device 106 can use structured light to capture depth information.
  • patterned light e.g., light displayed as a known pattern such as spot, grid, or stripe pattern, which may also be time-varying
  • the pattern Upon striking the surface of one or more targets or objects in the scene, the pattern becomes deformed.
  • Such a deformation of the pattern can be captured by the depth camera 202 and then be analyzed to determine a physical distance from the capture device 106 to a location on the targets or objects in the scene.
  • the depth camera 202 can be in the form of two or more physically separated cameras that view a scene from different angles, such that visual stereo data is obtained that can be resolved to generate depth information.
  • the emitter 204 can be used to illuminate the scene or can be omitted.
  • the capture device 106 can comprise a video camera, which is referred to as an RGB camera 206 .
  • the RGB camera 206 is arranged to capture sequences of images of the scene at visible light frequencies, and can hence provide images that can be used to augment the depth images.
  • the RGB camera 206 can be used instead of the depth camera 202 .
  • the capture device 106 can also optionally comprise a microphone 207 or microphone array (which can be directional and/or steerable), which is arranged to capture sound information such as voice input from the user and can be used for speech recognition.
  • the capture device 106 shown in FIG. 2 further comprises at least one processor 208 , which is in communication with the imaging sensor 200 (i.e. depth camera 202 and RGB camera 206 in the example of FIG. 2 ), the emitter 204 , and the microphone 207 .
  • the processor 208 can be a general purpose microprocessor, or a specialized signal/image processor.
  • the processor 208 is arranged to execute instructions to control the imaging sensor 200 , emitter 204 and microphone 207 to capture depth images, RGB images, and/or voice signals.
  • the processor 208 can also optionally be arranged to perform processing on these images and signals, as outlined in more detail hereinafter.
  • the capture device 106 shown in FIG. 2 further includes a memory 210 arranged to store the instructions that for execution by the processor 208 , images or frames of images captured by the depth camera 202 or RGB camera 206 , or any other suitable information, images, or the like.
  • the memory 210 can include random access memory (RAM), read only memory (ROM), cache, Flash memory, a hard disk, or any other suitable storage component.
  • RAM random access memory
  • ROM read only memory
  • cache Flash memory
  • a hard disk or any other suitable storage component.
  • the memory 210 can be a separate component in communication with the processor 208 or integrated into the processor 208 .
  • the capture device 106 also comprises an output interface 212 in communication with the processor 208 and is arranged to provide data to the computing device 104 via a communication link.
  • the communication link can be, for example, a wired connection (such as USB, Firewire, Ethernet or similar) and/or a wireless connection (such as WiFi, Bluetooth or similar).
  • the output interface 212 can interface with one or more communication networks (such as the internet) and provide data to the computing device 104 via these networks.
  • a controller 110 is also provided as part of the capture device.
  • the controller may be a hand held controller as depicted schematically in FIG. 1 or may be integral with another larger device that is not hand held.
  • the controller comprises a plurality of user input devices such as buttons, joysticks, touch pads, switches and enables a player to make input to a game system.
  • User input data is sent from the controller to the computing device 104 by a wired connection and/or a wireless connection.
  • the computing device 104 executes a number of functions relating to the camera-based gesture recognition, such as an optional body pose estimator 214 and a gesture recognition engine 216 .
  • the body pose estimator 214 is arranged to use computer vision techniques to detect and track different body parts of the user.
  • An example of a body pose estimator is given in US patent publication US-2010-0278384-A1 “Human body pose estimation” filed 20 May 2009.
  • the body pose estimator 214 can provide an output to the gesture recognition engine in the form of a time-series of data relating to the user's body pose. This can be in the form of a fully tracked skeletal model of the user, or a more coarse identification of the visible body parts of the user.
  • these time-series sequences can comprise data relating to a time-varying angle between at least two body parts of the user, a rate of change of angle between at least two body parts of the user, a motion velocity for at least one body part of the user, or a combination thereof.
  • the different types of data are known as “features”.
  • the body pose estimator 214 can derive other data sequences (i.e. other features) from the changing pose of the user over time.
  • the gesture recognition engine 216 can utilize input (i.e. features) derived from different sources other than the body pose estimator.
  • Application software 218 can also be executed on the computing device 104 and controlled using the gestures. The application software is arranged to control display of the game at a display 220 .
  • FIG. 3 is a plan view of an example hand held controller 110 . It has a generally winged shape with each wing or shoulder 122 being 316 being sized and shaped to be clasped in one hand.
  • the controller comprises a housing supporting a plurality of buttons, switches and joysticks as now described in more detail. However, this is an example only and other types of controller 110 may be used.
  • buttons 302 are provided on the right face of the controller comprising a green A button, red B button, blue X button and amber Y button.
  • Two analog joysticks 310 and 312 are provided. These joysticks may also be depressed or clicked in to active a digital button beneath each joystick.
  • Digital start 306 , back 308 and guide 304 buttons are centrally positioned on the housing. For example, the guide button is used to turn on the controller and access a menu.
  • FIG. 4 is a perspective view of the controller and shows a left bumper 406 and a right bumper 404 each of which are buttons that may be pressed by the user.
  • a left trigger 400 and a right trigger 402 which are both analog are given on the underside of the controller (visible in FIG. 4 ).
  • a connect 408 may be provided to enable wired connection to the computing device 104 .
  • FIG. 5 is a schematic diagram of a display during gameplay for example, at display screen 108 .
  • the game is a two dimensional side-scrolling platformer game.
  • a player controls an avatar 500 using a manual controller 110 in order to proceed through an environment which in this example comprises a landscape in daylight due to presence of a sun 504 .
  • a plurality of enemies 502 are shown blocking the path of the avatar 500 .
  • Around some of these enemies weapons are represented as clouds of danger.
  • the player may advance in the game by avoiding the enemies such as by jumping over them. In this example it is difficult for the player to succeed because of the great number of enemies in the avatar's path.
  • a player In addition to controlling the avatar 500 using a manual controller, a player is able to control the game system using the camera-based control system.
  • a player may make a particular gesture in order to control the game system. For example, a player may rotate his or her head in order to make the sun 504 set and the enemies 502 sleep. Once the enemies sleep the player may advance the avatar 500 without attack from the enemies.
  • FIG. 6 is a flow diagram of a method of operation of a game system.
  • a game is displayed 600 such as at display screen 108 .
  • An image stream is received 602 depicting at least one player of the game.
  • the image stream is obtained from the image capture system 106 and may comprise depth images and color images.
  • Gesture recognition 606 is carried out on the image stream, for example, using body pose estimator 214 and gesture recognition engine 216 .
  • the output of the gesture recognition process is used to influence 608 the course of the game.
  • manual input from a controller 604 is used to influence the course of the game.
  • the game is displayed 610 and the display continues according to rules of the game in combination with the player's manual input, the image stream and other optional factors such as time, random elements, and other factors.
  • the camera-based control of the game system is used to control only background objects in the avatar's environment.
  • Background objects are any objects in the game without an associated physics model.
  • the avatar 500 and enemies 502 of FIG. 5 are not background objects but the sun 504 is a background object.
  • the camera-based control of the game system is available only at particular states of the game and only using specified gestures.
  • the specified gestures may be gestures which may be performed by a player at the same time as holding a manual controller.
  • a non-exhaustive list of examples of such gestures is: full head rotation, partial head rotation, leg movement, foot movement, knee movement.
  • FIG. 7 is a flow diagram of a method of operation of a game system using both manual and gesture-based control.
  • An avatar is displayed in an environment with at least one background object.
  • the background object may be a sun, a tree, a building, a clock.
  • Manual input is received 702 from a player of the game.
  • the manual input is used to control 704 the avatar.
  • a specified background object such as a sun is highlighted if a particular game state is reached 706 .
  • the game state may be a scripted movement in the game or may be a state which is activated by the player using a manual input at the controller.
  • the sun may change color and/or pulsate. However, this is not essential. Other ways of highlighting the background object may be used.
  • Motion of at least part of the player's body is tracked 708 using the image stream and gesture recognition.
  • Motion of the specified object (such as the sun) is displayed 710 according to the tracked motion of the player's body or body part.
  • the display of the sun in the game moves in a manner matching movement of the player's head. This provides an indication to the player that he or she has control of the background object by moving his or her body or body part.
  • a specified gesture is detected 712 then the game state is changed 714 by changing the state of at least the specified object. For example, the sun sets and the enemies sleep.
  • one player operates the game system.
  • the image stream depicts a plurality of players and the players are segmented from the image stream as part of the gesture recognition process.
  • a plurality of players play the game with each player having his or her own game system with camera-based and manual control. Combinations of these approaches are also possible.
  • a head rotation is detected by the gesture recognition system as now described with reference to FIG. 8 .
  • the image stream 800 is divided into four quadrants 802 “cut” corner to corner and meeting in the center.
  • the player's head position is typically in the top quadrant and this is illustrated at 804 in FIG. 8 . If the player's head moves through each of the quadrants in a clockwise direction and in a specified time, then a particular gesture is detected. This is illustrated in FIG. 8 by the head positions 804 , 806 , 808 and 810 .
  • Other ways of detecting a head movement gesture may be used.
  • gestures of the head are detected by monitoring velocity of motion of the head in the image stream and comparing that to a threshold.
  • FIG. 9 illustrates various components of an exemplary computing device 104 which may be implemented as any form of a computing and/or electronic device, and in which embodiments of the above-described game control techniques may be implemented.
  • Computing device 104 comprises one or more processors 902 which may be microprocessors, controllers or any other suitable type of processors for processing computing executable instructions to control a game system.
  • processors 902 may include one or more fixed function blocks (also referred to as accelerators) which implement a part of the game control methods in hardware (rather than software or firmware).
  • the computing-based device 104 also comprises an input interface 904 arranged to receive input from one or more devices, such as the capture device 106 of FIG. 2 and/or the controller of FIG. 3 and FIG. 4 .
  • An output interface 906 is also provided and arranged to provide output to, for example, a display system integral with or in communication with the computing-based device (such as display device 108 or 220 ).
  • the display system may provide a graphical user interface, or other user interface of any suitable type although this is not essential.
  • a communication interface 908 may optionally be provided, which can be arranged to communicate with one or more communication networks (e.g. the internet).
  • Computer-readable media may include, for example, computer storage media such as memory 910 and communications media.
  • Computer storage media, such as memory 910 includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
  • communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism.
  • computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se.
  • the computer storage media memory 910
  • the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 908 ).
  • Platform software comprising an operating system 912 or any other suitable platform software may be provided at the computing-based device to enable application software 218 to be executed on the device.
  • the memory 910 can store executable instructions to implement the functionality of the body pose estimator 214 and the gesture recognition engine 216 .
  • the memory 910 can also provide a data store 914 , which can be used to provide storage for data used by the processors 902 when performing the game control techniques, such as for any stance templates, thresholds, parameters, screen space mapping functions, or other data.
  • computer is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the term ‘computer’ includes PCs, servers, mobile telephones, personal digital assistants and many other devices.
  • the methods described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium.
  • tangible (or non-transitory) storage media include disks, thumb drives, memory etc and do not include propagated signals.
  • the software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
  • a remote computer may store an example of the process described as software.
  • a local or terminal computer may access the remote computer and download a part or all of the software to run the program.
  • the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network).
  • a dedicated circuit such as a DSP, programmable logic array, or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)

Abstract

Manual and camera-based game control is described. In one embodiment, a game system receives an image stream depicting a player of a game and also receives manual input from a hand operated controller used by the player. In an example, the manual input may be used to control an avatar and the image stream is used to recognize gestures of the player and control background objects in the game using the recognized gestures. For example, by using gestures such as head rotations the player's hands are free to continue operating a hand-held controller and the player has increased control and an immersive game play experience. In various embodiments, methods for detecting head gestures from image streams in real time are described such as tracking velocity of player's heads and motion through quadrants of an image stream.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This non-provisional utility application claims priority to U.S. provisional application Ser. No. 61/480,046, entitled “Manual and Camera-based Game Control” and filed on Apr. 28, 2011, which is incorporated herein in its entirety by reference.
  • BACKGROUND
  • Existing video and computer game control systems use hand held controllers which incorporate buttons and joysticks to enable a player to control an avatar or other objects depicted at a game display. Design of these types of hand held controllers seeks to enable fine grained control of game play in robust, easy to use and intuitive manners.
  • More recently, some computer game control systems use voice recognition technology and gesture recognition to enable a player to control a game interface. In this situation gamers have no hand held controller and are able to interact with the game in a straightforward manner without being restricted by physical user input devices such as hand held controllers.
  • The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known game control systems
  • SUMMARY
  • The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements or delineate the scope of the specification. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
  • Manual and camera-based game control is described. In one embodiment, a game system receives an image stream depicting a player of a game and also receives manual input from a hand operated controller used by the player. In an example, the manual input may be used to control an avatar and the image stream is used to recognize gestures of the player and control background objects in the game using the recognized gestures. For example, by using gestures such as head rotations the player's hands are free to continue operating a hand-held controller and the player has increased control and an immersive game play experience. In various embodiments, methods for detecting head gestures from image streams in real time are described such as tracking velocity of player's heads and motion through quadrants of an image stream.
  • Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
  • DESCRIPTION OF THE DRAWINGS
  • The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
  • FIG. 1 is a schematic diagram of a player holding a game controller and sitting before a game apparatus having a depth camera;
  • FIG. 2 is a schematic diagram of a game system incorporating an image capture device, a hand held controller, a computing device and a display;
  • FIG. 3 is a plan view of a hand held controller;
  • FIG. 4 is a perspective view of the hand held controller of FIG. 3;
  • FIG. 5 is a schematic diagram of a display during game play;
  • FIG. 6 is a flow diagram of a method of operation of a game system;
  • FIG. 7 is a flow diagram of another method of operation of a game system;
  • FIG. 8 is a schematic diagram of a head tracking process;
  • FIG. 9 illustrates an exemplary computing-based device in which embodiments of a game system may be implemented.
  • Like reference numerals are used to designate like parts in the accompanying drawings.
  • DETAILED DESCRIPTION
  • The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
  • Although the present examples are described and illustrated herein as being implemented in a game system for two dimensional side-scrolling platformer games, the system described is provided as an example and not a limitation. As those skilled in the art will appreciate, the present examples are suitable for application in a variety of different types of game systems.
  • Reference is first made to FIG. 1, which illustrates an example control system 100 for controlling a computer game. In this example, the control system comprises both a hand-held controller and a camera-based control system. By integrating both types of control a game player experiences the benefits of both types of control system. Integration is achieved as described herein to enable fine grained control of game systems in a robust, easy to use manner which enhances the player experience. FIG. 1 shows a user 102 playing, in this illustrative example, a two dimensional side-scrolling platformer game. This type of game is may be clearly depicted in two dimensional drawings; however, the methods described herein are also applicable to three dimensional games, augmented reality applications and games of other types. In some examples, camera-based control system 100 can be used to, among other things, determine body pose, bind, recognize, analyze, track, associate to a human target, provide feedback, and/or adapt to aspects of a human target such as user 102 (also referred to herein as a player). In this example one player is depicted for clarity. However, two or more players may also use the control system at the same time.
  • The camera-based control system 100 comprises a computing device 104. The computing device 104 can be a general purpose computer, gaming system or console, or dedicated image processing device. The computing device 104 can include hardware components and/or software components such that the computing device 104 can be used to execute applications such as gaming applications and/or non-gaming applications. The structure of the computing device 104 is discussed hereinafter with reference to FIG. 9.
  • The camera-based control system 100 further comprises a capture device 106. The capture device 106 can be, for example, an image sensor or detector that can be used to visually monitor one or more users (such as user 102) such that gestures performed by the one or more users can be captured, analyzed, processed, and tracked to perform one or more controls or actions within a game or application, as described in more detail below.
  • The camera-based control system 100 can further comprise a display device 108 connected to the computing device 104. The computing device can be a television, a monitor, a high-definition television (HDTV), or the like that can provide game or application visuals (and optionally audio) to the user 102.
  • In operation, the user 102 can be tracked using the capture device 106 such that the position, movements and size of user 102 can be interpreted by the computing device 104 (and/or the capture device 106) as controls that can be used to affect the application being executed by computing device 104. As a result, the user 102 can move his or her body (or parts of his or her body) to control an executed game or application.
  • In the illustrative example of FIG. 1, the application executing on the computing device 104 is a two dimensional side-scrolling platformer game that the user 102 is playing. In this example, the computing device 104 controls the display device 108 to provide a visual representation of a terrain comprising a landscape, tree, and the sun to the user 102. The computing device 104 also controls the display device 108 to provide a visual representation of a user avatar that the user 102 can control with his or her movements and/or by using a hand held controller 110. For example, the computing device 104 can comprise a body pose estimator that is arranged to recognize and track different body parts of the user, and map these onto the avatar. In this way, the avatar copies the movements of the user 102 such that if the user 102, for example walks in physical space, this causes the user avatar to walk in game space.
  • However, only copying user movements in game space limits the type and complexity of the interaction between the user and the game. For example, many in-game controls are momentary actions or commands, which may be triggered using button presses in traditional gaming systems. Examples of these include actions such as punch, shoot, change weapon, throw, kick, jump, and/or crouch. Such actions or commands may be controlled by recognizing that the user is performing one of these actions and triggering a corresponding in-game action, rather than merely copying the user's movements. In addition, combinations of user inputs at the hand held controller and user inputs via the camera-based control system may be used to control the game apparatus.
  • Reference is now made to FIG. 2, which illustrates a schematic diagram of the capture device 106 that can be used in the camera-based control system 100 of FIG. 1. In the example of FIG. 2 the capture device 106 is configured to capture video images with depth information. Such a capture device can be referred to as a depth camera. The depth information can be in the form of a depth image that includes depth values, i.e. a value associated with each image element of the depth image that is related to the distance between the depth camera and an item or object located at that image element. Note that the term “image element” is used to refer to a pixel, group of pixels, voxel, group of voxels or other higher level component of an image.
  • The depth information can be obtained using any suitable technique including, for example, time-of-flight, structured light, stereo image, or the like. In some examples, the capture device 106 can organize the depth information into “Z layers,” or layers that may be perpendicular to a Z-axis extending from the depth camera along its line of sight.
  • As shown in FIG. 2, the capture device 106 comprises at least one imaging sensor 200. In the example shown in FIG. 2, the imaging sensor 200 comprises a depth camera 202 arranged to capture a depth image of a scene. The captured depth image can include a two-dimensional (2-D) area of the captured scene where each image element in the 2-D area represents a depth value such as a length or distance of an object in the captured scene from the depth camera 202.
  • The capture device can also include an emitter 204 arranged to illuminate the scene in such a manner that depth information can be ascertained by the depth camera 202. For example, in the case that the depth camera 202 is an infra-red (IR) time-of-flight camera, the emitter 204 emits IR light onto the scene, and the depth camera 202 is arranged to detect backscattered light from the surface of one or more targets and objects in the scene. In some examples, pulsed infrared light can be emitted from the emitter 204 such that the time between an outgoing light pulse and a corresponding incoming light pulse can be detected by the depth camera and measured and used to determine a physical distance from the capture device 106 to a location on the targets or objects in the scene. Additionally, in some examples, the phase of the outgoing light wave from the emitter 204 can be compared to the phase of the incoming light wave at the depth camera 202 to determine a phase shift. The phase shift can then be used to determine a physical distance from the capture device 106 to a location on the targets or objects. In a further example, time-of-flight analysis can be used to indirectly determine a physical distance from the capture device 106 to a location on the targets or objects by analyzing the intensity of the reflected beam of light over time via various techniques including, for example, shuttered light pulse imaging.
  • In another example, the capture device 106 can use structured light to capture depth information. In such a technique, patterned light (e.g., light displayed as a known pattern such as spot, grid, or stripe pattern, which may also be time-varying) can be projected onto the scene using the emitter 204. Upon striking the surface of one or more targets or objects in the scene, the pattern becomes deformed. Such a deformation of the pattern can be captured by the depth camera 202 and then be analyzed to determine a physical distance from the capture device 106 to a location on the targets or objects in the scene.
  • In another example, the depth camera 202 can be in the form of two or more physically separated cameras that view a scene from different angles, such that visual stereo data is obtained that can be resolved to generate depth information. In this case the emitter 204 can be used to illuminate the scene or can be omitted.
  • In some examples, in addition or alternative to the depth camera 202, the capture device 106 can comprise a video camera, which is referred to as an RGB camera 206. The RGB camera 206 is arranged to capture sequences of images of the scene at visible light frequencies, and can hence provide images that can be used to augment the depth images. In some examples, the RGB camera 206 can be used instead of the depth camera 202. The capture device 106 can also optionally comprise a microphone 207 or microphone array (which can be directional and/or steerable), which is arranged to capture sound information such as voice input from the user and can be used for speech recognition.
  • The capture device 106 shown in FIG. 2 further comprises at least one processor 208, which is in communication with the imaging sensor 200 (i.e. depth camera 202 and RGB camera 206 in the example of FIG. 2), the emitter 204, and the microphone 207. The processor 208 can be a general purpose microprocessor, or a specialized signal/image processor. The processor 208 is arranged to execute instructions to control the imaging sensor 200, emitter 204 and microphone 207 to capture depth images, RGB images, and/or voice signals. The processor 208 can also optionally be arranged to perform processing on these images and signals, as outlined in more detail hereinafter.
  • The capture device 106 shown in FIG. 2 further includes a memory 210 arranged to store the instructions that for execution by the processor 208, images or frames of images captured by the depth camera 202 or RGB camera 206, or any other suitable information, images, or the like. In some examples, the memory 210 can include random access memory (RAM), read only memory (ROM), cache, Flash memory, a hard disk, or any other suitable storage component. The memory 210 can be a separate component in communication with the processor 208 or integrated into the processor 208.
  • The capture device 106 also comprises an output interface 212 in communication with the processor 208 and is arranged to provide data to the computing device 104 via a communication link. The communication link can be, for example, a wired connection (such as USB, Firewire, Ethernet or similar) and/or a wireless connection (such as WiFi, Bluetooth or similar). In other examples, the output interface 212 can interface with one or more communication networks (such as the internet) and provide data to the computing device 104 via these networks.
  • A controller 110 is also provided as part of the capture device. The controller may be a hand held controller as depicted schematically in FIG. 1 or may be integral with another larger device that is not hand held. The controller comprises a plurality of user input devices such as buttons, joysticks, touch pads, switches and enables a player to make input to a game system. User input data is sent from the controller to the computing device 104 by a wired connection and/or a wireless connection.
  • The computing device 104 executes a number of functions relating to the camera-based gesture recognition, such as an optional body pose estimator 214 and a gesture recognition engine 216. The body pose estimator 214 is arranged to use computer vision techniques to detect and track different body parts of the user. An example of a body pose estimator is given in US patent publication US-2010-0278384-A1 “Human body pose estimation” filed 20 May 2009. The body pose estimator 214 can provide an output to the gesture recognition engine in the form of a time-series of data relating to the user's body pose. This can be in the form of a fully tracked skeletal model of the user, or a more coarse identification of the visible body parts of the user. For example, these time-series sequences can comprise data relating to a time-varying angle between at least two body parts of the user, a rate of change of angle between at least two body parts of the user, a motion velocity for at least one body part of the user, or a combination thereof. The different types of data (angles between certain body parts, velocities, etc.) are known as “features”. In other examples, the body pose estimator 214 can derive other data sequences (i.e. other features) from the changing pose of the user over time. In further examples, the gesture recognition engine 216 can utilize input (i.e. features) derived from different sources other than the body pose estimator. Application software 218 can also be executed on the computing device 104 and controlled using the gestures. The application software is arranged to control display of the game at a display 220.
  • FIG. 3 is a plan view of an example hand held controller 110. It has a generally winged shape with each wing or shoulder 122 being 316 being sized and shaped to be clasped in one hand. The controller comprises a housing supporting a plurality of buttons, switches and joysticks as now described in more detail. However, this is an example only and other types of controller 110 may be used.
  • Four digital actions buttons 302 are provided on the right face of the controller comprising a green A button, red B button, blue X button and amber Y button. Two analog joysticks 310 and 312 are provided. These joysticks may also be depressed or clicked in to active a digital button beneath each joystick. Digital start 306, back 308 and guide 304 buttons are centrally positioned on the housing. For example, the guide button is used to turn on the controller and access a menu.
  • FIG. 4 is a perspective view of the controller and shows a left bumper 406 and a right bumper 404 each of which are buttons that may be pressed by the user. A left trigger 400 and a right trigger 402 which are both analog are given on the underside of the controller (visible in FIG. 4). A connect 408 may be provided to enable wired connection to the computing device 104.
  • FIG. 5 is a schematic diagram of a display during gameplay for example, at display screen 108. In this example the game is a two dimensional side-scrolling platformer game. A player controls an avatar 500 using a manual controller 110 in order to proceed through an environment which in this example comprises a landscape in daylight due to presence of a sun 504. In this example a plurality of enemies 502 are shown blocking the path of the avatar 500. Around some of these enemies weapons are represented as clouds of danger. The player may advance in the game by avoiding the enemies such as by jumping over them. In this example it is difficult for the player to succeed because of the great number of enemies in the avatar's path.
  • In addition to controlling the avatar 500 using a manual controller, a player is able to control the game system using the camera-based control system. In the example of FIG. 5 a player may make a particular gesture in order to control the game system. For example, a player may rotate his or her head in order to make the sun 504 set and the enemies 502 sleep. Once the enemies sleep the player may advance the avatar 500 without attack from the enemies.
  • FIG. 6 is a flow diagram of a method of operation of a game system. A game is displayed 600 such as at display screen 108. An image stream is received 602 depicting at least one player of the game. For example, the image stream is obtained from the image capture system 106 and may comprise depth images and color images. Gesture recognition 606 is carried out on the image stream, for example, using body pose estimator 214 and gesture recognition engine 216. The output of the gesture recognition process is used to influence 608 the course of the game. In addition, manual input from a controller 604 is used to influence the course of the game. The game is displayed 610 and the display continues according to rules of the game in combination with the player's manual input, the image stream and other optional factors such as time, random elements, and other factors. By enabling hybrid control of the game system by both manual player input and gesture input the player has increased control of the game system and is immersed in the game experience.
  • In some examples the camera-based control of the game system is used to control only background objects in the avatar's environment. Background objects are any objects in the game without an associated physics model. For example, the avatar 500 and enemies 502 of FIG. 5 are not background objects but the sun 504 is a background object. In some examples, the camera-based control of the game system is available only at particular states of the game and only using specified gestures. For example, the specified gestures may be gestures which may be performed by a player at the same time as holding a manual controller. A non-exhaustive list of examples of such gestures is: full head rotation, partial head rotation, leg movement, foot movement, knee movement.
  • FIG. 7 is a flow diagram of a method of operation of a game system using both manual and gesture-based control. An avatar is displayed in an environment with at least one background object. For example, the background object may be a sun, a tree, a building, a clock. Manual input is received 702 from a player of the game. The manual input is used to control 704 the avatar. A specified background object such as a sun is highlighted if a particular game state is reached 706. The game state may be a scripted movement in the game or may be a state which is activated by the player using a manual input at the controller. To highlight the background object, the sun may change color and/or pulsate. However, this is not essential. Other ways of highlighting the background object may be used. Motion of at least part of the player's body is tracked 708 using the image stream and gesture recognition. Motion of the specified object (such as the sun) is displayed 710 according to the tracked motion of the player's body or body part. For example, the display of the sun in the game moves in a manner matching movement of the player's head. This provides an indication to the player that he or she has control of the background object by moving his or her body or body part. If a specified gesture is detected 712 then the game state is changed 714 by changing the state of at least the specified object. For example, the sun sets and the enemies sleep.
  • In the examples described above one player operates the game system. However, it is also possible to have a plurality of players playing the game at the same time. In this case the image stream depicts a plurality of players and the players are segmented from the image stream as part of the gesture recognition process. In other examples a plurality of players play the game with each player having his or her own game system with camera-based and manual control. Combinations of these approaches are also possible.
  • In an example, a head rotation is detected by the gesture recognition system as now described with reference to FIG. 8. The image stream 800 is divided into four quadrants 802 “cut” corner to corner and meeting in the center. The player's head position is typically in the top quadrant and this is illustrated at 804 in FIG. 8. If the player's head moves through each of the quadrants in a clockwise direction and in a specified time, then a particular gesture is detected. This is illustrated in FIG. 8 by the head positions 804, 806, 808 and 810. Other ways of detecting a head movement gesture may be used. For example taking a point in real-world space below the player's head and tracking the player's head position relative to that; or detecting horizontal motion of the player's head and assuming correct vertical motion. In other examples gestures of the head are detected by monitoring velocity of motion of the head in the image stream and comparing that to a threshold.
  • FIG. 9 illustrates various components of an exemplary computing device 104 which may be implemented as any form of a computing and/or electronic device, and in which embodiments of the above-described game control techniques may be implemented.
  • Computing device 104 comprises one or more processors 902 which may be microprocessors, controllers or any other suitable type of processors for processing computing executable instructions to control a game system. In some examples, for example where a system on a chip architecture is used, the processors 902 may include one or more fixed function blocks (also referred to as accelerators) which implement a part of the game control methods in hardware (rather than software or firmware).
  • The computing-based device 104 also comprises an input interface 904 arranged to receive input from one or more devices, such as the capture device 106 of FIG. 2 and/or the controller of FIG. 3 and FIG. 4. An output interface 906 is also provided and arranged to provide output to, for example, a display system integral with or in communication with the computing-based device (such as display device 108 or 220). The display system may provide a graphical user interface, or other user interface of any suitable type although this is not essential. A communication interface 908 may optionally be provided, which can be arranged to communicate with one or more communication networks (e.g. the internet).
  • The computer executable instructions may be provided using any computer-readable media that is accessible by computing based device 104. Computer-readable media may include, for example, computer storage media such as memory 910 and communications media. Computer storage media, such as memory 910, includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Although the computer storage media (memory 910) is shown within the computing-based device 104 it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 908).
  • Platform software comprising an operating system 912 or any other suitable platform software may be provided at the computing-based device to enable application software 218 to be executed on the device. The memory 910 can store executable instructions to implement the functionality of the body pose estimator 214 and the gesture recognition engine 216. The memory 910 can also provide a data store 914, which can be used to provide storage for data used by the processors 902 when performing the game control techniques, such as for any stance templates, thresholds, parameters, screen space mapping functions, or other data.
  • The term ‘computer’ is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the term ‘computer’ includes PCs, servers, mobile telephones, personal digital assistants and many other devices.
  • The methods described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible (or non-transitory) storage media include disks, thumb drives, memory etc and do not include propagated signals. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
  • This acknowledges that software can be a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
  • Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.
  • Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
  • Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
  • It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to ‘an’ item refers to one or more of those items.
  • The steps of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the spirit and scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
  • The term ‘comprising’ is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
  • It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this specification.

Claims (20)

1. A method of controlling a computer game system comprising:
receiving a stream of images from an image capture device depicting at least one player of a game;
recognizing a gesture of the player by analyzing the stream of images;
receiving player manual input from a hand operated controller operated by the player;
displaying a computer game comprising at least two objects at a display being viewed by the at least one player;
controlling the display of one of the objects of the computer game on the basis of the recognized gesture and controlling the display of the other object of the computer game on the basis of the manual input from the controller.
2. A method as claimed in claim 1 wherein displaying the computer game comprises displaying an avatar, as one of the at least two objects, in an environment comprising at least one background object, being the other object, and wherein controlling the display of the computer game comprises using the recognized gesture to control the background object.
3. A method as claimed in claim 1 wherein the controlling the display of the background object is achieved without an associated physics model.
4. A method as claimed in claim 1 comprising storing details of specified gestures for use when recognizing the gesture and wherein the specified gestures may be performed by a player at the same time as making manual input at a hand operated controller.
5. A method as claimed in claim 2 wherein controlling the display of the background object using the recognized gesture is only possible at scripted moments of the game or on manual input from the player.
6. A method as claimed in claim 1 wherein displaying the computer game comprises displaying an avatar as one of the objects in an environment and wherein controlling the display of the computer game comprises controlling the avatar using the manual input from the controller and controlling the other object in the environment using the recognized gesture.
7. A method as claimed in claim 1 wherein recognizing the gesture comprises detecting a head of the player and tracking motion of the head in the image stream.
8. A method as claimed in claim 7 comprising dividing each image of the image stream into four quadrants and recognizing a head rotation when the head is detected in each of the four quadrants in turn.
9. A method as claimed in claim 8 wherein recognizing the head rotation comprises using a time limit.
10. A method as claimed in claim 7 which comprises recognizing a head rotation when motion of the head exceeds a specified velocity.
11. A method as claimed in claim 7 wherein recognizing a head rotation comprises tracking only horizontal motion of the head.
12. A computer game system comprising:
an image capture device arranged to receive a stream of images depicting at least one player of a game;
a gesture recognition engine arranged to recognizing a gesture of the player by analyzing the stream of images;
an input arranged to receive player manual input from a hand operated controller;
an output arranged to display a computer game comprising at least two objects at a display being viewed by the at least one player;
a processor arranged to control the display of one of the objects of the computer game on the basis of the recognized gesture and controlling the display of the other object of the computer game on the basis of the manual input from the controller.
13. A system as claimed in claim 12 wherein the processor is arranged to display an avatar, as one of the at least two objects, in an environment comprising at least one background object being the other object, and wherein controlling the display of the computer game comprises using the recognized gesture to control the background object.
14. A system as claimed in claim 12 wherein the gesture recognition engine and the input arranged to receive player manual input are arranged to operate substantially simultaneously.
15. A system as claimed in claim 12 comprises a memory storing details of specified gestures for use when recognizing the gesture and wherein the specified gestures may be performed by a player at the same time as making manual input at a hand operated controller.
16. A system as claimed in claim 13 wherein the processor is arranged to control the display of the background object using the recognized gesture only at scripted moments of the game or on specified manual input from the player.
17. A system as claimed in claim 12 wherein the gesture recognition engine is arranged to detecting a head of the player and track motion of the head in the image stream.
18. A method of controlling a computer game system comprising:
receiving a stream of images from an image capture device depicting at least one player of a game;
recognizing a gesture of the player by analyzing the stream of images, the gesture not requiring use of at least one of the player's hands;
receiving player manual input from a hand operated controller operated by the player at the same time as recognizing the gesture;
displaying a computer game, comprising an avatar in an environment having at least one background object;
controlling the display of the avatar using the manual input from the controller and controlling the display of the background object in the environment using the recognized gesture.
19. A method as claimed in claim 18 wherein the gesture comprises a head movement and wherein recognizing the gesture comprises monitoring a velocity of the player's head in the stream of images.
20. A method as claimed in claim 18 wherein the gesture comprises a head movement and wherein recognizing the gesture comprises monitoring motion of the player's head in a single direction.
US13/220,609 2011-04-28 2011-08-29 Manual and Camera-based Game Control Abandoned US20120277001A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US13/220,609 US20120277001A1 (en) 2011-04-28 2011-08-29 Manual and Camera-based Game Control
CN201280020708.2A CN103501869A (en) 2011-04-28 2012-04-23 Manual and camera-based game control
KR1020137028313A KR20140023951A (en) 2011-04-28 2012-04-23 Manual and camera-based game control
JP2014508466A JP2014523258A (en) 2011-04-28 2012-04-23 Manual and camera-based game control
EP12777746.4A EP2701817B1 (en) 2011-04-28 2012-04-23 Manual and camera-based game control
PCT/US2012/034677 WO2012148853A2 (en) 2011-04-28 2012-04-23 Manual and camera-based game control

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161480046P 2011-04-28 2011-04-28
US13/220,609 US20120277001A1 (en) 2011-04-28 2011-08-29 Manual and Camera-based Game Control

Publications (1)

Publication Number Publication Date
US20120277001A1 true US20120277001A1 (en) 2012-11-01

Family

ID=47068300

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/220,609 Abandoned US20120277001A1 (en) 2011-04-28 2011-08-29 Manual and Camera-based Game Control

Country Status (6)

Country Link
US (1) US20120277001A1 (en)
EP (1) EP2701817B1 (en)
JP (1) JP2014523258A (en)
KR (1) KR20140023951A (en)
CN (1) CN103501869A (en)
WO (1) WO2012148853A2 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120163674A1 (en) * 2010-12-24 2012-06-28 Hon Hai Precision Industry Co., Ltd. Motion detection module, electronic device applying the motion detection module, and motion detection method
US20120274745A1 (en) * 2011-04-29 2012-11-01 Austin Russell Three-dimensional imager and projection device
US20130131836A1 (en) * 2011-11-21 2013-05-23 Microsoft Corporation System for controlling light enabled devices
US20130215235A1 (en) * 2011-04-29 2013-08-22 Austin Russell Three-dimensional imager and projection device
US20140267611A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Runtime engine for analyzing user motion in 3d images
US20150002648A1 (en) * 2013-06-28 2015-01-01 Casio Computer Co., Ltd Measuring Apparatus Capable Of Measuring A Continuous Motional State
US9076212B2 (en) 2006-05-19 2015-07-07 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US20160077598A1 (en) * 2015-11-19 2016-03-17 International Business Machines Corporation Client device motion control via a video feed
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9734589B2 (en) 2014-07-23 2017-08-15 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9782141B2 (en) 2013-02-01 2017-10-10 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10004462B2 (en) 2014-03-24 2018-06-26 Kineticor, Inc. Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10620713B1 (en) 2019-06-05 2020-04-14 NEX Team Inc. Methods and systems for touchless control with a mobile device
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190202517A1 (en) * 2017-12-30 2019-07-04 Enriqueta Heras Therapeutic multi-seat bicycle

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050059488A1 (en) * 2003-09-15 2005-03-17 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US20070015559A1 (en) * 2002-07-27 2007-01-18 Sony Computer Entertainment America Inc. Method and apparatus for use in determining lack of user activity in relation to a system
US20070197290A1 (en) * 2003-09-18 2007-08-23 Ssd Company Limited Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method
US20090221374A1 (en) * 2007-11-28 2009-09-03 Ailive Inc. Method and system for controlling movements of objects in a videogame
US20090258703A1 (en) * 2008-04-09 2009-10-15 Aaron Philip Brunstetter Motion Assessment Using a Game Controller
US20100203969A1 (en) * 2007-08-03 2010-08-12 Camelot Co., Ltd. Game device, game program and game object operation method
US20100292007A1 (en) * 2007-06-26 2010-11-18 Nintendo Of America Inc. Systems and methods for control device including a movement detector
US20110059798A1 (en) * 1997-08-22 2011-03-10 Pryor Timothy R Interactive video based games using objects sensed by tv cameras
US20110098109A1 (en) * 2009-10-23 2011-04-28 Disney Enterprises, Virtual game instructor
US20110250962A1 (en) * 2010-04-09 2011-10-13 Feiner Steven K System and method for a 3d computer game with true vector of gravity
US20120155703A1 (en) * 2010-12-16 2012-06-21 Sony Computer Entertainment, Inc. Microphone array steering with image-based source location

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3611807B2 (en) * 2001-07-19 2005-01-19 コナミ株式会社 Video game apparatus, pseudo camera viewpoint movement control method and program in video game
US7874917B2 (en) * 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US7528835B2 (en) * 2005-09-28 2009-05-05 The United States Of America As Represented By The Secretary Of The Navy Open-loop controller
CN102989174B (en) * 2006-05-04 2016-06-29 美国索尼电脑娱乐公司 Obtain the input being used for controlling the operation of games
JP2010533006A (en) * 2007-03-01 2010-10-21 ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー System and method for communicating with a virtual world
TWI333156B (en) * 2007-08-16 2010-11-11 Ind Tech Res Inst Inertia sensing input controller and receiver and interactive system using thereof
US8225343B2 (en) * 2008-01-11 2012-07-17 Sony Computer Entertainment America Llc Gesture cataloging and recognition
US8696458B2 (en) * 2008-02-15 2014-04-15 Thales Visionix, Inc. Motion tracking system and method using camera and non-camera sensors
US8403753B2 (en) * 2008-09-30 2013-03-26 Nintendo Co., Ltd. Computer-readable storage medium storing game program, game apparatus, and processing method
JP5433215B2 (en) * 2008-11-21 2014-03-05 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US8009022B2 (en) * 2009-05-29 2011-08-30 Microsoft Corporation Systems and methods for immersive interaction with virtual objects

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110059798A1 (en) * 1997-08-22 2011-03-10 Pryor Timothy R Interactive video based games using objects sensed by tv cameras
US20070015559A1 (en) * 2002-07-27 2007-01-18 Sony Computer Entertainment America Inc. Method and apparatus for use in determining lack of user activity in relation to a system
US20050059488A1 (en) * 2003-09-15 2005-03-17 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US20070197290A1 (en) * 2003-09-18 2007-08-23 Ssd Company Limited Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method
US20100292007A1 (en) * 2007-06-26 2010-11-18 Nintendo Of America Inc. Systems and methods for control device including a movement detector
US20100203969A1 (en) * 2007-08-03 2010-08-12 Camelot Co., Ltd. Game device, game program and game object operation method
US20090221374A1 (en) * 2007-11-28 2009-09-03 Ailive Inc. Method and system for controlling movements of objects in a videogame
US20090258703A1 (en) * 2008-04-09 2009-10-15 Aaron Philip Brunstetter Motion Assessment Using a Game Controller
US20110098109A1 (en) * 2009-10-23 2011-04-28 Disney Enterprises, Virtual game instructor
US20110250962A1 (en) * 2010-04-09 2011-10-13 Feiner Steven K System and method for a 3d computer game with true vector of gravity
US20120155703A1 (en) * 2010-12-16 2012-06-21 Sony Computer Entertainment, Inc. Microphone array steering with image-based source location

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10869611B2 (en) 2006-05-19 2020-12-22 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9138175B2 (en) 2006-05-19 2015-09-22 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9076212B2 (en) 2006-05-19 2015-07-07 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9867549B2 (en) 2006-05-19 2018-01-16 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US20120163674A1 (en) * 2010-12-24 2012-06-28 Hon Hai Precision Industry Co., Ltd. Motion detection module, electronic device applying the motion detection module, and motion detection method
US8570372B2 (en) * 2011-04-29 2013-10-29 Austin Russell Three-dimensional imager and projection device
US20130215235A1 (en) * 2011-04-29 2013-08-22 Austin Russell Three-dimensional imager and projection device
US20120274745A1 (en) * 2011-04-29 2012-11-01 Austin Russell Three-dimensional imager and projection device
US8760499B2 (en) * 2011-04-29 2014-06-24 Austin Russell Three-dimensional imager and projection device
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US10663553B2 (en) 2011-08-26 2020-05-26 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US20130131836A1 (en) * 2011-11-21 2013-05-23 Microsoft Corporation System for controlling light enabled devices
US9628843B2 (en) * 2011-11-21 2017-04-18 Microsoft Technology Licensing, Llc Methods for controlling electronic devices using gestures
US9607377B2 (en) 2013-01-24 2017-03-28 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9779502B1 (en) 2013-01-24 2017-10-03 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US10339654B2 (en) 2013-01-24 2019-07-02 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9782141B2 (en) 2013-02-01 2017-10-10 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US10653381B2 (en) 2013-02-01 2020-05-19 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US20140267611A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Runtime engine for analyzing user motion in 3d images
US20150002648A1 (en) * 2013-06-28 2015-01-01 Casio Computer Co., Ltd Measuring Apparatus Capable Of Measuring A Continuous Motional State
US10004462B2 (en) 2014-03-24 2018-06-26 Kineticor, Inc. Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US9734589B2 (en) 2014-07-23 2017-08-15 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US11100636B2 (en) 2014-07-23 2021-08-24 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10438349B2 (en) 2014-07-23 2019-10-08 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10660541B2 (en) 2015-07-28 2020-05-26 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10712827B2 (en) * 2015-11-19 2020-07-14 International Business Machines Corporation Client device motion control via a video feed
US10353473B2 (en) * 2015-11-19 2019-07-16 International Business Machines Corporation Client device motion control via a video feed
US20160077598A1 (en) * 2015-11-19 2016-03-17 International Business Machines Corporation Client device motion control via a video feed
US11163369B2 (en) * 2015-11-19 2021-11-02 International Business Machines Corporation Client device motion control via a video feed
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10620713B1 (en) 2019-06-05 2020-04-14 NEX Team Inc. Methods and systems for touchless control with a mobile device
US11868536B2 (en) 2019-06-05 2024-01-09 NEX Team Inc. Methods and systems for touchless control with a mobile device

Also Published As

Publication number Publication date
CN103501869A (en) 2014-01-08
EP2701817A2 (en) 2014-03-05
KR20140023951A (en) 2014-02-27
WO2012148853A2 (en) 2012-11-01
WO2012148853A3 (en) 2013-01-10
JP2014523258A (en) 2014-09-11
EP2701817A4 (en) 2014-10-15
EP2701817B1 (en) 2018-09-12

Similar Documents

Publication Publication Date Title
EP2701817B1 (en) Manual and camera-based game control
US8702507B2 (en) Manual and camera-based avatar control
US9259643B2 (en) Control of separate computer game elements
US9519970B2 (en) Systems and methods for detecting a tilt angle from a depth image
US9349040B2 (en) Bi-modal depth-image analysis
US9498718B2 (en) Altering a view perspective within a display environment
US20160291700A1 (en) Combining Gestures Beyond Skeletal
US9256282B2 (en) Virtual object manipulation
EP2964353B1 (en) User body angle, curvature and average extremity positions extraction using depth images
WO2011087887A2 (en) Tracking groups of users in motion capture system
EP2956909B1 (en) User center-of-mass and mass distribution extraction using depth images
EP3209398B1 (en) Silhouette-based limb finding

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LANSDALE, THOMAS WILLIAM;GRIFFITHS, CHARLES ROBERT;REEL/FRAME:026824/0578

Effective date: 20110823

AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LANSDALE, THOMAS WILLIAM;GRIFFITHS, CHARLES ROBERT;REEL/FRAME:026938/0054

Effective date: 20110916

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0001

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION