US20070021207A1 - Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method - Google Patents

Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method Download PDF

Info

Publication number
US20070021207A1
US20070021207A1 US11/349,431 US34943106A US2007021207A1 US 20070021207 A1 US20070021207 A1 US 20070021207A1 US 34943106 A US34943106 A US 34943106A US 2007021207 A1 US2007021207 A1 US 2007021207A1
Authority
US
United States
Prior art keywords
player
image
players
frame
offense
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/349,431
Inventor
Ned Ahdoot
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/189,176 external-priority patent/US20070021199A1/en
Application filed by Individual filed Critical Individual
Priority to US11/349,431 priority Critical patent/US20070021207A1/en
Publication of US20070021207A1 publication Critical patent/US20070021207A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/833Hand-to-hand fighting, e.g. martial arts competition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2244/00Sports without balls
    • A63B2244/10Combat sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/46Computing the game score
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8029Fighting without shooting

Definitions

  • This disclosure relates generally to games of interactive play between two or more entities including individuals and computer simulated opponents, i.e., the invention may be used by two individuals, an individual and a simulation, and even between two simulations, as for demonstration purposes, and more particularly to a computer controlled interactive movement and contact simulation game in which a player mutually interacts with a computer generated image that responds to the player's movement in real-time.
  • the device is responsive to head or hand movements in order to move a dampened substance contained within a confined tube past one or more sensors. Light passing through the tube is interrupted by the movement of the dampened substance.
  • the intended use of the device, as disclosed, is changing the perspective shown on a video display.
  • Goo U.S. Pat. No. 4,817,950 teaches a video game controller for surfboarding simulation, and of particular interest is the use of a unique attitude sensing device to determine the exact position of the surfboard.
  • the attitude sensing device employs a plurality of switch closures to determine the tilt angle of the platform and open and close a plurality of electrical contacts enabling a signal input to a computer control unit.
  • the device consists of a hand held, one dimensional torque feedback device used to manipulate computer generated visual information and associated torque forces.
  • Kosugi et al. U.S. Pat. No. 5,229,756 disclose a combination of components forming an interactive image control apparatus.
  • the main components of the device are a movement detector for detecting movement, a judging device for determining the state of the operator on the basis of the movement signal provided by the movement detector, and a controller that controls the image in accordance with the movement signal and the judgment of the judgment device.
  • the movement detector, judging device and the controller cooperate so as to control the image in accordance with the movement of the operator.
  • Kosugi requires that a detection means be attached adjacent to the operator's elbow and knee joints so as to measure the bending angle of the extremity and thus more accurately respond to the operator's movements.
  • the present invention employs a system in which the position of the player is continually monitored.
  • a system in which the position of the player is continually monitored.
  • the present invention takes the approach to simulate a combat adversary image, while allowing the player to exercise every part of his body in combat with the image. This is the final and most important objective.
  • the present invention fulfills these needs and provides further related advantages as described in the following summary.
  • a best mode embodiment of the present invention provides a method for engaging a player or a pair of players in a motion related game including the steps of attaching plural colored elements onto selected portions of the player(s); processing a video stream from a digital camera to separately identify the positions, velocities an accelerations of the several colored elements in time; providing a data stream of the video recorder to a data processor; calculating the distance between the player and the camera as a function of time; predicting the motions of the players and providing anticipatory motions of a virtual image in compensation thereof.
  • a primary objective of the present invention is to provide an apparatus and method of use of such apparatus that yields advantages not taught by the prior art.
  • Another objective of the invention is to provide a game for simulated combat between two individuals.
  • a further objective of the invention is to provide a game for simulated combat between an individual and a simulated second player of the game.
  • a further objective of the invention is to provide a game for simulated combat between an individual carrying a sport instrument in hand and a simulated offense and defense players of the game.
  • a still further objective of the invention is to provide the virtual image to anticipate and predict the movement of the real player and to change the virtual image accordingly.
  • FIG. 1 is a perspective view of the present invention as seen from behind a projection screen transparent to a camera mounted there behind so as to record the motions of a first player moving in front of the screen, the screen being translucent to the first player;
  • FIG. 2 is a perspective view thereof from the front of the screen showing the first player at left being recorded from the camera mounted behind the screen wherein the player in front of the screen is able to view an image of a second player projected onto the screen from a projector behind the screen;
  • FIG. 3 is a perspective view thereof showing the first and the second players in separate locations with video images of each projected onto a screen at the other player's location;
  • FIG. 4 is a logic diagram of the method of the invention showing event detection and prediction processing steps
  • FIG. 5 is a continuation of the logic diagram of FIG. 4 showing player offense event processing steps
  • FIG. 6 is a continuation of the logic diagram of FIG. 4 showing player defense event processing steps.
  • FIG. 7 is a flow chart showing an associative address generator of the invention.
  • one or two players take part in a game involving physical movements.
  • Such games may comprise simulated combat, games of chance, competition, cooperative engagement, and similar subjects.
  • the present invention is ideal for use in contact games of hand-to-hand combat such as karate, aikido, kick-boxing and American style boxing where the players have contact but are not physically intertwined as they are in wrestling, Judo and similar sports.
  • a combat game is described, but such is not meant to limit the range of possible uses of the present invention.
  • a first player 5 engages in simulated combat with a second player's image 7 ′ projected by video projector 40 onto a screen 20 placed in front of the player 5 .
  • the image 7 ′ is computer generated using the same technology as found in game arcades and the like and which is well known in the art.
  • two live players 5 and 7 stand in front of two separate screens 20 and 22 and engage in mutual simulated combat against recorded and projected images 5 ′ and 7 ′ of players 5 and 7 respectively. This avoids physical face-to-face combat where one of the players might receive injury.
  • the images projected onto the screens 20 and 22 are not computer generated but are real-time projections of video recordings taken as shown in FIG. 1 using cameras 10 .
  • player 5 is positioned in front of rear projection screen 20 .
  • One or more video cameras 10 are positioned behind screen 20 .
  • the camera 10 is able to view player 5 through the screen 20 which is transparent from the position of camera 10 , and record the player's movements.
  • a miniature videcon CCTV camera (not shown) may be mounted on the front of screen 20 , or may be operated through a small hole in the screen 20 .
  • the screen 20 may be supported by a screen stand (not shown) or it may be mounted on a wall 25 as shown in the figures.
  • Simulated image 7 ′ is visible to the player 5 as shown in FIG. 2 .
  • both the camera 10 and the projector 40 are operated at identical rates (frames per second) but each records a frame and blanks for an equal time interlacing the two functions in time so that one is operating when the other is blanking and vice-versa.
  • player 5 positioned at the front of the screen 10 , sees the projected image 7 ′ while the camera 10 sees player 5 and not the projected image 7 ′.
  • projection screen 20 is transparent to camera 10 mounted behind it so as to enable recording the motions of first player 5 moving in front of screen 20 .
  • screen 20 is translucent to first player 5 so that he sees only the projected image 7 ′ and not the camera 10 or projector 40 .
  • players 5 and 7 each wears colored bands as best seen in FIG. 2 .
  • player 5 has a band 51 secured at his forehead, above each elbow 52 , on each wrist 53 , around the waist 54 , above each knee 55 and on each ankle 56 .
  • Each of these 10 bands is a different color.
  • Further bands may be placed in additional locations on the players, but the 10 bands shown in FIG. 2 as described, are able to achieve the objectives of the instant innovation as will be shown.
  • the image 5 ′ of the player 5 as recorded by camera 10 is converted into a digital electronic primary signal. This primary signal is split into 10 derivative secondary signals by color filtering the primary signal for each of the ten colors.
  • Each of the secondary signals contains three pieces of information with respect to each frame of the video recording: a location “x” (left to right), a location “y” (top to bottom) in the camera's field of view, and finally, a number of pixels “p” subtended by the color in the field of view.
  • each secondary signal is a representation of only the color band to which it has been filtered and all other aspects of the recorded image are invisible, i.e., not present in that secondary signal.
  • each frame of the recorded image yields 30 pieces of information, i.e., for each of the ten bands, an x, y and p value.
  • the x and y information locates the band in the plane of the field of view of the camera, while the p information approximates the location in the “z” direction approximately, i.e., the distance from the camera lens to the band.
  • the z coordinate is approximated by taking the value for each band at time zero to be the nominal value of the distance z, while when the numerical value of p drops in a subsequent frame of the recording the distance z is increased, and when the numerical value of p increase, the value of z lessens.
  • Computer 60 processes the locations of all ten bands for each frame of the recording in real time, i.e., there is no appreciable lag between the computer's numeric calculation of the locations of the bands and the actual locations thereof.
  • the player 5 stands facing the screen 20 with feet a comfortable distance apart, legs straight, and arms hanging at the player's sides.
  • Each of the ten colored bands 51 - 56 are visible to the camera 10 and with a simple set of anatomical rules, the computer 60 is able to compose a model of the player's form that accurately represents the player's physical position and anatomical orientation at that moment including approximations of arm and leg length, height, and so on.
  • a band moves, its image on the recording plane of camera 10 moves accordingly so that the computer 60 is able to plot the motion trajectory of the band in three-space using coordinates x, y and p.
  • a band disappears i.e., is hidden behind another part of the players anatomy, as is the case in FIG. 2 where band 52 on the players right arm is hidden by his body, the trajectory of the band is approximated taking into account, its locus of locations in preceding frames.
  • the opponent's image 7 ′ is generated and projected onto the screen 20 .
  • trajectories of the player's bands 51 - 56 enable the computer to model the player's motion.
  • the computer is programmed to move the image 7 ′ to attack and defend accordingly.
  • the image 7 ′ is projected with three dimensional realism by any one of the well known techniques for accomplishing this as reported in the art.
  • One technique for accomplishing this is the projection of two orthogonally polarized and slightly separated identical projected images which appear fuzzy to the unaided eye on screen 20 .
  • the image 7 ′ appears in three-dimensional realism. Calibration of the image 7 ′ enables a virtual plane of contact between the player 5 and the image 7 ′ where this plane of contact is in front of the screen 20 . Please see the virtual three-dimensional image shown in FIG. 2 where player 5 is blocking a kick from the image 7 ′ of player 7 .
  • players 5 and 7 stand facing their respective screens 20 , each with feet a comfortable distance apart, legs slightly bent, and arms hanging at their sides.
  • Each of the ten colored bands 51 - 56 on each of the players 5 and 7 are visible to their respective cameras 10 so that the computers 60 are able to compose mathematical models of the positions of each of the players 5 and 7 that accurately represents each of the player's physical position and anatomical orientation at that moment relative to the other of the player.
  • the vertical planes represented by the screens 20 and 22 represent the same plane in the combat three-space of the game. Therefore, when one player moves a fist, elbow, knee or foot toward his screen, the computers 60 calculate that motion as projecting toward the other player.
  • the computers 60 calculates contacts between players 5 in offensive and defensive moves when their respective body parts occupy the same space coordinates.
  • the players initially and nominally stand slightly more than an arm's length away from their screen, i.e., mathematically from their opponent. Points are awarded to each of the players for successful offensive and defensive moves.
  • the images are preferably projected with three-dimensional realism by use of the well known polarized dual images technique, so that each player sees the illusion of the opponent player's image projecting toward him from the screen 20 or 22 .
  • the present disclosure teaches an improved video frame processing method that enables the combative motions between two distant players 5 and 7 , as described above and shown in FIGS. 1-3 , to be calculated and compared with respect to each other.
  • This method is described as follows and is as shown in FIGS. 4-7 .
  • a stream of information from the video recorder frames is processed. Frame by frame each of the 30 coordinate data elements x, y, and p is recorded, with z being calculated, so that for each frame, the position of all parts of the players is known and using a simple physical model of the human body, a mathematical model of each of the players positions in three-space is determined.
  • the changes of the locations of the player's body parts from frame to frame enables the calculation of velocity and acceleration of these parts by taking the first and second differential of the change in position. Furthermore, at each frame, a prediction of the positions, velocities and accelerations of each of the body parts is made. These predictions are made using data from multiple frames. These calculations continue until the number of frames is at least equal to a specified set point.
  • the computer generated image is modified so as to defend against an offensive move by player 5 or to initiate a new offensive move from an inventory of such moves.
  • the logical steps of the present method are shown in FIGS. 4 through 6 and comprise the determination of incoming offense information, calculation of the player's new coordinates, determination if the defense or offence is complete, and calculating the player's offensive positions as compared to the image defense moves and vice-versa. Finally, a scoring method is used and for each of the motion and counter motion determinations for both offensive and defensive motions of players, a score is created and projected onto the screen.
  • each incoming video frame is compared to the previous frame to detect a magnitude of change. Changes surpassing a fixed threshold value trigger further processing at ( 4 ). This occurrence triggers the start of “event detection,” and represents the recognition of a player's initial motion. Frame to frame changes that do not surpass the threshold are counted, discarded and directed to further processing at ( 3 ).
  • frame changes are compared with a prior trajectory and if consistency is found logic moves to ( 6 ), otherwise to ( 1 ). Changes in position, speed and acceleration of the player are measured each frame. If motion is consistent, frame to frame as per ( 6 ), this indicates that the motion detected at ( 2 ) continues. Frame to frame changes in the orientation of each body part suggests body part rotation.
  • a pattern in the player's motions is sought by the system and characterized as a specific stored pattern. This is accomplished by recognizing a prediction area within a selected variance range. Based on new received input information an associative memory generator, e.g., an FPGA (see FIG. 7 ) for instance, stores player motion habits as an inventory related to the specific player.
  • an associative memory generator e.g., an FPGA (see FIG. 7 ) for instance, stores player motion habits as an inventory related to the specific player.
  • an “FPGA” is a Field-Programmable Gate Array, a type of logic chip that can be programmed.
  • An FPGA is similar to a PLD but has an order of magnitude greater gates. FPGAs are commonly used for prototyping integrated circuit designs and other applications.
  • the physical attributes of the player which were determined after b frames are fed to the associative memory generator ( FIG. 7 ).
  • the output of the generator is fed to a memory address lookup table which provides a memory address of the various predictions.
  • the use of a generator of this type which relates physical attributes to a memory address does not burden the processor since not calculations are necessary.
  • an event follower processor for player offense waits for event detection from ( 15 ).
  • the player's offense prediction is read along with recovering a defense absolute address from the lookup table.
  • This address is generated in conjunction with the associative memory generator. The address stores the physical attributes of the player that are quantities representing a degree of expertise.
  • the output from the associative memory generator is the address used at the lookup table which has been previously prepared.
  • the output of the lookup table is the address used by memory holding prediction data.
  • next frame is considered and processed calculating the player's new coordinates and amending prior coordinate information.
  • the trajectory is calculated and image player's defensive moves are predicted.
  • the player's offense and the image's defense predictions are compared for each frame. If the player's actual offense correlates with prediction, logic moves to ( 18 ) and if not correlated, logic moves to ( 21 ). If significant correlation variance is determined, logic moves to ( 1 ).
  • Coordinates in three-space of the positions of body parts of the image and of the player are calculated and when a collision is determined velocity and acceleration vectors of both the player and image are used to determine scores.
  • the score number for player contact with the image's hand is relatively low, while the score for player's contact with the image's face results in a large score number.
  • ( 24 ) waits for the player's defense command from the event detector processor.
  • the player's defense prediction and the image's offense address from the address lookup table are read. This address is generated in conjunction with the associative memory generator. The address contains the player's physical attributes which represent a degree of expertise. This output is stored in the lookup table memory which is information used to establish prediction.
  • the player's defense trajectory and the image's offense predictions are compared at each frame. If the player's defensive prediction corresponds to the measured actual motion, logic moves to ( 28 ) and if it does not correspond, logic moves to ( 31 ). If correspondence is poor logic moves to ( 1 ).
  • the end of the image's defense is determined and if an end is not found, logic moves to ( 26 ) and ( 27 ). If an end is determined, logic moves to ( 29 ).
  • the image's planned offense is used to provide a score considering the player's actual defense and the image's planned offense motions.
  • the player's trajectory is compared with the image's planned trajectory and scores are determined in accordance with the outcome.
  • the scores are displayed and the end of the player's defensive motion is logged.
  • the player's information is stored in memory for future reference.
  • an imaginary boundary is set around the projected image.
  • the actual motion of the player is compared with this boundary to determine the relative position of the player's hands and feet with respect to the boundary, and scores are determined by the relative positions and sensitivities of the parts of the player's or image's body.
  • the actual speed, accuracy, acceleration and positioning of the player is stored and used to improve the prediction model of the player.

Abstract

A method for engaging a player or a pair of players in a motion related game including the steps of attaching plural colored elements onto selected portions of the player(s) garments and processing a video stream of each of the players to separately identify the positions, velocities an accelerations of the several colored elements. The method further comprises generation of a combatant competitor image and moving the image in a manor to overcome the player. In a further approach, two players are recorded and their video images are presented one screens frontal to the other of the players. The same colored elements are used to enable computer calculations of fighting proficiency of the players.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This is a continuation-in-part application of U.S. patent application Ser. No. 11/189,176, filed Jul. 25, 2005, which is incorporated herein by reference.
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • Not applicable.
  • THE NAMES OF THE PARTIES TO A JOINT RESEARCH AGREEMENT
  • Not applicable.
  • INCORPORATION-BY-REFERENCE OF MATERIAL SUBMITTTED ON A COMPACT DISC
  • Not applicable.
  • REFERENCE TO A “MICROFICHE APPENDIX”
  • Not applicable.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Present Disclosure
  • This disclosure relates generally to games of interactive play between two or more entities including individuals and computer simulated opponents, i.e., the invention may be used by two individuals, an individual and a simulation, and even between two simulations, as for demonstration purposes, and more particularly to a computer controlled interactive movement and contact simulation game in which a player mutually interacts with a computer generated image that responds to the player's movement in real-time.
  • 2. Description of Related Art Including Information Disclosed Under 37 CFR 1.97 and 1.98
  • Invention and use of computer generated, interactive apparatus are known to the public, in that such apparatus are currently employed for a wide variety of uses, including interactive games, exercise equipment, and astronaut training. Ahdoot, U.S. Pat. No. 5,913,727 discloses an interactive contact and simulation game apparatus in which a player and a three dimensional computer generated image interact in simulated physical contact. Alternately two players may interact through the apparatus of the invention. The game apparatus includes a computerized control means generating a simulated image or images of the players, and displaying the images on a large display. A plurality of position sensing and impact generating means are secured to various locations on each of the player's bodies. The position sensing means relay information to the control means indicating the exact position of the player. This is accomplished by the display means generating a moving light signal, invisible to the player, but detected by the position sensing means and relayed to the control means. The control means then responds in real time to the player's position and movements by moving the image in a combat strategy. When simulated contact between the image and the player is determined by the control means, the impact generating means positioned at the point of contact is activated to apply pressure to the player, thus simulating contact. With two players, each players sees his opponent as a simulated image on his display device. Lewis et al. U.S. Pat. No. 5,177,872 discloses a novel device for determining the position of a person or object. The device is responsive to head or hand movements in order to move a dampened substance contained within a confined tube past one or more sensors. Light passing through the tube is interrupted by the movement of the dampened substance. The intended use of the device, as disclosed, is changing the perspective shown on a video display. Goo U.S. Pat. No. 4,817,950 teaches a video game controller for surfboarding simulation, and of particular interest is the use of a unique attitude sensing device to determine the exact position of the surfboard. The attitude sensing device employs a plurality of switch closures to determine the tilt angle of the platform and open and close a plurality of electrical contacts enabling a signal input to a computer control unit. Good et al. U.S. Pat. No. 5,185,561 teaches the principals of tactile feedback through the use of a torque motor. As disclosed, the device consists of a hand held, one dimensional torque feedback device used to manipulate computer generated visual information and associated torque forces. Kosugi et al. U.S. Pat. No. 5,229,756 disclose a combination of components forming an interactive image control apparatus. The main components of the device are a movement detector for detecting movement, a judging device for determining the state of the operator on the basis of the movement signal provided by the movement detector, and a controller that controls the image in accordance with the movement signal and the judgment of the judgment device. The movement detector, judging device and the controller cooperate so as to control the image in accordance with the movement of the operator. Kosugi requires that a detection means be attached adjacent to the operator's elbow and knee joints so as to measure the bending angle of the extremity and thus more accurately respond to the operator's movements.
  • The present invention employs a system in which the position of the player is continually monitored. Between the simple types of games of combat as typically found in game arcades, wherein the player's is via a simple control joystick and punch-buttons, and the very sophisticated and complex artificial reality types of game wherein the headgear provides a full sensory input structure, and a highly instrumented and wired glove allows manual contact on a limited basis with the simulation, there is a need for a fully interactive game. The present invention takes the approach to simulate a combat adversary image, while allowing the player to exercise every part of his body in combat with the image. This is the final and most important objective. The present invention fulfills these needs and provides further related advantages as described in the following summary.
  • Our prior art search with abstracts described above teaches interactive game technology, technique and know-how. However, the prior art fails to teach the instant technique featuring simulated “stand-up” combat between two individuals or between an individual and a computer simulation. The present invention fulfills these needs and provides further related advantages as described in the following summary.
  • BRIEF SUMMARY OF THE INVENTION
  • A best mode embodiment of the present invention provides a method for engaging a player or a pair of players in a motion related game including the steps of attaching plural colored elements onto selected portions of the player(s); processing a video stream from a digital camera to separately identify the positions, velocities an accelerations of the several colored elements in time; providing a data stream of the video recorder to a data processor; calculating the distance between the player and the camera as a function of time; predicting the motions of the players and providing anticipatory motions of a virtual image in compensation thereof.
  • A primary objective of the present invention is to provide an apparatus and method of use of such apparatus that yields advantages not taught by the prior art.
  • Another objective of the invention is to provide a game for simulated combat between two individuals.
  • A further objective of the invention is to provide a game for simulated combat between an individual and a simulated second player of the game.
  • A further objective of the invention is to provide a game for simulated combat between an individual carrying a sport instrument in hand and a simulated offense and defense players of the game.
  • A still further objective of the invention is to provide the virtual image to anticipate and predict the movement of the real player and to change the virtual image accordingly.
  • Other features and advantages of the embodiments of the present invention will become apparent from the following more detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the principles of at least one of the possible embodiments of the invention.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
  • Illustrated in the accompanying drawing(s) is at least one of the best mode embodiments of the present invention In such drawing(s):
  • The accompanying drawings illustrate at least one of the best mode embodiments of the present invention. In such drawings.
  • FIG. 1 is a perspective view of the present invention as seen from behind a projection screen transparent to a camera mounted there behind so as to record the motions of a first player moving in front of the screen, the screen being translucent to the first player;
  • FIG. 2 is a perspective view thereof from the front of the screen showing the first player at left being recorded from the camera mounted behind the screen wherein the player in front of the screen is able to view an image of a second player projected onto the screen from a projector behind the screen;
  • FIG. 3 is a perspective view thereof showing the first and the second players in separate locations with video images of each projected onto a screen at the other player's location;
  • FIG. 4 is a logic diagram of the method of the invention showing event detection and prediction processing steps;
  • FIG. 5 is a continuation of the logic diagram of FIG. 4 showing player offense event processing steps;
  • FIG. 6 is a continuation of the logic diagram of FIG. 4 showing player defense event processing steps; and
  • FIG. 7 is a flow chart showing an associative address generator of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The above described drawing figures illustrate the described apparatus and its method of use in at least one of its preferred, best mode embodiment, which is further defined in detail in the following description. Those having ordinary skill in the art may be able to make alterations and modifications what is described herein without departing from its spirit and scope. Therefore, it should be understood that what is illustrated is set forth only for the purposes of example and that it should not be taken as a limitation in the scope of the present apparatus and method of use.
  • In the present apparatus and method, one or two players take part in a game involving physical movements. Such games may comprise simulated combat, games of chance, competition, cooperative engagement, and similar subjects. However, the present invention is ideal for use in contact games of hand-to-hand combat such as karate, aikido, kick-boxing and American style boxing where the players have contact but are not physically intertwined as they are in wrestling, Judo and similar sports. In this disclosure a combat game is described, but such is not meant to limit the range of possible uses of the present invention. In one embodiment of the instant combat game, as shown in FIG. 2, a first player 5 engages in simulated combat with a second player's image 7′ projected by video projector 40 onto a screen 20 placed in front of the player 5. In this embodiment, the image 7′ is computer generated using the same technology as found in game arcades and the like and which is well known in the art. In an alternate embodiment shown in FIG. 3, two live players 5 and 7 stand in front of two separate screens 20 and 22 and engage in mutual simulated combat against recorded and projected images 5′ and 7′ of players 5 and 7 respectively. This avoids physical face-to-face combat where one of the players might receive injury. In this second approach, the images projected onto the screens 20 and 22 are not computer generated but are real-time projections of video recordings taken as shown in FIG. 1 using cameras 10.
  • In the first approach, shown in FIG. 1, player 5 is positioned in front of rear projection screen 20. One or more video cameras 10, are positioned behind screen 20. The camera 10 is able to view player 5 through the screen 20 which is transparent from the position of camera 10, and record the player's movements. Alternatively, a miniature videcon CCTV camera (not shown) may be mounted on the front of screen 20, or may be operated through a small hole in the screen 20. The screen 20 may be supported by a screen stand (not shown) or it may be mounted on a wall 25 as shown in the figures.
  • Simulated image 7′ is visible to the player 5 as shown in FIG. 2. In an approach where the camera 10 is located behind the screen 20, and the image 7′ is visible on screen 20, in order for the camera 10 to not record the projected image 7′, both the camera 10 and the projector 40 are operated at identical rates (frames per second) but each records a frame and blanks for an equal time interlacing the two functions in time so that one is operating when the other is blanking and vice-versa. The net result is that player 5, positioned at the front of the screen 10, sees the projected image 7′ while the camera 10 sees player 5 and not the projected image 7′.
  • Preferably, projection screen 20 is transparent to camera 10 mounted behind it so as to enable recording the motions of first player 5 moving in front of screen 20. Preferably also, screen 20 is translucent to first player 5 so that he sees only the projected image 7′ and not the camera 10 or projector 40.
  • In both of the above described embodiments, players 5 and 7 each wears colored bands as best seen in FIG. 2. Preferably, player 5 has a band 51 secured at his forehead, above each elbow 52, on each wrist 53, around the waist 54, above each knee 55 and on each ankle 56. Each of these 10 bands is a different color. Further bands may be placed in additional locations on the players, but the 10 bands shown in FIG. 2 as described, are able to achieve the objectives of the instant innovation as will be shown. In the instant method, the image 5′ of the player 5, as recorded by camera 10 is converted into a digital electronic primary signal. This primary signal is split into 10 derivative secondary signals by color filtering the primary signal for each of the ten colors. Each of the secondary signals contains three pieces of information with respect to each frame of the video recording: a location “x” (left to right), a location “y” (top to bottom) in the camera's field of view, and finally, a number of pixels “p” subtended by the color in the field of view. It is noted that each secondary signal is a representation of only the color band to which it has been filtered and all other aspects of the recorded image are invisible, i.e., not present in that secondary signal. To summarize then, each frame of the recorded image yields 30 pieces of information, i.e., for each of the ten bands, an x, y and p value. The x and y information locates the band in the plane of the field of view of the camera, while the p information approximates the location in the “z” direction approximately, i.e., the distance from the camera lens to the band. The z coordinate is approximated by taking the value for each band at time zero to be the nominal value of the distance z, while when the numerical value of p drops in a subsequent frame of the recording the distance z is increased, and when the numerical value of p increase, the value of z lessens. By rigorous calibration prior to the use of the present invention a reasonable qualitative approximation of the motion of the bands, in the z direction is made by identifying the p count. Computer 60 processes the locations of all ten bands for each frame of the recording in real time, i.e., there is no appreciable lag between the computer's numeric calculation of the locations of the bands and the actual locations thereof.
  • EXAMPLE 1
  • The player 5 stands facing the screen 20 with feet a comfortable distance apart, legs straight, and arms hanging at the player's sides. Each of the ten colored bands 51-56 are visible to the camera 10 and with a simple set of anatomical rules, the computer 60 is able to compose a model of the player's form that accurately represents the player's physical position and anatomical orientation at that moment including approximations of arm and leg length, height, and so on. When a band moves, its image on the recording plane of camera 10 moves accordingly so that the computer 60 is able to plot the motion trajectory of the band in three-space using coordinates x, y and p. When a band disappears, i.e., is hidden behind another part of the players anatomy, as is the case in FIG. 2 where band 52 on the players right arm is hidden by his body, the trajectory of the band is approximated taking into account, its locus of locations in preceding frames.
  • In the case of a single player 5 with a computer generated virtual opponent the opponent's image 7′ is generated and projected onto the screen 20. As player 5 moves to attack or defend against the image 7′, trajectories of the player's bands 51-56 enable the computer to model the player's motion. The computer is programmed to move the image 7′ to attack and defend accordingly. Preferably, the image 7′ is projected with three dimensional realism by any one of the well known techniques for accomplishing this as reported in the art. One technique for accomplishing this is the projection of two orthogonally polarized and slightly separated identical projected images which appear fuzzy to the unaided eye on screen 20. However, when player 5 wears glasses with polarized lenses also orthogonally polarized, the image 7′ appears in three-dimensional realism. Calibration of the image 7′ enables a virtual plane of contact between the player 5 and the image 7′ where this plane of contact is in front of the screen 20. Please see the virtual three-dimensional image shown in FIG. 2 where player 5 is blocking a kick from the image 7′ of player 7.
  • EXAMPLE 2
  • As shown in FIG. 3, players 5 and 7 stand facing their respective screens 20, each with feet a comfortable distance apart, legs slightly bent, and arms hanging at their sides. Each of the ten colored bands 51-56 on each of the players 5 and 7 are visible to their respective cameras 10 so that the computers 60 are able to compose mathematical models of the positions of each of the players 5 and 7 that accurately represents each of the player's physical position and anatomical orientation at that moment relative to the other of the player. The vertical planes represented by the screens 20 and 22 represent the same plane in the combat three-space of the game. Therefore, when one player moves a fist, elbow, knee or foot toward his screen, the computers 60 calculate that motion as projecting toward the other player. In this manner the computers 60 calculates contacts between players 5 in offensive and defensive moves when their respective body parts occupy the same space coordinates. As in actual combat, the players initially and nominally stand slightly more than an arm's length away from their screen, i.e., mathematically from their opponent. Points are awarded to each of the players for successful offensive and defensive moves. As discussed above, the images are preferably projected with three-dimensional realism by use of the well known polarized dual images technique, so that each player sees the illusion of the opponent player's image projecting toward him from the screen 20 or 22.
  • The present disclosure teaches an improved video frame processing method that enables the combative motions between two distant players 5 and 7, as described above and shown in FIGS. 1-3, to be calculated and compared with respect to each other. This method is described as follows and is as shown in FIGS. 4-7. Once the game is initiated, a stream of information from the video recorder frames is processed. Frame by frame each of the 30 coordinate data elements x, y, and p is recorded, with z being calculated, so that for each frame, the position of all parts of the players is known and using a simple physical model of the human body, a mathematical model of each of the players positions in three-space is determined. The changes of the locations of the player's body parts from frame to frame enables the calculation of velocity and acceleration of these parts by taking the first and second differential of the change in position. Furthermore, at each frame, a prediction of the positions, velocities and accelerations of each of the body parts is made. These predictions are made using data from multiple frames. These calculations continue until the number of frames is at least equal to a specified set point. Depending on whether the motion is defensive, i.e., responsive to the opponents movement, or offensive, i.e., independent of the opponent's movement, in any of the body parts, the computer generated image is modified so as to defend against an offensive move by player 5 or to initiate a new offensive move from an inventory of such moves.
  • With respect to two real opponents, the logical steps of the present method are shown in FIGS. 4 through 6 and comprise the determination of incoming offense information, calculation of the player's new coordinates, determination if the defense or offence is complete, and calculating the player's offensive positions as compared to the image defense moves and vice-versa. Finally, a scoring method is used and for each of the motion and counter motion determinations for both offensive and defensive motions of players, a score is created and projected onto the screen.
  • Referring now to the numerical reference numbers in the logic flow chart shown in FIGS. 4-7, we find at (1) the game is initiated whereby all game counters and variables, such as player weight, skill level and expertise are entered by the players. Counters are initialized. At this time camera auto focus, zoom and player position is operating and data is being taken and stored in memory. At (2) each incoming video frame is compared to the previous frame to detect a magnitude of change. Changes surpassing a fixed threshold value trigger further processing at (4). This occurrence triggers the start of “event detection,” and represents the recognition of a player's initial motion. Frame to frame changes that do not surpass the threshold are counted, discarded and directed to further processing at (3).
  • At (3) and (8), counts “a” of frames that do not surpass threshold are compared with a set constant “c.” If a>c, then an offensive action is taken against the player (11). Otherwise the system waits for action to occur.
  • At (4) and (5) frame changes are compared with a prior trajectory and if consistency is found logic moves to (6), otherwise to (1). Changes in position, speed and acceleration of the player are measured each frame. If motion is consistent, frame to frame as per (6), this indicates that the motion detected at (2) continues. Frame to frame changes in the orientation of each body part suggests body part rotation.
  • At (6) calculated changes are amended to previous trajectory information. At (7) motion is checked to determine if motion has been continuous for “b” frames and if so, logic moves to (9), otherwise back to (4) and (5). During b frames motion is determined to be offensive or defensive.
  • At (9) during initial time periods and between event detection periods, a pattern in the player's motions is sought by the system and characterized as a specific stored pattern. This is accomplished by recognizing a prediction area within a selected variance range. Based on new received input information an associative memory generator, e.g., an FPGA (see FIG. 7) for instance, stores player motion habits as an inventory related to the specific player. It is noted here that an “FPGA” is a Field-Programmable Gate Array, a type of logic chip that can be programmed. An FPGA is similar to a PLD but has an order of magnitude greater gates. FPGAs are commonly used for prototyping integrated circuit designs and other applications.
  • At (17) when an end of an event is characterized by the completion of b frames, if the motion is determined to be offensive logic moves to (10), and if the motion is determined to be defensive logic moves to (11), if neither, the next step is taken. If the motion is a combination of offense and defense then a calculation of likelihood of hit success is established by comparing player's and image's motions. If the player's offense is stronger logic moves to (10), otherwise (11).
  • At (10) and (11) the physical attributes of the player which were determined after b frames are fed to the associative memory generator (FIG. 7). The output of the generator is fed to a memory address lookup table which provides a memory address of the various predictions. The use of a generator of this type which relates physical attributes to a memory address does not burden the processor since not calculations are necessary.
  • In FIG. 6 at (14), an event follower processor for player offense, waits for event detection from (15). At (15) the player's offense prediction is read along with recovering a defense absolute address from the lookup table. This address is generated in conjunction with the associative memory generator. The address stores the physical attributes of the player that are quantities representing a degree of expertise. The output from the associative memory generator is the address used at the lookup table which has been previously prepared. The output of the lookup table is the address used by memory holding prediction data.
  • At (16) the next frame is considered and processed calculating the player's new coordinates and amending prior coordinate information. The trajectory is calculated and image player's defensive moves are predicted.
  • At (17) the player's offense and the image's defense predictions are compared for each frame. If the player's actual offense correlates with prediction, logic moves to (18) and if not correlated, logic moves to (21). If significant correlation variance is determined, logic moves to (1).
  • At (18), if an end of image's defense is not determined, logic moves to (16) and (17). If an end is determined, logic moves to (19). At (19) the player's trajectory is compared with the predicted and planned image's trajectory and determines a score. At (20) scores are displayed and the event detection processor is informed of an end of the player's offensive motion. At (21) player information is stored in memory and received at (1) as needed.
  • Coordinates in three-space of the positions of body parts of the image and of the player are calculated and when a collision is determined velocity and acceleration vectors of both the player and image are used to determine scores. As an example, the score number for player contact with the image's hand (image parrying a player's thrust) is relatively low, while the score for player's contact with the image's face results in a large score number.
  • In FIG. 7, (24) waits for the player's defense command from the event detector processor. At (25) the player's defense prediction and the image's offense address from the address lookup table are read. This address is generated in conjunction with the associative memory generator. The address contains the player's physical attributes which represent a degree of expertise. This output is stored in the lookup table memory which is information used to establish prediction.
  • At (26) frames are processed in order sequence and the player's new coordinates are calculated and updated. The trajectory of the player's moves are calculated and the image's defense is predicted.
  • At (27) the player's defense trajectory and the image's offense predictions are compared at each frame. If the player's defensive prediction corresponds to the measured actual motion, logic moves to (28) and if it does not correspond, logic moves to (31). If correspondence is poor logic moves to (1).
  • At (28) the end of the image's defense is determined and if an end is not found, logic moves to (26) and (27). If an end is determined, logic moves to (29). The image's planned offense is used to provide a score considering the player's actual defense and the image's planned offense motions.
  • At (29) the player's trajectory is compared with the image's planned trajectory and scores are determined in accordance with the outcome. At (30) the scores are displayed and the end of the player's defensive motion is logged. At (31) the player's information is stored in memory for future reference.
  • Preferably, an imaginary boundary is set around the projected image. The actual motion of the player is compared with this boundary to determine the relative position of the player's hands and feet with respect to the boundary, and scores are determined by the relative positions and sensitivities of the parts of the player's or image's body. As play proceeds, the actual speed, accuracy, acceleration and positioning of the player (history information) is stored and used to improve the prediction model of the player.
  • The enablements described in detail above are considered novel over the prior art of record and are considered critical to the operation of at least one aspect of one best mode embodiment of the instant invention and to the achievement of the above described objectives. The words used in this specification to describe the instant embodiments are to be understood not only in the sense of their commonly defined meanings, but to include by special definition in this specification: structure, material or acts beyond the scope of the commonly defined meanings. Thus if an element can be understood in the context of this specification as including more than one meaning, then its use must be understood as being generic to all possible meanings supported by the specification and by the word or words describing the element.
  • The definitions of the words or elements of the embodiments of the herein described invention and its related embodiments not described are, therefore, defined in this specification to include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result. In this sense it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements in the invention and its various embodiments or that a single element may be substituted for two or more elements in a claim.
  • Changes from the claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalents within the scope of the invention and its various embodiments. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements. The invention and its various embodiments are thus to be understood to include what is specifically illustrated and described above, what is conceptually equivalent, what can be obviously substituted, and also what essentially incorporates the essential idea of the invention.
  • The enablements described in detail above are considered novel over the prior art of record and are considered critical to the operation of at least one aspect of the apparatus and its method of use and to the achievement of the above described objectives. The words used in this specification to describe the instant embodiments are to be understood not only in the sense of their commonly defined meanings, but to include by special definition in this specification: structure, material or acts beyond the scope of the commonly defined meanings. Thus if an element can be understood in the context of this specification as including more than one meaning, then its use must be understood as being generic to all possible meanings supported by the specification and by the word or words describing the element.
  • The definitions of the words or drawing elements described herein are meant to include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result. In this sense it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements described and its various embodiments or that a single element may be substituted for two or more elements in a claim.
  • Changes from the claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalents within the scope intended and its various embodiments. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements. This disclosure is thus meant to be understood to include what is specifically illustrated and described above, what is conceptually equivalent, what can be obviously substituted, and also what incorporates the essential ideas.
  • The scope of this description is to be interpreted only in conjunction with the appended claims and it is made clear, here, that each named inventor believes that the claimed subject matter is what is intended to be patented.

Claims (10)

1. A method of playing a motion related hand-to-hand combat type game, between a real player and a virtual player; the method comprising the steps of:
a) identifying portions of the real player with distinct colored elements;
b) positioning the real player in front of a video screen upon which a virtual player image is projected.
c) recording video frames of a real player image and filtering the real player image into separate signals according to the colored elements;
d) reviewing each the frames in sequence so as to determine time change of positions in 3-space of the portions and calculating velocity, acceleration and trajectory of each of the portions;
e) moving the virtual player on the video screen, said movement corresponding in defense and offence to the movement of the real player;
f) identifying portions of the virtual player corresponding to the portions of the real player and establishing position, velocity, acceleration and trajectory of the portions of the virtual player to determine when real player and virtual player portions intersect in three-space;
g) assigning scores to the players in accordance with a set of scoring rules; and
h) repeating steps (c) through (g) until one of the players has achieved a set number of points and is therefore declared a winner.
2. The method of claim 1 wherein, utilizing a digital video camera interfaced to a distributed processor to capture real time 3-d motions of a player comprising the further steps of:
a) calibrating the system by initially placing the player(s) at a fixed distance from the camera and having the colored elements, and bodily signatures of the player to be calibrated for real time 3-d motion detection;
b) continually receiving the camera's real time electro-optical, auto focus, and zooming control information along with video camera signals measuring the 3 dimensional positions of the player(s) at motions;
c) while in motion, the depth (z) is calculated by the ratio of the of the total pixel count of the colored elements worn by the player(s) to the total video pixels of the colored elements measured during initial calibration;
d) utilizing a camera that could be commanded to perform auto focus or computer controlled focus;
e) adjusting the pixel count information of the colored elements, and player(s) bodily signature based upon the received camera's auto-focus or computer controlled focus;
f) trajectory of motion, speed, and acceleration of the players body parts is measured upon the differential changes of recent frame to the previous frame. provide filtering of images to provide a sharp image and eliminate background noises;
g) differential changes are measured from frame to frame by following the periphery of each colored element and measuring pixel changes;
h) utilize a computer controlled camera that is commanded to focus and stay focus on a specific moving colored element;
i) utilize a computer controlled camera that its zooming is computer controlled;
j) placing the digital camera on a computer controlled gimbal to follow the player's motions. the pixel count derived from step c will be further adjusted based upon the 2-d gimbal motions;
k) utilizing the digital camera with inferred sensors to monitor the bodily temperature of the player.
3. The method of claim 1 wherein the computer's further actions are synchronized to the start of a player's motions, or verbal commands on a frame by frame basis, further comprising the steps of:
a) each incoming frame is compared to the previous frame to detect the magnitude of change compared to the previous frame. changes in the incoming frames surpassing a threshold are lead to further processing. changes in the incoming frames not surpassing the threshold are counted, discarded, and led to further processing;
b) continuous incoming frames not surpassing a threshold for a certain period of time (“c” number of frames) are counted, discarded, and led to an offense motion by the computer generated image;
c) the voice activated command or other commands are analyzed and led to different processing stages, depending upon the nature of the commands;
4. The method of claims 1 wherein an event detection and prediction distributed digital image processor, continually monitors the movement of the player to detect motions that are consistent within certain time period (“b” number of frames) an event is defined as offense or a defense motion by the player; and wherein the event detector's algorithm is comprising of the steps:
a) consecutive frames that have passed the threshold, each are compared to the previous frame to detect the magnitude of change. changes are added to the previous trajectory of the player's motions;
b) if received frame number is less than b number of frames, repeat previous step, otherwise go to the next step.
c) at the end of “b” number of frames, does player's motions indicate an offense aimed at the image's sensitive parts? if yes, go to players offensive play (step f), otherwise continue;
d) at the end of “b” number of frames does player's motions indicate a defense, protecting and dodging the image's offensive moves? if yes, go to step player's defense (step g), otherwise continue;
e) at the end of “b” number of frames does player's motions indicate a combination of offense and defense against image's body parts? if yes, calculate the likelihood of hit success comparing that of player's and images' motions. if player's offense is stronger, go to step f, if weaker, go to step g;
f) predict a player's offense course of motion that is the continuation of motion detections in “b”. plan a defensive image's motions in conjunction with the player's prediction. calculate player and image's final trajectory and coordinates at the end of the predicted or planned period, and send it to event follower processor step h, then go to step a.
g) predict the players defense course of motion that is the continuation of motion detections in “b”. plan an offense course of motion for the image's motions in conjunction with the player's prediction. calculate player, and image's final trajectory, coordinates at the end of the planned period and send results to event follower processor, step 1, then go to step a;
h) new player's offense command received from the event detection prediction processor? if no, go to the next step, otherwise continue displaying planned image and repeat this step;
i) continually display the planned defense or offense motions of the image. get next frame, process frame, calculate players new coordinates, add to the previous coordinate;
j) end of the players prediction period, or image's planned defense? if yes, go to next step, if no go to previous step;
k) compare player's offense compared to image's defense. calculate and show scores, go step h;
l) new player's defense command received from the event detection processor? if no, go to the next step, otherwise continue displaying planned image and repeat this step;
m) continually display the planned defense or offense motions of the image. get next frame, process frame, calculate players new coordinates, add to the previous coordinate;
n) end of player's prediction or image's planned offense period? if yes go to next step, otherwise go to previous step;
o) calculate image's offense compared to player's defense. calculate and show scores, go to step 1.
5. The method of claim 1 wherein the degree for player's speed, is decided by adjustment of “b” number of frames during the initialization, and the degree of expertise is decided by classifying predictions and plans.
6. The method of claim 1 further comprising the step of increasing the number of cameras and display monitors to assist the player's view of the image at different angles while turning and facing from one camera to another, wherein:
a) the image processor to provide an image 3d field of play for the player to use as a visual guidelines for his/hers movements in a field of play, while the image is moved around from one side of the field of play to the other;
b) the image processor to detect player positions from different cameras and decide witch camera provides the best detection angle and display the image in a relevant field of play to be viewed by the player;
7. The method of claim 1 wherein two local players using two sets of camera(s), two sets of displays, and an image processors examine individual video pictures from each players and display the video or planned image of the opponent player.
8. The method of claim 1 wherein two remote players using two sets of camera(s), two sets of displays, and two sets of image processor further comprising the steps of:
p) each processor to examine the videos from each relevant local player.
q) each processor to receive the opponent's image motion information, (or actual opponent's, video and other relevant information) via remote transmission facilities on frame by frame basis;
r) each processor to display an image of the opponent and control the image based upon the information received from the opponents motions;
s) each processor to provide scores on each displays;
t) one processor to determine the winner score, to be displayed on both monitors;
9. A method of playing baseball, tennis, golf, or other related games between a player having a sport instruments in hand and the images of offense and defense players wherein a ball, the images of a players, and an image of the field are generated by a processor; the method comprising the steps of:
a. identify the play instrument and portions of the player body with individual colored elements;
b. each computer generated thrown image ball, will be planned and played with a known trajectory, speed, acceleration, and a prediction, simulating a pro player;
c. generate an image of the field of play in 3-d whereby offense and defense actions takes place, by image offense, and image defense players;
d. processing of the incoming frames at the vicinity of the time of impact of the player's tennis with the image ball;
e. recording the movements of the player and the instrument as a video image and electronically or optically filtering the image into separate signals according to the colored elements;
f. determining positions in 3-space of the portions of the player and the instrument on each recording video frame;
g. following the trajectory of the player's body parts and the sport instrument, utilizing the method of claim 4, to calculate changes in trajectory, velocity, and acceleration of the portions of the player's body and trajectory of the instrument;
h. predicting the trajectory, velocity, and acceleration of the image ball being hit by the player's instrument;
i. moving the image players, as the result of the predicted trajectory of the ball and the image's physical location in the field of play, at the moment of impact of instrument with the image ball;
j. calculate the likelihood of success for the image ball to stay within the image 3-d field of play;
k. displaying the predicted trajectory of the ball hit by the player instrument;
display the images of players, playing offense, defense and reacting to the image ball, based upon the positions of the image player, and the prediction;
m. compare the player's prediction and follow-on, to the initial planned trajectory (step b), and display scores.
10. A method of playing a motion related hand-to-hand combat type game between a first real player and second real player; the method comprising the steps of:
a) identifying portions of the players with distinct colored elements;
b) positioning the players in front of separate video screens upon which an image of the first player is projected and viewable by the second player, and an image of the second player is projected and viewable by the first player;
c) recording video frames of images of combat movement between the players and filtering each of the images into respective separate signals according to the colored elements;
d) reviewing each the frames of each of the players in sequence so as to determine position changes from frame to frame, and calculating velocity, acceleration and trajectory of each of the portions of each of the players;
e) identifying portions of the players in contact to determine when the portions of the players virtually intersect;
f) assigning scores to the players in accordance with a set of scoring rules; and
g) repeating steps (c) through (f) until one of the players has achieved a set number of points and is therefore declared a winner.
US11/349,431 2005-07-25 2006-02-06 Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method Abandoned US20070021207A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/349,431 US20070021207A1 (en) 2005-07-25 2006-02-06 Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/189,176 US20070021199A1 (en) 2005-07-25 2005-07-25 Interactive games with prediction method
US11/349,431 US20070021207A1 (en) 2005-07-25 2006-02-06 Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/189,176 Continuation-In-Part US20070021199A1 (en) 2005-07-25 2005-07-25 Interactive games with prediction method

Publications (1)

Publication Number Publication Date
US20070021207A1 true US20070021207A1 (en) 2007-01-25

Family

ID=46325234

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/349,431 Abandoned US20070021207A1 (en) 2005-07-25 2006-02-06 Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method

Country Status (1)

Country Link
US (1) US20070021207A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100241998A1 (en) * 2009-03-20 2010-09-23 Microsoft Corporation Virtual object manipulation
US20100306685A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation User movement feedback via on-screen avatars
US20110039624A1 (en) * 2009-08-15 2011-02-17 Miodrag Potkonjak Cyber-physical game
US20110285620A1 (en) * 2009-01-30 2011-11-24 Microsoft Corporation Gesture recognizer system architecture
DE102010040699A1 (en) 2010-09-14 2012-03-15 Otto-Von-Guericke-Universität Magdeburg Medizinische Fakultät Apparatus for determining anticipation skill of athletes in sport activities, has projection device and video camera that are connected with data processing system to which display screen is connected
US20120249540A1 (en) * 2011-03-28 2012-10-04 Casio Computer Co., Ltd. Display system, display device and display assistance device
US20120327206A1 (en) * 2011-06-24 2012-12-27 Kabushiki Kaisha Toshiba Information processing apparatus, computer implemented method for processing information and non-transitory medium storing a computer program for processing information
US20130130791A1 (en) * 2011-02-18 2013-05-23 Konami Digital Entertainment Co., Ltd. Game device, game control method, program, recording medium and game management device
US8674980B2 (en) 2010-10-29 2014-03-18 Au Optronics Corp. Three-dimensional image interactive system and position-bias compensation method of the same
US20140363799A1 (en) * 2013-06-06 2014-12-11 Richard Ivan Brown Mobile Application For Martial Arts Training
US20150143976A1 (en) * 2013-03-04 2015-05-28 Empire Technology Development Llc Virtual instrument playing scheme
US20150231477A1 (en) * 2011-10-25 2015-08-20 Aquimo, Llc Method and system to analyze sports motions using motion sensors of a mobile device
US20160008710A1 (en) * 2014-07-11 2016-01-14 Zeroplus Technology Co., Ltd. Interactive gaming apparatus
US20160179211A1 (en) * 2012-12-13 2016-06-23 Eyesight Mobile Technologies, LTD. Systems and methods for triggering actions based on touch-free gesture detection
US9390501B2 (en) * 2007-05-24 2016-07-12 Pillar Vision, Inc. Stereoscopic image capture with performance outcome prediction in sporting environments
CN106730815A (en) * 2016-12-09 2017-05-31 福建星网视易信息系统有限公司 The body-sensing interactive approach and system of a kind of easy realization
US20170354866A1 (en) * 2016-06-10 2017-12-14 Nintendo Co., Ltd. Non-transitory storage medium having game program stored therein, information processing apparatus, information processing system, game processing method
US10382672B2 (en) 2015-07-14 2019-08-13 Samsung Electronics Co., Ltd. Image capturing apparatus and method
US11103787B1 (en) 2010-06-24 2021-08-31 Gregory S. Rabin System and method for generating a synthetic video stream
US11137832B2 (en) 2012-12-13 2021-10-05 Eyesight Mobile Technologies, LTD. Systems and methods to predict a user action within a vehicle

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4375674A (en) * 1980-10-17 1983-03-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Kinesimetric method and apparatus
US4503506A (en) * 1981-08-05 1985-03-05 Westinghouse Electric Corp. Apparatus for mapping and identifying an element within a field of elements
US4542291A (en) * 1982-09-29 1985-09-17 Vpl Research Inc. Optical flex sensor
US4563617A (en) * 1983-01-10 1986-01-07 Davidson Allen S Flat panel television/display
US4736097A (en) * 1987-02-02 1988-04-05 Harald Philipp Optical motion sensor
US4817950A (en) * 1987-05-08 1989-04-04 Goo Paul E Video game control unit and attitude sensor
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US4925189A (en) * 1989-01-13 1990-05-15 Braeunig Thomas F Body-mounted video game exercise device
US4988981A (en) * 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
USRE33559E (en) * 1986-11-13 1991-03-26 James Fallacaro System for enhancing audio and/or visual presentation
US5045687A (en) * 1988-05-11 1991-09-03 Asaf Gurner Optical instrument with tone signal generating means
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5177872A (en) * 1990-10-05 1993-01-12 Texas Instruments Incorporated Method and apparatus for monitoring physical positioning of a user
US5185561A (en) * 1991-07-23 1993-02-09 Digital Equipment Corporation Torque motor as a tactile feedback device in a computer system
US5229756A (en) * 1989-02-07 1993-07-20 Yamaha Corporation Image control apparatus
US5288078A (en) * 1988-10-14 1994-02-22 David G. Capper Control interface apparatus
US5317489A (en) * 1993-09-22 1994-05-31 Sal Delli Gatti Illuminated apparatus for playing a game of horseshoes
US5412554A (en) * 1994-04-21 1995-05-02 Lee; Deng-Ran Compound lamp shade frame
US5442168A (en) * 1991-10-15 1995-08-15 Interactive Light, Inc. Dynamically-activated optical instrument for producing control signals having a self-calibration means
US5490784A (en) * 1993-10-29 1996-02-13 Carmein; David E. E. Virtual reality system with enhanced sensory apparatus
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5524637A (en) * 1994-06-29 1996-06-11 Erickson; Jon W. Interactive system for measuring physiological exertion
US5577981A (en) * 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5913727A (en) * 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4375674A (en) * 1980-10-17 1983-03-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Kinesimetric method and apparatus
US4503506A (en) * 1981-08-05 1985-03-05 Westinghouse Electric Corp. Apparatus for mapping and identifying an element within a field of elements
US4542291A (en) * 1982-09-29 1985-09-17 Vpl Research Inc. Optical flex sensor
US4563617A (en) * 1983-01-10 1986-01-07 Davidson Allen S Flat panel television/display
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
USRE33559E (en) * 1986-11-13 1991-03-26 James Fallacaro System for enhancing audio and/or visual presentation
US4736097A (en) * 1987-02-02 1988-04-05 Harald Philipp Optical motion sensor
US4988981A (en) * 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
US4988981B1 (en) * 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US4817950A (en) * 1987-05-08 1989-04-04 Goo Paul E Video game control unit and attitude sensor
US5045687A (en) * 1988-05-11 1991-09-03 Asaf Gurner Optical instrument with tone signal generating means
US5288078A (en) * 1988-10-14 1994-02-22 David G. Capper Control interface apparatus
US4925189A (en) * 1989-01-13 1990-05-15 Braeunig Thomas F Body-mounted video game exercise device
US5229756A (en) * 1989-02-07 1993-07-20 Yamaha Corporation Image control apparatus
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5177872A (en) * 1990-10-05 1993-01-12 Texas Instruments Incorporated Method and apparatus for monitoring physical positioning of a user
US5185561A (en) * 1991-07-23 1993-02-09 Digital Equipment Corporation Torque motor as a tactile feedback device in a computer system
US5442168A (en) * 1991-10-15 1995-08-15 Interactive Light, Inc. Dynamically-activated optical instrument for producing control signals having a self-calibration means
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5317489A (en) * 1993-09-22 1994-05-31 Sal Delli Gatti Illuminated apparatus for playing a game of horseshoes
US5490784A (en) * 1993-10-29 1996-02-13 Carmein; David E. E. Virtual reality system with enhanced sensory apparatus
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5577981A (en) * 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5412554A (en) * 1994-04-21 1995-05-02 Lee; Deng-Ran Compound lamp shade frame
US5524637A (en) * 1994-06-29 1996-06-11 Erickson; Jon W. Interactive system for measuring physiological exertion
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5913727A (en) * 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9390501B2 (en) * 2007-05-24 2016-07-12 Pillar Vision, Inc. Stereoscopic image capture with performance outcome prediction in sporting environments
US8869072B2 (en) * 2009-01-30 2014-10-21 Microsoft Corporation Gesture recognizer system architecture
US20110285620A1 (en) * 2009-01-30 2011-11-24 Microsoft Corporation Gesture recognizer system architecture
US9280203B2 (en) 2009-01-30 2016-03-08 Microsoft Technology Licensing, Llc Gesture recognizer system architecture
WO2010107577A3 (en) * 2009-03-20 2011-01-06 Microsoft Corporation Virtual object manipulation
US9256282B2 (en) 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US20100241998A1 (en) * 2009-03-20 2010-09-23 Microsoft Corporation Virtual object manipulation
US20100306685A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation User movement feedback via on-screen avatars
US20110039624A1 (en) * 2009-08-15 2011-02-17 Miodrag Potkonjak Cyber-physical game
US11103787B1 (en) 2010-06-24 2021-08-31 Gregory S. Rabin System and method for generating a synthetic video stream
DE102010040699A1 (en) 2010-09-14 2012-03-15 Otto-Von-Guericke-Universität Magdeburg Medizinische Fakultät Apparatus for determining anticipation skill of athletes in sport activities, has projection device and video camera that are connected with data processing system to which display screen is connected
US8674980B2 (en) 2010-10-29 2014-03-18 Au Optronics Corp. Three-dimensional image interactive system and position-bias compensation method of the same
US8808090B2 (en) * 2011-02-18 2014-08-19 Konami Digital Entertainment Co., Ltd. Game device, game control method, program, recording medium and game management device
US20130130791A1 (en) * 2011-02-18 2013-05-23 Konami Digital Entertainment Co., Ltd. Game device, game control method, program, recording medium and game management device
US20120249540A1 (en) * 2011-03-28 2012-10-04 Casio Computer Co., Ltd. Display system, display device and display assistance device
US8994797B2 (en) * 2011-03-28 2015-03-31 Casio Computer Co., Ltd. Display system, display device and display assistance device
US20120327206A1 (en) * 2011-06-24 2012-12-27 Kabushiki Kaisha Toshiba Information processing apparatus, computer implemented method for processing information and non-transitory medium storing a computer program for processing information
US20150231477A1 (en) * 2011-10-25 2015-08-20 Aquimo, Llc Method and system to analyze sports motions using motion sensors of a mobile device
US9895590B2 (en) * 2011-10-25 2018-02-20 Aquimo, Llc Method and system to analyze sports motions using motion sensors of a mobile device
US11249555B2 (en) 2012-12-13 2022-02-15 Eyesight Mobile Technologies, LTD. Systems and methods to detect a user behavior within a vehicle
US11137832B2 (en) 2012-12-13 2021-10-05 Eyesight Mobile Technologies, LTD. Systems and methods to predict a user action within a vehicle
US11726577B2 (en) 2012-12-13 2023-08-15 Eyesight Mobile Technologies, LTD. Systems and methods for triggering actions based on touch-free gesture detection
US20160179211A1 (en) * 2012-12-13 2016-06-23 Eyesight Mobile Technologies, LTD. Systems and methods for triggering actions based on touch-free gesture detection
US20190250714A1 (en) * 2012-12-13 2019-08-15 Eyesight Mobile Technologies, LTD. Systems and methods for triggering actions based on touch-free gesture detection
US10203764B2 (en) * 2012-12-13 2019-02-12 Eyesight Mobile Technologies, LTD. Systems and methods for triggering actions based on touch-free gesture detection
US9236039B2 (en) * 2013-03-04 2016-01-12 Empire Technology Development Llc Virtual instrument playing scheme
US9734812B2 (en) 2013-03-04 2017-08-15 Empire Technology Development Llc Virtual instrument playing scheme
US20150143976A1 (en) * 2013-03-04 2015-05-28 Empire Technology Development Llc Virtual instrument playing scheme
US20140363799A1 (en) * 2013-06-06 2014-12-11 Richard Ivan Brown Mobile Application For Martial Arts Training
US9566508B2 (en) * 2014-07-11 2017-02-14 Zeroplus Technology Co., Ltd. Interactive gaming apparatus using an image projected onto a flexible mat
US20160008710A1 (en) * 2014-07-11 2016-01-14 Zeroplus Technology Co., Ltd. Interactive gaming apparatus
US10382672B2 (en) 2015-07-14 2019-08-13 Samsung Electronics Co., Ltd. Image capturing apparatus and method
US20170354866A1 (en) * 2016-06-10 2017-12-14 Nintendo Co., Ltd. Non-transitory storage medium having game program stored therein, information processing apparatus, information processing system, game processing method
US10653947B2 (en) * 2016-06-10 2020-05-19 Nintendo Co., Ltd. Non-transitory storage medium having game program stored therein, information processing apparatus, information processing system, game processing method
CN106730815A (en) * 2016-12-09 2017-05-31 福建星网视易信息系统有限公司 The body-sensing interactive approach and system of a kind of easy realization

Similar Documents

Publication Publication Date Title
US20070021207A1 (en) Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method
US20070021199A1 (en) Interactive games with prediction method
US10821347B2 (en) Virtual reality sports training systems and methods
Miles et al. A review of virtual environments for training in ball sports
US6951515B2 (en) Game apparatus for mixed reality space, image processing method thereof, and program storage medium
US5913727A (en) Interactive movement and contact simulation game
US11826628B2 (en) Virtual reality sports training systems and methods
US20110256914A1 (en) Interactive games with prediction and plan with assisted learning method
CN102331840B (en) User selection and navigation based on looped motions
US9345957B2 (en) Enhancing a sport using an augmented reality display
US9898675B2 (en) User movement tracking feedback to improve tracking
TW201936241A (en) Enhanced gaming systems and methods
JP2000033184A (en) Whole body action input type game and event device
JP2000350860A (en) Composite reality feeling device and method for generating composite real space picture
CN113709411A (en) Sports auxiliary training system of MR intelligent glasses based on eye movement tracking technology
CN111672089B (en) Electronic scoring system for multi-person confrontation type project and implementation method
KR20170092929A (en) Apparatus for base-ball practice, sensing device and sensing method used to the same and control method for the same
Dabnichki Computers in sport
US11331551B2 (en) Augmented extended realm system
Katz et al. Virtual reality
US11951376B2 (en) Mixed reality simulation and training system
JP7248353B1 (en) Hitting analysis system and hitting analysis method
RU2786594C1 (en) Virtual reality simulator for working the skill of the hockey player in shooting the puck and determining the level of skill
JP7344096B2 (en) Haptic metadata generation device, video-tactile interlocking system, and program
US20220288457A1 (en) Alternate reality system for a ball sport

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION