US20110230266A1 - Game device, control method for a game device, and non-transitory information storage medium - Google Patents
Game device, control method for a game device, and non-transitory information storage medium Download PDFInfo
- Publication number
- US20110230266A1 US20110230266A1 US13/043,800 US201113043800A US2011230266A1 US 20110230266 A1 US20110230266 A1 US 20110230266A1 US 201113043800 A US201113043800 A US 201113043800A US 2011230266 A1 US2011230266 A1 US 2011230266A1
- Authority
- US
- United States
- Prior art keywords
- player
- game
- determination
- space
- subject space
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/44—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5375—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
- A63F13/655—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1018—Calibration; Key and button assignment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6607—Methods for processing data by generating or executing the game program for rendering three dimensional images for animating game characters, e.g. skeleton kinematics
Definitions
- the present invention relates to a game device, a control method for a game device, and a non-transitory information storage medium.
- JP 2005-287830 A describes the following technology. That is, an image obtained by photographing the player and a reference game image stored in advance are synthesized, and the synthesized image is displayed on a monitor, to thereby enable the player to understand a movement that the player should make in the game.
- the present invention has been made in view of the above-mentioned problems, and therefore has an object to provide a game device, a control method for a game device, and a non-transitory information storage medium, which are capable of dealing with displacement in position of a player during gameplay.
- a game device includes: position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; game processing execution means for executing game processing based on a result of the determination made by the determination means; and determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
- a control method for a game device includes: a position acquiring step of acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; a determination step of determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; a game processing execution step of executing game processing based on a result of the determination made in the determination step; and a determination subject space changing step of changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
- a program causes a computer to function as a game device including: position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; game processing execution means for executing game processing based on a result of the determination made by the determination means; and determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
- a non-transitory computer-readable information storage medium is a non-transitory computer-readable information storage medium having the above-mentioned program recorded thereon.
- the determination subject space changing means includes: means for determining whether or not a state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for a reference period; and means for changing the position of the determination subject space in a case where the state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for the reference period.
- the game device further includes display control means for causing display means to display a game screen containing a game character and a focused area having lightness thereof set higher than lightness of another area, in which the display control means includes means for controlling positional relation between a display position of the game character and a display position of the focused area based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.
- the game device further includes display control means for causing display means to display a game screen containing a first game character and a second game character, in which the display control means includes means for controlling a positional relation between a display position of the first game character and a display position of the second game character based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.
- FIG. 1 is a diagram illustrating a positional relation among a position detecting device, a game device, and a player;
- FIG. 2 is a diagram illustrating an example of a photographed image generated by a CCD camera
- FIG. 3 is a diagram for describing a method of measuring a depth of the player, which is performed by an infrared sensor;
- FIG. 4 is a diagram illustrating an example of a depth image acquired by the infrared sensor
- FIG. 5 is a diagram illustrating an example of three-dimensional position information generated by the position detecting device
- FIG. 6 is a diagram illustrating a position of the player, which is identified by the three-dimensional position information
- FIG. 7 is a diagram illustrating a space to be photographed by the position detecting device
- FIG. 8 is a diagram illustrating an example of a game screen displayed by the game device.
- FIG. 9 is a diagram illustrating, as an example, the game screen displayed by the game device in a case where the player has stepped out of a determination subject space;
- FIG. 10 is a diagram illustrating the position detecting device and the player viewed from an Xw-Zw plane
- FIG. 11 is an example of the game screen displayed in the case where the player has stepped out of the determination subject space
- FIG. 12 is a diagram illustrating the position detecting device and the player viewed from an Xw-Yw plane
- FIG. 13 is an example of the game screen displayed in the case where the player has stepped out of the determination subject space
- FIG. 14 is a diagram illustrating a hardware configuration of the position detecting device
- FIG. 15 is a diagram illustrating a hardware configuration of the game device
- FIG. 16 is a functional block diagram illustrating a group of functions to be implemented on the game device
- FIG. 17 is a diagram illustrating an example of reference action information
- FIG. 18 is a diagram illustrating an example of action determination criterion information
- FIG. 19 is a flow chart illustrating an example of processing to be executed on the game device.
- FIG. 20 is a diagram illustrating the determination subject space after change
- FIG. 21 is a diagram illustrating a case where a display position of an image contained in the game screen has been changed
- FIG. 22 is a diagram illustrating another example of the game screen
- FIG. 23 is a diagram illustrating an example of the game screen
- FIG. 24 is a diagram illustrating an example of the game screen.
- FIG. 25 is a diagram illustrating a case where the display position of an image contained in the game screen has been changed.
- a game device is implemented by, for example, a home-use game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer.
- a home-use game machine stationary game machine
- portable game machine portable game machine
- mobile phone mobile phone
- PDA personal digital assistant
- personal computer a personal computer
- FIG. 1 is a diagram illustrating a positional relation among a position detecting device 1 , a game device 20 , and a player 100 .
- the player 100 is positioned, for example, in front of the position detecting device 1 .
- the position detecting device 1 and the game device 20 are connected to each other so as to be able to communicate data therebetween. Further, the player 100 plays a game in, for example, a living room where items of furniture F are placed.
- the position detecting device 1 generates information relating to a position of the player 100 based on an image acquired by photographing the player 100 and information relating to a distance between the position detecting device 1 and the player 100 .
- the position detecting device 1 detects sets of three-dimensional coordinates corresponding to a plurality of parts (for example, head, shoulder, etc.) constituting the body of the player 100 .
- the game device 20 acquires the information relating to the position of the player 100 from the position detecting device 1 .
- the game device 20 acquires a three-dimensional coordinate that indicate a standing position of the player 100 in a three-dimensional space from the position detecting device 1 .
- the game device 20 controls the game based on changes in the three-dimensional coordinate.
- a change in the three-dimensional coordinate associated with the player 100 corresponds to an action of the player 100 .
- sets of the three-dimensional coordinates corresponding to the right elbow and the right hand of the player 100 mainly change.
- the position detecting device 1 generates the information relating to the position of the player 100 (three-dimensional position information).
- the position detecting device 1 includes, for example, a CCD camera 2 , an infrared sensor 3 , and a microphone 4 including a plurality of microphones.
- the three-dimensional position information of the player 100 is generated based on information acquired from the CCD camera 2 and the infrared sensor 3 .
- the CCD camera 2 is a publicly-known camera comprising a CCD image sensor.
- the CCD camera 2 photographs the player 100 .
- the CCD camera 2 generates a still image (for example, RGB digital image) by photographing the player 100 at predetermined time intervals (for example, every 1/60th of a second).
- the still image generated by the CCD camera 2 is referred to as a photographed image.
- the photographed image contains an object located within a field of view of the CCD camera 2 .
- FIG. 2 is a diagram illustrating an example of the photographed image generated by the CCD camera 2 .
- the photographed image contains, for example, the player 100 .
- the photographed image contains those objects, which are omitted in FIG. 2 for simplicity of description.
- the photographed image there are set an Xs-axis and a Ys-axis, which are orthogonal to each other.
- the upper left corner of the photographed image is set as an origin point Os (0,0).
- the lower right corner of the photographed image is set as a coordinate Pmax (Xmax,Ymax).
- the position of each pixel in the photographed image is identified by a two-dimensional coordinate (Xs-Ys coordinate) that is assigned to each pixel.
- the infrared sensor 3 is formed of, for example, an infrared emitting device and an infrared receiving device (for example, infrared diodes).
- the infrared sensor 3 detects reflected light obtained by emitting infrared light.
- the infrared sensor 3 measures the depth of a subject (for example, player 100 ) based on a detection result of the reflected light.
- the depth of a subject is a distance between a measurement reference position (for example, position of the infrared receiving device of the infrared sensor 3 ) and the position of the subject.
- the measurement reference position is a position that serves as a reference in measuring the depth of the position of the player 100 .
- the measurement reference position may be a predetermined position associated with the position of the position detecting device 1 .
- the infrared sensor 3 measures the depth of the player 100 based, for example, on a time of flight (TOF), which is a time required for the infrared sensor 3 to receive reflected light after emitting infrared light.
- TOF time of flight
- FIG. 3 is a diagram for describing a method of measuring the depth of the player 100 , which is performed by the infrared sensor 3 .
- the infrared sensor 3 emits pulsed infrared light at predetermined intervals.
- the infrared light emitted from the infrared sensor 3 spreads spherically with an emission position of the infrared sensor 3 at the center.
- the infrared light emitted from the infrared sensor 3 strikes surfaces of, for example, the body of the player 100 and other objects (for example, furniture F, walls, etc.) located in the living room.
- the infrared light that has struck those surfaces is reflected.
- the reflected infrared light is detected by the infrared receiving device of the infrared sensor 3 .
- the infrared sensor 3 detects reflected light having a phase shifted by 180° from that of the emitted infrared light.
- the TOF of the infrared light reflected by both hands of the player 100 is shorter than the TOF of the infrared light reflected by the torso of the player 100 .
- the value determined as follows corresponds to the distance between the measurement reference position and the player 100 (that is, depth). Specifically, the value is determined by multiplying a time required for the infrared sensor 3 to detect the reflected light after emitting the infrared light (that is, TOF) by the speed of the infrared light and then dividing the resultant value by two. In this manner, the infrared sensor 3 can measure the depth of the player 100 .
- the infrared sensor 3 can also detect an outline of a subject (player 100 ) by detecting depth differences acquired from the reflected infrared light.
- the fact that the infrared sensor 3 receives the reflected infrared light as described above means that an object is located at that place. If there is no other object located behind the object, the depth difference between the object and the surroundings of the object is large. Specifically, for example, the depth difference is large between a depth acquired by the infrared light reflected from the player 100 and a depth acquired by the infrared light reflected from the wall behind the player 100 , and hence it is possible to detect the outline of the object by joining portions having the depth differences larger than a predetermined value.
- the method of detecting the outline of an object is not limited to the above-mentioned example.
- the outline may be detected based on the brightness of each pixel of the photographed image acquired by the CCD camera 2 .
- the light that has returned to the infrared sensor 3 may be subjected to predetermined filtering processing. Specifically, noise may be reduced by employing such a configuration that only reflected light corresponding to the infrared light emitted by the infrared sensor 3 is detected by a light detection sensor.
- depth information Information relating to the depth of the player 100 (depth information), which is detected as described above, is expressed as, for example, a depth image.
- depth information is expressed as a gray-scale depth image (for example, 256-bit gray-scale image data).
- FIG. 4 is a diagram illustrating an example of the depth image acquired by the infrared sensor 3 .
- an object located close to the infrared sensor 3 is expressed as bright (brightness is high), and an object located far from the infrared sensor 3 is expressed as dark (brightness is low).
- the depth image is expressed as the 256-bit gray-scale image data
- the depth of the player 100 corresponds to the brightness (pixel value) of the depth image.
- the depth image is changed by one bit. This case means that the infrared sensor 3 is capable of detecting the depth of the subject in units of 2 cm.
- pixels corresponding to both hands of the player 100 are expressed as brighter (brightness is higher) than pixels corresponding to the torso.
- the infrared sensor 3 similarly to the CCD camera 2 , the infrared sensor 3 generates the depth image at predetermined time intervals (for example, every 1/60th of a second). Based on the photographed image acquired by the CCD camera 2 and the depth image acquired by the infrared sensor 3 , the three-dimensional position information is generated relating to the position of the player 100 .
- the composite image contains, for each pixel, color information (lightness of each of R, G, and B) and the depth information.
- the position of at least one of the photographed image and the depth image is corrected based on a positional distance between the CCD camera 2 and the infrared sensor 3 .
- the CCD camera 2 and the infrared sensor 3 are spaced apart from each other by 2 cm in the horizontal direction, the coordinates of each pixel of the depth image are shifted by the number of pixels that corresponds to 2 cm, to thereby correct the position.
- the three-dimensional position information is generated based on the composite image.
- description is given by taking, as an example, a case where the three-dimensional position information represents the three-dimensional coordinate corresponding to each of the parts (for example, head, shoulder, etc.) of the body of the player 100 .
- the three-dimensional position information is generated in the following manner.
- pixels corresponding to the outline of the player 100 are identified. Pixels enclosed within the outline of the player 100 are the pixels corresponding to the body of the player 100 .
- the color information (lightnesses of R, G, and B) of the above-mentioned pixels enclosed within the outline is referred to.
- pixels corresponding to each part of the body of the player 100 are identified.
- a publicly-known method is applicable, such as a pattern matching method in which the object (that is, each part of the body of the player 100 ) is extracted from the image through a comparison with a comparison image (training image).
- pixels corresponding to the positions of the head, both elbows, etc. of the player 100 may be identified by calculating a velocity vector of each part of the body based on a change in color information of each pixel of the photographed image and then detecting a motion vector of each pixel based on an optical flow representing the movement of the object (for example, gradient method or filtering method).
- the three-dimensional coordinates of the head, both elbows, etc. of the player 100 are calculated.
- the three-dimensional coordinates are generated by carrying out predetermined matrix transformation processing on those pixel values.
- the matrix transformation processing is executed through, for example, a matrix operation similar to transformation processing performed in 3D graphics between two coordinate systems of a world coordinate system and a screen coordinate system.
- the RGB value indicating the color information of the pixel and the D value indicating the depth are substituted into a predetermined determinant, to thereby calculate the three-dimensional coordinate of the pixel. That is, the three-dimensional coordinates of each part of the player 100 are calculated.
- a publicly-known method may be applied, and the calculation method is not limited to the above-mentioned example.
- the coordinate transformation may be performed using a lookup table.
- FIG. 5 is a diagram illustrating an example of the three-dimensional position information generated by the position detecting device 1 .
- the three-dimensional position information for example, each part of the player 100 and the three-dimensional coordinates are stored in association with each other.
- FIG. 6 is a diagram illustrating the position of the player 100 , which is identified by the three-dimensional position information.
- a predetermined position corresponding to the position detecting device 1 (for example, the measurement reference position) is set as an origin point Ow.
- the origin point Ow represents the three-dimensional coordinate corresponding to the measurement reference position of the infrared sensor 3 .
- the position of the origin point Ow may be set anywhere in the three-dimensional space in which the player 100 exists.
- the three-dimensional coordinate corresponding to the origin point Os of the photographed image may be set as the origin point Ow.
- description is given by taking, as an example, a case where sets of three-dimensional coordinates corresponding to, for example, the head P 1 , neck P 2 , right shoulder P 3 , left shoulder P 4 , right elbow P 5 , left elbow P 6 , right hand P 7 , left hand P 8 , chest P 9 , waist P 10 , right knee P 11 , left knee P 12 , right heel P 13 , left heel P 14 , a right toe P 15 , and a left toe P 16 of the player 100 are acquired as the three-dimensional position information.
- the part of the body of the player 100 which is indicated by the three-dimensional position information, may be a part that is determined in advance from the player's skeletal frame.
- any part of the body may be used as long as the part is identifiable by the above-mentioned pattern matching method.
- the three-dimensional position information is generated at predetermined time intervals (for example, every 1/60th of a second).
- the generated three-dimensional position information is transmitted from the position detecting device 1 to the game device 20 at predetermined time intervals.
- the game device 20 receives the three-dimensional position information transmitted from the position detecting device 1 , and recognizes the position of the body of the player 100 based on the three-dimensional position information. Specifically, if the player 100 has performed an action of dancing or kicking a ball, the three-dimensional position information changes in response to this action, and hence the game device 20 recognizes the movement of the player based on the change in three-dimensional position information. The game device 20 executes the game while recognizing the movement of the body of the player based on the three-dimensional position information, details of which are described later.
- detectable space 60 a space in which the position detecting device 1 can detect the player 100 (hereinafter, referred to as detectable space 60 ).
- FIG. 7 is a diagram illustrating a space to be photographed by the position detecting device 1 .
- the detectable space 60 (space enclosed with broken lines of FIG. 7 ) is, for example, a predetermined space within the field of view of the CCD camera 2 .
- the field of view of the CCD camera 2 is determined based, for example, on the line-of-sight and the angle of view of the CCD camera 2 .
- the detectable space 60 is such a space as to allow accurate capturing of the movement of the player 100 .
- the position detecting device 1 is unable to photograph the entire body of the player 100 .
- the position detecting device 1 is unable to acquire accurate three-dimensional position information. Therefore, a space relatively close to the position detecting device 1 is excluded from the detectable space 60 even if the space is within the field of view of the CCD camera 2 .
- the infrared light is attenuated, which results in the position detecting device 1 being unable to detect the reflected light.
- the position detecting device 1 is unable to acquire accurate depth information. Therefore, a space relatively far from the position detecting device 1 is excluded from the detectable space 60 even if the space is within the field of view of the CCD camera 2 .
- the position detecting device 1 is unable to photograph the entire body of the player 100 .
- the right side of or the left side of the body of the player 100 is omitted from the photographed image ( FIG. 2 ), and hence the position detecting device 1 is unable to acquire accurate three-dimensional position information. Therefore, a space close to both end portions of the horizontal direction is excluded from the detectable space 60 even if the space is within the field of view of the CCD camera 2 .
- the detectable space 60 is, for example, a space obtained by excluding the respective spaces described above from the space within the field of view of the CCD camera 2 .
- the detectable space 60 is a space in which the position detecting device 1 can generate accurate three-dimensional position information when the player 100 is standing inside the space.
- the size (volume), shape, and position of the detectable space 60 may be determined in advance by, for example, a game creator, or may be changed according to a state of a room where the position detecting device 1 is installed.
- a determination subject space 70 is set inside the detectable space 60 . As illustrated in FIG. 7 , for example, the determination subject space 70 is set at a predetermined position inside the detectable space 60 associated with the position detecting device 1 . Further, the determination subject space 70 contains a representative point 71 for specifying a position at which the determination subject space 70 is to be placed.
- the determination subject space 70 is used to define a space in which the player 100 needs to be located.
- the size (volume) and shape of the determination subject space 70 may be determined in advance by, for example, a game creator.
- the position of the determination subject space 70 is changed, for example, according to the position of the player 100 , details of which are described later.
- the game device 20 can detect the action of the player 100 .
- the player 100 is unable to move freely in the detectable space 60 .
- One example is a case where the player 100 plays the game in a living room or the like of their house as illustrated in FIG. 1 .
- the items of furniture F such as a desk and a chair, and the walls and the like of the living room are placed in the living room. Further, there is a case where there is another player 100 or a person who is watching the game in the living room. Thus, in a case where the player 100 actually plays the game, various obstacles are often placed inside the detectable space 60 , and hence, in such a case, the player is unable to move freely in the detectable space 60 .
- the determination subject space 70 is set inside the detectable space 60 , and the player is prompted to play the game in the determination subject space 70 .
- the determination subject space 70 serves to show the player 100 a safe space in which the player 100 has a low risk of hitting an obstacle.
- the position of the determination subject space 70 is determined based, for example, on the position of the player 100 at the time of game start (alternatively, immediately before or after the game start).
- the position of the determination subject space 70 is set so as to contain a place where the player 100 is standing at the time of the game start (alternatively, immediately before or after the game start).
- the player 100 remains inside the determination subject space 70 set as described above, there is a high possibility that there will be no such obstacle as a desk or a chair in their surroundings, and hence the player 100 can play the game more safely.
- the setting method for the position of the determination subject space 70 is not limited to the above-mentioned example.
- an extension of the line-of-sight of the CCD camera 2 may be set as an initial position of the determination subject space 70 .
- one player 100 plays the game, but a plurality of players 100 may play the game.
- the three-dimensional position information of each player 100 is generated. Specifically, based on the number of outlines of the players 100 , the position detecting device 1 can recognize the number of the players 100 .
- the same processing as described above is executed with respect to pixels corresponding to each of the plurality of players 100 , and hence it is possible to generate the three-dimensional position information of the plurality of players 100 .
- an object having a predetermined height for example, one meter
- a predetermined height for example, one meter
- the player 100 may be prevented from being detected.
- the game is executed based on the three-dimensional position information of the player 100 in the determination subject space 70 .
- an example of the game is described.
- description is given by taking, as an example, a case where the game device 20 recognizes the standing position and the action of the player based on the three-dimensional position information to execute a dance game.
- the game device 20 executes a game configured such that the player 100 dances to movements of a game character on a game screen 50 .
- the player 100 is required to play the game without moving away from a predetermined standing position.
- the game screen 50 displayed in this embodiment in a case where the player 100 has stepped out of the determination subject space 70 , it is possible to prompt the player 100 to return to the determination subject space 70 .
- FIG. 8 is a diagram illustrating an example of the game screen 50 displayed by the game device 20 .
- the game screen 50 includes, for example, a game character 51 , a spotlight 52 , a spotlight area 53 being an area illuminated by the spotlight 52 , and a message 54 .
- the game character 51 stands within the spotlight area 53 (focused area) and dances.
- the lightness (brightness) of the spotlight area 53 is higher (brighter) than the lightness of the other area.
- the lightness of an area outside the spotlight area 53 is lower (darker) than the lightness of the spotlight area 53 .
- the game character 51 moves out of the spotlight area 53 , the game character 51 becomes less visible.
- the game character 51 moves the respective parts of its body, thereby serving to show a dance action to be performed by the player 100 . According to movements of the body of the game character 51 , the player 100 dances in front of the position detecting device 1 .
- the player 100 steps their right foot forward as well. Further, for example, if the game character 51 has performed an action of raising its left hand, the player 100 performs an action of raising their left hand as well. In a case where the player 100 has succeeded in moving their body according to the action of the game character 51 , for example, the message 54 that reads “GOOD” is displayed on the game screen 50 .
- the message 54 to that effect is displayed on the game screen 50 .
- FIG. 9 is a diagram illustrating, as an example, the game screen 50 displayed by the game device 20 in the case where the player 100 has stepped out of the determination subject space 70 .
- the message 54 that reads “CAUTION” is displayed on the game screen 50 .
- the message 54 that issues a warning is displayed.
- the display position of the spotlight area 53 may be configured to correspond to the determination subject space 70 of the position detecting device 1 .
- this example is described. Specifically, in the case where the player 100 is inside the determination subject space 70 , the game character 51 is located at a position with high lightness. In other words, in this case, the game character 51 is located in the spotlight area 53 .
- the game character 51 is located at a position with low lightness. In other words, in this case, the game character 51 is located in the area outside the spotlight area 53 .
- FIG. 10 is a diagram illustrating the position detecting device 1 and the player 100 viewed from an Xw-Zw plane (that is, from the side).
- FIG. 10 illustrates a case where the player 100 has moved backward while dancing. As illustrated in FIG. 10 , the player 100 has moved backward and therefore is out of the determination subject space 70 .
- a game screen 50 that prompts the player 100 to move forward to return to the inside of the determination subject space 70 .
- FIG. 11 is an example of the game screen 50 displayed in the case where the player 100 has stepped out of the determination subject space 70 .
- the game screen 50 of FIG. 11 is displayed in the case where the player 100 is standing at the position of FIG. 10 .
- the game character 51 is displayed in the area outside the spotlight area 53 (for example, at a predetermined position in the rear). As described above, the area outside the spotlight area 53 is set low in lightness.
- the player 100 finds the game character 51 less visible. In such a case, it is conceivable that the player 100 will move forward so as to cause the game character 51 to move to the spotlight area 53 , which is bright and thus makes the game character 51 more visible. Specifically, in a case where the player 100 has stepped backward out of the determination subject space 70 when viewed from the position detecting device 1 , by causing the position of the game character 51 to move out of the spotlight area 53 , it is possible to prompt the player 100 to move toward the determination subject space 70 .
- FIG. 12 is a diagram illustrating the position detecting device 1 and the player 100 viewed from an Xw-Yw plane (that is, from the above).
- FIG. 12 illustrates a case where the player 100 has moved in the horizontal direction (for example, Yw-axis direction) while dancing. As illustrated in FIG. 12 , for example, the player 100 is displaced in the horizontal direction and is therefore out of the determination subject space 70 .
- FIG. 13 is an example of the game screen 50 displayed in the case where the player 100 has stepped out of the determination subject space 70 .
- the game screen 50 of FIG. 13 is displayed in the case where the player 100 is standing at the position of FIG. 12 .
- the player 100 finds the game character 51 less visible.
- the player 100 moves leftward so as to cause the game character 51 to move to the spotlight area 53 , which is bright and thus makes the game character 51 more visible.
- the player 100 has stepped in the horizontal direction out of the determination subject space 70 when viewed from the position detecting device 1 , it is possible to prompt the player 100 to move toward the determination subject space 70 .
- the game device 20 sets the determination subject space 70 for showing the standing position of the player 100 , and hence it is possible to show the standing position to the player 100 .
- FIG. 14 is a diagram illustrating a hardware configuration of the position detecting device 1 .
- the position detecting device 1 includes a microprocessor 10 , a storage unit 11 , a photographing unit 12 , a depth measuring unit 13 , an audio processing unit 14 , and a communication interface unit 15 .
- the respective components of the position detecting device 1 are connected to one another by a bus 16 so as to be able to exchange data thereamong.
- the microprocessor 10 controls the respective units of the position detecting device 1 according to an operating system and various kinds of programs which are stored in the storage unit 11 .
- the storage unit 11 stores programs and various kinds of parameters which are used for operating the operating system, the photographing unit 12 , and the depth measuring unit 13 . Further, the storage unit 11 stores a program for generating the three-dimensional position information based on the photographed image and the depth image.
- the photographing unit 12 includes the CCD camera 2 and the like.
- the photographing unit 12 generates, for example, the photographed image of the player 100 .
- the depth measuring unit 13 includes the infrared sensor 3 and the like.
- the depth measuring unit 13 generates the depth image based, for example, on the TOF acquired using the infrared sensor 3 .
- the microprocessor 10 generates the three-dimensional position information based on the photographed image generated by the photographing unit 12 and the depth image generated by the depth measuring unit 13 .
- the microprocessor 10 identifies the positions of pixels corresponding to the respective parts (for example, head P 1 to left toe P 16 ) of the player 100 based on the photographed image.
- the microprocessor 10 executes coordinate transformation processing and calculates the three-dimensional coordinate based on the RGBD values of the identified pixels.
- the coordinate transformation processing is performed based on the matrix operation as described above.
- the three-dimensional position information ( FIG. 5 ) is generated at the predetermined time intervals (for example, every 1/60th of a second).
- the audio processing unit 14 includes the microphone 4 and the like.
- the audio processing unit 14 can identify a position at which the player 100 has made a sound based on time lags among sounds detected using a plurality of (for example, three) microphones.
- a unidirectional microphone that detects sounds originating from a sound source located along the line-of-sight of the CCD camera 2 may be applied.
- the communication interface unit 15 is an interface for transmitting various kinds of data, such as the three-dimensional position information, to the game device 20 .
- FIG. 15 is a diagram illustrating a hardware configuration of the game device 20 .
- the game device 20 includes a home-use game machine 21 , a display unit 40 , an audio output unit 41 , an optical disk 42 , and a memory card 43 .
- the display unit 40 and the audio output unit 41 are connected to the home-use game machine 21 .
- a home-use television set is used as the display unit 40 .
- a speaker integrated into the home-use television set is used as the audio output unit 41 .
- the optical disk 42 and the memory card 43 are information storage media, and are inserted into the home-use game machine 21 .
- the home-use game machine 21 is a publicly-known computer game system, and, as illustrated in FIG. 15 , includes a bus 22 , a microprocessor 23 , a main memory 24 , an image processing unit 25 , an audio processing unit 26 , an optical disk reproducing unit 27 , a memory card slot 28 , a communication interface (I/F) 29 , a controller interface (I/F) 30 , and a controller 31 . Components other than the controller 31 are accommodated in an enclosure of the home-use game machine 21 .
- the bus 22 is used for exchanging addresses and data among the units constituting the home-use game machine 21 .
- the microprocessor 23 , the main memory 24 , the image processing unit 25 , the audio processing unit 26 , the optical disk reproducing unit 27 , the memory card slot 28 , the communication interface 29 , and the controller interface 30 are connected to one another by the bus 22 so as to be able to communicate data thereamong.
- the microprocessor 23 executes various kinds of information processing based on an operating system stored in a ROM (not shown), or programs read from the optical disk 42 or the memory card 43 .
- the main memory 24 includes, for example, a RAM.
- the program and data read from the optical disk 42 or the memory card 43 are written into the main memory 24 as necessary.
- the main memory 24 is also used as a working memory for the microprocessor 23 .
- the main memory 24 stores the three-dimensional position information received from the position detecting device 1 at the predetermined time intervals.
- the microprocessor 23 controls the game based on the three-dimensional position information stored in the main memory 24 .
- the image processing unit 25 includes a VRAM, and renders, based on image data transmitted from the microprocessor 23 , the game screen 50 in the VRAM.
- the image processing unit 25 converts the game screen 50 into video signals, and outputs the video signals to the display unit 40 at a predetermined timing.
- the audio processing unit 26 includes a sound buffer.
- the audio processing unit 26 outputs, from the audio output unit 41 , various kinds of audio data (game music, game sound effects, messages, etc.) that have been read from the optical disk 42 into the sound buffer.
- the optical disk reproducing unit 27 reads a program and data recorded on the optical disk 42 .
- description is given by taking, as an example, a case where the optical disk 42 is used for supplying the program and the data to the home-use game machine 21 .
- another information storage medium for example, memory card 43 or the like
- the program and the data may be supplied to the home-use game machine 21 via a data communication network such as the Internet.
- the memory card slot 28 is an interface for the memory card 43 to be inserted into.
- the memory card 43 includes a nonvolatile memory (for example, EEPROM etc.).
- the memory card 43 stores various kinds of game data, such as saved data.
- the communication interface 29 is an interface for establishing communication connection to a communication network such as the Internet.
- the controller interface 30 is an interface for establishing wireless connection or wired connection to the controller 31 .
- an interface compliant with, for example, the Bluetooth (registered trademark) interface standard may be used. It should be noted that the controller interface 30 may be an interface for establishing wired connection to the controller 31 .
- FIG. 16 is a functional block diagram illustrating a group of functions to be implemented on the game device 20 .
- a game data storage unit 80 As illustrated in FIG. 16 , on the game device 20 , there are implemented a game data storage unit 80 , a position acquiring unit 82 , a determination unit 84 , a game processing execution unit 86 , a determination subject space changing unit 88 , and a display control unit 90 .
- Those functions are implemented by the microprocessor 23 operating according to programs read from the optical disk 42 .
- the game data storage unit 80 is mainly implemented by the main memory 24 and the memory card 43 .
- the game data storage unit 80 stores information necessary for executing the game.
- the game data storage unit 80 stores animation information indicating how the game character 51 moves its body.
- the game data storage unit 80 stores reference action information for identifying an action to be performed by the player 100 .
- FIG. 17 is a diagram illustrating an example of the reference action information.
- time information indicating a timing at which an action is to be performed and information for identifying an action to be performed by the player 100 are stored.
- the time information indicates, for example, an elapsed time after the game is started.
- a time t 1 indicates that the player 100 should perform an action of putting their right foot forward.
- the game character 51 plays a role of showing an action to be performed by the player 100 , and thus, when the time t 1 arrives, the game character 51 performs an action that looks like putting its right foot forward.
- the animation information is created in such a manner as to correspond to the reference action information illustrated in FIG. 17 . Specifically, every time a time indicated by the time information stored in the reference action information arrives, the game character 51 performs a predetermined animation action based on the animation information.
- the game data storage unit 80 stores action determination criterion information, which serves as a condition for making a determination as to the action of the player based on the three-dimensional position information.
- FIG. 18 is a diagram illustrating an example of the action determination criterion information.
- the action determination criterion information for example, information for identifying the movement of the body of the player 100 and a determination criterion to be satisfied by the three-dimensional position information are stored in association with each other.
- the determination criterion includes, for example, a change amount, a change direction, a change speed, and the like of the three-dimensional coordinate of each part of the player 100 .
- the determination criterion is a condition to be satisfied by the motion vector (three-dimensional vector) of each part of the player 100 .
- “putting the right foot forward” is the movement of the body which is stored in the action determination criterion information
- conditions relating to the change amounts, the change directions, and the change speeds of the sets of the three-dimensional coordinates of the right heel P 13 and the right toe P 15 are associated with this movement of the body.
- the change amounts, the change directions, and the change speeds of the sets of the three-dimensional coordinates of the right heel P 13 and the right toe P 15 satisfy the conditions stored in the action determination criterion information, it is determined that the player 100 has put their right foot forward.
- the action of the player 100 is determined based on whether or not the three-dimensional coordinates indicated by the three-dimensional position information satisfy the conditions stored in the action determination criterion information.
- the determination criterion information is obtained by storing information for making a determination as to dancing of the player 100 . It should be noted that the determination criterion information may be stored in a ROM (not shown) or the like of the game device 20 .
- the game data storage unit 80 stores, for example, determination subject space information for identifying the determination subject space 70 .
- determination subject space information for identifying the determination subject space 70 .
- the shape of the determination subject space 70 is such a truncated pyramid as illustrated in FIG. 7
- the length of each side of the determination subject space 70 and information indicating the representative point 71 are stored. That is, based on those items of information, the position of the determination subject space is identified. Further, the length of each side of the determination subject space 70 may be a value determined in advance.
- the initial position of the representative point 71 is determined.
- the position of the determination subject space 70 is determined so as to contain the position of the player 100 when the game is started.
- the representative point 71 is determined so as to correspond to the standing position of the player 100 at the time starting the game.
- the representative point 71 may be a point located along the line-of-sight of the CCD camera 2 .
- the determination subject space information may be any information as long as the information allows the position and the size of the determination subject space 70 to be identified.
- the determination subject space information may be information indicating the upper left vertices and the lower right vertices of the top surface and the bottom surface of the determination subject space 70 and information indicating the representative point 71 .
- the game data storage unit 80 stores information for identifying the detectable space 60 .
- this information may be any information as long as the information allows the position and the size of the detectable space 60 to be identified.
- the position acquiring unit 82 is mainly implemented by the microprocessor 23 .
- the position acquiring unit 82 acquires the three-dimensional position information ( FIG. 5 ) from position information generating means (microprocessor 10 ) for generating the three-dimensional position information relating to the position of the player 100 in the three-dimensional space based on the photographed image acquired from the position detecting device (photographing unit 12 ) for photographing the player 100 and the depth information relating to a distance between the measurement reference position of the depth measuring means (depth measuring unit 13 ) and the player 100 .
- the position acquiring unit 82 acquires the three-dimensional position information generated by the microprocessor 10 (position information generating means) of the position detecting device 1 .
- the determination unit 84 is mainly implemented by the microprocessor 23 .
- the determination unit 84 determines whether or not the position of the player 100 in the three-dimensional space is contained in the determination subject space 70 . For example, in a case where any one of the sets of the three-dimensional coordinates contained in the three-dimensional position information is outside the determination subject space 70 , it is determined that the position of the player corresponding to the three-dimensional position information is not contained in the determination subject space 70 of the position detecting device 1 .
- the determination method performed by the determination unit 84 may be any method as long as the method is performed based on the three-dimensional position information and the determination subject space 70 , and that the determination method of the determination unit 84 is not limited thereto.
- the determination method of the determination unit 84 is not limited thereto. For example, in a case where sets of the three-dimensional coordinates corresponding to a plurality of (for example, three) portions of a plurality of (for example, sixteen) parts of the player 100 indicated by the three-dimensional position information are outside the determination subject space 70 , it may be determined that the position of the player 100 is not contained in the determination subject space 70 .
- the game processing execution unit 86 is mainly implemented by the microprocessor 23 .
- the game processing execution unit 86 executes game processing based on a result of a determination made by the determination unit 84 . Details of operation of the game processing execution unit 86 are described later (see S 105 , S 106 , and S 107 of FIG. 19 ).
- the determination subject space changing unit 88 is mainly implemented by the microprocessor 23 . In a case where it is determined that the position of the player 100 in the three-dimensional space is not contained in the determination subject space 70 , the determination subject space changing unit 88 changes the position of the determination subject space 70 based on the position of the player 100 in the three-dimensional space. Details of operation of the determination subject space changing unit 88 are described later (see S 108 and S 109 of FIG. 19 ).
- the display control unit 90 is mainly implemented by the microprocessor 23 .
- the display control unit 90 displays the game screen 50 on the display unit 40 .
- the display control unit 90 causes display means (display unit 40 ) to display the game screen 50 containing the game character 51 and the focused area (spotlight area 53 ) having its lightness set higher than that of the other area.
- the display control unit 90 includes means for controlling the positional relation between the display position of the game character 51 and the display position of the focused area based on the positional relation between the position of the player 100 in the three-dimensional space and the determination subject space 70 . Details of operation of the display control unit 90 are described later (see S 102 of FIG. 19 ).
- FIG. 19 is a flow chart illustrating an example of processing to be executed on the game device 20 .
- the processing of FIG. 19 is executed by the microprocessor 23 operating according to programs read from the optical disk 42 .
- the processing of FIG. 19 is executed at predetermined time intervals (for example, every 1/60th of a second).
- the microprocessor 23 acquires the three-dimensional position information of the player 100 (S 101 ).
- the microprocessor 23 changes the position of the game character 51 to be displayed on the game screen 50 (S 102 ).
- the display position of the game character 51 is changed based on the positional relation between the three-dimensional position information of the player 100 and the representative point 71 .
- a determination is made as to the positional relation between the three-dimensional coordinates of the waist P 10 contained in the three-dimensional position information of the player 100 and the representative point 71 .
- a direction D from the representative point 71 of the determination subject space 70 toward the three-dimensional coordinate of the waist P 10 of the player and a distance L therebetween are acquired ( FIG. 7 ).
- the display position of the game character 51 is changed so that the positional relation between the display position of the game character 51 and a guidance position 55 of the spotlight area 53 corresponds to the positional relation between the three-dimensional coordinate of the waist P 10 of the player and the representative point 71 of the determination subject space 70 .
- the display position of the game character 51 is changed from the guidance position 55 of the spotlight area 53 to a position 57 obtained by shifting the guidance position 55 of the spotlight area 53 by a distance Ls corresponding to the above-mentioned distance L in a direction Ds corresponding to the above-mentioned direction D.
- the direction Ds and the distance Ls are respectively calculated based, for example, on the direction D or the distance L and a predetermined mathematical expression.
- the predetermined mathematical expression may be, for example, a predetermined matrix (for example, projection matrix) for transforming a three-dimensional vector to a two-dimensional vector.
- the guidance position 55 which is a position for guiding the game character 51 , corresponds to the representative point 71 .
- the guidance position 55 is a position located a predetermined distance above a center point of the spotlight area 53 .
- the display position of the game character 51 is controlled. Specifically, by referring to the positional relation between the game character 51 and the spotlight area 53 which are displayed on the game screen 50 , the player 100 can recognize whether or not the position of the body of the player 100 is out of the determination subject space 70 .
- the player can adjust their own standing position. Further, in a case where the game character 51 has moved out of the spotlight area 53 , the game character 51 becomes less visible, and hence it is conceivable that the player will unconsciously adjust their own standing position so that the game character 51 is positioned within the spotlight area 53 . By controlling the display position of the game character 51 as described above, it also becomes possible to make the player unconsciously adjust their own standing position.
- the microprocessor 23 (display control unit 90 ) updates the posture of the game character 51 displayed on the game screen 50 based on animation data (S 103 ).
- the microprocessor 23 determines whether or not at least one position of the body of the player 100 indicated by the three-dimensional position information is outside the determination subject space 70 (S 104 ).
- the determination of S 104 is performed by, for example, comparing the three-dimensional position information and the determination subject space information. Specifically, for example, a determination is made based on whether or not the three-dimensional coordinates ( FIG. 5 ) contained in the three-dimensional position information are inside the determination subject space 70 ( FIG. 7 ).
- the microprocessor 23 determines whether or not the player 100 has moved their body according to the movement of the body of the game character 51 (S 105 ).
- S 105 it is determined whether or not the player 100 has performed an action similar to the action (movement of the body) performed by the game character 51 . This determination is executed based, for example, on the three-dimensional position information, the reference action information ( FIG. 17 ), and the determination criterion information ( FIG. 18 ).
- the reference action information is the data storage example illustrated in FIG. 17 , for example, it is indicated that the game character 51 puts its right foot forward at the time t 1 .
- the reference time is, for example, a time within a predetermined period including times (for example, time t 1 ) stored in the reference action information.
- the “predetermined period” is, for example, a period from a start time that is a predetermined time before the reference time until an end time that is a predetermined time after the reference time.
- the microprocessor 23 displays the message 54 such as “GOOD” on the game screen 50 (S 106 ).
- the microprocessor 23 does not display such a message as in S 106 and ends the processing.
- the microprocessor 23 displays the message 54 such as “CAUTION” on the game screen (S 107 ).
- the microprocessor 23 determines whether or not a state in which at least one position of the body of the player is outside the determination subject space 70 has continued for a reference period (for example, three seconds) (S 108 ).
- the microprocessor 23 determines whether the state in which at least one position of the body of the player is outside the determination subject space 70 has continued for the reference period (S 108 ; Y).
- the three-dimensional coordinate of the waist P 10 in the three-dimensional position information is referred to. Subsequently, the position of the determination subject space 70 is changed so that the representative point 71 of the determination subject space 70 coincides with the three-dimensional coordinate corresponding to the waist P 10 of the player.
- FIG. 20 is a diagram illustrating the determination subject space 70 after the change. As illustrated in FIG. 20 , the position of the determination subject space 70 is changed so that the position of the waist P 10 of the player 100 coincides with the representative point 71 .
- the change method for the position of the determination subject space 70 performed in S 109 may be any method as long as the position of the player 100 is contained in the determination subject space 70 based on the position of the player 100 indicated by the three-dimensional position information, and that the change method is not limited to the above-mentioned example.
- the position of the determination subject space 70 may be changed so that an average value of the sets of three-dimensional coordinates contained in the three-dimensional position information coincides with the representative point 71 .
- the player 100 in a case where the player 100 has moved closer to an obstacle, it is conceivable that the player 100 will notice that fact within a predetermined time period and return to the original position.
- the position of the determination subject space 70 is changed so that the position of the body of the player is contained in the determination subject space 70 .
- the microprocessor 23 ends the processing. Specifically, in this case, the microprocessor 23 does not perform the processing of displaying the message 54 such as “GOOD” based on the movement of the body of the player 100 . In other words, by making no response to the dance action performed by the player 100 , it is also possible to make the player 100 understand that the player 100 is not in the determination subject space 70 .
- the game device 20 when the game processing is executed, whether or not the message 54 such as “CAUTION” is to be displayed or whether or not detection of the action of the player 100 is to be avoided is determined based on whether or not the player 100 is in the determination subject space 70 . Further, the position of the determination subject space 70 is changed based on the position of the player 100 in the three-dimensional space, and hence it is possible to change the determination subject space 70 for the player 100 to be able to continue the gameplay while prompting the player 100 to stay in the determination subject space 70 .
- the positional relation between the display position of the game character 51 and the display position of the spotlight area 53 is controlled. For example, in the case where the position of the body of the player 100 is out of the determination subject space 70 , the game character 51 is located outside the spotlight area 53 (see FIG. 11 and FIG. 13 ).
- the player 100 can recognize whether or not the position of the body of the player 100 is out of the determination subject space 70 .
- the player 100 can know that their standing position has changed. Therefore, the player 100 can adjust their own standing position.
- the game character 51 has moved out of the spotlight area 53 , the game character 51 becomes less visible. Accordingly, it is conceivable that the player 100 will try to unconsciously adjust their own standing position so that the game character 51 is located within the spotlight area 53 . As described above, according to the game device 20 , it is also possible to make the player 100 unconsciously adjust their own standing position.
- the position of the determination subject space 70 is changed so that the position of the body of the player 100 is contained in the determination subject space 70 (see FIG. 20 ).
- the game device 20 is capable of preventing the player from feeling such confusion.
- the display position of the spotlight area 53 (and the spotlight 52 ) may be changed so that the positional relation between the display position of the game character 51 and the guidance position 55 of the spotlight area 53 corresponds to the positional relation between the position of the player 100 (for example, the three-dimensional coordinate of the waist P 10 ) and the representative point 71 of the determination subject space 70 .
- the spotlight area 53 may be moved instead of moving the game character 51 .
- the display positions of both the game character 51 and the spotlight area 53 may be changed so that the positional relation between the display position of the game character 51 and the guidance position 55 of the spotlight area 53 corresponds to the positional relation between the position of the player 100 (for example, the three-dimensional coordinate of the waist P 10 ) and the representative point 71 of the determination subject space 70 .
- the display position of the game character 51 may be moved to the right-hand side, the left-hand side, or the like of the game screen 50 .
- the display position of the game character 51 is changed as described above.
- FIG. 21 is a diagram illustrating a case where the display position of an image contained in the game screen 50 has been changed.
- the game screen 50 illustrated in FIG. 21 is displayed, for example, in a case where the player 100 has moved to the right-hand side of the determination subject space 70 with respect to the position detecting device 1 .
- the position of the virtual camera is changed to the left, and thus, for example, the game character 51 located in the vicinity of the center of the game screen 50 is moved to the right in relation to a center point of the game screen 50 .
- the vicinity of a left end portion of the game screen 50 is displayed in black.
- the width and the position of the area 50 a are determined based, for example, on the distance L and the direction D between the representative point 71 and the waist P 10 of the player 100 . It seems to the player 100 that nothing is displayed in the area 50 a located in the vicinity of the left end portion of the game screen 50 . In this case, it is conceivable that the player 100 will move to their left, trying to move the display position of the game character 51 back to the original position. Thus, according to the game screen 50 , it is possible to guide the player 100 to the inside of the determination subject space 70 .
- the display control of the game screen 50 By performing the display control of the game screen 50 as described above, it is possible to notify the player 100 that their standing position is displaced, without changing the relative positions of respective images (game character 51 and the like) contained in the game screen 50 .
- the description above is directed to the case where the position of the virtual camera is changed.
- the notification of the standing position of the player 100 may be performed by changing the angle of view or the line-of-sight of the virtual camera.
- the game screen 50 only needs to show the positional relation between the display positions of the standing position of the player 100 and the representative point 71 of the determination subject space 70 , and thus the example of the game screen 50 is not limited to the example of this embodiment.
- FIG. 22 is a diagram illustrating another example of the game screen 50 .
- a player character 51 a first game character
- an instructor character 51 b second game character
- the player 100 moves their body according to the movement of the instructor character 51 b . Then, based on the movement of the player 100 , the player character 51 a performs an action. Similarly to the embodiment, in a case where the player 100 has succeeded in performing the action, the message 54 such as “GOOD” is displayed.
- the player character 51 a and the instructor character 51 b may move in the same manner, and the player 100 may move their body according to the movement of the player character 51 a and the instructor character 51 b.
- the positional relation between the player character 51 a and the instructor character 51 b is changed based on the positional relation between the position of the player 100 and the determination subject space 70 .
- the player character 51 a is displayed substantially in front of the instructor character 51 b as illustrated in FIG. 22 .
- the player character 51 a is displayed at a position displaced far from the instructor character 51 b as illustrated in FIG. 23 , for example. Further, the message 54 such as “CAUTION” is displayed.
- the player character 51 a is displayed at a position significantly displaced sideways from the instructor character 51 b as illustrated in FIG. 24 , for example.
- processing similar to the processing of S 102 of FIG. 19 is executed. Specifically, at least one of the display positions of the player character 51 a and the instructor character 51 b is changed so that the positional relation between the display position of the player character 51 a and the display position of the instructor character Sib corresponds to the positional relation between the position of the player 100 and the representative point 71 of the determination subject space 70 .
- the three-dimensional coordinate of the waist P 10 of the player 100 are referred to.
- a determination is made as to the positional relation between the three-dimensional coordinate of the waist P 10 of the player 100 and the representative point 71 of the determination subject space 70 .
- a difference between the three-dimensional coordinate of the waist P 10 of the player 100 and the representative point 71 of the determination subject space 70 is acquired.
- the direction D from the representative point 71 of the determination subject space 70 toward the three-dimensional coordinate of the waist P 10 of the player 100 and the distance L therebetween are acquired.
- the display position of the player character 51 a is changed so that the positional relation between the display position of the player character 51 a and the display position of the instructor character 51 b corresponds to the positional relation between the three-dimensional coordinate of the waist P 10 of the player 100 and the representative point 71 of the determination subject space 70 .
- the display position of the player character 51 a is changed from a basic position 56 set in front of the instructor character 51 b to a position 57 obtained by shifting the basic position 56 by the distance Ls corresponding to the above-mentioned distance L in the direction Ds corresponding to the above-mentioned direction D.
- the player 100 can recognize whether or not the position of the body of the player 100 is out of the determination subject space 70 .
- the player 100 can know that their standing position has changed, and accordingly can adjust their own standing position. Therefore, the player 100 can play the game in a relatively safe place within the determination subject space 70 .
- the player 100 in the case where the position of the player character 51 a is displaced from the front of the instructor character 51 b or displaced far from the instructor character 51 b , the player 100 conceivably feels difficulty in imitating the movement of the instructor character 51 b . Therefore, it is conceivable that the player 100 will unconsciously adjust their own standing position so that the position of the player character 51 a is set to the front of the instructor character 51 b.
- the display control as illustrated in FIG. 21 may be performed. Specifically, without changing the relative positions of the player character 51 a and the instructor character 51 b , the display positions of the player character 51 a and the instructor character 51 b may be moved to the right-hand side, the left-hand side, or the like of the game screen 50 . In this case, too, similarly to the case illustrated in FIG. 21 , by changing the position of the virtual camera, for example, the display positions of the player character 51 a and the instructor character 51 b are changed as described above.
- FIG. 25 is a diagram illustrating a case where the display position of an image contained in the game screen 50 has been changed.
- the game screen 50 illustrated in FIG. 25 is displayed, for example, in the case where the player 100 has moved to the right-hand side of the determination subject space 70 with respect to the position detecting device 1 .
- the position of the virtual camera is changed to the left, and thus, for example, the player character 51 a and the instructor character 51 b located in the vicinity of the center of the game screen 50 are moved to the right in relation to the center point of the game screen 50 .
- the area 50 a is displayed.
- the player 100 will move to their left so as to move the display positions of the player character 51 a and the instructor character 51 b back to the original positions.
- the game screen 50 it is possible to guide the player 100 to the inside of the determination subject space 70 .
- the three-dimensional position information indicating the position of the player 100 has been described by taking, as an example, the data storage example illustrated in FIG. 5 .
- the three-dimensional position information transmitted from the position detecting device 1 may be any information as long as the information allows the position (for example, standing position) of the player 100 to be identified, and thus the data storage example is not limited to the example of FIG. 5 .
- the three-dimensional position information may be such information that indicates a distance and a direction from a reference point of the player 100 (for example, a point corresponding to the head) to each part of the body.
- the position information generating means for generating the three-dimensional position information based on the photographed image and the depth information (depth image) is included in the position detecting device 1 .
- the position information generating means may be included in the game device 20 .
- the game device 20 may receive the photographed image and the depth image from the position detecting device 1 , to thereby generate the three-dimensional position information based on those images.
- the description above has been given by taking, as a method of analyzing the movement of the player 100 based on the three-dimensional position information, the example in which a comparison is made between the action determination criterion information illustrated in FIG. 18 and the change amount, the change direction, the change speed, etc. of the three-dimensional coordinate of each part of the player 100 .
- the analysis method for the movement of the player 100 may be any method as long as the method is performed based on the three-dimensional position information, and thus the analysis method is not limited to the above-mentioned example.
- the movement of the player 100 may be analyzed based on values acquired by substituting the three-dimensional coordinates contained in the three-dimensional position information into a predetermined mathematical expression.
- control may be performed that prevents the determination subject spaces 70 corresponding to the respective players 100 from overlapping each other.
- the three-dimensional position information contains sets of the three-dimensional coordinates for the two players.
- control may be performed so as to prevent such change.
- the determination subject space changing unit 88 may include means for inhibiting change in the case where changing the position of the determination subject space 70 corresponding to one player 100 causes the changed determination subject space 70 to overlap the determination subject space 70 corresponding to another player 100 .
- the reference points which are referred to when the display control unit 90 controls the display position and which represent the positional relation between the position of the player 100 and the determination subject space 70 , are set to the three-dimensional coordinate of the waist P 10 and the representative point 71 , respectively.
- the display control unit 90 only needs to determine whether or not to control the display position based on the three-dimensional position information corresponding to the player 100 and information identifying the position of the determination subject space 70 , and thus information items to be compared are not limited to the above-mentioned example. For example, a comparison may be made between an average value of sets of the three-dimensional coordinates contained in the three-dimensional position information and one arbitrary point within the determination subject space 70 .
- the method of measuring the depth of the player 100 has been described by taking, as an example, the case of performing calculation based on the TOF of the infrared light.
- the measuring method is not limited to the example of this embodiment.
- a method of performing triangulation, a method of performing three-dimensional laser scanning, or the like may be applied.
- the description has been given by taking the example in which the depth information is acquired as the depth image, but the depth information is not limited thereto.
- the depth information may be any information as long as the information allows the depth of the player 100 to be identified, and hence the depth information may be a value indicating the TOF, for example.
- the determination subject space 70 has been described by taking, as an example, the shape illustrated in FIG. 7 , but the shape of the determination subject space 70 is not limited thereto.
- the determination subject space 70 may have any shape as long as the shape allows the position at which the player 100 should be standing to be identified, and hence the determination subject space 70 may have a spherical shape, for example.
- the game data storage unit 80 stores the representative point 71 of the determination subject space 70 (for example, center point of the sphere) and information for identifying the radius of the sphere.
- the position of the determination subject space 70 is changed with the size thereof kept as it is.
- the position of the determination subject space 70 may be changed. Specifically, in the case where the state in which the player 100 is out of the determination subject space 70 has continued for the predetermined time period, the position of the determination subject space 70 may be changed in such a manner that the determination subject space 70 is enlarged in a direction in which the player 100 is out of the determination subject space 70 .
- the game device 20 makes a determination as to the movement of the player 100 based on the three-dimensional position information.
- the position detecting device 1 may make a determination as to the movement of the player 100 .
- the determination criterion information ( FIG. 18 ) is stored in the position detecting device 1 . Specifically, a determination is made as to the movement of the player 100 by the position detecting device 1 , and only information indicating a result of the determination is transmitted to the game device 20 .
- the display control processing performed by the display control unit 90 is not limited to this embodiment and the modified examples ( FIG. 11 , FIG. 13 , FIG. 21 , FIG. 23 , FIG. 24 , and FIG. 25 ). It is only necessary to perform such display control that notifies the player 100 that the player 100 is out of the determination subject space 70 .
- the entire determination subject space 70 may correspond to the entire display area of the game screen 50 .
- the game character 51 may be made invisible on the game screen 50 .
- the display control unit 90 may perform predetermined image processing on an image contained in the game screen 50 .
- noise processing may be performed on the game character 51 , which is a focus target of the player 100 , to thereby make the game character 51 less visible.
- the dance game has been described as an example of the game to be executed on the game device 20 .
- the game to be executed on the game device 20 may be any game as long as the movement of the player 100 is detected to execute the game processing, and thus the kind of the game to be executed is not limited thereto.
- the game may be a sport game such as a soccer game, a fighting game, or the like.
Abstract
A position acquiring unit acquires, from a position information generating unit, three-dimensional position information relating to a position of a player, the position information generating unit generating the three-dimensional position information based on a photographed image acquired from a photographing unit for photographing the player and depth information relating to a distance between a measurement reference position of a depth measuring unit and the player. A determination unit determines whether or not the position of the player is contained in a determination subject space. A game processing execution unit executes game processing based on a result of the determination made by the determination unit. A determination subject space changing unit changes, in a case where it is determined that the position of the player is not contained in the determination subject space, a position of the determination subject space based on the position of the player.
Description
- The present application claims priority from Japanese application JP2010-059465 filed on Mar. 16, 2010, the content of which is hereby incorporated by reference into this application.
- 1. Field of the Invention
- The present invention relates to a game device, a control method for a game device, and a non-transitory information storage medium.
- 2. Description of the Related Art
- There is known a game in which an image obtained by photographing a player with a camera is used. For example, JP 2005-287830 A describes the following technology. That is, an image obtained by photographing the player and a reference game image stored in advance are synthesized, and the synthesized image is displayed on a monitor, to thereby enable the player to understand a movement that the player should make in the game.
- In recent years, studies have been made on a game in which, in addition to the image obtained by photographing the player, distance information acquired by using an infrared sensor (for example, distance between the player and the infrared sensor) is used. For example, based on the image obtained by photographing the player and the distance information, a determination can be made as to a position and a movement of the player.
- In such a game, the player moves their (his/her) body to play the game, and therefore the standing position of the player is sometimes displaced. As a result, there is a risk of the player hitting an obstacle in their surroundings. To address this, it is conceivable to narrow the photographing range of a camera so that the player does not go out of a predetermined range. However, in this case, the player is more liable to go out of the photographing range, and hence there is a risk that some problem will occur to the gameplay of the player.
- The present invention has been made in view of the above-mentioned problems, and therefore has an object to provide a game device, a control method for a game device, and a non-transitory information storage medium, which are capable of dealing with displacement in position of a player during gameplay.
- In order to solve the above-mentioned problems, a game device according to the present invention includes: position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; game processing execution means for executing game processing based on a result of the determination made by the determination means; and determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
- Further, a control method for a game device according to the present invention includes: a position acquiring step of acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; a determination step of determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; a game processing execution step of executing game processing based on a result of the determination made in the determination step; and a determination subject space changing step of changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
- Further, a program according to the present invention causes a computer to function as a game device including: position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; game processing execution means for executing game processing based on a result of the determination made by the determination means; and determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
- Further, a non-transitory computer-readable information storage medium according to the present invention is a non-transitory computer-readable information storage medium having the above-mentioned program recorded thereon.
- According to the present invention, it is possible to deal with the displacement in position of the player during gameplay.
- Further, according to one aspect of the present invention, the determination subject space changing means includes: means for determining whether or not a state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for a reference period; and means for changing the position of the determination subject space in a case where the state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for the reference period.
- Further, according to one aspect of the present invention, the game device further includes display control means for causing display means to display a game screen containing a game character and a focused area having lightness thereof set higher than lightness of another area, in which the display control means includes means for controlling positional relation between a display position of the game character and a display position of the focused area based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.
- Further, according to one aspect of the present invention, the game device further includes display control means for causing display means to display a game screen containing a first game character and a second game character, in which the display control means includes means for controlling a positional relation between a display position of the first game character and a display position of the second game character based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.
- In the accompanying drawings:
-
FIG. 1 is a diagram illustrating a positional relation among a position detecting device, a game device, and a player; -
FIG. 2 is a diagram illustrating an example of a photographed image generated by a CCD camera; -
FIG. 3 is a diagram for describing a method of measuring a depth of the player, which is performed by an infrared sensor; -
FIG. 4 is a diagram illustrating an example of a depth image acquired by the infrared sensor; -
FIG. 5 is a diagram illustrating an example of three-dimensional position information generated by the position detecting device; -
FIG. 6 is a diagram illustrating a position of the player, which is identified by the three-dimensional position information; -
FIG. 7 is a diagram illustrating a space to be photographed by the position detecting device; -
FIG. 8 is a diagram illustrating an example of a game screen displayed by the game device; -
FIG. 9 is a diagram illustrating, as an example, the game screen displayed by the game device in a case where the player has stepped out of a determination subject space; -
FIG. 10 is a diagram illustrating the position detecting device and the player viewed from an Xw-Zw plane; -
FIG. 11 is an example of the game screen displayed in the case where the player has stepped out of the determination subject space; -
FIG. 12 is a diagram illustrating the position detecting device and the player viewed from an Xw-Yw plane; -
FIG. 13 is an example of the game screen displayed in the case where the player has stepped out of the determination subject space; -
FIG. 14 is a diagram illustrating a hardware configuration of the position detecting device; -
FIG. 15 is a diagram illustrating a hardware configuration of the game device; -
FIG. 16 is a functional block diagram illustrating a group of functions to be implemented on the game device; -
FIG. 17 is a diagram illustrating an example of reference action information; -
FIG. 18 is a diagram illustrating an example of action determination criterion information; -
FIG. 19 is a flow chart illustrating an example of processing to be executed on the game device; -
FIG. 20 is a diagram illustrating the determination subject space after change; -
FIG. 21 is a diagram illustrating a case where a display position of an image contained in the game screen has been changed; -
FIG. 22 is a diagram illustrating another example of the game screen; -
FIG. 23 is a diagram illustrating an example of the game screen; -
FIG. 24 is a diagram illustrating an example of the game screen; and -
FIG. 25 is a diagram illustrating a case where the display position of an image contained in the game screen has been changed. - Hereinafter, detailed description is given of an example of an embodiment of the present invention with reference to the drawings.
- A game device according to the embodiment of the present invention is implemented by, for example, a home-use game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer. In this specification, description is given of a case where the game device according to the embodiment of the present invention is implemented by a home-use game machine.
-
FIG. 1 is a diagram illustrating a positional relation among aposition detecting device 1, agame device 20, and aplayer 100. As illustrated inFIG. 1 , theplayer 100 is positioned, for example, in front of theposition detecting device 1. Theposition detecting device 1 and thegame device 20 are connected to each other so as to be able to communicate data therebetween. Further, theplayer 100 plays a game in, for example, a living room where items of furniture F are placed. - The
position detecting device 1 generates information relating to a position of theplayer 100 based on an image acquired by photographing theplayer 100 and information relating to a distance between theposition detecting device 1 and theplayer 100. For example, theposition detecting device 1 detects sets of three-dimensional coordinates corresponding to a plurality of parts (for example, head, shoulder, etc.) constituting the body of theplayer 100. - The
game device 20 acquires the information relating to the position of theplayer 100 from theposition detecting device 1. For example, thegame device 20 acquires a three-dimensional coordinate that indicate a standing position of theplayer 100 in a three-dimensional space from theposition detecting device 1. Thegame device 20 controls the game based on changes in the three-dimensional coordinate. - A change in the three-dimensional coordinate associated with the
player 100 corresponds to an action of theplayer 100. For example, in a case where theplayer 100 has performed an action of raising their right hand, sets of the three-dimensional coordinates corresponding to the right elbow and the right hand of theplayer 100 mainly change. - Next, description is given of processing in which the
position detecting device 1 generates the information relating to the position of the player 100 (three-dimensional position information). As illustrated inFIG. 1 , theposition detecting device 1 includes, for example, aCCD camera 2, aninfrared sensor 3, and a microphone 4 including a plurality of microphones. In this embodiment, the three-dimensional position information of theplayer 100 is generated based on information acquired from theCCD camera 2 and theinfrared sensor 3. - The
CCD camera 2 is a publicly-known camera comprising a CCD image sensor. TheCCD camera 2 photographs theplayer 100. For example, theCCD camera 2 generates a still image (for example, RGB digital image) by photographing theplayer 100 at predetermined time intervals (for example, every 1/60th of a second). Hereinafter, the still image generated by theCCD camera 2 is referred to as a photographed image. The photographed image contains an object located within a field of view of theCCD camera 2. -
FIG. 2 is a diagram illustrating an example of the photographed image generated by theCCD camera 2. As illustrated inFIG. 2 , the photographed image contains, for example, theplayer 100. It should be noted that in a case where the items of furniture F, the floor and the wall of the living room, and the like are contained within the field of view of theCCD camera 2, the photographed image contains those objects, which are omitted inFIG. 2 for simplicity of description. - In the photographed image, there are set an Xs-axis and a Ys-axis, which are orthogonal to each other. For example, the upper left corner of the photographed image is set as an origin point Os (0,0). Further, for example, the lower right corner of the photographed image is set as a coordinate Pmax (Xmax,Ymax). The position of each pixel in the photographed image is identified by a two-dimensional coordinate (Xs-Ys coordinate) that is assigned to each pixel.
- The
infrared sensor 3 is formed of, for example, an infrared emitting device and an infrared receiving device (for example, infrared diodes). Theinfrared sensor 3 detects reflected light obtained by emitting infrared light. Theinfrared sensor 3 measures the depth of a subject (for example, player 100) based on a detection result of the reflected light. - The depth of a subject is a distance between a measurement reference position (for example, position of the infrared receiving device of the infrared sensor 3) and the position of the subject. The measurement reference position is a position that serves as a reference in measuring the depth of the position of the
player 100. The measurement reference position may be a predetermined position associated with the position of theposition detecting device 1. Theinfrared sensor 3 measures the depth of theplayer 100 based, for example, on a time of flight (TOF), which is a time required for theinfrared sensor 3 to receive reflected light after emitting infrared light. -
FIG. 3 is a diagram for describing a method of measuring the depth of theplayer 100, which is performed by theinfrared sensor 3. As illustrated inFIG. 3 , theinfrared sensor 3 emits pulsed infrared light at predetermined intervals. The infrared light emitted from theinfrared sensor 3 spreads spherically with an emission position of theinfrared sensor 3 at the center. - The infrared light emitted from the
infrared sensor 3 strikes surfaces of, for example, the body of theplayer 100 and other objects (for example, furniture F, walls, etc.) located in the living room. The infrared light that has struck those surfaces is reflected. The reflected infrared light is detected by the infrared receiving device of theinfrared sensor 3. Specifically, theinfrared sensor 3 detects reflected light having a phase shifted by 180° from that of the emitted infrared light. - For example, as illustrated in
FIG. 3 , in a case where theplayer 100 is holding out both hands, those held-out hands are closer to theinfrared sensor 3 than the torso of theplayer 100. Specifically, the TOF of the infrared light reflected by both hands of theplayer 100 is shorter than the TOF of the infrared light reflected by the torso of theplayer 100. - The value determined as follows corresponds to the distance between the measurement reference position and the player 100 (that is, depth). Specifically, the value is determined by multiplying a time required for the
infrared sensor 3 to detect the reflected light after emitting the infrared light (that is, TOF) by the speed of the infrared light and then dividing the resultant value by two. In this manner, theinfrared sensor 3 can measure the depth of theplayer 100. - Further, the
infrared sensor 3 can also detect an outline of a subject (player 100) by detecting depth differences acquired from the reflected infrared light. - Specifically, the fact that the
infrared sensor 3 receives the reflected infrared light as described above means that an object is located at that place. If there is no other object located behind the object, the depth difference between the object and the surroundings of the object is large. Specifically, for example, the depth difference is large between a depth acquired by the infrared light reflected from theplayer 100 and a depth acquired by the infrared light reflected from the wall behind theplayer 100, and hence it is possible to detect the outline of the object by joining portions having the depth differences larger than a predetermined value. - It should be noted that the method of detecting the outline of an object is not limited to the above-mentioned example. Alternatively, for example, the outline may be detected based on the brightness of each pixel of the photographed image acquired by the
CCD camera 2. In this case, it is equally possible to detect the outline of the object by, for example, joining portions having large brightness differences among the pixels. - It should be noted that the light that has returned to the
infrared sensor 3 may be subjected to predetermined filtering processing. Specifically, noise may be reduced by employing such a configuration that only reflected light corresponding to the infrared light emitted by theinfrared sensor 3 is detected by a light detection sensor. - Information relating to the depth of the player 100 (depth information), which is detected as described above, is expressed as, for example, a depth image. In this embodiment, description is given by taking, as an example, a case where the depth information is expressed as a gray-scale depth image (for example, 256-bit gray-scale image data).
-
FIG. 4 is a diagram illustrating an example of the depth image acquired by theinfrared sensor 3. As illustrated inFIG. 4 , for example, an object located close to theinfrared sensor 3 is expressed as bright (brightness is high), and an object located far from theinfrared sensor 3 is expressed as dark (brightness is low). For example, in a case where the depth image is expressed as the 256-bit gray-scale image data, the depth of theplayer 100 corresponds to the brightness (pixel value) of the depth image. Specifically, for example, for every 2-cm change in depth of theplayer 100, the depth image is changed by one bit. This case means that theinfrared sensor 3 is capable of detecting the depth of the subject in units of 2 cm. - As illustrated in
FIG. 3 , in the case where theplayer 100 is holding out both hands, those held-out hands are closer to theinfrared sensor 3 than the torso of theplayer 100. In other words, the depth of both hands of theplayer 100 is smaller than that of the torso. Accordingly, as illustrated inFIG. 4 , pixels corresponding to both hands of theplayer 100 are expressed as brighter (brightness is higher) than pixels corresponding to the torso. - In this embodiment, similarly to the
CCD camera 2, theinfrared sensor 3 generates the depth image at predetermined time intervals (for example, every 1/60th of a second). Based on the photographed image acquired by theCCD camera 2 and the depth image acquired by theinfrared sensor 3, the three-dimensional position information is generated relating to the position of theplayer 100. - For example, there is generated such a composite image (RGBD data) that is obtained by adding the depth information (D: depth) indicated by the depth image to the photographed image (RGB data) acquired by the
CCD camera 2. In other words, the composite image contains, for each pixel, color information (lightness of each of R, G, and B) and the depth information. - It should be noted that in generating the composite image, the position of at least one of the photographed image and the depth image is corrected based on a positional distance between the
CCD camera 2 and theinfrared sensor 3. For example, in a case where theCCD camera 2 and theinfrared sensor 3 are spaced apart from each other by 2 cm in the horizontal direction, the coordinates of each pixel of the depth image are shifted by the number of pixels that corresponds to 2 cm, to thereby correct the position. - The three-dimensional position information is generated based on the composite image. In this embodiment, description is given by taking, as an example, a case where the three-dimensional position information represents the three-dimensional coordinate corresponding to each of the parts (for example, head, shoulder, etc.) of the body of the
player 100. - Specifically, for example, the three-dimensional position information is generated in the following manner.
- First, as described above, based on the depth image, pixels corresponding to the outline of the
player 100 are identified. Pixels enclosed within the outline of theplayer 100 are the pixels corresponding to the body of theplayer 100. - Next, in the photographed image, the color information (lightnesses of R, G, and B) of the above-mentioned pixels enclosed within the outline is referred to. Based on the color information of the photographed image, pixels corresponding to each part of the body of the
player 100 are identified. For this identification method, for example, a publicly-known method is applicable, such as a pattern matching method in which the object (that is, each part of the body of the player 100) is extracted from the image through a comparison with a comparison image (training image). - Alternatively, for example, pixels corresponding to the positions of the head, both elbows, etc. of the
player 100 may be identified by calculating a velocity vector of each part of the body based on a change in color information of each pixel of the photographed image and then detecting a motion vector of each pixel based on an optical flow representing the movement of the object (for example, gradient method or filtering method). - Based on the pixel values (RGBD values) of the pixels identified as described above, the three-dimensional coordinates of the head, both elbows, etc. of the
player 100 are calculated. For example, the three-dimensional coordinates are generated by carrying out predetermined matrix transformation processing on those pixel values. The matrix transformation processing is executed through, for example, a matrix operation similar to transformation processing performed in 3D graphics between two coordinate systems of a world coordinate system and a screen coordinate system. Specifically, the RGB value indicating the color information of the pixel and the D value indicating the depth are substituted into a predetermined determinant, to thereby calculate the three-dimensional coordinate of the pixel. That is, the three-dimensional coordinates of each part of theplayer 100 are calculated. - It should be noted that for the method of calculating the three-dimensional coordinate that correspond to a pixel based on the pixel value (RGBD value), a publicly-known method may be applied, and the calculation method is not limited to the above-mentioned example. Alternatively, for example, the coordinate transformation may be performed using a lookup table.
-
FIG. 5 is a diagram illustrating an example of the three-dimensional position information generated by theposition detecting device 1. As illustrated inFIG. 5 , as the three-dimensional position information, for example, each part of theplayer 100 and the three-dimensional coordinates are stored in association with each other. -
FIG. 6 is a diagram illustrating the position of theplayer 100, which is identified by the three-dimensional position information. In this embodiment, for example, a predetermined position corresponding to the position detecting device 1 (for example, the measurement reference position) is set as an origin point Ow. For example, the origin point Ow represents the three-dimensional coordinate corresponding to the measurement reference position of theinfrared sensor 3. It should be noted that the position of the origin point Ow may be set anywhere in the three-dimensional space in which theplayer 100 exists. For example, the three-dimensional coordinate corresponding to the origin point Os of the photographed image may be set as the origin point Ow. - As illustrated in
FIG. 6 , in this embodiment, description is given by taking, as an example, a case where sets of three-dimensional coordinates corresponding to, for example, the head P1, neck P2, right shoulder P3, left shoulder P4, right elbow P5, left elbow P6, right hand P7, left hand P8, chest P9, waist P10, right knee P11, left knee P12, right heel P13, left heel P14, a right toe P15, and a left toe P16 of theplayer 100 are acquired as the three-dimensional position information. - It should be noted that the part of the body of the
player 100, which is indicated by the three-dimensional position information, may be a part that is determined in advance from the player's skeletal frame. For example, any part of the body may be used as long as the part is identifiable by the above-mentioned pattern matching method. - In this embodiment, as described above, based on the photographed image and the depth image which are generated at the predetermined time intervals, the three-dimensional position information is generated at predetermined time intervals (for example, every 1/60th of a second). The generated three-dimensional position information is transmitted from the
position detecting device 1 to thegame device 20 at predetermined time intervals. - The
game device 20 receives the three-dimensional position information transmitted from theposition detecting device 1, and recognizes the position of the body of theplayer 100 based on the three-dimensional position information. Specifically, if theplayer 100 has performed an action of dancing or kicking a ball, the three-dimensional position information changes in response to this action, and hence thegame device 20 recognizes the movement of the player based on the change in three-dimensional position information. Thegame device 20 executes the game while recognizing the movement of the body of the player based on the three-dimensional position information, details of which are described later. - Next, description is given of a space in which the
position detecting device 1 can detect the player 100 (hereinafter, referred to as detectable space 60). -
FIG. 7 is a diagram illustrating a space to be photographed by theposition detecting device 1. As illustrated inFIG. 7 , the detectable space 60 (space enclosed with broken lines ofFIG. 7 ) is, for example, a predetermined space within the field of view of theCCD camera 2. The field of view of theCCD camera 2 is determined based, for example, on the line-of-sight and the angle of view of theCCD camera 2. - Of the space photographed by the position detecting device 1 (that is, space within the field of view), the
detectable space 60 is such a space as to allow accurate capturing of the movement of theplayer 100. - For example, in a case where the
position detecting device 1 and theplayer 100 are located too close to each other (for example, 1 meter or shorter), theposition detecting device 1 is unable to photograph the entire body of theplayer 100. In such a case, for example, if the head, the foot, or the like of theplayer 100 is not contained in the photographed image (FIG. 2 ), theposition detecting device 1 is unable to acquire accurate three-dimensional position information. Therefore, a space relatively close to theposition detecting device 1 is excluded from thedetectable space 60 even if the space is within the field of view of theCCD camera 2. - Further, for example, in a case where the
position detecting device 1 and theplayer 100 are located too far from each other (for example, 5 meters or longer), the infrared light is attenuated, which results in theposition detecting device 1 being unable to detect the reflected light. In such a case, theposition detecting device 1 is unable to acquire accurate depth information. Therefore, a space relatively far from theposition detecting device 1 is excluded from thedetectable space 60 even if the space is within the field of view of theCCD camera 2. - Further, for example, in a case where the standing position of the
player 100 is displaced in the horizontal direction (for example, Yw-axis direction), theposition detecting device 1 is unable to photograph the entire body of theplayer 100. In such a case, the right side of or the left side of the body of theplayer 100 is omitted from the photographed image (FIG. 2 ), and hence theposition detecting device 1 is unable to acquire accurate three-dimensional position information. Therefore, a space close to both end portions of the horizontal direction is excluded from thedetectable space 60 even if the space is within the field of view of theCCD camera 2. - As illustrated in
FIG. 7 , thedetectable space 60 is, for example, a space obtained by excluding the respective spaces described above from the space within the field of view of theCCD camera 2. In other words, thedetectable space 60 is a space in which theposition detecting device 1 can generate accurate three-dimensional position information when theplayer 100 is standing inside the space. The size (volume), shape, and position of thedetectable space 60 may be determined in advance by, for example, a game creator, or may be changed according to a state of a room where theposition detecting device 1 is installed. - In this embodiment, a determination
subject space 70 is set inside thedetectable space 60. As illustrated inFIG. 7 , for example, the determinationsubject space 70 is set at a predetermined position inside thedetectable space 60 associated with theposition detecting device 1. Further, the determinationsubject space 70 contains arepresentative point 71 for specifying a position at which the determinationsubject space 70 is to be placed. - The determination
subject space 70 is used to define a space in which theplayer 100 needs to be located. The size (volume) and shape of the determinationsubject space 70 may be determined in advance by, for example, a game creator. On the other hand, the position of the determinationsubject space 70 is changed, for example, according to the position of theplayer 100, details of which are described later. - Here, description is given of the significance of the determination
subject space 70 illustrated inFIG. 7 . As described above, in the case where theplayer 100 is in thedetectable space 60, in principle, thegame device 20 can detect the action of theplayer 100. - However, there is a case where the
player 100 is unable to move freely in thedetectable space 60. One example is a case where theplayer 100 plays the game in a living room or the like of their house as illustrated inFIG. 1 . - As illustrated in
FIG. 1 , the items of furniture F, such as a desk and a chair, and the walls and the like of the living room are placed in the living room. Further, there is a case where there is anotherplayer 100 or a person who is watching the game in the living room. Thus, in a case where theplayer 100 actually plays the game, various obstacles are often placed inside thedetectable space 60, and hence, in such a case, the player is unable to move freely in thedetectable space 60. - Further, in a case where the
player 100 gets absorbed in the gameplay, there is a risk that theplayer 100 will not notice the existence of obstacles in their surroundings. Specifically, there is a risk that theplayer 100 will move their body despite the existence of an obstacle and hit their body against the obstacle. Further, in a case where a plurality ofplayers 100 play the game simultaneously, there is a risk that thoseplayers 100 will hit each other because each of the players moves their body to play the game. - In view of this, in this embodiment, the determination
subject space 70 is set inside thedetectable space 60, and the player is prompted to play the game in the determinationsubject space 70. Specifically, as long as theplayer 100 plays the game in the determinationsubject space 70, it is possible to reduce the risk of theplayer 100 hitting anotherplayer 100 or an obstacle. Therefore, the determinationsubject space 70 serves to show the player 100 a safe space in which theplayer 100 has a low risk of hitting an obstacle. - Normally, the
player 100 clears away surrounding items of furniture F from their standing position to ensure safety in their surroundings, and then starts to play the game. Thus, the position of the determinationsubject space 70 is determined based, for example, on the position of theplayer 100 at the time of game start (alternatively, immediately before or after the game start). For example, the position of the determinationsubject space 70 is set so as to contain a place where theplayer 100 is standing at the time of the game start (alternatively, immediately before or after the game start). - If the
player 100 remains inside the determinationsubject space 70 set as described above, there is a high possibility that there will be no such obstacle as a desk or a chair in their surroundings, and hence theplayer 100 can play the game more safely. - It should be noted that the setting method for the position of the determination
subject space 70 is not limited to the above-mentioned example. For example, an extension of the line-of-sight of theCCD camera 2 may be set as an initial position of the determinationsubject space 70. In this case, it is possible to prompt theplayer 100 to stand at a position facing the position detecting device 1 (for example, position in the vicinity of the front of a TV set) to play the game. - It should be noted that in the example described above, one
player 100 plays the game, but a plurality ofplayers 100 may play the game. In the case where there are a plurality ofplayers 100, through the same processing as described above, the three-dimensional position information of eachplayer 100 is generated. Specifically, based on the number of outlines of theplayers 100, theposition detecting device 1 can recognize the number of theplayers 100. The same processing as described above is executed with respect to pixels corresponding to each of the plurality ofplayers 100, and hence it is possible to generate the three-dimensional position information of the plurality ofplayers 100. - Further, when the
player 100 is identified from the photographed image acquired by theposition detecting device 1, an object having a predetermined height (for example, one meter) or shorter may be excluded. Specifically, in a case such as where theplayer 100 is sitting on the floor and thus their sitting height is equal to or shorter than the predetermined height, there is a risk that theplayer 100 will not be detected accurately. Therefore, in this case, theplayer 100 may be prevented from being detected. - As described above, the game is executed based on the three-dimensional position information of the
player 100 in the determinationsubject space 70. Hereinafter, an example of the game is described. - In this embodiment, description is given by taking, as an example, a case where the
game device 20 recognizes the standing position and the action of the player based on the three-dimensional position information to execute a dance game. - For example, the
game device 20 executes a game configured such that theplayer 100 dances to movements of a game character on agame screen 50. In this game, for example, theplayer 100 is required to play the game without moving away from a predetermined standing position. In view of this, according to thegame screen 50 displayed in this embodiment, in a case where theplayer 100 has stepped out of the determinationsubject space 70, it is possible to prompt theplayer 100 to return to the determinationsubject space 70. -
FIG. 8 is a diagram illustrating an example of thegame screen 50 displayed by thegame device 20. As illustrated inFIG. 8 , thegame screen 50 includes, for example, agame character 51, aspotlight 52, aspotlight area 53 being an area illuminated by thespotlight 52, and amessage 54. In the game according to this embodiment, in principle, thegame character 51 stands within the spotlight area 53 (focused area) and dances. - The lightness (brightness) of the
spotlight area 53 is higher (brighter) than the lightness of the other area. On the other hand, the lightness of an area outside thespotlight area 53 is lower (darker) than the lightness of thespotlight area 53. Specifically, in a case where thegame character 51 moves out of thespotlight area 53, thegame character 51 becomes less visible. - The
game character 51 moves the respective parts of its body, thereby serving to show a dance action to be performed by theplayer 100. According to movements of the body of thegame character 51, theplayer 100 dances in front of theposition detecting device 1. - For example, if the
game character 51 has stepped its right foot forward, theplayer 100 steps their right foot forward as well. Further, for example, if thegame character 51 has performed an action of raising its left hand, theplayer 100 performs an action of raising their left hand as well. In a case where theplayer 100 has succeeded in moving their body according to the action of thegame character 51, for example, themessage 54 that reads “GOOD” is displayed on thegame screen 50. - Further, in a case where the
player 100 has stepped out of the determinationsubject space 70, themessage 54 to that effect is displayed on thegame screen 50. -
FIG. 9 is a diagram illustrating, as an example, thegame screen 50 displayed by thegame device 20 in the case where theplayer 100 has stepped out of the determinationsubject space 70. As illustrated inFIG. 9 , for example, themessage 54 that reads “CAUTION” is displayed on thegame screen 50. Specifically, because theplayer 100 is outside the relatively safe determinationsubject space 70, themessage 54 that issues a warning is displayed. - Further, in this case, the display position of the
spotlight area 53 may be configured to correspond to the determinationsubject space 70 of theposition detecting device 1. Hereinafter, this example is described. Specifically, in the case where theplayer 100 is inside the determinationsubject space 70, thegame character 51 is located at a position with high lightness. In other words, in this case, thegame character 51 is located in thespotlight area 53. - On the other hand, in the case where the
player 100 has stepped out of the determinationsubject space 70, thegame character 51 is located at a position with low lightness. In other words, in this case, thegame character 51 is located in the area outside thespotlight area 53. -
FIG. 10 is a diagram illustrating theposition detecting device 1 and theplayer 100 viewed from an Xw-Zw plane (that is, from the side).FIG. 10 illustrates a case where theplayer 100 has moved backward while dancing. As illustrated inFIG. 10 , theplayer 100 has moved backward and therefore is out of the determinationsubject space 70. - If the
player 100 is unaware of this fact and continues the gameplay, there is a risk that theplayer 100 will hit the furniture located therebehind, such as a sofa. Thus, there is displayed agame screen 50 that prompts theplayer 100 to move forward to return to the inside of the determinationsubject space 70. -
FIG. 11 is an example of thegame screen 50 displayed in the case where theplayer 100 has stepped out of the determinationsubject space 70. Thegame screen 50 ofFIG. 11 is displayed in the case where theplayer 100 is standing at the position ofFIG. 10 . As illustrated inFIG. 11 , thegame character 51 is displayed in the area outside the spotlight area 53 (for example, at a predetermined position in the rear). As described above, the area outside thespotlight area 53 is set low in lightness. - Specifically, because the
game character 51 is displayed in a relatively dark area, theplayer 100 finds thegame character 51 less visible. In such a case, it is conceivable that theplayer 100 will move forward so as to cause thegame character 51 to move to thespotlight area 53, which is bright and thus makes thegame character 51 more visible. Specifically, in a case where theplayer 100 has stepped backward out of the determinationsubject space 70 when viewed from theposition detecting device 1, by causing the position of thegame character 51 to move out of thespotlight area 53, it is possible to prompt theplayer 100 to move toward the determinationsubject space 70. -
FIG. 12 is a diagram illustrating theposition detecting device 1 and theplayer 100 viewed from an Xw-Yw plane (that is, from the above).FIG. 12 illustrates a case where theplayer 100 has moved in the horizontal direction (for example, Yw-axis direction) while dancing. As illustrated inFIG. 12 , for example, theplayer 100 is displaced in the horizontal direction and is therefore out of the determinationsubject space 70. -
FIG. 13 is an example of thegame screen 50 displayed in the case where theplayer 100 has stepped out of the determinationsubject space 70. Thegame screen 50 ofFIG. 13 is displayed in the case where theplayer 100 is standing at the position ofFIG. 12 . - Specifically, because the
game character 51 is displayed in a relatively dark area, theplayer 100 finds thegame character 51 less visible. Theplayer 100 moves leftward so as to cause thegame character 51 to move to thespotlight area 53, which is bright and thus makes thegame character 51 more visible. Specifically, in a case where theplayer 100 has stepped in the horizontal direction out of the determinationsubject space 70 when viewed from theposition detecting device 1, it is possible to prompt theplayer 100 to move toward the determinationsubject space 70. - As described above, for example, in a dance game configured such that the
player 100 dances to the movements of thegame character 51, there is a case where the standing position of theplayer 100 changes gradually during the gameplay. Specifically, even though theplayer 100 has ensured safety in their surroundings at the time of starting the game, if theplayer 100 gets absorbed in the game, there is a risk that theplayer 100 will move closer to an obstacle. To address this, thegame device 20 sets the determinationsubject space 70 for showing the standing position of theplayer 100, and hence it is possible to show the standing position to theplayer 100. - By the way, in the case where the
player 100 has stepped out of the determinationsubject space 70, if the position of thegame character 51 is moved out of thespotlight area 53 as illustrated inFIG. 11 orFIG. 13 , thegame character 51 becomes less visible. It is possible to guide theplayer 100 to a safe position, but there is a case where no obstacle is actually placed in the area outside the determinationsubject space 70. Specifically, there is a fear of such inconvenience that theplayer 100 feels difficulty playing the game because the determinationsubject space 70 is fixed to the initial position. Hereinafter, description is given of detailed processing relating to technology that solves this inconvenience. - First, detailed description is given of configurations of the
position detecting device 1 and thegame device 20. -
FIG. 14 is a diagram illustrating a hardware configuration of theposition detecting device 1. As illustrated inFIG. 14 , theposition detecting device 1 includes amicroprocessor 10, astorage unit 11, a photographingunit 12, adepth measuring unit 13, anaudio processing unit 14, and acommunication interface unit 15. The respective components of theposition detecting device 1 are connected to one another by abus 16 so as to be able to exchange data thereamong. - The
microprocessor 10 controls the respective units of theposition detecting device 1 according to an operating system and various kinds of programs which are stored in thestorage unit 11. - The
storage unit 11 stores programs and various kinds of parameters which are used for operating the operating system, the photographingunit 12, and thedepth measuring unit 13. Further, thestorage unit 11 stores a program for generating the three-dimensional position information based on the photographed image and the depth image. - The photographing
unit 12 includes theCCD camera 2 and the like. The photographingunit 12 generates, for example, the photographed image of theplayer 100. - The
depth measuring unit 13 includes theinfrared sensor 3 and the like. Thedepth measuring unit 13 generates the depth image based, for example, on the TOF acquired using theinfrared sensor 3. - As described above, the
microprocessor 10 generates the three-dimensional position information based on the photographed image generated by the photographingunit 12 and the depth image generated by thedepth measuring unit 13. Themicroprocessor 10 identifies the positions of pixels corresponding to the respective parts (for example, head P1 to left toe P16) of theplayer 100 based on the photographed image. - Next, the
microprocessor 10 executes coordinate transformation processing and calculates the three-dimensional coordinate based on the RGBD values of the identified pixels. The coordinate transformation processing is performed based on the matrix operation as described above. Through a series of those processing steps, the three-dimensional position information (FIG. 5 ) is generated at the predetermined time intervals (for example, every 1/60th of a second). - The
audio processing unit 14 includes the microphone 4 and the like. For example, theaudio processing unit 14 can identify a position at which theplayer 100 has made a sound based on time lags among sounds detected using a plurality of (for example, three) microphones. Further, as the microphone 4 of theaudio processing unit 14, a unidirectional microphone that detects sounds originating from a sound source located along the line-of-sight of theCCD camera 2 may be applied. - The
communication interface unit 15 is an interface for transmitting various kinds of data, such as the three-dimensional position information, to thegame device 20. -
FIG. 15 is a diagram illustrating a hardware configuration of thegame device 20. As illustrated inFIG. 15 , thegame device 20 according to this embodiment includes a home-use game machine 21, adisplay unit 40, anaudio output unit 41, anoptical disk 42, and amemory card 43. Thedisplay unit 40 and theaudio output unit 41 are connected to the home-use game machine 21. For example, a home-use television set is used as thedisplay unit 40. Further, for example, a speaker integrated into the home-use television set is used as theaudio output unit 41. - The
optical disk 42 and thememory card 43 are information storage media, and are inserted into the home-use game machine 21. - The home-
use game machine 21 is a publicly-known computer game system, and, as illustrated inFIG. 15 , includes abus 22, amicroprocessor 23, amain memory 24, animage processing unit 25, anaudio processing unit 26, an opticaldisk reproducing unit 27, amemory card slot 28, a communication interface (I/F) 29, a controller interface (I/F) 30, and acontroller 31. Components other than thecontroller 31 are accommodated in an enclosure of the home-use game machine 21. - The
bus 22 is used for exchanging addresses and data among the units constituting the home-use game machine 21. Specifically, themicroprocessor 23, themain memory 24, theimage processing unit 25, theaudio processing unit 26, the opticaldisk reproducing unit 27, thememory card slot 28, thecommunication interface 29, and thecontroller interface 30 are connected to one another by thebus 22 so as to be able to communicate data thereamong. - The
microprocessor 23 executes various kinds of information processing based on an operating system stored in a ROM (not shown), or programs read from theoptical disk 42 or thememory card 43. - The
main memory 24 includes, for example, a RAM. The program and data read from theoptical disk 42 or thememory card 43 are written into themain memory 24 as necessary. Themain memory 24 is also used as a working memory for themicroprocessor 23. - Further, the
main memory 24 stores the three-dimensional position information received from theposition detecting device 1 at the predetermined time intervals. Themicroprocessor 23 controls the game based on the three-dimensional position information stored in themain memory 24. - The
image processing unit 25 includes a VRAM, and renders, based on image data transmitted from themicroprocessor 23, thegame screen 50 in the VRAM. Theimage processing unit 25 converts thegame screen 50 into video signals, and outputs the video signals to thedisplay unit 40 at a predetermined timing. - The
audio processing unit 26 includes a sound buffer. Theaudio processing unit 26 outputs, from theaudio output unit 41, various kinds of audio data (game music, game sound effects, messages, etc.) that have been read from theoptical disk 42 into the sound buffer. - The optical
disk reproducing unit 27 reads a program and data recorded on theoptical disk 42. In this embodiment, description is given by taking, as an example, a case where theoptical disk 42 is used for supplying the program and the data to the home-use game machine 21. Alternatively, for example, another information storage medium (for example,memory card 43 or the like) may be used. Further, the program and the data may be supplied to the home-use game machine 21 via a data communication network such as the Internet. - The
memory card slot 28 is an interface for thememory card 43 to be inserted into. Thememory card 43 includes a nonvolatile memory (for example, EEPROM etc.). Thememory card 43 stores various kinds of game data, such as saved data. - The
communication interface 29 is an interface for establishing communication connection to a communication network such as the Internet. - The
controller interface 30 is an interface for establishing wireless connection or wired connection to thecontroller 31. As thecontroller interface 30, an interface compliant with, for example, the Bluetooth (registered trademark) interface standard may be used. It should be noted that thecontroller interface 30 may be an interface for establishing wired connection to thecontroller 31. -
FIG. 16 is a functional block diagram illustrating a group of functions to be implemented on thegame device 20. As illustrated inFIG. 16 , on thegame device 20, there are implemented a gamedata storage unit 80, aposition acquiring unit 82, adetermination unit 84, a gameprocessing execution unit 86, a determination subjectspace changing unit 88, and adisplay control unit 90. Those functions are implemented by themicroprocessor 23 operating according to programs read from theoptical disk 42. - The game
data storage unit 80 is mainly implemented by themain memory 24 and thememory card 43. The gamedata storage unit 80 stores information necessary for executing the game. For example, the gamedata storage unit 80 stores animation information indicating how thegame character 51 moves its body. - Further, for example, the game
data storage unit 80 stores reference action information for identifying an action to be performed by theplayer 100. -
FIG. 17 is a diagram illustrating an example of the reference action information. As illustrated inFIG. 17 , as the reference action information, time information indicating a timing at which an action is to be performed and information for identifying an action to be performed by theplayer 100 are stored. The time information indicates, for example, an elapsed time after the game is started. In a data storage example illustrated inFIG. 17 , for example, a time t1 indicates that theplayer 100 should perform an action of putting their right foot forward. - As described above, the
game character 51 plays a role of showing an action to be performed by theplayer 100, and thus, when the time t1 arrives, thegame character 51 performs an action that looks like putting its right foot forward. The animation information is created in such a manner as to correspond to the reference action information illustrated inFIG. 17 . Specifically, every time a time indicated by the time information stored in the reference action information arrives, thegame character 51 performs a predetermined animation action based on the animation information. - Further, for example, the game
data storage unit 80 stores action determination criterion information, which serves as a condition for making a determination as to the action of the player based on the three-dimensional position information. -
FIG. 18 is a diagram illustrating an example of the action determination criterion information. As illustrated inFIG. 18 , as the action determination criterion information, for example, information for identifying the movement of the body of theplayer 100 and a determination criterion to be satisfied by the three-dimensional position information are stored in association with each other. The determination criterion includes, for example, a change amount, a change direction, a change speed, and the like of the three-dimensional coordinate of each part of theplayer 100. Specifically, for example, the determination criterion is a condition to be satisfied by the motion vector (three-dimensional vector) of each part of theplayer 100. - In a case where “putting the right foot forward” is the movement of the body which is stored in the action determination criterion information, for example, conditions relating to the change amounts, the change directions, and the change speeds of the sets of the three-dimensional coordinates of the right heel P13 and the right toe P15 are associated with this movement of the body. In this case, if the change amounts, the change directions, and the change speeds of the sets of the three-dimensional coordinates of the right heel P13 and the right toe P15 satisfy the conditions stored in the action determination criterion information, it is determined that the
player 100 has put their right foot forward. - The same applies to other actions of the player 100 (for example, punching with the right hand, etc.), and the action of the
player 100 is determined based on whether or not the three-dimensional coordinates indicated by the three-dimensional position information satisfy the conditions stored in the action determination criterion information. Specifically, in this embodiment, the determination criterion information is obtained by storing information for making a determination as to dancing of theplayer 100. It should be noted that the determination criterion information may be stored in a ROM (not shown) or the like of thegame device 20. - Further, the game
data storage unit 80 stores, for example, determination subject space information for identifying the determinationsubject space 70. For example, in a case where the shape of the determinationsubject space 70 is such a truncated pyramid as illustrated inFIG. 7 , the length of each side of the determinationsubject space 70 and information indicating therepresentative point 71 are stored. That is, based on those items of information, the position of the determination subject space is identified. Further, the length of each side of the determinationsubject space 70 may be a value determined in advance. - For example, when the
player 100 starts the game, the initial position of therepresentative point 71 is determined. For example, the position of the determinationsubject space 70 is determined so as to contain the position of theplayer 100 when the game is started. Specifically, for example, therepresentative point 71 is determined so as to correspond to the standing position of theplayer 100 at the time starting the game. Alternatively, for example, therepresentative point 71 may be a point located along the line-of-sight of theCCD camera 2. - It should be noted that information that may be used as the determination subject space information is not limited to the above-mentioned example. The determination subject space information may be any information as long as the information allows the position and the size of the determination
subject space 70 to be identified. For example, in the case where the shape of the determinationsubject space 70 is a truncated pyramid, the determination subject space information may be information indicating the upper left vertices and the lower right vertices of the top surface and the bottom surface of the determinationsubject space 70 and information indicating therepresentative point 71. - Further, the game
data storage unit 80 stores information for identifying thedetectable space 60. Similarly to the determination subject space information, this information may be any information as long as the information allows the position and the size of thedetectable space 60 to be identified. - The
position acquiring unit 82 is mainly implemented by themicroprocessor 23. Theposition acquiring unit 82 acquires the three-dimensional position information (FIG. 5 ) from position information generating means (microprocessor 10) for generating the three-dimensional position information relating to the position of theplayer 100 in the three-dimensional space based on the photographed image acquired from the position detecting device (photographing unit 12) for photographing theplayer 100 and the depth information relating to a distance between the measurement reference position of the depth measuring means (depth measuring unit 13) and theplayer 100. - In this embodiment, the
position acquiring unit 82 acquires the three-dimensional position information generated by the microprocessor 10 (position information generating means) of theposition detecting device 1. - The
determination unit 84 is mainly implemented by themicroprocessor 23. Thedetermination unit 84 determines whether or not the position of theplayer 100 in the three-dimensional space is contained in the determinationsubject space 70. For example, in a case where any one of the sets of the three-dimensional coordinates contained in the three-dimensional position information is outside the determinationsubject space 70, it is determined that the position of the player corresponding to the three-dimensional position information is not contained in the determinationsubject space 70 of theposition detecting device 1. - It should be noted that the determination method performed by the
determination unit 84 may be any method as long as the method is performed based on the three-dimensional position information and the determinationsubject space 70, and that the determination method of thedetermination unit 84 is not limited thereto. For example, in a case where sets of the three-dimensional coordinates corresponding to a plurality of (for example, three) portions of a plurality of (for example, sixteen) parts of theplayer 100 indicated by the three-dimensional position information are outside the determinationsubject space 70, it may be determined that the position of theplayer 100 is not contained in the determinationsubject space 70. - The game
processing execution unit 86 is mainly implemented by themicroprocessor 23. The gameprocessing execution unit 86 executes game processing based on a result of a determination made by thedetermination unit 84. Details of operation of the gameprocessing execution unit 86 are described later (see S105, S106, and S107 ofFIG. 19 ). - The determination subject
space changing unit 88 is mainly implemented by themicroprocessor 23. In a case where it is determined that the position of theplayer 100 in the three-dimensional space is not contained in the determinationsubject space 70, the determination subjectspace changing unit 88 changes the position of the determinationsubject space 70 based on the position of theplayer 100 in the three-dimensional space. Details of operation of the determination subjectspace changing unit 88 are described later (see S108 and S109 ofFIG. 19 ). - The
display control unit 90 is mainly implemented by themicroprocessor 23. Thedisplay control unit 90 displays thegame screen 50 on thedisplay unit 40. In this embodiment, thedisplay control unit 90 causes display means (display unit 40) to display thegame screen 50 containing thegame character 51 and the focused area (spotlight area 53) having its lightness set higher than that of the other area. - Further, the
display control unit 90 includes means for controlling the positional relation between the display position of thegame character 51 and the display position of the focused area based on the positional relation between the position of theplayer 100 in the three-dimensional space and the determinationsubject space 70. Details of operation of thedisplay control unit 90 are described later (see S102 ofFIG. 19 ). -
FIG. 19 is a flow chart illustrating an example of processing to be executed on thegame device 20. The processing ofFIG. 19 is executed by themicroprocessor 23 operating according to programs read from theoptical disk 42. For example, the processing ofFIG. 19 is executed at predetermined time intervals (for example, every 1/60th of a second). - As illustrated in
FIG. 19 , first, the microprocessor 23 (position acquiring unit 82) acquires the three-dimensional position information of the player 100 (S101). - The microprocessor 23 (display control unit 90) changes the position of the
game character 51 to be displayed on the game screen 50 (S102). In S102, for example, the display position of thegame character 51 is changed based on the positional relation between the three-dimensional position information of theplayer 100 and therepresentative point 71. For example, a determination is made as to the positional relation between the three-dimensional coordinates of the waist P10 contained in the three-dimensional position information of theplayer 100 and therepresentative point 71. Specifically, a direction D from therepresentative point 71 of the determinationsubject space 70 toward the three-dimensional coordinate of the waist P10 of the player and a distance L therebetween are acquired (FIG. 7 ). - Next, the display position of the
game character 51 is changed so that the positional relation between the display position of thegame character 51 and aguidance position 55 of thespotlight area 53 corresponds to the positional relation between the three-dimensional coordinate of the waist P10 of the player and therepresentative point 71 of the determinationsubject space 70. - For example, as illustrated in
FIG. 11 andFIG. 13 , the display position of thegame character 51 is changed from theguidance position 55 of thespotlight area 53 to aposition 57 obtained by shifting theguidance position 55 of thespotlight area 53 by a distance Ls corresponding to the above-mentioned distance L in a direction Ds corresponding to the above-mentioned direction D. The direction Ds and the distance Ls are respectively calculated based, for example, on the direction D or the distance L and a predetermined mathematical expression. The predetermined mathematical expression may be, for example, a predetermined matrix (for example, projection matrix) for transforming a three-dimensional vector to a two-dimensional vector. - Further, the
guidance position 55, which is a position for guiding thegame character 51, corresponds to therepresentative point 71. For example, theguidance position 55 is a position located a predetermined distance above a center point of thespotlight area 53. - Through the processing of S102, the display position of the
game character 51 is controlled. Specifically, by referring to the positional relation between thegame character 51 and thespotlight area 53 which are displayed on thegame screen 50, theplayer 100 can recognize whether or not the position of the body of theplayer 100 is out of the determinationsubject space 70. - As a result, the player can adjust their own standing position. Further, in a case where the
game character 51 has moved out of thespotlight area 53, thegame character 51 becomes less visible, and hence it is conceivable that the player will unconsciously adjust their own standing position so that thegame character 51 is positioned within thespotlight area 53. By controlling the display position of thegame character 51 as described above, it also becomes possible to make the player unconsciously adjust their own standing position. - Referring back to
FIG. 19 , the microprocessor 23 (display control unit 90) updates the posture of thegame character 51 displayed on thegame screen 50 based on animation data (S103). - The microprocessor 23 (determination unit 84) determines whether or not at least one position of the body of the
player 100 indicated by the three-dimensional position information is outside the determination subject space 70 (S104). The determination of S104 is performed by, for example, comparing the three-dimensional position information and the determination subject space information. Specifically, for example, a determination is made based on whether or not the three-dimensional coordinates (FIG. 5 ) contained in the three-dimensional position information are inside the determination subject space 70 (FIG. 7 ). - In a case where the position of the player is not outside the determination subject space 70 (S104; N), that is, in a case where all the positions corresponding to the
player 100 are inside the determinationsubject space 70, the microprocessor 23 (game processing execution unit 86) determines whether or not theplayer 100 has moved their body according to the movement of the body of the game character 51 (S105). - In S105, it is determined whether or not the
player 100 has performed an action similar to the action (movement of the body) performed by thegame character 51. This determination is executed based, for example, on the three-dimensional position information, the reference action information (FIG. 17 ), and the determination criterion information (FIG. 18 ). - In a case where the reference action information is the data storage example illustrated in
FIG. 17 , for example, it is indicated that thegame character 51 puts its right foot forward at the time t1. In this case, at a time at which thegame character 51 puts its right foot forward (hereinafter, referred to as “reference time”), it is determined whether or not the player has put their right foot forward. The reference time is, for example, a time within a predetermined period including times (for example, time t1) stored in the reference action information. - Here, the “predetermined period” is, for example, a period from a start time that is a predetermined time before the reference time until an end time that is a predetermined time after the reference time. In a case where the
player 100 has put their right foot forward within the above-mentioned predetermined period, it is determined that the player has moved their foot according to the movement of the foot of thegame character 51. In other words, it is determined that theplayer 100 has performed an action according to the movement of thegame character 51. As described above, whether or not theplayer 100 has put their right foot forward is determined based on the determination criterion information (FIG. 18 ). - In a case where it is determined that the
player 100 has performed an action according to the movement of the game character 51 (S105; Y), the microprocessor 23 (game processing execution unit 86) displays themessage 54 such as “GOOD” on the game screen 50 (S106). - On the other hand, in a case where it is not determined that the
player 100 has performed an action according to the movement of the game character 51 (S105; N), themicroprocessor 23 does not display such a message as in S106 and ends the processing. - On the other hand, in a case where at least one position of the body of the player is outside the determination subject space 70 (S104; Y), the microprocessor 23 (game processing execution unit 86) displays the
message 54 such as “CAUTION” on the game screen (S107). - The microprocessor 23 (determination subject space changing unit 88) determines whether or not a state in which at least one position of the body of the player is outside the determination
subject space 70 has continued for a reference period (for example, three seconds) (S108). - In a case where the state in which at least one position of the body of the player is outside the determination
subject space 70 has continued for the reference period (S108; Y), the microprocessor 23 (determination subject space changing unit 88) changes the position of the determination subject space 70 (S109). - In S109, for example, the three-dimensional coordinate of the waist P10 in the three-dimensional position information is referred to. Subsequently, the position of the determination
subject space 70 is changed so that therepresentative point 71 of the determinationsubject space 70 coincides with the three-dimensional coordinate corresponding to the waist P10 of the player. -
FIG. 20 is a diagram illustrating the determinationsubject space 70 after the change. As illustrated inFIG. 20 , the position of the determinationsubject space 70 is changed so that the position of the waist P10 of theplayer 100 coincides with therepresentative point 71. - It should be noted that the change method for the position of the determination
subject space 70 performed in S109 may be any method as long as the position of theplayer 100 is contained in the determinationsubject space 70 based on the position of theplayer 100 indicated by the three-dimensional position information, and that the change method is not limited to the above-mentioned example. In addition, for example, the position of the determinationsubject space 70 may be changed so that an average value of the sets of three-dimensional coordinates contained in the three-dimensional position information coincides with therepresentative point 71. - For example, in a case where the
player 100 has moved closer to an obstacle, it is conceivable that theplayer 100 will notice that fact within a predetermined time period and return to the original position. Specifically, in view of the above, in a case where the state in which theplayer 100 is out of the determinationsubject space 70 before the change has continued for the predetermined time period, it is conceivable that there is a high possibility that there is no obstacle around the current standing position of theplayer 100. Thus, in this case, in order to allow theplayer 100 to continue the gameplay, as illustrated inFIG. 19 , the position of the determinationsubject space 70 is changed so that the position of the body of the player is contained in the determinationsubject space 70. - On the other hand, in a case where the state in which at least one position of the body of the player is outside the determination
subject space 70 has not continued for the reference period (S108; N), themicroprocessor 23 ends the processing. Specifically, in this case, themicroprocessor 23 does not perform the processing of displaying themessage 54 such as “GOOD” based on the movement of the body of theplayer 100. In other words, by making no response to the dance action performed by theplayer 100, it is also possible to make theplayer 100 understand that theplayer 100 is not in the determinationsubject space 70. - In the
game device 20 described above, when the game processing is executed, whether or not themessage 54 such as “CAUTION” is to be displayed or whether or not detection of the action of theplayer 100 is to be avoided is determined based on whether or not theplayer 100 is in the determinationsubject space 70. Further, the position of the determinationsubject space 70 is changed based on the position of theplayer 100 in the three-dimensional space, and hence it is possible to change the determinationsubject space 70 for theplayer 100 to be able to continue the gameplay while prompting theplayer 100 to stay in the determinationsubject space 70. - As described above, at the time of starting the game (alternatively, immediately before or after starting the game), it is possible to ensure safety of the
player 100 within their surroundings are safe. Accordingly, by guiding theplayer 100 to this safe position, it is possible to reduce a risk that theplayer 100 will hit an obstacle or anotherplayer 100. Therefore, even if a game is configured to require theplayer 100 to move their body, theplayer 100 can play the game safely. - Further, based on the positional relation between the position of the body of the
player 100 and the position of the determinationsubject space 70, the positional relation between the display position of thegame character 51 and the display position of thespotlight area 53 is controlled. For example, in the case where the position of the body of theplayer 100 is out of the determinationsubject space 70, thegame character 51 is located outside the spotlight area 53 (seeFIG. 11 andFIG. 13 ). - According to the
game device 20, by referring to the positional relation between thegame character 51 and thespotlight area 53 which are displayed on thegame screen 50, theplayer 100 can recognize whether or not the position of the body of theplayer 100 is out of the determinationsubject space 70. As a result, in the case where the standing position of theplayer 100 has changed during the gameplay, theplayer 100 can know that their standing position has changed. Therefore, theplayer 100 can adjust their own standing position. - Further, in the case where the
game character 51 has moved out of thespotlight area 53, thegame character 51 becomes less visible. Accordingly, it is conceivable that theplayer 100 will try to unconsciously adjust their own standing position so that thegame character 51 is located within thespotlight area 53. As described above, according to thegame device 20, it is also possible to make theplayer 100 unconsciously adjust their own standing position. - Further, in the
game device 20, in the case where the state in which at least one position of the body of theplayer 100 is outside the determinationsubject space 70 has continued for the reference period (for example, three seconds), the position of the determinationsubject space 70 is changed so that the position of the body of theplayer 100 is contained in the determination subject space 70 (seeFIG. 20 ). - According to the above-mentioned processing of the
game device 20, in the state in which the position of at least one part of the body of theplayer 100 is outside the determinationsubject space 70 because the standing position of theplayer 100 has changed during the gameplay, it is possible to continue the game even if theplayer 100 does not adjust their standing position. As described above, in the case where the state in which theplayer 100 is outside the determinationsubject space 70 has continued for the reference period, there is a high possibility that no obstacle is around the standing position of theplayer 100, and hence it is possible to guarantee safety for theplayer 100 even if the position of the determinationsubject space 70 is changed. - Further, for example, if the position of the determination
subject space 70 is changed even in a case where any one position of the body of the player is outside the determinationsubject space 70 for a brief moment, there arises a fear that the player will become confused instead. In this respect, thegame device 20 is capable of preventing the player from feeling such confusion. - It should be noted that the present invention is not limited to the embodiment described above.
- In S102 of
FIG. 19 , the display position of the spotlight area 53 (and the spotlight 52) may be changed so that the positional relation between the display position of thegame character 51 and theguidance position 55 of thespotlight area 53 corresponds to the positional relation between the position of the player 100 (for example, the three-dimensional coordinate of the waist P10) and therepresentative point 71 of the determinationsubject space 70. In other words, thespotlight area 53 may be moved instead of moving thegame character 51. - Alternatively, the display positions of both the
game character 51 and the spotlight area 53 (and the spotlight 52) may be changed so that the positional relation between the display position of thegame character 51 and theguidance position 55 of thespotlight area 53 corresponds to the positional relation between the position of the player 100 (for example, the three-dimensional coordinate of the waist P10) and therepresentative point 71 of the determinationsubject space 70. - Further, without changing the relative positions of the
game character 51, thespotlight area 53, and the like, the display position of thegame character 51 may be moved to the right-hand side, the left-hand side, or the like of thegame screen 50. Specifically, for example, in a case where thegame screen 50 is a screen showing a situation of a virtual game space viewed from a virtual camera, by changing the position of the virtual camera, the display position of thegame character 51 is changed as described above. -
FIG. 21 is a diagram illustrating a case where the display position of an image contained in thegame screen 50 has been changed. Thegame screen 50 illustrated inFIG. 21 is displayed, for example, in a case where theplayer 100 has moved to the right-hand side of the determinationsubject space 70 with respect to theposition detecting device 1. The position of the virtual camera is changed to the left, and thus, for example, thegame character 51 located in the vicinity of the center of thegame screen 50 is moved to the right in relation to a center point of thegame screen 50. - Further, for example, like an
area 50 a, the vicinity of a left end portion of thegame screen 50 is displayed in black. The width and the position of thearea 50 a are determined based, for example, on the distance L and the direction D between therepresentative point 71 and the waist P10 of theplayer 100. It seems to theplayer 100 that nothing is displayed in thearea 50 a located in the vicinity of the left end portion of thegame screen 50. In this case, it is conceivable that theplayer 100 will move to their left, trying to move the display position of thegame character 51 back to the original position. Thus, according to thegame screen 50, it is possible to guide theplayer 100 to the inside of the determinationsubject space 70. - By performing the display control of the
game screen 50 as described above, it is possible to notify theplayer 100 that their standing position is displaced, without changing the relative positions of respective images (game character 51 and the like) contained in thegame screen 50. Here, the description above is directed to the case where the position of the virtual camera is changed. However, the notification of the standing position of theplayer 100 may be performed by changing the angle of view or the line-of-sight of the virtual camera. - Further, the
game screen 50 only needs to show the positional relation between the display positions of the standing position of theplayer 100 and therepresentative point 71 of the determinationsubject space 70, and thus the example of thegame screen 50 is not limited to the example of this embodiment. -
FIG. 22 is a diagram illustrating another example of thegame screen 50. On thegame screen 50 illustrated inFIG. 22 , aplayer character 51 a (first game character) corresponding to theplayer 100 and aninstructor character 51 b (second game character) are displayed. - In this case, for example, the
player 100 moves their body according to the movement of theinstructor character 51 b. Then, based on the movement of theplayer 100, theplayer character 51 a performs an action. Similarly to the embodiment, in a case where theplayer 100 has succeeded in performing the action, themessage 54 such as “GOOD” is displayed. - Alternatively, the
player character 51 a and theinstructor character 51 b may move in the same manner, and theplayer 100 may move their body according to the movement of theplayer character 51 a and theinstructor character 51 b. - In the second modified example, the positional relation between the
player character 51 a and theinstructor character 51 b is changed based on the positional relation between the position of theplayer 100 and the determinationsubject space 70. For example, in the case where the position of theplayer 100 is contained in the determinationsubject space 70, theplayer character 51 a is displayed substantially in front of theinstructor character 51 b as illustrated inFIG. 22 . - On the other hand, for example, in the case where the position of the
player 100 is out of the determinationsubject space 70 as illustrated inFIG. 10 , theplayer character 51 a is displayed at a position displaced far from theinstructor character 51 b as illustrated inFIG. 23 , for example. Further, themessage 54 such as “CAUTION” is displayed. - Further, for example, in the case where the position of the
player 100 is out of the determinationsubject space 70 as illustrated inFIG. 12 , theplayer character 51 a is displayed at a position significantly displaced sideways from theinstructor character 51 b as illustrated inFIG. 24 , for example. - In the second modified example, for example, processing similar to the processing of S102 of
FIG. 19 is executed. Specifically, at least one of the display positions of theplayer character 51 a and theinstructor character 51 b is changed so that the positional relation between the display position of theplayer character 51 a and the display position of the instructor character Sib corresponds to the positional relation between the position of theplayer 100 and therepresentative point 71 of the determinationsubject space 70. - For example, first, the three-dimensional coordinate of the waist P10 of the
player 100 are referred to. Subsequently, a determination is made as to the positional relation between the three-dimensional coordinate of the waist P10 of theplayer 100 and therepresentative point 71 of the determinationsubject space 70. For example, a difference between the three-dimensional coordinate of the waist P10 of theplayer 100 and therepresentative point 71 of the determinationsubject space 70 is acquired. Specifically, the direction D from therepresentative point 71 of the determinationsubject space 70 toward the three-dimensional coordinate of the waist P10 of theplayer 100 and the distance L therebetween are acquired. - After that, the display position of the
player character 51 a is changed so that the positional relation between the display position of theplayer character 51 a and the display position of theinstructor character 51 b corresponds to the positional relation between the three-dimensional coordinate of the waist P10 of theplayer 100 and therepresentative point 71 of the determinationsubject space 70. For example, as illustrated inFIG. 23 andFIG. 24 , the display position of theplayer character 51 a is changed from abasic position 56 set in front of theinstructor character 51 b to aposition 57 obtained by shifting thebasic position 56 by the distance Ls corresponding to the above-mentioned distance L in the direction Ds corresponding to the above-mentioned direction D. - According to the second modified example, by referring to the positional relation between the
player character 51 a and theinstructor character 51 b, theplayer 100 can recognize whether or not the position of the body of theplayer 100 is out of the determinationsubject space 70. As a result, in such a case where the standing position of theplayer 100 has changed during the gameplay, theplayer 100 can know that their standing position has changed, and accordingly can adjust their own standing position. Therefore, theplayer 100 can play the game in a relatively safe place within the determinationsubject space 70. - Here, in the case where the position of the
player character 51 a is displaced from the front of theinstructor character 51 b or displaced far from theinstructor character 51 b, it is generally conceivable that theplayer 100 will try to set the position of theplayer character 51 a to the front of theinstructor character 51 b. - Specifically, in the case where the position of the
player character 51 a is displaced from the front of theinstructor character 51 b or displaced far from theinstructor character 51 b, theplayer 100 conceivably feels difficulty in imitating the movement of theinstructor character 51 b. Therefore, it is conceivable that theplayer 100 will unconsciously adjust their own standing position so that the position of theplayer character 51 a is set to the front of theinstructor character 51 b. - As described above, by controlling the positional relation between the
player character 51 a and theinstructor character 51 b, it is possible to make theplayer 100 unconsciously adjust their own standing position. - In the second modified example, too, the display control as illustrated in
FIG. 21 may be performed. Specifically, without changing the relative positions of theplayer character 51 a and theinstructor character 51 b, the display positions of theplayer character 51 a and theinstructor character 51 b may be moved to the right-hand side, the left-hand side, or the like of thegame screen 50. In this case, too, similarly to the case illustrated inFIG. 21 , by changing the position of the virtual camera, for example, the display positions of theplayer character 51 a and theinstructor character 51 b are changed as described above. -
FIG. 25 is a diagram illustrating a case where the display position of an image contained in thegame screen 50 has been changed. Thegame screen 50 illustrated inFIG. 25 is displayed, for example, in the case where theplayer 100 has moved to the right-hand side of the determinationsubject space 70 with respect to theposition detecting device 1. The position of the virtual camera is changed to the left, and thus, for example, theplayer character 51 a and theinstructor character 51 b located in the vicinity of the center of thegame screen 50 are moved to the right in relation to the center point of thegame screen 50. - Further, for example, similarly to
FIG. 21 , thearea 50 a is displayed. In this case, it is conceivable that theplayer 100 will move to their left so as to move the display positions of theplayer character 51 a and theinstructor character 51 b back to the original positions. Thus, according to thegame screen 50, it is possible to guide theplayer 100 to the inside of the determinationsubject space 70. - It should be noted that the present invention is not limited to the embodiment and the modified examples which are described above, and that various modifications may be made as needed without departing from the gist of the present invention.
- (1) For example, the three-dimensional position information indicating the position of the
player 100 has been described by taking, as an example, the data storage example illustrated inFIG. 5 . However, the three-dimensional position information transmitted from theposition detecting device 1 may be any information as long as the information allows the position (for example, standing position) of theplayer 100 to be identified, and thus the data storage example is not limited to the example ofFIG. 5 . Alternatively, for example, the three-dimensional position information may be such information that indicates a distance and a direction from a reference point of the player 100 (for example, a point corresponding to the head) to each part of the body. - (2) Further, for example, the description above has been given by taking the example in which the position information generating means for generating the three-dimensional position information based on the photographed image and the depth information (depth image) is included in the
position detecting device 1. However, the position information generating means may be included in thegame device 20. Specifically, thegame device 20 may receive the photographed image and the depth image from theposition detecting device 1, to thereby generate the three-dimensional position information based on those images. - (3) Further, for example, the description above has been given by taking, as a method of analyzing the movement of the
player 100 based on the three-dimensional position information, the example in which a comparison is made between the action determination criterion information illustrated inFIG. 18 and the change amount, the change direction, the change speed, etc. of the three-dimensional coordinate of each part of theplayer 100. The analysis method for the movement of theplayer 100 may be any method as long as the method is performed based on the three-dimensional position information, and thus the analysis method is not limited to the above-mentioned example. Alternatively, for example, the movement of theplayer 100 may be analyzed based on values acquired by substituting the three-dimensional coordinates contained in the three-dimensional position information into a predetermined mathematical expression. - (4) Further, for example, in the case where a plurality of
players 100 play the game, such control may be performed that prevents the determinationsubject spaces 70 corresponding to therespective players 100 from overlapping each other. For example, in a case where twoplayers 100 play the game, the three-dimensional position information contains sets of the three-dimensional coordinates for the two players. In a case where changing the determinationsubject space 70 so that therepresentative point 71 moves to the three-dimensional coordinate of the waist P10 of oneplayer 100 causes the changed determinationsubject space 70 to overlap the determinationsubject space 70 of theother player 100, there is a risk that the two players will hit each other, and thus control may be performed so as to prevent such change. - Specifically, in the case where the game executed by the
game device 20 is played by a plurality ofplayers 100, the determination subjectspace changing unit 88 may include means for inhibiting change in the case where changing the position of the determinationsubject space 70 corresponding to oneplayer 100 causes the changed determinationsubject space 70 to overlap the determinationsubject space 70 corresponding to anotherplayer 100. - (5) Further, in the first modified example and the second modified example, the reference points, which are referred to when the
display control unit 90 controls the display position and which represent the positional relation between the position of theplayer 100 and the determinationsubject space 70, are set to the three-dimensional coordinate of the waist P10 and therepresentative point 71, respectively. Thedisplay control unit 90 only needs to determine whether or not to control the display position based on the three-dimensional position information corresponding to theplayer 100 and information identifying the position of the determinationsubject space 70, and thus information items to be compared are not limited to the above-mentioned example. For example, a comparison may be made between an average value of sets of the three-dimensional coordinates contained in the three-dimensional position information and one arbitrary point within the determinationsubject space 70. - (6) Further, in this embodiment, the method of measuring the depth of the
player 100 has been described by taking, as an example, the case of performing calculation based on the TOF of the infrared light. However, the measuring method is not limited to the example of this embodiment. Alternatively, for example, a method of performing triangulation, a method of performing three-dimensional laser scanning, or the like may be applied. Further, the description has been given by taking the example in which the depth information is acquired as the depth image, but the depth information is not limited thereto. The depth information may be any information as long as the information allows the depth of theplayer 100 to be identified, and hence the depth information may be a value indicating the TOF, for example. - (7) Further, the determination
subject space 70 has been described by taking, as an example, the shape illustrated inFIG. 7 , but the shape of the determinationsubject space 70 is not limited thereto. The determinationsubject space 70 may have any shape as long as the shape allows the position at which theplayer 100 should be standing to be identified, and hence the determinationsubject space 70 may have a spherical shape, for example. In this case, the gamedata storage unit 80 stores therepresentative point 71 of the determination subject space 70 (for example, center point of the sphere) and information for identifying the radius of the sphere. - (8) Further, in this embodiment, only the position of the determination
subject space 70 is changed with the size thereof kept as it is. However, by changing the size of the determinationsubject space 70, the position of the determinationsubject space 70 may be changed. Specifically, in the case where the state in which theplayer 100 is out of the determinationsubject space 70 has continued for the predetermined time period, the position of the determinationsubject space 70 may be changed in such a manner that the determinationsubject space 70 is enlarged in a direction in which theplayer 100 is out of the determinationsubject space 70. - (9) Further, for example, in this embodiment, the
game device 20 makes a determination as to the movement of theplayer 100 based on the three-dimensional position information. However, theposition detecting device 1 may make a determination as to the movement of theplayer 100. In this case, the determination criterion information (FIG. 18 ) is stored in theposition detecting device 1. Specifically, a determination is made as to the movement of theplayer 100 by theposition detecting device 1, and only information indicating a result of the determination is transmitted to thegame device 20. - (10) Further, in the case where the
player 100 is out of the determinationsubject space 70, the display control processing performed by thedisplay control unit 90 is not limited to this embodiment and the modified examples (FIG. 11 ,FIG. 13 ,FIG. 21 ,FIG. 23 ,FIG. 24 , andFIG. 25 ). It is only necessary to perform such display control that notifies theplayer 100 that theplayer 100 is out of the determinationsubject space 70. For example, the entire determinationsubject space 70 may correspond to the entire display area of thegame screen 50. Specifically, in the case where theplayer 100 is out of the determinationsubject space 70, thegame character 51 may be made invisible on thegame screen 50. - (11) Alternatively, for example, in the case where the
player 100 is not in the determinationsubject space 70, thedisplay control unit 90 may perform predetermined image processing on an image contained in thegame screen 50. Specifically, for example, noise processing may be performed on thegame character 51, which is a focus target of theplayer 100, to thereby make thegame character 51 less visible. - (12) Further, in this embodiment, the dance game has been described as an example of the game to be executed on the
game device 20. The game to be executed on thegame device 20 may be any game as long as the movement of theplayer 100 is detected to execute the game processing, and thus the kind of the game to be executed is not limited thereto. Alternatively, for example, the game may be a sport game such as a soccer game, a fighting game, or the like. - While there have been described what are at present considered to be certain embodiments of the invention, it will be understood that various modifications may be made thereto, and it is intended that the appended claims cover all such modifications as fall within the true spirit and scope of the invention.
Claims (6)
1. A game device, comprising:
position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player;
determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space;
game processing execution means for executing game processing based on a result of the determination made by the determination means; and
determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
2. The game device according to claim 1 , wherein the determination subject space changing means comprises:
means for determining whether or not a state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for a reference period; and
means for changing the position of the determination subject space in a case where the state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for the reference period.
3. The game device according to claim 1 , further comprising display control means for causing display means to display a game screen containing a game character and a focused area having lightness thereof set higher than lightness of another area,
wherein the display control means comprises means for controlling a positional relation between a display position of the game character and a display position of the focused area based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.
4. The game device according to claim 1 , further comprising display control means for causing display means to display a game screen containing a first game character and a second game character,
wherein the display control means comprises means for controlling a positional relation between a display position of the first game character and a display position of the second game character based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.
5. A control method for a game device, comprising:
a position acquiring step of acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player;
a determination step of determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space;
a game processing execution step of executing game processing based on a result of the determination made in the determination step; and
a determination subject space changing step of changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
6. A non-transitory computer-readable information storage medium having a program recorded thereon, the program causing a computer to function as a game device comprising:
position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player;
determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space;
game processing execution means for executing game processing based on a result of the determination made by the determination means; and
determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010059465A JP5039808B2 (en) | 2010-03-16 | 2010-03-16 | GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM |
JP2010-059465 | 2010-03-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110230266A1 true US20110230266A1 (en) | 2011-09-22 |
Family
ID=44647660
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/043,800 Abandoned US20110230266A1 (en) | 2010-03-16 | 2011-03-09 | Game device, control method for a game device, and non-transitory information storage medium |
Country Status (2)
Country | Link |
---|---|
US (1) | US20110230266A1 (en) |
JP (1) | JP5039808B2 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100051836A1 (en) * | 2008-08-27 | 2010-03-04 | Samsung Electronics Co., Ltd. | Apparatus and method of obtaining depth image |
US20130059661A1 (en) * | 2011-09-02 | 2013-03-07 | Zeroplus Technology Co., Ltd. | Interactive video game console |
US8520901B2 (en) | 2010-06-11 | 2013-08-27 | Namco Bandai Games Inc. | Image generation system, image generation method, and information storage medium |
US20140292642A1 (en) * | 2011-06-15 | 2014-10-02 | Ifakt Gmbh | Method and device for determining and reproducing virtual, location-based information for a region of space |
US8854304B2 (en) | 2010-06-11 | 2014-10-07 | Namco Bandai Games Inc. | Image generation system, image generation method, and information storage medium |
EP3052209A2 (en) * | 2013-09-30 | 2016-08-10 | Sony Interactive Entertainment Inc. | Camera based safety mechanisms for users of head mounted displays |
US10122985B2 (en) | 2013-09-30 | 2018-11-06 | Sony Interactive Entertainment Inc. | Camera based safety mechanisms for users of head mounted displays |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102008041411A1 (en) * | 2008-08-21 | 2010-06-10 | Hilti Aktiengesellschaft | Screw with sealing washer arrangement |
US9898675B2 (en) | 2009-05-01 | 2018-02-20 | Microsoft Technology Licensing, Llc | User movement tracking feedback to improve tracking |
JP5213913B2 (en) * | 2010-06-11 | 2013-06-19 | 株式会社バンダイナムコゲームス | Program and image generation system |
JP2012040209A (en) * | 2010-08-19 | 2012-03-01 | Konami Digital Entertainment Co Ltd | Game device, method of controlling game device, and program |
CN103635240B (en) | 2011-07-01 | 2015-12-16 | 英派尔科技开发有限公司 | Based on the safety approach of the game of posture |
US9390318B2 (en) | 2011-08-31 | 2016-07-12 | Empire Technology Development Llc | Position-setup for gesture-based game system |
US8657681B2 (en) | 2011-12-02 | 2014-02-25 | Empire Technology Development Llc | Safety scheme for gesture-based game system |
US8790179B2 (en) | 2012-02-24 | 2014-07-29 | Empire Technology Development Llc | Safety scheme for gesture-based game system |
JP6249817B2 (en) * | 2014-02-25 | 2017-12-20 | 株式会社ソニー・インタラクティブエンタテインメント | Image display system, image display method, image display program, and recording medium |
JP6167408B2 (en) * | 2015-07-30 | 2017-07-26 | 株式会社コナミデジタルエンタテインメント | GAME DEVICE AND PROGRAM |
JP6834197B2 (en) * | 2016-07-05 | 2021-02-24 | 株式会社リコー | Information processing equipment, display system, program |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050221892A1 (en) * | 2004-03-31 | 2005-10-06 | Konami Corporation | Game device, computer control method, and computer-readable information storage medium |
US20060066723A1 (en) * | 2004-09-14 | 2006-03-30 | Canon Kabushiki Kaisha | Mobile tracking system, camera and photographing method |
US20090221374A1 (en) * | 2007-11-28 | 2009-09-03 | Ailive Inc. | Method and system for controlling movements of objects in a videogame |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3766981B2 (en) * | 1994-04-05 | 2006-04-19 | カシオ計算機株式会社 | Image control apparatus and image control method |
JP2000163178A (en) * | 1998-11-26 | 2000-06-16 | Hitachi Ltd | Interaction device with virtual character and storage medium storing program generating video of virtual character |
JP3637802B2 (en) * | 1999-03-23 | 2005-04-13 | ヤマハ株式会社 | Music control device |
-
2010
- 2010-03-16 JP JP2010059465A patent/JP5039808B2/en active Active
-
2011
- 2011-03-09 US US13/043,800 patent/US20110230266A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050221892A1 (en) * | 2004-03-31 | 2005-10-06 | Konami Corporation | Game device, computer control method, and computer-readable information storage medium |
US20060066723A1 (en) * | 2004-09-14 | 2006-03-30 | Canon Kabushiki Kaisha | Mobile tracking system, camera and photographing method |
US20090221374A1 (en) * | 2007-11-28 | 2009-09-03 | Ailive Inc. | Method and system for controlling movements of objects in a videogame |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100051836A1 (en) * | 2008-08-27 | 2010-03-04 | Samsung Electronics Co., Ltd. | Apparatus and method of obtaining depth image |
US8217327B2 (en) * | 2008-08-27 | 2012-07-10 | Samsung Electronics Co., Ltd. | Apparatus and method of obtaining depth image |
US8520901B2 (en) | 2010-06-11 | 2013-08-27 | Namco Bandai Games Inc. | Image generation system, image generation method, and information storage medium |
US8854304B2 (en) | 2010-06-11 | 2014-10-07 | Namco Bandai Games Inc. | Image generation system, image generation method, and information storage medium |
US20140292642A1 (en) * | 2011-06-15 | 2014-10-02 | Ifakt Gmbh | Method and device for determining and reproducing virtual, location-based information for a region of space |
US20130059661A1 (en) * | 2011-09-02 | 2013-03-07 | Zeroplus Technology Co., Ltd. | Interactive video game console |
EP3052209A2 (en) * | 2013-09-30 | 2016-08-10 | Sony Interactive Entertainment Inc. | Camera based safety mechanisms for users of head mounted displays |
US10122985B2 (en) | 2013-09-30 | 2018-11-06 | Sony Interactive Entertainment Inc. | Camera based safety mechanisms for users of head mounted displays |
US10124257B2 (en) | 2013-09-30 | 2018-11-13 | Sony Interactive Entertainment Inc. | Camera based safety mechanisms for users of head mounted displays |
Also Published As
Publication number | Publication date |
---|---|
JP2011189066A (en) | 2011-09-29 |
JP5039808B2 (en) | 2012-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110230266A1 (en) | Game device, control method for a game device, and non-transitory information storage medium | |
US8740704B2 (en) | Game device, control method for a game device, and a non-transitory information storage medium | |
JP5320332B2 (en) | GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM | |
US8419545B2 (en) | Method and system for controlling movements of objects in a videogame | |
US10293252B2 (en) | Image processing device, system and method based on position detection | |
US8556716B2 (en) | Image generation system, image generation method, and information storage medium | |
JP5256269B2 (en) | Data generation apparatus, data generation apparatus control method, and program | |
EP2243525A2 (en) | Method and system for creating a shared game space for a networked game | |
US10755486B2 (en) | Occlusion using pre-generated 3D models for augmented reality | |
US11839811B2 (en) | Game processing program, game processing method, and game processing device | |
US20110305398A1 (en) | Image generation system, shape recognition method, and information storage medium | |
EP2394711A1 (en) | Image generation system, image generation method, and information storage medium | |
US20130194182A1 (en) | Game device, control method for a game device, and non-transitory information storage medium | |
JP5425940B2 (en) | GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM | |
US11073902B1 (en) | Using skeletal position to predict virtual boundary activation | |
US11195320B2 (en) | Feed-forward collision avoidance for artificial reality environments | |
JP5746644B2 (en) | GAME DEVICE AND PROGRAM | |
JP2012196286A (en) | Game device, control method for game device, and program | |
WO2022124135A1 (en) | Game program, game processing method, and game device | |
JP5715583B2 (en) | GAME DEVICE AND PROGRAM | |
JP2022090964A (en) | Game program, game processing method, and game device | |
JP2022090965A (en) | Game program, game processing method, and game device | |
CN114867536A (en) | Information processing apparatus, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KONAMI DIGITAL ENTERTAINMENT CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAMAGUCHI, TAKESHI;REEL/FRAME:025927/0391 Effective date: 20110221 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |