CN101479782B - Multi-input game control mixer - Google Patents

Multi-input game control mixer Download PDF

Info

Publication number
CN101479782B
CN101479782B CN200780016094XA CN200780016094A CN101479782B CN 101479782 B CN101479782 B CN 101479782B CN 200780016094X A CN200780016094X A CN 200780016094XA CN 200780016094 A CN200780016094 A CN 200780016094A CN 101479782 B CN101479782 B CN 101479782B
Authority
CN
China
Prior art keywords
controller
data
information
game
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200780016094XA
Other languages
Chinese (zh)
Other versions
CN101479782A (en
Inventor
G·M·扎列夫斯基
R·L·马克斯
茅晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment LLC
Original Assignee
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/381,729 external-priority patent/US7809145B2/en
Priority claimed from US11/418,988 external-priority patent/US8160269B2/en
Priority claimed from US11/418,989 external-priority patent/US8139793B2/en
Priority claimed from US11/381,727 external-priority patent/US7697700B2/en
Priority claimed from US11/381,721 external-priority patent/US8947347B2/en
Priority claimed from US11/381,725 external-priority patent/US7783061B2/en
Priority claimed from US11/429,047 external-priority patent/US8233642B2/en
Priority claimed from US11/381,728 external-priority patent/US7545926B2/en
Priority claimed from US11/429,133 external-priority patent/US7760248B2/en
Priority claimed from US11/381,724 external-priority patent/US8073157B2/en
Application filed by Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment America LLC
Priority claimed from PCT/US2007/067004 external-priority patent/WO2007130791A2/en
Publication of CN101479782A publication Critical patent/CN101479782A/en
Application granted granted Critical
Publication of CN101479782B publication Critical patent/CN101479782B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L21/00Processing of the speech or voice signal to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
    • G10L21/02Speech enhancement, e.g. noise reduction or echo cancellation
    • G10L21/0208Noise filtering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M9/00Arrangements for interconnection not involving centralised switching
    • H04M9/08Two-way loud-speaking telephone systems with means for conditioning the signal, e.g. for suppressing echoes for one or both directions of traffic
    • H04M9/082Two-way loud-speaking telephone systems with means for conditioning the signal, e.g. for suppressing echoes for one or both directions of traffic using echo cancellers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • H04R3/005Circuits for transducers, loudspeakers or microphones for combining the signals of two or more microphones
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L21/00Processing of the speech or voice signal to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
    • G10L21/02Speech enhancement, e.g. noise reduction or echo cancellation
    • G10L21/0208Noise filtering
    • G10L21/0216Noise filtering characterised by the method used for estimating noise
    • G10L2021/02161Number of inputs available containing the signal or the noise to be suppressed
    • G10L2021/02163Only one microphone
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/03Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 characterised by the type of extracted parameters
    • G10L25/18Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 characterised by the type of extracted parameters the extracted parameters being spectral information of each sub-band
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/20Arrangements for obtaining desired frequency or directional characteristics
    • H04R1/32Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
    • H04R1/40Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
    • H04R1/406Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2201/00Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
    • H04R2201/40Details of arrangements for obtaining desired directional characteristic by combining a number of identical transducers covered by H04R1/40 but not provided for in any of its subgroups
    • H04R2201/4012D or 3D arrays of transducers

Abstract

The present invention discloses a system and a method for analyzing game-controlling input data, also discloses a machine-readable medium for analyzing instructions of game-controlling input data.

Description

Multi-input game control mixer
The right of priority that requires
The application has required following right of priority: U.S. number of patent application No.11/381, and 729, Xiao Dong Mao, denomination of invention is ULTRA SMALL MICROPHONE ARRAY, (proxy number SCEA05062US00), the applying date is on May 4th, 2006; U.S. number of patent application No.11/381,728, Xiao Dong Mao, denomination of invention is ECHO AND NOISE CANCELLATION, (proxy number SCEA05064US00), the applying date is on May 4th, 2006; U.S. number of patent application No.11/381,725, Xiao Dong Mao, denomination of invention is " METHODS AND APPARATUS FOR TARGETEDSOUND DETECTION ", (proxy number SCEA05072US00), the applying date is on May 4th, 2006; U.S. number of patent application No.11/381,727, Xiao Dong Mao, denomination of invention is " NOISEREMOVAL FOR ELECTRONIC DEVICE WITH FAR FIELD MICROPHONEON CONSOLE ", (proxy number SCEA05073US00), the applying date is on May 4th, 2006; U.S. number of patent application No.11/381,724, Xiao Dong Mao, denomination of invention is " METHODS ANDAPPARATUS FOR TARGETED SOUND DETECTION ANDCHARACTERIZATION ", (proxy number SCEA05079US00), the applying date is on May 4th, 2006; U.S. number of patent application No.11/381,721, Xiao Dong Mao, denomination of invention is " SELECTIVESOUND SOURCE LISTENING IN CONJUCTION WITH COMPUTERINTERACTIVE PROCESSING ", (proxy number SCEA04005US00JUMBOUS), the applying date is on May 4th, 2006.Above file is incorporated this paper into by the mode of reference.
The application has also required following right of priority: U.S. number of patent application No.11/382, and 031, Gary Zalewski etc., denomination of invention is " MULTI-INPUT GAME CONTROL MIXER ", (proxy number SCEA06MXR1), the applying date is on May 6th, 2006; U.S. number of patent application No.11/382,032, Gary Zalewski etc., denomination of invention is " SYSTEM FOR TRACKING USERMANIPULATIONS WITHIN AN ENVIRONMENT ", (proxy number SCEA06MXR2), the applying date is on May 6th, 2006.Above file is incorporated this paper into by the mode of reference.
The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/418,988, Xiao Dong Mao, denomination of invention is " METHODS AND APPARATUSES FORADJUSTING A LISTENING AREA FOR CAPUTURING SOUNDS ", (proxy number is SCEA-00300), the applying date is on May 4th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/418,989, Xiao Dong Mao, denomination of invention is " METHODS AND APPARATUSES FORCAPTURING AN AUDIO SIGNAL BASED ON VISUAL IMAGE ", (proxy number is SCEA-00400), the applying date is on May 4th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/429,047, Xiao Dong Mao, denomination of invention is " METHODS AND APPARATUSES FORCAPTURING AN AUDIO SIGNAL BASED ON A LOCATION OF THESIGNAL ", (proxy number is SCEA-00500), the applying date is on May 4th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/429,133, Richard Marks etc., denomination of invention is " SELECTIVE SOUNDSOURCE LISTENING IN CONJUNCTION WITH COMPUTER INTERACTIVEPROCESSING ", (proxy number is SCEA04005US01-SONYP045), the applying date is on May 4th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/429,414, Richard Marks etc., denomination of invention is " Computer Image and Audio Processing of Intensity and Input Devices for InterfacingWith A Computer Program ", (proxy number is SONYP052), the applying date is on May 4th, 2006, incorporates its full content into this paper by the mode of reference.
The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,033, denomination of invention is " SYSTEM; METHOD; AND APPARATUS FORTHREE-DIMENSIONAL INPUT CONTROL ", (proxy number is SCEA06INRT1), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,035, denomination of invention is " INERTIALLY TRACKABLE HAND-HELD CONTROLLER ", (proxy number is SCEA06INRT2), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,036, denomination of invention is " METHODAND SYSTEM FOR APPLYING GEARINGEFFECTS TO VISUAL TRACKING ", (proxy number is SONYP058A), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,041, denomination of invention is " METHOD ANDSYSTEM FOR APPLYING GEARING EFFECTS TO INERTIAL TRACKING ", (proxy number is SONYP058B), in please day be on May 6th, 2006, incorporate its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,038, denomination of invention is " METHOD AND SYSTEM FOR APPLYINGGEARING EFFECTS TO ACOUSTICAL TRACKING ", (proxy number is SONYP058C), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,040, denomination of invention is " METHOD AND SYSTEM FOR APPLYING GEARING EFFECTSTO MULTI-CHANNEL MIXED INPUT ", (proxy number is SONYP058D), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,034, denomination of invention is " SCHEMEFOR DETECTING AND TRACKING USER MANIPULATION OF A GAMECONTROLLER BODY ", (proxy number is SCEA05082US00), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,037, denomination of invention is " SCHEME FORTRANSLATING MOVEMENTS OF A HAND-HELD CONTROLLER INTOINUTS FOR A SYSTEM ", (proxy number is 86324), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,043, denomination of invention is " DETECTABLE ANDTRACKABLE HAND-HELD CONTROLLER ", (proxy number is 86325), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 11/382,039, denomination of invention is " METHODFOR MAPPING MOVEMENTS OF A HAND-HELD CONTROLLER TO GAMECOMMANDS ", (proxy number is 86326), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 29/259,349, denomination of invention is " CONTROLLER WITH INFRAREDPORT ((DESIGN PATENT)) ", (proxy number is SCEA06007US00), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 29/259,350, denomination of invention is " CONTROLLERWITH TRACKING SENSORS ((DESIGN PATENT)) ", (proxy number is SCEA06008US00), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 60/798,031, denomination of invention is " DYNAMIC TARGET INTERFACE ", (proxy number is SCEA06009US00), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.The right of priority of the patented claim that the application has awaited the reply below also having required jointly: application number is 29/259,348, denomination of invention is " TRACKED CONTROLLER DEVICE ((DESIGN)) ", (proxy number is SCEA06010US00), the applying date is on May 6th, 2006, incorporates its full content into this paper by the mode of reference.
The related application of cross reference
The application relates to Application No. No.10/207, and 677, denomination of invention is " MAN-MACHINEINTERFACE USING A DEFORMABLE DEVICE ", the applying date is on July 27th, 2002; Application No. No.10/650,409, denomination of invention is " AUDIO INPUT SYSTEM ", the applying date is on August 27th, 2003; Application No. No.10/663,236, denomination of invention is " METHODAND APPARATUS FOR ADJUSTING A VIEW OF A SCENE BEINGDISPLAYED ACCORDING TO TRACKED HEAD MOTION ", and the applying date is on September 15th, 2003; Application No. No.10/759,782, denomination of invention is " METHOD ANDAPPARATUS FOR LIGHT INPUT DEVICE ", the applying date is on January 16th, 2004; Application No. No.10/820,469, denomination of invention is " METHOD AND APPARATUS TODETECT AND REMOVE AUDIO DISTURBANCES ", the applying date is on April 7th, 2004; Application No. No.11/301,673, denomination of invention is " METHOD FOR USINGRELATIVE HEAD AND HAND POSITIONS TO ENABLE A POINTINGINTERFACE VIACAMERATRACKING ", and the applying date is on Dec 12nd, 2005; U.S. Provisional Patent Application No.60/718,145, denomination of invention is " AUDIO, VIDEO, SIMULATION, AND USER INTERFACE PARADIGMS ", the applying date is on September 15th, 2005, incorporates its full content into this paper by the mode of reference.
TECHNICAL FIELD OF THE INVENTION
The present invention relates generally to man-machine interface, relate in particular to the processing hyperchannel and import user's manipulation of following the tracks of one or more controllers.
The background technology of invention
Computer entertainment system generally includes hand held controller, game console or other controller.User or player use controller to order or other instruction sends to entertainment systems, to control video-game or other emulation of playing.For example, controller can be provided the executor of user's operation, for example operating rod.The manipulated variable of operating rod is transformed into digital value from the analogue value, and digital value is sent to the main frame of game machine.Controller also can be provided the manipulable button of user.
The present invention is above-mentioned or the further developing of other background information key element.
The detailed description of accompanying drawing
Can easily understand instruction of the present invention by consideration the following detailed description in conjunction with the accompanying drawings, in the accompanying drawings:
Fig. 1 illustrates the pictorial diagram of the video game system of operation according to one embodiment of present invention;
Fig. 2 is the skeleton view of the controller made according to one embodiment of present invention;
Fig. 3 is the explanation schematic three dimensional views that can be used for the accelerometer of controller according to an embodiment of the invention;
Fig. 4 is the block diagram that is used to mix the system of various control input according to an embodiment of the invention;
Fig. 5 A is the block diagram of a part of the video game system of Fig. 1;
Fig. 5 B is the process flow diagram of the method for the controller that is used to follow the tracks of video game system according to an embodiment of the invention;
Fig. 5 C is the process flow diagram that utilizes the method for position and/or directional information in video game system in the game process that carries out according to one embodiment of present invention;
Fig. 6 is the block diagram of video game system according to an embodiment of the invention; And
Fig. 7 is the block diagram of little processing execution device of video game system according to an embodiment of the invention.
Embodiment
Although the following detailed description comprises many specific details for illustrative purposes, any those of ordinary skill in this area is appreciated that to the variation of following details or change be within the scope of the invention.Thus, the exemplary embodiment of the present invention of the following stated is illustrated, but has no lack of ubiquity, and does not influence the scope of claimed invention.
The embodiment of several different methods described herein, device, scheme and system provide to the user to the moving of entire controller body self, action and/or the detection of handling, catch and follow the tracks of.Detected user is to the moving of The whole control body, action and/or handle the additional command of the many aspects that can be used as the ongoing recreation of control or other emulation.
The detection and tracking user can accomplished in many ways to the manipulation of game console body.For example, inertial sensor is such as accelerometer or gyroscope, and image capturing unit can be used for computer entertainment system such as digital camera, with the action that detects the hand held controller body and be converted into action in the recreation.Example with the inertial sensor tracking control unit is described, for example in Application No. 11/382,033, denomination of invention is that above-mentioned file is incorporated this paper into by the mode of reference in " SYSTEM; METHOD; AND APPARATUS FORTHREE-DIMENSIONNAL INPUT CONTROL " (proxy number is SCEA06INRT1).Example with the image capture device tracking control unit is described, for example in Application No. 11/, 382,034, denomination of invention is in " SCHEME FORDETECTING AND TRACKING USER MANIPULATION OF A GAMECONTROLLER BODY " (proxy number is SCEA05082US00), and above-mentioned file is incorporated this paper into by the mode of reference.In addition, controller and/or user can also be tracked on acoustics by using microphone array or appropriate signals to handle.The example that such acoustics is followed the tracks of is described in Application No. 11/381,721, and its mode by reference is incorporated this paper into.
Acoustic sensor, inertial sensor and image capture device can be used separately or combination in any ground uses, and to detect a lot of dissimilar action of controller, for example move up and down, distortion is moved, and moving from one side to another side jerks and move, moving of club formula, bust moves or the like.These actions can be corresponding to various order, so that above-mentioned action is converted into the action in the recreation.The detection and tracking user can be applied to realize recreation, emulation of number of different types etc. to the manipulation of game console body; allow the user; for example; participate in the fight of fencing or light sword; depict the shape of article with rod; participate in the sport event of number of different types, the fight on the participation screen or other contact battle or the like.Games can be configured to the action of tracking control unit and certain attitude that writes down in advance of identification from the action of following the tracks of.The identification of one or more above-mentioned attitudes can trigger the change of game state.
In an embodiment of the present invention, the controller routing information that obtains from above-mentioned different source can be mixed before the analysis of gesture recognition.Tracking data from homology (for example, acoustics, inertia and image capturing) not can mix in the mode that improves the gesture recognition possibility.
With reference to figure 1, show operated system 100 according to one embodiment of present invention.As shown in the figure, computer entertainment control desk 102 can be couple to TV or other video display 104, with the image of display video recreation or other emulation.Recreation or other emulation can be stored in the DVD that inserts in the control desk 102, CD, and flash memory is in USB storage or other storage medium 106.User or player's 108 direct game controllers 110 come control of video recreation or other emulation.As shown in Figure 2, game console 110 comprises inertial sensor 112, and it moves in response to the position of game console 110, and the change of direction or direction produces signal.Except inertial sensor, game console 110 can comprise traditional control input device, and for example operating rod 111, button 113, R1, L1 etc.
In operating process, user 108 is mobile controller 110 physically.For example, controller 110 can be moved by any direction by user 108, for example upper and lower, on one side, to another side, distortion, roll, shake, jerk, bust or the like.Above-mentioned the moving of controller 110 self can be detected and catch by video camera 114, and this video camera 114 is realized detecting and catching by analyzing according to mode hereinafter described to follow the tracks of from the signal of inertial sensor 112.
Refer again to Fig. 1, system 100 selectively comprises video camera or other video image capture device 114, and this equipment can be positioned as and make controller 110 in 116 scopes of the visual field of video camera.To from the analysis of the image of image capture device 114 can with the analysis from the data of inertial sensor 112 is used together.As shown in Figure 2, controller 110 can selectively be configured to have light source, and for example light emitting diode (LED) 202,204,206,208 is followed the tracks of by video analysis with convenient.Said apparatus can be mounted in the body of controller 110.Here, term " body " meaning is to describe the portion of the handle of game console 110 (or wear part, if game console is wearable).
The analysis that is used for this video image of tracking control unit 110 is disclosed, for example in Application No. 11/382,034, denomination of invention is " SCHEME FOR DETECTING AND TRACKINGUSER MANIPULATION OF A GAME CONTROLLER BODY " (proxy number is SCEA05082US00), and above-mentioned file is incorporated this paper into by the mode of reference.Control desk 102 can comprise sound transducer, and for example microphone array 118.Controller 110 can comprise that also voice signal generator 210 (for example loudspeaker) provides sound source, make and come the sound of controller 110 is followed the tracks of facility more with microphone array 118 and suitable sound signal processing, for example, in Application No. 11/381, describe in 724, above-mentioned file is incorporated this paper into by the mode of reference.
Usually, be used to produce the position and the directional data of controller 110 from the signal of inertial sensor 112.Above-mentioned data can be used to the many physics aspect that moves of computing controller 110, and for example controller is along the acceleration and the speed of arbitrary axis, the inclination of controller, and pitching, driftage is rolled, and any telemetry station of controller 110.Here, remote measurement typically refers to deviser or operator's information of interest of telemeasurement and reporting system or system.
The detection and tracking ability that moves for controller 110 makes it possible to determine whether controller 110 has carried out any predefined moving.That is to say that controller 110 specific Move Modes or attitude are can be predefined, and can be with playing games or the input command of other emulation.For example, the attitude of the downward jumping of controller 110 can be defined as an order, and the distortion attitude of controller 110 can be defined as another order, the shaking attitude and can be defined as another order again of controller 110, or the like.Like this, the user 108 physically mode of mobile controller 110 is used as another input that control is played, and stimulates more and interesting experience for the user provides.
By example explanation but be not limited only to this, inertial sensor 112 may be an accelerometer.Fig. 3 has enumerated an example of accelerometer 300 by following form: a simple block 302 is connected with framework 304 elasticity on four points, for example by spring 306,308,310,312.Pitching (pitch) axle and (roll) axle (being represented by X and Y-axis respectively) that rolls are arranged in the plane of crosscut framework.Driftage (yaw) axle Z is on the direction perpendicular to the plane that comprises pitch axis X and roll axis Y.Framework 304 is assemblied on the controller 110 in any suitable manner.When framework 304 (with Joystick controller 110) quickens and/or rotate, piece 302 framework 304 relatively is shifted, spring 306,308,310,312 angle elongation or the compressions that can be dependent on amount and the direction and/or the pitching of translation and/or rotary acceleration and/or roll and/or go off course.The elongation of the displacement of piece 302 and/or spring 306,308,310,312 or compression can be sensed, for example, adopt appropriate sensor 314,316,318,320, be converted into then with known or confirmable mode and depend on pitching and/or the signal of the amount acceleration that rolls.
Exist different ways to follow the tracks of the position of piece and/or be applied to power on it, comprise the strain ga(u)ge material, optical sensor, Magnetic Sensor, Hall effect device, piezo-electric device, capacitive transducer and similar device.Embodiments of the invention can comprise the sensor of any amount and type or composite type.As example, but be not limited thereto, sensor 314,316,318,320 can be the space closure electrode (gap closing electrode) that is placed on piece 302 tops.When the position of piece changed with respect to each electrode, the electric capacity between piece and each electrode can change.Each electrode can be connected to a circuit, and this circuit produces about the signal of piece 302 with respect to the electric capacity (therefore about the degree of approach of piece 302 with respect to this electrode) of this electrode.In addition, spring 306,308,310,312 can comprise resistive strain gage sensor, produce the signal about length of spring compressed or elongation.
In certain embodiments, framework 304 gimbals are installed (gimbal mounted) to controller 110, make accelerometer 300 about pitching and/or roll and/or direction that yaw axis is maintained fixed.In aforesaid way, controller shaft X, Y, Z can map directly to the corresponding axle in the real space, and needn't consider the inclination of these controller shafts about the real space coordinate axis.
As mentioned above, from inertia, the data of image capturing and sound source can be analyzed, with the position of generation tracking control unit 110 and/or the path of direction.Shown in the block diagram of Fig. 4, system 400 according to an embodiment of the invention comprises inertia analyzer 402, image dissector 404 harmony credit parsers 406.Above-mentioned each analyzer is received signal from the environment 401 of induction all.Analyzer 402,404,406 can use hardware, software (or firmware) or above both or more certain in conjunction with realizing.Each analyzer produces about the position of interested target and/or the trace information of direction.As example, interested target may be above-mentioned controller 110.Image dissector 404 can be operated in conjunction with Application No. 11/382,034 (proxy number SCEA05082US00) disclosed method, the field about these methods below forming.Inertia analyzer 402 can be in conjunction with Application No. 11/382,033, denomination of invention is that " SYSTEM; METHOD; AND APPARATUS FOR THREE-DIMENSIONAL INPUT CONTROL " (proxy number SCEA06INRT1) disclosed method is operated, the field about these methods below forming.Acoustic analysis device 406 can adopt Application No. 11/381,724 disclosed method to operate, the field about these methods below forming.
Analyzer 402,404 or 406 can be considered to be associated with the different input channels of position and/or directional information.Mixer 408 can receive a plurality of input channels, and such passage can comprise the sampled data of the environment 401 of describing induction, typically from the observation of this passage.By inertia analyzer 402, position and/or directional information that image dissector 404 harmony credit parsers 406 produce can be coupled in the input of mixer 408.Mixer 408 and analyzer 402,404,406 can be inquired about by Games Software program 410, and can be configured in response to the event interrupt Games Software.Incident can comprise the gesture recognition incident, and transmission changes (gearing change), and configuration change is provided with the noise rank, and sampling rate is set, and changes the mapping chain, or the like, their example is discussed hereinafter.Mixer 408 can be operated in conjunction with method described herein, and the field about these methods below forming.
As mentioned above, from different input channels, inertial sensor for example, the signal of video image and/or acoustic sensor can be by inertia analyzer 402, image dissector 404 harmony credit parsers 406 are analyzed respectively, to determine the action and/or the direction of controller 110 in the process of carrying out video-game according to the method for invention.Said method can be implemented as a series of processor executable program code instructions that are stored in the processor readable medium and carry out on digital processing unit.For example, shown in Fig. 5 A, video game system 100 can be included on the control desk 102, and control desk 102 has or by hardware or the inertia analyzer 402 realized by software, image dissector 404 harmony credit parsers 406.As example, analyzer 402,404,406 can be implemented as the software instruction that operates on the suitable processor unit 502.As example, processor unit 502 can be a digital processing unit, for example the microprocessor of normally used type in video game console.The part of instruction can be stored in the storer 506.Optionally, inertia analyzer 402, image dissector 404 harmony credit parsers 406 can adopt hardware to realize, for example as application-specific IC (ASIC).Such analyzer hardware can be positioned on controller 110 or the control desk 102, maybe can be remotely located at other places.In hardware is realized, analyzer 402,404,406 can be in response to the external signal programming, and external signal for example comes from processor 502, or other is positioned at long-range source, for example by the USB cable, wireless connections or on network, connect.
Inertia analyzer 402 can comprise or carry out the signal that inertial sensor 112 is produced and analyze, also utilizes about the position of controller 110 and/or the information instruction of direction.Same, image dissector 404 can be carried out the instruction that image that image capturing unit 114 is captured is analyzed.In addition, the acoustic analysis device can be carried out the instruction that image that microphone array 118 is captured is analyzed.Shown in the process flow diagram 510 of Fig. 5 B, above-mentioned signal and/or image can receive by analyzed device 402,404,406, illustrate as piece 512.Above-mentioned signal and/or image can be analyzed by analyzed device 402,404,406, to determine that image trace information 405 and acoustics trace information 407 illustrate as piece 514 about the position of controller 110 and/or the inertia trace information 403 of direction.Trace information 403,405,407 can be relevant with one or more degree of freedom.Preferably six-freedom degree is tracked, with the manipulation of description control device 110 or other tracking target.This degree of freedom may relate to controller and tilt, and driftage is rolled and along x, position, speed or the acceleration of y and z-axle.
Shown in piece 516, mixer 408 mixes inertia information 403, and image information 405 and acoustic information 407 are to produce improved position and/or directional information 409.As example, mixer 408 can be based on recreation or environmental baseline, and to inertia, image and acoustics trace information 403,405,407 adopt different weights, get weighted mean then.In addition, mixer 408 can comprise the hybrid analysis device 412 of oneself, is used for analyzing the position/orientation information of combination, and produces the combination relate to the information that other analyzer produces, the result of oneself " mixer " information.
In one embodiment of the invention, but mixer 408 assignment profile values are given from analyzer 402,404 406 trace information 403,405,407.As mentioned above, some set of input control data can be by on average.Yet in the present embodiment, the input control data was assigned with a value before being averaged, thus from the input control data of some analyzer recently from other analytically more important.
Mixer 408 can adopt a lot of functions in the context of current system, comprise observation, proofreaies and correct, and is stable, derive, and combination, route is mixed, report, buffering is interrupted other processing and analysis.Above-mentioned functions can be about the trace information 403,405 that receives from one or more analyzers 402,404,406, and 407 carry out.Each analyzer 402,404,406 specific trace information that can receive and/or derive, and mixer 408 may be implemented as, and optimizes the use to the trace information 403,405,407 that receives, and produce improved trace information 409.
Analyzer 402,404,406 and mixer 408 preferably be configured to provide trace information with similar output format.Can be mapped to a individual parameter in the analyzer from the trace information parameter of any analyzer element 402,404,406.Alternatively, mixer 408 can be by handling the one or more trace information parameters from one or more analyzers 402,404,406, for any analyzer 402,404,406 forms trace informations.Mixer is capable of being combined from analyzer 402,404, two or more elements in the trace information of the 406 identical parameters types that obtain, and/or carry out the function of a plurality of parameters stride the trace information that these analyzers produce, to set up comprehensive output collection, this output collection has the benefit that produces from a plurality of input channels.
Improved trace information 409 can be employed in the process with system's 100 playing video games, shown in piece 518.In certain embodiments, the attitude that can make with user 108 use location and/or directional information are relatively carried out in the process in recreation.In certain embodiments, mixer 408 can be operated together with gesture recognition device 505, to be associated with at least one action in the game environment with from one or more user actions of user (for example, in the space to the manipulation of controller).
Shown in the process flow diagram among Fig. 5 C 520, the path of controller 110 can the use location and/or directional information follow the tracks of, shown in piece 522.As example, but be not limited thereto, the path can comprise the set about the point of the position of certain coordinate system of the barycenter of representing controller.Each location point can be by one or more coordinate representations, for example the X in cartesian coordinate system, Y and Z coordinate.Time can with each spot correlation connection on the path, thereby shape and controller advancing that can monitored path along the path.In addition, each point that point is concentrated can be associated with the data of the direction of representing controller, the data of representing direction for example, controller is about one or more rotation angle of its barycenter.In addition, each point on the path can with the speed of the barycenter of controller and the value of acceleration, and controller is associated about the speed and the angular acceleration of the angle rotation of its barycenter.
Shown in piece 524, can compare with the path of one or more storages in the path of tracking, and the path of storage is corresponding to attitude 508 known and/or that write down in advance, the context dependent of attitude 508 and ongoing video-game.Recognizer 505 can be configured to discern the user or processing audio is identified attitude or the like.For example, the user can be identified device 505 identifications by attitude, and attitude can be specific to the user.Above-mentioned particular pose can be recorded and be included in the attitude of record in advance 508 that is stored in the storer 506.Recording process can optionally be stored the sound that produces in the attitude recording process.The environment of induction is sampled multichannel analyzer and processed.Processor can be with reference to the attitude model, determining based on sound or acoustic mode and to identify and/or identification user or target, and reaches very high degree of accuracy and performance.
Shown in Fig. 5 A, represent the data 508 of attitude can be stored in the storer 506.The example of attitude comprises following several, but is not limited thereto, for example, throw target such as ball, swing target such as rod or golf clubs, pump hand pump, open or close door or window, steering wheel rotation or the control of other vehicle, the action of wushu is for example punched, polishing is moved, the paraffin removal (wax on wax off) of waxing whitewashes the house, shakes, vibration (rattle), roll, rugby is thrown (football pitch), and stubborn knob moves, 3D MOUSE moves, scrolling moves, and moving of known attitude is any recordable mobile, move around along any vector, promptly give inflation of tire but any direction in the space,, have and stop accurately and the moving of start time along moving of a path, in the intrinsic noise level, batten (spline) but etc. within record, any time-based user who follows the tracks of and repeat handles.In these attitudes each can be from path data by record in advance, and is stored as time-based model.Path and stored the hypothesis that the comparison of attitude can steady state (SS) and begin, if the path deviation steady state (SS), the path can be handled and stored attitude relatively by eliminating.If do not mate at piece 526, analyzer can continue the path at piece 522 tracking control units 110.Path if (the perhaps part in path) and stored sufficient coupling is arranged between the attitude, game state can be changed, shown in piece 528.The change of game state can include but not limited to, interrupts, and transmits control signal, and changes variable or the like.
Here be a contingent example.When definite controller 110 has left steady state (SS), the moving of path analysis device 402,404,406 or 412 tracking control units 110.If the path of controller 110 and store attitude model 508 in the path that defines consistent, these attitudes are possible " hitting ".If the path of controller 110 (in noise margin is provided with) departs from any attitude model 508, this attitude model is deleted from hit tabulation.Each attitude reference model comprises the time basis that wherein writes down attitude.Analyzer 402,404,406, or 412 at appropriate time index point comparison controller path data with stored attitude 508.The generation reset clock of steady-state condition.When departing from steady state (SS) (that is) when outside noise gate, tracing into when mobile, hit list be added into all potential may attitude models.Clock enable, and moving with hit list of controller compared.Again, relatively be advancing of elapsed time.If any attitude reaches the end of this attitude in the hit list, it is once to hit so.
In certain embodiments, when some incident took place, mixer 408 and/or single analyzer 402,404,406,412 can be notified games.Comprising for example of this class incident is as follows:
Arrive zero-acceleration point and interrupt (X and/or Y and/or Z axle).Under some game situation when the acceleration of controller when inflexion point changes, the process in the games can be notified or interrupt to analyzer.For example, user 108 can use controller 110 controls to represent the recreation incarnation of the quarter back in the rugby emulation match.Analyzer can be by the path trace controller (representative rugby) that produces from the signal from inertial sensor 112.The specific change of the acceleration of controller 110 can represent to discharge rugby.At this point, but another process (for example, the physical simulation bag) is come based on the position of controller at point of release in the analyzer trigger, and/or the track of speed and/or direction emulation rugby.
New gesture recognition is interrupted
In addition, analyzer can be disposed by one or more inputs.The example of this class input includes, but are not limited to:
Noise rank (X, Y or Z axle) is set.The noise rank can be at the analysis user reference tolerance value in hand when shake in recreation.
Sampling rate is set.As used in this, sampling rate can refer to the frequent degree of analyzer from the inertial sensor sampled signal.Sampling rate can be set to over-sampling or average signal.
Transmission is set.As used in this, transmission typically refer to controller move and play in the ratio that moves that takes place.The example of this type of " transmission " in the environment of video-game control can find in Application No. 11/382,040, and the applying date is on May 7th, 2006, and (proxy number: SONYP058D), above-mentioned file is incorporated this paper into by the mode of reference.
The mapping chain is set.As used in this, the mapping chain is meant the mapping of attitude model.The mapping of attitude model can be carried out (for example, only for the path data that produces from the inertial sensor signal) to specific input channel or the hybrid channel that forms the mixer unit is carried out.Three input channels can provide service by the analyzer of two or more different similar inertia analyzers 402.Especially, these analyzers can comprise: used inertia analyzer 402 like this, in Application No. 11/382,034, denomination of invention is the video analyzer described in the SCHEMEFOR DETECTING AND TRACKING USER MANIPULATION OF A GAMECONTROLLER BODY (proxy number SCEA05082US00), above-mentioned file is incorporated this paper into by the mode of reference, in Application No. 11/381, acoustic analysis device described in 721, above-mentioned file is incorporated this paper into by the mode of reference.Can be with shining upon chain Allocation Analysis device.The mapping chain can be swapped out by recreation in the process of playing games, as also being swapped out by recreation for analyzer with for the setting of mixer.
Refer again to Fig. 5 B, piece 512 one of skill in the art will appreciate that multiple mode produces signal from inertial sensor 112.The some of them example is described at this.Reference block 514 has multiple mode to analyze the sensor signal that produces in piece 512, to obtain about the position of controller 110 and/or the trace information of direction.As an example, but be not limited thereto, trace information can be separately or is comprised information about following parameter with combination in any, but is not limited thereto.
The controller direction.The direction of controller 110 can be expressed as the pitching of certain reference direction relatively, rolls or crab angle (for example, with radian).The change rate of controller direction (for example, angular velocity or angular acceleration) also can be included in position and/or the directional information.For example, wherein inertial sensor 112 comprises gyro sensor, the controller directional information can with pitching, roll or the form of the proportional one or more output valves of crab angle directly obtains.
Location of controls (for example, the Cartesian coordinates X of the controller 110 in certain reference frame, Y, Z)
Controller X-axle speed
Controller Y-axle speed
Controller Z-axle speed
Controller X-axle acceleration
Controller Y-axle acceleration
Controller Z-axle acceleration
It will be noted that about the position, speed and acceleration, position and/or directional information can be represented with the coordinate system beyond the Descartes.For example, cylindrical coordinates or spherical coordinates can be used for the position, speed and acceleration.About X, Y, the acceleration information of Z axle can directly obtain from the accelerometer type sensor, and is for example described herein.X, Y, the Z acceleration can begin about time integral from certain initial moment, determined X, Y, the change of Z speed.These speed can be by being added to rapid change the known X-of initial time in time, Y-, and the value of Z-speed is calculated.X, Y, Z speed can be determined the X-of controller, Y-, Z-displacement to time integral.X-, Y-, the Z-position can be by being added to displacement the known X-of initial time, Y-, Z-determines the position.
Whether this customizing messages indicating control of steady state (SS) Y/N-is in steady state (SS), and it can be defined as the optional position, and it also can stand to change.In a preferred embodiment, steady state position can be, and its middle controller is maintained at the more or less position of the direction of level at roughly equal with user's waist height.
Apart from last time steady state (SS) time typically refer to about after detecting steady state (SS) (as mentioned above) at last through how long the time interval data.As previously mentioned, determining of this time can be by in real time, and processor cycle or sampling period calculate.May be apart from steady state (SS) time data time last time for tracking about the initial point reset controller, to guarantee the accuracy of role in the game environment or target mapping, be important.These data may be for determine that the actions available/attitude that may carry out subsequently also is important (reaches exclusively and comprise ground) in game environment.
Discerned attitude last time and typically referred to gesture recognition device 505 (realizing) attitude of identification at last by hardware or software.Discerning that attitude approves really last time can be important for the following fact: it is relevant that attitude before can other action take place with the possible attitude that is identified subsequently or in game environment.
Gesture recognition time last time
Above-mentioned each output can be sampled at any time by games or software.
In one embodiment of this invention, mixer 408 can be from analyzer 402,404,406 trace information 403,405,407 assignment profile values.As mentioned above, some set of input control data can be averaged.Yet in the present embodiment, the input control data was assigned with a value before being averaged, thus from the input control data of some analyzer recently from other analytically more important.
For example, mixer 408 may be about the trace information of acceleration and steady state (SS).Mixer 408 will receive trace information 403,405,407 then as described above.Trace information can comprise about for example parameter of aforesaid acceleration and steady state (SS).Before the data of representing above-mentioned information were averaged, mixer 408 can be trace information data set 403,405,407 assignment profile values.For example, x-and the y-acceleration parameter that obtains from inertia analyzer 402 can be worth 90% weighting.Yet, from the x-and only 10% weighting of y-acceleration information of image dissector 406 acquisitions.The acoustic analysis device trace information 407 that belongs to acceleration parameter can 0% weighting, that is to say that these data do not have weights.
Similarly, the Z-axle trace information parameter that obtains from inertia analyzer 402 can 10% weighting, and image dissector Z-axle trace information can 90% weighting.Acoustic analysis device trace information 407 can be once more with 0% value weighting, but can 100% weighting from the steady state (SS) trace information of acoustic analysis device 406, other analyzer trace information is with 0% weighting simultaneously.
After appropriate distribution weights were assigned with, the input control data can be averaged in conjunction with weights, and to reach weighted mean input control data collection, this data set is analyzed by gesture recognition device 505 subsequently, and and game environment in specific action be associated.Related value can be pre-defined by mixer 408 or particular game title.Above-mentioned value also mixer 408 determine specific speciality, and the result who dynamically adjusts thus, following further discussion from the data of each analyzer.Above-mentioned adjustment is also set up particular data when in specific environment and/or have a result of the historical knowledge base of specific value in response to the singularity of given game title.
Mixer 408 can be configured to carry out middle dynamic operation in recreation.For example, when mixer 408 receives when respectively importing control data, it can identify, and certain data exceeds the receivable scope or the quality of data always, or has reflected the bad data of the processing mistake that can indicate the related input device place.
In addition, the condition of some in actual environment may change.For example, the natural light in user's the game environment of being in may strengthen from the morning to the afternoon, caused that pictorial data is caught to go wrong.In addition, neighbours or household may be along with one day the noises that becomes, and cause that voice data is caught to go wrong.Similarly, if the user continues to have played several hrs, it is not too quick that their reaction may become, and causes that the explanation of inertial data goes wrong.
In these examples, or the quality of the input control data of particular form is in impeachable other example arbitrarily therein, mixer 408 can dynamically be redistributed the distribution weights for the specific set of data from specific device, makes more or less importance be assigned to aforesaid specific input control data.Similarly, game environment may change in game process, and wherein the demand of particular game changes, thereby need redistribute weights or needs for specific input control data.
Similarly, mixer 408 can be based on being identified by processing mistake or the feedback data that gesture recognition device 505 produces, and certain data that is passing to gesture recognition device 505 are just handled mistakenly, handled lentamente, or do not handle fully.In response to this feedback or (for example recognize these difficult treatment, although image analysis data within the acceptable range, when carrying out association, gesture recognition device 505 makes a mistake), which analyzer mixer 408 can be adjusted from, and if possible when, seek which input control data.Mixer 408 also can require (for example to be sent to once more deal with data at the input control data, data are averaged) mixer 408 before, by suitable analyzer the input control data is carried out specific analysis and processing, thereby obtain the assurance of more one decks, make that the data that are sent to gesture recognition device 505 will effectively and suitably be handled.
In certain embodiments, mixer 408 can identify, certain data is bad, invalid or outside the specific variable, and can visit specific input control data or the variable relevant with these data, make its replaceable incorrect data, perhaps suitably analyze and calculate particular data about essential variable.
According to embodiments of the invention, the method for video game system and the above-mentioned type can realize as described in Fig. 6.Video game system 600 can comprise processor 601 and storer 602 (such as RAM, DRAM, ROM or the like).In addition, video game system 600 can have a plurality of processors 601, if carry out the words of parallel processing.Storer 602 comprises data and games code 604, and it can comprise the each several part as dispose above-mentionedly.Especially, storer 602 can comprise inertial signal data 606, and these inertial signal data 606 can comprise the controller routing information of above-mentioned storage.Storer 602 also can comprise the attitude data 608 of storage, such as the data of the representative one or more attitudes relevant with games 604.The instruction of the coding of carrying out on processor 602 can realize many input mixers 605 that can dispose as described above and work.
System 600 also can comprise well-known support function 610, such as I/O (I/O) element 611, power supply (P/S) 612, clock (CLK) 613 and buffer memory 614.Equipment 600 can comprise mass storage device 615 alternatively, such as disc driver, and CD-ROM drive, tape drive waits stored programme and/or data.Controller also can comprise display unit 616 and user interface section 618 alternatively, with convenient mutual between controller 600 and user.Display unit 616 can be a videotex, numeral, the cathode ray tube (CRT) of graphical symbol or image or the form of flat screens.User interface 618 can comprise keyboard, mouse, operating rod, light pen or other device.In addition, user interface 618 can comprise microphone, and video camera or other chromacoder are to provide the direct seizure to the signal that will analyze.Processor 601, other assembly of storer 602 and system 600 can be by system bus 620 mutual switching signals (such as code command and data) as shown in Figure 6.
Microphone array 622 can be couple to system 600 by I/O function 611.Microphone array can comprise about 2 to 8 microphones, preferably about 4 microphones, and the distance that adjacent microphone is separated by is less than about 4 centimetres, preferably between about 1 to 2 centimetre.Preferably, the microphone in the array 622 is an omnidirectional microphone.Optionally image capturing unit 623 (such as digital camera) can be couple to equipment 600 by I/O function 611.The one or more aiming actuators 625 that mechanically are couple to video camera can be by I/O function 611 and processor 601 switching signals.
As used herein, term I/O refer generally to or from system's 600 Data transmission, and to or from any program, operation or the equipment of peripherals Data transmission.Data transfer can be counted as from the output of an equipment and to the input of another one equipment each time.Peripherals only comprises the equipment of input, such as keyboard and mouse, and Shu Chu equipment only, such as printer, and the equipment that as can writing CD-ROM, can import and can export.Term " peripherals " had both comprised external unit, such as mouse, keyboard, printer, watch-dog, microphone, game console, video camera, external compression driver or scanner, also comprise internal unit, such as CD-ROM drive, CD-R driver or internal modems or other peripherals, such as flash memory read/write device, hard disk drive.
In certain embodiments of the present invention, equipment 600 can be a video-game unit, and it can comprise the controller 630 that is couple to processor via I/O function 611 by wired (such as USB cable) or wireless mode.Controller 630 can have analog joystick control 631 and conventional button 633, is provided at the control signal of using always during the playing video game.This video-game may be implemented as processor readable data and/or the instruction from program 604, and it can be stored on storer 602 or other processor readable medium, such as one that interrelates with mass storage 615.
Operating rod control 631 can be configured to generally make that mobile to the left or to the right control lever produces the signal that moves along X-axis, forward (on) or (descend) to move its generation backward along the moving signal of y-axis shift.In operating rod for three-dimensional motion configuration, left (counterclockwise) or to the right (clockwise) twist operating rod and can produce the signal that moves along the Z axle.These three axle-X, Y and Z-be called as respectively usually roll, pitching and driftage, particularly about aircraft.
Except traditional characteristic, controller 630 can comprise one or more inertial sensors 632, and described inertial sensor 632 can provide position and/or directional information for processor 601 by inertial signal.Directional information can comprise angle information such as the pitching of controller 630, roll or go off course.By example, inertial sensor 632 can comprise accelerometer, gyroscope or the inclination sensor of any amount and/or their combination.In a preferred embodiment, inertial sensor 632 comprises inclination sensor, is suitable for the direction of inductive control rod controller about pitching and roll axis; First accelerometer is suitable for responding to the acceleration along yaw axis; And second accelerometer, be suitable for responding to angular acceleration about yaw axis.Accelerometer may be implemented as, and such as the MEMS device, this device comprises the piece of installing by one or more springs, has to be used for the sensor of sensor block with respect to the displacement of one or more directions.The acceleration that can be used for determining Joystick controller 630 from the signal of the displacement that depends on piece of sensor.This technology can realize by being stored on the storer 602 and by the instruction in the games 604 of processor 601 execution.
By example, the accelerometer that is suitable for as inertial sensor 632 can be a simple block, and it flexibly is couple to framework at three or four points, such as passing through spring.Pitch axis and roll axis are arranged in a plane with this frame intersection, and this framework is mounted on the Joystick controller 630.When this framework (and Joystick controller 630) during around the rotation of pitch axis and roll axis, piece can be at the bottom offset that influences of gravity, and spring will elongate or compresses in the mode that depends on pitching and/or roll angle.The displacement of piece can be sensed and be converted the signal of the amount that depends on pitching and/or roll to.Also can produce the feature mode of length of spring compressed and/or elongation or moving of piece about the angular acceleration of yaw axis or along the linear acceleration of yaw axis, this moves can be sensed and convert the signal of the amount that depends on angle or linear acceleration to.Such accelerometer device can be by following the tracks of moving or the compression and the expansionary force of spring of piece, measure pitching about yaw axis, roll angle acceleration and along yaw axis linear acceleration.Exist various method to follow the tracks of the position of piece and/or be applied to power above it, comprise resistance-strain metering material, optical sensor, Magnetic Sensor, hall effect device, piezo-electric device, capacitive sensor or the like.
In addition, Joystick controller 630 can comprise one or more light sources 634, such as light emitting diode (LED).Light source 634 can be used for a controller and other controller are made a distinction.For example, one or more LED can or keep a kind of LED pattern-coding to realize this target by flicker.By way of example, 5 LED can provide on Joystick controller 630 with the pattern of straight line or two dimension.Although line spread LED is an optimal way, LED can be aligned to the pattern of rectangle or arc pattern alternatively, during with the image of the LED pattern that obtains at analysis image capturing unit 623, and the convenient plane of delineation of determining led array.In addition, the LED pattern-coding also can be used for determining the location of Joystick controller 630 between game play session.For example, LED can help identification controller pitching, go off course and roll.This detecting pattern can help to provide better user's impression in recreation, such as aircraft flight recreation or the like.Image capturing unit 623 can be caught the image that comprises Joystick controller 630 and light source 634.Analyze these images and can determine the position and/or the direction of Joystick controller.These analyses can realize by being stored in the storer 602 and by the code instructions 604 that processor 601 is carried out.Image capturing unit 623 is caught the image of light source 634 for convenience, and light source 634 can be placed on two or more different sides of Joystick controller 630, such as front and back (shown in dotted line).This placement allows image capturing unit 623 at depending on how the user holds different directions Joystick controller 630, Joystick controller 630, obtains the image of light source 634.
In addition, light source 634 provides telesignalisation can for processor 601, such as, with pulse code, amplitude modulation(PAM) or warbled form.This telesignalisation can indicate which button of operating rod just being pressed and/or these buttons by by to which kind of degree.Telesignalisation can be encoded into light signal, such as, by pulse code, pulse-length modulation, frequency modulation (PFM) or light intensity (amplitude) modulation.Processor 601 can decode telesignalisation from light signal, and carries out the recreation order in response to the telesignalisation of decoding.The image of the Joystick controller 630 that telesignalisation can obtain by analysis image capturing unit 623 and decoded.Alternatively, equipment 600 can comprise independent optical sensor, is specifically designed to from light source 634 and receives telesignalisation.With the computer program interface in combine and determine that amount of brightness uses LED,, such as, Richard L.Marks etal.'s, be entitled as " Computer Image and Audio processing of Intensity and Input Devices forInterfacing With A Computer Program " (Attorney Docket No.SONYP052), the Application No. 11/429 of application on May 4th, 2006, be described in 414, its full content mode by reference is integrated into here.In addition, analyze the image that comprises light source 634 and both can be used to position and/or the direction that remote measurement also can be used to determine Joystick controller 630.This technology can realize by the instruction of the program 604 that can be stored in the storer 602 and be carried out by processor 601.
Processor 601 can be used to the inertial signal from inertial sensor 632, in conjunction with by image capturing unit 623 detected from light source 634 light signal and/or from the sound source position and the characteristic information of microphone array 622 detected voice signals, infer about controller 630 and/or its user's the position and/or the information of direction.For example, " acoustics radar " sound source position and characteristic can be used in combination with microphone array 622, with Joystick controller move by independently tracked (by inertial sensor 632 and or light source 634) in follow the tracks of the sound that moves.In the acoustics radar was selected, precalibrated listening zone was selected when operation, and was filtered from the sound in the source of precalibrated listening zone outside.Precalibrated listening zone can comprise corresponding to the focusing volume of image capturing unit 623 or the listening zone of visual field.The example of acoustics radar exists, being entitled as of Xiadong Mao " METHOD AND APPARATUS FOR TARGETEDSOUND DETECTION AND CHARACTERIZATION ", the Application No. 11/381 that on May 4th, 2006 submitted to, detailed description is arranged in 724, and it is integrated into here by reference.Provide any amount of various combination of the different mode of control signal to be used in combination with embodiments of the invention to processor 601.This technology can realize by being stored in the storer 602 and by the code instructions 604 that processor 601 is carried out, and can comprise alternatively, control one or more processors and when operation, select precalibrated listening zone, and filter out one or more instructions from the sound in the source beyond the precalibrated listening zone.Precalibrated listening zone can comprise corresponding to the focusing volume of image capturing unit 623 or the listening zone of visual field.
Program 604 can comprise one or more instructions alternatively, indicates the microphone M of one or more processors from microphone array 622 o... M mProduce a discrete time domain input signal x m(t), determine to listen to the sector, and in separate in the half-blindness source, use this to listen to the sector,, be used for from input signal x to select the finite impulse response filter coefficient m(t) isolate different sound sources in.Program 604 also can comprise instruction, and one or more fractional delays are applied to except from reference microphone M oInput signal x oThe input signal x of the selection (t) m(t).Each fractional delay can be selected to optimize the signal to noise ratio (S/N ratio) from the discrete time domain output signal y (t) of microphone array.Fractional delay can be selected such that, with respect to the signal of other (a plurality of) microphone in array from reference microphone M oSignal be first in time.Program 604 can comprise that also instruction is incorporated into mark time delay Δ among the output signal y (t) of microphone array, thereby: y (t+ Δ)=x (t+ Δ) * b o+ x (t-1+ Δ) * b 1+ x (t-2+ Δ) * b 2+ ...+x (t-N+ Δ) b N, wherein Δ 0 and ± 1 between.The example of this technology is at Xiadong Mao, is entitled as " ULTRA SMALL MICROPHONEARRAY ", has a detailed description its whole open is integrated into here by reference in the Application No. 11/381,729 that on May 4th, 2006 submitted to.
Program 604 can comprise one or more instructions, makes system 600 select to comprise the precalibrated sector of listening to of sound source when carrying out.These instructions can make equipment determine whether sound source is arranged in initial sector or is positioned at certain particular side of initial sector.If sound source not in the sector of acquiescence, these instructions can, when carrying out, select the different sectors of the particular side of acquiescence sector.This difference sector can be by portraying with the decay of the immediate input signal of optimal value.These instructions can, when carrying out, calculate from the decay of the input signal of microphone array 622 and this and decay to optimal value.These instructions can, when carrying out, make equipment 600 determine the pad value of input signal, and select decay near the sector of optimal value for one or more sectors.The example of this technology exists, Xiadong Mao for example is entitled as " METHODS ANDAPPARATUS FOR TARGETED SOUND DETECTION ", the Application No. 11/381 that on May 4th, 2006 submitted to, describe in 725, it openly is integrated into here by reference.
Can provide the part of trace information input from the signal of inertial sensor 632, and image capturing unit 623 can provide another part of trace information input from the signal of following the tracks of one or more light sources 634 and producing.By example, rather than restriction, this " mixed mode " signal can be used in the video-game of rugby type, and wherein the quarter back is on the right of head pretends to move on the left side after ball is thrown into.Especially, the game player holds controller 630 can turn to the left side with his head, and sounds when making throwing action in that to be rugby just as controller with controller shoot to the right.Microphone array 622 can be followed the tracks of user's sound in conjunction with " acoustics radar " program code.Image capturing unit 623 can be followed the tracks of the action of user's head or follow the tracks of the order that other does not need the use of sound or controller.Sensor 632 can be followed the tracks of moving of Joystick controller (representative rugby).The light source 634 of image capturing unit 623 on also can tracking control unit 630.The user can be when a certain amount of and/or direction of the acceleration that reaches Joystick controller 630, or triggers a key order and discharge " ball " by pressing a button on the controller 630.
In certain embodiments of the present invention, inertial signal such as from accelerometer or gyroscope, can be used for determining the position of controller 630.Especially, can be from the acceleration signal of accelerometer to time integral once determining the variation of speed, and described speed can be to time integral to determine the variation of position.If the value of initial position sometime and speed is known, then can utilize the variation of these values and speed and position to determine the absolute position.Determine that Billy is faster with image capturing unit 623 and light source 634 although can make the position that utilizes inertial sensor, but inertial sensor 632 may be subjected to the error of a kind of being called " drift ", wherein, the error of accumulation may cause difference D between the physical location of the position (shown in the dotted line) of the operating rod 631 that calculates according to inertial signal and Joystick controller 630 in time.Embodiments of the invention allow several different methods to handle this error.
For example, can be rearranged into the position that equals to calculate at present by initial position and manually eliminate drift controller 630.The user can utilize the one or more buttons on the controller 630, resets initial position with trigger command.Alternatively, the drift based on image can realize by current location being re-set as a position of determining according to the image as a reference that obtains from image capturing unit 623.This drift compensation based on image can manually realize, such as, when the user triggers one or more button on the Joystick controller 630.Alternatively, can realize automatically based on the drift compensation of image, such as, in the time interval of rule or in response to the carrying out that plays.This technology can realize by being stored on the storer 602 and by the code instructions 604 that processor 601 is carried out.
May wish to compensate the pseudo-data in the inertial sensor signal in certain embodiments.For example, can be over-sampling from the signal of inertial sensor 632, and can calculate running mean, to eliminate the pseudo-data in the inertial sensor signal according to oversampled signals.May wish in some cases signal is carried out over-sampling, and from certain subclass of data point, abandon height and/or low value, and according to remaining data point calculation running mean.In addition, other data sampling and treatment technology can be used to adjust the signal from inertial sensor, to eliminate or to reduce the influence of pseudo-data.Technology Selection can depend on the characteristic of signal, be to calculating, the characteristic of playing games or the combination of two or more these factors of signal execution.This technology can realize by the instruction of the program 604 that can be stored in the storer 602 and be carried out by processor 601.
Processor 601 can be as mentioned above, in response to the code instructions of inertial signal data 606 and storer 602 storages and the program 604 taking out and carried out by processor module 601, carries out the analysis to inertial signal data 606.The code section of program 604 can be abideed by any one in the multiple different programming languages, such as compilation, C++, JAVA or other Languages.When the program carried out such as program code 604, the general purpose computing machine that processor module 601 forms becomes the computing machine of special-purpose purpose.Although program code 604 described herein is realized and is moved on general purpose computing machine in the mode of software, one skilled in the art will appreciate that, the method of task management also can be utilized hardware alternatively, realizes such as application-specific IC (ASIC) or other hardware circuit.Like this, all or part of can the combination by certain of software, hardware or they that is to be understood that embodiments of the invention realized.
Among embodiment, program code 604 can comprise one group of processor instructions therein, and it realizes a kind of method, this method have with Fig. 5 B in method 510 and Fig. 5 C in method 520 or the two or more the same characteristic of certain combination in them.Program code 604 generally can comprise one or more instructions, and it indicates the signal of one or more processor analyses from inertial sensor 632, with generation position and/or directional information, and utilizes these information during playing video game.
Program code 604 can comprise processor executable alternatively, it comprises one or more instructions, when operation, make the visual field of image capturing unit 623 monitoring image capturing units 623 fronts, one or more light sources 634 in the identification visual field detect the change of the light of (a plurality of) light source 634 emissions; And, trigger input command to processor 601 in response to detecting change.LED is used in combination with image capture apparatus to trigger the action of game console, for example Richard L.Marks's, the Application No. 10/759 that is entitled as " METHOD AND APPARATUS FOR LIGHT INPUT DEVICE " that on January 16th, 2004 submitted to, be described in 782, its whole modes by reference are integrated into here.
Program code 604 can comprise processor executable alternatively, it comprises one or more instructions, when when operation, uses the signal of inertial sensor and image capturing unit from the input as games system of the signal following the tracks of one or more light sources and produce, such as, as described above.Program code 604 can comprise processor executable alternatively, and it comprises one or more instructions, the drift when operation in the compensation inertial sensor 632.
Although embodiments of the invention are described by the example relevant with the recreation of PlayStation 3 videogame console/PS3 630, embodiments of the invention, comprise that system 600 can be used to have the inertial sensor signal transfer capability of inertia sensing capability and wireless or alternate manner, the main body that Any user is handled, molded object, knob, structure or the like.
By example, embodiments of the invention can be realized on parallel processing system (PPS).This parallel processing system (PPS) typically comprises two or more processor elements, and it is configured to utilize the part of processor executed in parallel program separately.By example, rather than restriction, Fig. 7 shows one type Cell processor 700 according to an embodiment of the invention.This Cell processor 700 can be as processor 601 among Fig. 6 or the processor 502 among Fig. 5 A.In the example that Fig. 7 describes, Cell processor 700 comprises primary memory 702, power processor elements (PPE) 704, and a plurality of collaborative process device elements (SPE) 706.In the example that Fig. 7 describes, Cell processor 700 comprises single PPE 704 and eight SPE 706.In this configuration, seven SPE 706 can be used for parallel processing, and one can keep as backup, in case an inefficacy in other seven.Cell processor can comprise many group PPE (PPE group) and many group SPE (SPE group) alternatively.In this case, can share hardware resource between each unit in a group.But SPE and PPE must occur software as independent component.Like this, embodiments of the invention are not limited to adopt the configuration shown in Fig. 7.
Primary memory 702 typically comprises general purposes and non-volatile storer, and specific purposes hardware register or array, is used for for example system configuration, and data shift synchronously, the I/O of memory mapped, and function such as I/O subsystem.In an embodiment of the present invention, video game program 703 can reside in the primary memory 702.Storer 702 also can comprise signal data 709.Video program 703 can comprise that as above described or they certain makes up inertia, image harmony credit parser and the mixer that disposes with reference to figure 4,5A, 5B or 5C.Program 703 may operate on the PPE.Program 703 can be divided into a plurality of signal processing tasks, so that can move on SPE and/or PPE.
By example, PPE 704 has related high-speed cache L1 and 64 PowerPC processor units (PPU) of L2.PPE 704 is general purposes processing units, and it can access system management resource (for example, such as the storage protection table).Hardware resource can be mapped to the real address space of seeing as PPE clearly.Thereby PPE can utilize any one in suitable these resources of effective address value directly address.The major function of PPE704 is to be 706 management and the allocating tasks of the SPE in the Cell processor 700.
Although only show a PPE among Fig. 7, the realization of some Cell processor, such as unit wideband engine framework (CBEA), Cell processor 700 can have and is organized as a plurality of a plurality of PPE that wherein can have more than the PPE of PPE group.These PPE groups can be shared the visit to primary memory 702.Cell processor 700 can comprise two or more SPE groups in addition.These SPE groups also can be shared the visit to primary memory 702.These configurations within the scope of the invention.
Each SPE 706 comprises a coprocessor unit (SPU) and the local storage region LS of oneself thereof.Local storage LS can comprise the zone that separates of one or more memory stores, and each is associated with specific SPU.Each SPU can be configured to only to carry out the instruction (comprising data load and data storage operations) from its oneself the local storage region that is associated.In this configuration, data transmission between other parts of local storage LS and system 700 can be by carrying out from internal memory flow controller (MFC) issue direct memory access (DMA) (DMA) order, with to or transmit data from the local storage region of (single SPE).SPU is the computing unit that does not have PPE 704 complexity, is that they do not carry out any system management function.SPU generally has single instrction, and multidata (SIMD) ability and the data transmission (submitting to the access attribute that PPE sets up) of deal with data and any needs of initialization typically are to carry out the task of distributing to their.The purpose of SPU is to enable application that need higher computing unit density and the instruction group that provides can be provided.Effectively handle by a large amount of SPE permissions cost in big range of application of PPE 704 management in the system.
Each SPE 706 can comprise the internal memory flow controller (MFC) of a special use, and it comprises related Memory Management Unit, can preserve and processing memory protection and access permission information.MFC provides and has been used for data transmission between the storage of this locality of the primary memory of Cell processor and SPE, protection, and synchronous main method.The MFC command description transmission that will carry out.The order that is used to transmit data is called as MFC direct memory access (DMA) (DMA) order (or MFC command dma) sometimes.
Each MFC can support a plurality of DMA transmission simultaneously and can keep and handle a plurality of MFC orders.Each MFC DMA data transfer command request can comprise local memory address (LSA) and effective address (EA) both.The local storage region of the SPE that local memory address can directly address only be associated with it.Effective address can have more general application, such as, it can quote primary memory, comprises all SPE local storage region, if they have been mapped in the actual address space.
Communication between the SPE 706 and/or the communication between SPE 706 and the PPE 704 for convenience, SPE 706 can comprise the signalisation register relevant with signaling event with PPE 704.PPE 704 and SPE 706 can couple by Star topology, and wherein PPE 704 sends message as router to SPE 706.Alternatively, each SPE 706 and PPE 704 can have the one way signal notice register that is called as mailbox.It is synchronous that mailbox can be used for host operating system (OS) by SPE 706.
Cell processor 700 can comprise I/O (I/O) function 708, and Cell processor 700 can be passed through this I/O function 708 and peripherals, carries out alternately such as microphone array 712 and optional image capturing unit 713 and game console 730.The game console unit can comprise inertial sensor 732 and light source 734.Element interconnection bus 710 can connect the various assemblies of listing above in addition.Each SPE and PPE can both pass through Bus Interface Unit BIU access bus 710.Cell processor 700 also can comprise typically visible two controllers in processor: memory interface controller MIC, data stream between its control bus 710 and the primary memory 702, and bus interface controller BIC, the data stream between its control I/O 708 and the bus 710.Although for different application, have bigger variation for the demand of MIC, BIC, BIU and bus 710, those skilled in the art will be familiar with their function and circuit and realize them.
Cell processor 700 also can comprise internal interrupt controller IIC.The IIC assembly management is submitted to the priority of interrupt of PPE.IIC allows from the interruption of other assembly of Cell processor 700 processed and do not use the main system interruptable controller.IIC can be counted as the controller of second level.The main system interruptable controller can be handled the interruption that is initiated to Cell processor from the outside.
In an embodiment of the present invention, some calculating such as above-mentioned fractional delay, can use PPE 704 and/or one or more SPE 706 to carry out concurrently.But each fractional delay calculates and can be used as the one or more independent task run that different SPE706 can carry out in its time spent that becomes.
Although be a complete description of the preferred embodiment of the present invention above, may use different substitute, modification and equivalent.Therefore, scope of the present invention should not determine with reference to foregoing description, instead, and should be with reference to claims, and the equivalent of their all scopes is determined.No matter whether any feature described herein be preferred, can be combined with any further feature described herein, and no matter whether it is preferred.In the claim of back, indefinite article " " refers to one or more quantity of the project of article back, unless other clearly statement is arranged.Unless in given claim, clearly use wording " be used for ... device " limit, otherwise claims should not be interpreted as comprising that device adds the qualification of function.

Claims (15)

1. one kind is used to analyze the system that game control is imported data, comprising:
Image dissector is configured to receive and analyze the input data from image capture apparatus, the input control data;
The inertia analyzer is configured to receive and analyze the input data from inertial sensor;
The acoustic analysis device is configured to receive and analyze the input data from acoustic receivers; And
Mixer, be configured to receive one or more input data by analysis, and produce trace information based on the information of the one or more receptions from inertia analyzer, image dissector harmony credit parser from inertia analyzer, the image dissector harmony credit parser.
2. the system as claimed in claim 1 also comprises the gesture recognition device, this gesture recognition device be configured to based on above-mentioned trace information be associated in the game environment at least one the action with one or more user actions.
3. the system as claimed in claim 1, also comprise game console, wherein game console comprises that at least one operationally is couple to the inertial sensor of above-mentioned inertia analyzer, and/or one or more light sources that are configured to provide one or more light signals, and/or one or more acoustic signal generator.
4. system as claimed in claim 3, wherein above-mentioned at least one inertial sensor comprises at least one in accelerometer or the gyroscope.
5. the system as claimed in claim 1 also comprises the image capture apparatus that is couple to image dissector.
6. system as claimed in claim 5, wherein said image capture apparatus comprises video camera.
7. the system as claimed in claim 1 also comprises the microphone array that operationally is couple to acoustic analysis device unit.
8. the system as claimed in claim 1, wherein mixer also is configured to one or more by analysis the input data allocations one or more distribution values that receive of mixer from inertia analyzer, image dissector harmony credit parser.
9. the system as claimed in claim 1, wherein said trace information comprises that the user can handle the position and/or the directional information of object.
10. system as claimed in claim 9, wherein position and/or directional information comprise the indication user can handle facing upward of object bow, go off course and roll at least one information.
11. system as claimed in claim 9, wherein position and/or directional information comprise the information of indication about the position of one or more coordinates.
12. system as claimed in claim 9, wherein position and/or directional information comprise the information of indication about the speed of one or more coordinates.
13. system as claimed in claim 9, wherein position and/or directional information comprise the information of indication about the acceleration of one or more coordinates.
14. system as claimed in claim 9, wherein the user can to handle object be the controller that is used for video game system.
15. a method of analyzing game control input data comprises:
Receive the input control data from one or more input medias, described input control data is associated with one or more user actions;
Be the input control data assignment profile value that receives;
Based on the distribution value of distributing to the input control data that receives, identification is associated with one or more user actions, at least one action in game environment;
And
In game environment, carry out above-mentioned at least one action.
CN200780016094XA 2006-05-04 2007-04-19 Multi-input game control mixer Active CN101479782B (en)

Applications Claiming Priority (38)

Application Number Priority Date Filing Date Title
US11/381,725 2006-05-04
US11/429,133 2006-05-04
US11/381,727 2006-05-04
US11/418,988 US8160269B2 (en) 2003-08-27 2006-05-04 Methods and apparatuses for adjusting a listening area for capturing sounds
US11/381,729 2006-05-04
US11/418,989 2006-05-04
US11/381,725 US7783061B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for the targeted sound detection
US11/429,414 2006-05-04
US11/429,047 US8233642B2 (en) 2003-08-27 2006-05-04 Methods and apparatuses for capturing an audio signal based on a location of the signal
US11/381,728 2006-05-04
US11/418,988 2006-05-04
US11/381,724 US8073157B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for targeted sound detection and characterization
US11/429,047 2006-05-04
US11/381,721 2006-05-04
US11/381,729 US7809145B2 (en) 2006-05-04 2006-05-04 Ultra small microphone array
US11/381,721 US8947347B2 (en) 2003-08-27 2006-05-04 Controlling actions in a video game unit
US11/381,728 US7545926B2 (en) 2006-05-04 2006-05-04 Echo and noise cancellation
US11/429,133 US7760248B2 (en) 2002-07-27 2006-05-04 Selective sound source listening in conjunction with computer interactive processing
US11/381,724 2006-05-04
US11/381,727 US7697700B2 (en) 2006-05-04 2006-05-04 Noise removal for electronic device with far field microphone on console
US11/418,989 US8139793B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for capturing audio signals based on a visual image
US29/259,350 2006-05-06
US60/798,031 2006-05-06
US11/382,037 2006-05-06
US11/382,031 2006-05-06
US11/382,036 2006-05-06
US29/259,349 2006-05-06
US11/382,035 2006-05-06
US29/259,348 2006-05-06
US11/382,034 2006-05-06
US11/382,038 2006-05-06
US11/382,033 2006-05-06
US11/382,032 2006-05-06
US11/382,043 2006-05-07
US11/382,040 2006-05-07
US11/382,039 2006-05-07
US11/382,041 2006-05-07
PCT/US2007/067004 WO2007130791A2 (en) 2006-05-04 2007-04-19 Multi-input game control mixer

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN2010106245095A Division CN102058976A (en) 2006-05-04 2007-04-19 System for tracking user operation in environment

Publications (2)

Publication Number Publication Date
CN101479782A CN101479782A (en) 2009-07-08
CN101479782B true CN101479782B (en) 2011-08-03

Family

ID=56290949

Family Applications (2)

Application Number Title Priority Date Filing Date
CN200780016094XA Active CN101479782B (en) 2006-05-04 2007-04-19 Multi-input game control mixer
CN2007800161035A Active CN101438340B (en) 2006-05-04 2007-04-19 System, method, and apparatus for three-dimensional input control

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN2007800161035A Active CN101438340B (en) 2006-05-04 2007-04-19 System, method, and apparatus for three-dimensional input control

Country Status (1)

Country Link
CN (2) CN101479782B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102483652A (en) * 2009-08-05 2012-05-30 恩希软件株式会社 Device And Method For Controlling The Movement Of A Games Character

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8564534B2 (en) 2009-10-07 2013-10-22 Microsoft Corporation Human tracking system
US8963829B2 (en) 2009-10-07 2015-02-24 Microsoft Corporation Methods and systems for determining and tracking extremities of a target
EP3217653B1 (en) * 2009-12-24 2023-12-27 Nokia Technologies Oy An apparatus
US20120086630A1 (en) * 2010-10-12 2012-04-12 Sony Computer Entertainment Inc. Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system
US8667519B2 (en) * 2010-11-12 2014-03-04 Microsoft Corporation Automatic passive and anonymous feedback system
US9436286B2 (en) * 2011-01-05 2016-09-06 Qualcomm Incorporated Method and apparatus for tracking orientation of a user
CN102671382A (en) * 2011-03-08 2012-09-19 德信互动科技(北京)有限公司 Somatic game device
TWI590099B (en) * 2012-09-27 2017-07-01 緯創資通股份有限公司 Interaction system and motion detection method
US9690392B2 (en) 2012-10-15 2017-06-27 Sony Corporation Operating device including a touch sensor
KR102249208B1 (en) 2012-10-15 2021-05-10 주식회사 소니 인터랙티브 엔터테인먼트 Control deⅵce
JP5977147B2 (en) 2012-11-05 2016-08-24 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and input device
JP2014191688A (en) * 2013-03-28 2014-10-06 Sony Corp Information processor, information processing method and storage medium
CN103530799B (en) * 2013-10-22 2017-07-11 惠州Tcl移动通信有限公司 Realize that palm sees the browsing method and 3D house viewing systems in room according to 3D picture libraries
CN105812969A (en) * 2014-12-31 2016-07-27 展讯通信(上海)有限公司 Method, system and device for picking up sound signal
CN106621324A (en) * 2016-12-30 2017-05-10 当家移动绿色互联网技术集团有限公司 Interactive operation method of VR game
CN111437596A (en) * 2019-01-17 2020-07-24 杨跃龙 Game paddle of three-dimensional rocker of articulated equivalent
CN111514578A (en) * 2019-02-04 2020-08-11 杨跃龙 Game paddle with joint type three-dimensional rocking bar
CN112107852A (en) * 2020-09-16 2020-12-22 西安万像电子科技有限公司 Cloud game control method and device and cloud game control system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5554980A (en) * 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
CN1279803A (en) * 1997-11-20 2001-01-10 任天堂株式会社 Sound generator and video game machine employing it
US6699123B2 (en) * 1999-10-14 2004-03-02 Sony Computer Entertainment Inc. Entertainment system, entertainment apparatus, recording medium, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5554980A (en) * 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
CN1279803A (en) * 1997-11-20 2001-01-10 任天堂株式会社 Sound generator and video game machine employing it
US6699123B2 (en) * 1999-10-14 2004-03-02 Sony Computer Entertainment Inc. Entertainment system, entertainment apparatus, recording medium, and program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102483652A (en) * 2009-08-05 2012-05-30 恩希软件株式会社 Device And Method For Controlling The Movement Of A Games Character
CN102483652B (en) * 2009-08-05 2016-03-23 恩希软件株式会社 Control the Apparatus for () and method therefor of game motion

Also Published As

Publication number Publication date
CN101479782A (en) 2009-07-08
CN101438340A (en) 2009-05-20
CN101438340B (en) 2011-08-10

Similar Documents

Publication Publication Date Title
CN101479782B (en) Multi-input game control mixer
US7918733B2 (en) Multi-input game control mixer
US7850526B2 (en) System for tracking user manipulations within an environment
US10086282B2 (en) Tracking device for use in obtaining information for controlling game program execution
US7854655B2 (en) Obtaining input for controlling execution of a game program
US9682320B2 (en) Inertially trackable hand-held controller
US7782297B2 (en) Method and apparatus for use in determining an activity level of a user in relation to a system
US9009747B2 (en) Gesture cataloging and recognition
US9381424B2 (en) Scheme for translating movements of a hand-held controller into inputs for a system
JP5204224B2 (en) Object detection using video input combined with tilt angle information
US20060287084A1 (en) System, method, and apparatus for three-dimensional input control
US20070015559A1 (en) Method and apparatus for use in determining lack of user activity in relation to a system
CN101484221A (en) Obtaining input for controlling execution of a game program
CN102989174A (en) Method for obtaining inputs used for controlling operation of game program
US20130084981A1 (en) Controller for providing inputs to control execution of a program when inputs are combined
WO2007130872A2 (en) Method and apparatus for use in determining lack of user activity, determining an activity level of a user, and/or adding a new player in relation to a system
EP2012891B1 (en) Method and apparatus for use in determining lack of user activity, determining an activity level of a user, and/or adding a new player in relation to a system
WO2007130791A2 (en) Multi-input game control mixer
CN102058976A (en) System for tracking user operation in environment
KR101020510B1 (en) Multi-input game control mixer
EP2351604A2 (en) Obtaining input for controlling execution of a game program
EP2013864A2 (en) System, method, and apparatus for three-dimensional input control

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant