US20110250962A1 - System and method for a 3d computer game with true vector of gravity - Google Patents

System and method for a 3d computer game with true vector of gravity Download PDF

Info

Publication number
US20110250962A1
US20110250962A1 US13/084,488 US201113084488A US2011250962A1 US 20110250962 A1 US20110250962 A1 US 20110250962A1 US 201113084488 A US201113084488 A US 201113084488A US 2011250962 A1 US2011250962 A1 US 2011250962A1
Authority
US
United States
Prior art keywords
tracking mechanism
physical object
orientation
relative
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/084,488
Inventor
Steven K. Feiner
Ohan Oda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Columbia University of New York
Original Assignee
Columbia University of New York
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Columbia University of New York filed Critical Columbia University of New York
Priority to US13/084,488 priority Critical patent/US20110250962A1/en
Assigned to THE TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK reassignment THE TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FEINER, STEVEN K., ODA, OHAN
Publication of US20110250962A1 publication Critical patent/US20110250962A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1043Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being characterized by constructional details
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present application relates to a computer interaction system and a method of facilitating interaction between an interaction delivery device and a physical object in an environment.
  • a game accounting for gravity is a virtual marble labyrinth game.
  • a user tilts a gameboard having one or more virtual balls such that the virtual balls roll about the gameboard as if the virtual balls were real balls on the surface of the gameboard being influenced by a gravitational force and the user's actions.
  • the virtual balls can encounter a number of virtual objects that can affect the virtual balls' motion in the same manner as a physical object on the gameboard would affect the physical balls' motion.
  • the objective of the virtual marble labyrinth game is to move the virtual balls to a desired position on the gameboard.
  • an interaction delivery device can include a display device, a first tracking mechanism, and a second tracking mechanism.
  • the display device can be configured to display virtual objects.
  • the first tracking mechanism can be configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism.
  • the second tracking mechanism can be configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference.
  • one or more of position information received from the first tracking mechanism and orientation information received from the first tracking mechanism can be used to generate motion data for the physical object.
  • one or more of position information received from the second tracking mechanism and orientation information received from the second tracking mechanism can be used to generate adjusted motion data for the physical object.
  • the adjusted motion data for the physical object is used to generate virtual object information.
  • the virtual object information is received by the display.
  • an interaction processing device can include a processor, a memory, an input unit configured to receive information, and an output unit configured to output information.
  • the input unit can be configured to receive at least one of physical object position information (e.g., data, video, images, etc.), physical object orientation information (e.g., data, video, images, etc.), tracking mechanism position information (e.g., data, video, images, etc.), and tracking mechanism orientation information (e.g., data, video, images, etc.).
  • the processor can be configured to generate motion data for a physical object using at least one of the physical object position information and the physical object orientation information.
  • the processor can be configured to generate adjusted motion data for the physical object using at least one of tracking mechanism position information and tracking mechanism orientation information.
  • the adjusted motion data for the physical object can compensate for movement of a tracking mechanism relative to the physical object.
  • the processor can be configured to generate virtual object information using the adjusted motion data for the physical object.
  • the output unit can be configured to output the virtual object information.
  • the interaction delivery device can include a display device configured to display one or more of virtual objects, images, videos, and other information or media, a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference.
  • the computer can include a processor and a memory, and can be configured to process one or more of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism. Further, in certain embodiments, the computer can be configured to compensate for movement relative to the physical object of at least one of the first tracking mechanism, the second tracking mechanism, and the display device. In some embodiments, the computer can be further configured to output virtual object information to the display device.
  • the computer can be any processing device.
  • the first tracking mechanism can be an optical tracking mechanism which can include one or more cameras.
  • the interaction delivery device can be configured to be head-worn and to display one or more of video see-through augmented reality (e.g., video depicting a physical environment viewed through one or more attached cameras that is augmented with additional virtual graphics), virtual reality video (e.g., video depicting an entirely virtual environment), and optical see-through augmented reality (e.g., viewing the physical environment directly, rather than through video, with virtual graphics overlaid on the field of vision by using optical elements such as mirrors, lenses, etc.).
  • the second tracking mechanism can include one or more of a three-axis accelerometer and a magnetometer.
  • the second tracking mechanism can be configured to determine a true direction of a natural force, such as gravity or magnetism, acting on the second tracking mechanism.
  • the second tracking mechanism can have a position that is rigidly fixed relative to the first tracking mechanism, or a position that is otherwise known relative to the first tracking mechanism.
  • the second tracking mechanism can be configured to comprise a six-degree-of-freedom tracker that can be configured to determine a three-dimensional position of the second tracking mechanism relative to the reference and a three-dimensional orientation of the second tracking mechanism relative to the reference.
  • the reference can be the earth or fixed to the earth. Alternatively, the reference can be fixed to an one or more additional physical object moving relative to the earth.
  • the second tracking mechanism can be configured to determine a true direction of a natural force vector.
  • the computer can be configured to simulate motion of virtual objects based on a true direction of a natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism.
  • the computer can be configured to simulate motion of virtual objects based on a direction of force that is different from the true direction of a natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism.
  • the computer can further include one or more of a game engine and a physics engine, wherein the computer can be configured to simulate virtual objects.
  • the computer can be configured to simulate virtual forces acting, in a true physical direction, on the virtual objects based on one or more of the position information received from the first tracking mechanism, the orientation information received from the first tracking mechanism, the position information received from the second tracking mechanism, the orientation information received from the second tracking mechanism, acceleration information about the second tracking mechanism received from the second tracking mechanism, velocity information about the second tracking mechanism received from the second tracking mechanism, and true physical gravity vector information.
  • the position information received from the first tracking mechanism can include a first position of the physical object relative to the first tracking mechanism at a first time and a second position of the physical object relative to the first tracking mechanism at a second time.
  • the orientation information received from the first tracking mechanism can include a first orientation of the physical object relative to the first tracking mechanism at the first time and a second orientation of the physical object relative to the first tracking mechanism at the second time.
  • the position and orientation information received from the first tracking mechanism can be data, video streams, images, or other forms of information.
  • the position information received from the second tracking mechanism can include a first position of the second tracking mechanism relative to the reference at the first time and a second position of the second tracking mechanism relative to the reference at the second time.
  • the orientation information received from the second tracking mechanism can include a first orientation of the second tracking mechanism relative to the reference at the first time and a second orientation of the second tracking mechanism relative to the reference at the second time.
  • the second tracking mechanism can have a position rigidly fixed, or otherwise known, relative to the first tracking mechanism.
  • a predetermined time can exist between the first time and the second time.
  • the computer can be configured to determine a physical object movement vector between the first position of the physical object and the second position of the physical object.
  • the computer can be configured to determine a second tracking mechanism movement vector between the first position of the second tracking mechanism and the second position of the second tracking mechanism.
  • the computer can be configured to calculate an adjusted physical object movement vector by subtracting the second tracking mechanism movement vector from the physical object movement vector if a magnitude of the second tracking mechanism movement vector exceeds a predetermined threshold distance (e.g., the predetermined threshold distance can be set as moving about 0.02 meters in the about 1 second)
  • the computer can be configured to simulate virtual forces acting on the virtual objects based on at least the adjusted physical object movement vector if the magnitude of the second tracking mechanism movement vector exceeds, or is equal to, the predetermined threshold distance.
  • the computer can be configured to simulate virtual forces acting on the virtual objects based on at least the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance.
  • the computer can be configured to determine a physical object rotation tensor between the first orientation of the physical object and the second orientation of the physical object.
  • the computer can be configured to determine a second tracking mechanism rotation tensor between the first orientation of the second tracking mechanism and the second orientation of the second tracking mechanism.
  • the computer can be configured to calculate an adjusted physical object rotation tensor by applying a transformation based on a resultant rotation of the second tracking mechanism, if the resultant rotation of the second tracking mechanism exceeds, or is equal to, a predetermined threshold rotation value (e.g., a predetermined threshold rotation value of, for example 3 degrees in a predetermined time of 1 second).
  • a predetermined threshold rotation value e.g., a predetermined threshold rotation value of, for example 3 degrees in a predetermined time of 1 second.
  • the computer can be configured to simulate natural forces acting on the virtual objects based on at least the adjusted physical object rotation tensor if the resultant rotation of the second tracking mechanism exceeds, or is equal to, the threshold rotation magnitude. Additionally, the computer can be configured to simulate natural forces acting on the virtual objects based on at least the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
  • Non-transitory computer readable medium having computer readable instructions stored thereon, which, when executed by a computer having a processor to execute a plurality of processes, are configured to cause the processor to perform several functions.
  • the computer readable instructions can cause the processor to obtain one or more of first tracking mechanism information and second tracking mechanism information.
  • the computer readable instructions can cause the processor to determine one or more of a physical object movement vector using the first tracking mechanism information, a second tracking mechanism movement vector using the second tracking mechanism information, and direction of a true physical gravity vector relative to the second tracking mechanism.
  • the computer readable instructions can cause the processor to simulate motion data of a virtual object.
  • the processor can simulate motion data of the virtual object based on a true or not true virtual gravity vector using an adjusted physical object movement vector if a magnitude of the second tracking mechanism movement vector is greater than or equal to a predetermined threshold distance, the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance, and the direction of a true physical gravity vector relative to the second tracking mechanism.
  • the computer readable instructions can cause the processor to output the motion data of the virtual object to a display.
  • the non-transitory computer readable medium having computer readable instructions stored thereon, which, when executed by a computer having a processor to execute a plurality of processes, are configured further to cause the processor to determine one or more of a physical object rotation tensor using the first tracking mechanism information, and a second tracking mechanism rotation tensor using the second tracking mechanism information.
  • the computer readable instructions can also cause the processor to simulate motion data of an a virtual object using additionally: an adjusted physical object rotation tensor if a resultant rotation of the second tracking mechanism is greater than or equal to a predetermined threshold rotation value, and the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
  • the first tracking mechanism information can include a first position of a physical object relative to a first tracking mechanism and a first orientation of the physical object at a first time, and a second position of the physical object relative to the first tracking mechanism and a second orientation of the physical object at a second time.
  • the second tracking mechanism information can include a first position of a second tracking mechanism relative to a reference and a first orientation of the second tracking mechanism at the first time, and a second position of the second tracking mechanism relative to the reference and a second orientation of the second tracking mechanism at the second time.
  • Another aspect of the presently disclosed subject matter provides a method of facilitating interaction between an interaction delivery device and a physical object in an environment, the method including generating one or more virtual objects in the environment; detecting a change in the physical object; determining whether the change in the physical object is based on a change in the state of the virtual objects and the physical object, or both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object; measuring a direction and effect (e.g., magnitude, etc.) of a natural force interacting with the environment; and updating the virtual objects based on a result of the determining and the measuring.
  • a direction and effect e.g., magnitude, etc.
  • the detecting can further include: detecting a change in position of the physical object over a given time, and detecting a change in position of the interaction delivery device over the given time.
  • the determining can further include determining whether a magnitude of the change in position of the interaction delivery device over the given time exceeds, or is equal to, a predetermined threshold value, determining that the detected change in the physical object is based on a change in the state of the virtual objects and physical object if the change in position of the interaction delivery device over the given time is less than the predetermined threshold value, and determining that the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object if the change in position of the interaction delivery device over the given time exceeds or is equal to the predetermined threshold value.
  • the updating can further include updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object over the given time if the detected change in the physical object is based on a change in the state of the virtual objects and the physical object, and updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object and adjusted to remove effects caused by the force applied to the interaction delivery device over the given time if the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object.
  • the interaction delivery device can include: a tracking mechanism that can be configured to track at least one of a position of the physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism; a detecting mechanism configured to detect motion of the detecting mechanism, wherein a position of the detecting mechanism relative to a position of the tracking mechanism is predetermined; and a display configured to display the at least one virtual object.
  • the processing device can be configured to receive physical object position information from the tracking mechanism.
  • the processing device can be configured to receive detecting mechanism motion information from the detecting mechanism.
  • the processing device can be configured to perform at least one of: determining if a magnitude of acceleration of the detecting mechanism is greater than a predetermined acceleration threshold value and generating adjusted physical object position information if the magnitude of acceleration of the detecting mechanism is greater than the predetermined acceleration threshold value; and determining if a magnitude of velocity of the detecting mechanism is greater than a predetermined velocity threshold value and generating adjusted physical object position information if the magnitude of velocity of the detecting mechanism is greater than the predetermined velocity threshold value.
  • the processing device can be configured to generate motion information for the at least one virtual object based on the adjusted physical object position information if the processing device generates the adjusted physical object position information.
  • the processing device can be configured to generate the motion information for the at least one virtual object based on the physical object position information if the processing device does not generate the adjusted physical object position information.
  • the processing device can be configured to output the motion information for the at least one virtual object to the display.
  • the detecting mechanism can be configured to detect a correct direction and magnitude of a physical gravity vector. Additionally, the processing device can be further configured to generate the motion information for the at least one virtual object additionally based on the correct direction and magnitude of the physical gravity vector. Further, the tracking mechanism can be an optical tracking device. The physical object can be configured to not include attached or embedded electronic devices or components.
  • the at least one virtual object can be configured to move according to a virtual gravity vector, wherein the virtual gravity vector is substantially identical to the physical gravity vector. In another embodiment of the present aspect of the application, the at least one virtual object can be configured to move according to a virtual gravity vector, wherein the virtual gravity vector is different from the physical gravity vector.
  • an interaction delivery system comprising: a first tracking mechanism rigidly attached to a display device configured to track the position and orientation of a physical object relative to the display device; a second tracking mechanism rigidly attached to the display device, configured to track the absolute orientation of the display device relative to the earth; and a processing device configured to process tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism, wherein the processing device is configured to simulate motion information of at least one virtual object from the tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism, wherein the processing device is configured to output the simulated motion information of the at least one virtual object to the display, wherein the display is configured to display the at least one virtual object disposed relative to the physical object, and wherein the at least one virtual object is configured to behave as if acted on by a virtual gravity vector in a same direction as a physical gravity vector acting on the physical object.
  • FIG. 1 depicts a computer interaction system according to a non-limiting embodiment.
  • FIG. 4 depicts a computer interaction system including virtual graphics according to a non-limiting embodiment.
  • FIG. 5 is a schematic of the head-worn unit connected to a computer according to a non-limiting embodiment.
  • FIG. 6 is a schematic of a computer according to a non-limiting embodiment.
  • FIG. 7 is a flow chart depicting operations of a computer according to a non-limiting embodiment.
  • FIG. 8 is a flow chart depicting operations of the computer which compensate for movement and rotation of the optical tracker according to a non-limiting embodiment.
  • FIG. 9A depicts a front side of a head-worn unit
  • FIG. 9B depicts a back side of the head-worn unit according to a non-limiting embodiment
  • FIG. 10 is a schematic of the head-worn unit connected to a processing device according to a non-limiting embodiment.
  • FIG. 12A is a flow chart depicting operations of the processing device which compensate for movement and rotation of the optical tracker, based on acceleration information of the head motion detector, according to a non-limiting embodiment.
  • FIG. 12B is a flow chart depicting operations of the processing device which compensate for movement and rotation of the optical tracker, based on velocity information of the head motion detector, according to a non-limiting embodiment.
  • the disclosed subject matter can be exemplified with, but is not limited to, a virtual marble labyrinth game.
  • the virtual marble labyrinth game has been implemented using a number of technologies.
  • the virtual marble labyrinth game can consist of a hand-held display for displaying the virtual balls and other virtual objects, such as obstacles, bumps, ramps, prizes, walls, pools, holes, channels, etc.
  • the hand-held display is further integrated with a set of built-in accelerometers that can directly measure the direction of gravity relative to the display.
  • This implementation has several drawbacks: the game graphics are restricted to the hand-held display; the weight and fragility of the gameboard depend upon the technology it contains; and the position and orientation of the player's head relative to the gameboard are not known, so graphics must be generated for an assumed head position and orientation, and cannot respond to changes in actual head position and orientation during gameplay.
  • FIG. 1 Another example of a virtual marble labyrinth game uses a static camera to optically track a passive physical gameboard. This makes it possible to determine a six-degree-of-freedom position and orientation of the gameboard relative to the camera.
  • the virtual objects are displayed on an external monitor, with the virtual objects overlaid on the actual view from the camera.
  • the camera is assumed to be in a specific fixed pose relative to the direction of gravity. This fixed pose can be either a hardwired pose or a pose that is calibrated prior to gameplay.
  • Additional examples of a virtual marble labyrinth game use a display with a known or assumed orientation.
  • One implementation allows the user to manipulate a virtual gameboard using an input device such as a mouse. This approach does not provide the natural feel of controlling a virtual gameboard in the same manner as one would feel while controlling a physical gameboard.
  • Another implementation allows the user to manipulate a game controller containing an orientation tracker, effectively changing the direction of gravity in the game, but without changing the visual orientation of the gameboard.
  • Yet another example of the virtual marble labyrinth game uses a gameboard and head-worn unit.
  • the gameboard and the head-worn unit are each individually tracked in six-degrees of freedom relative to a real world reference.
  • This implementation requires that additional infrastructure for tracking the gameboard is added to the environment or added directly to the gameboard. Adding such additional infrastructure to the environment limits where the game can be played. Adding such additional infrastructure to the gameboard make it heavier, or more fragile. In both cases, adding additional infrastructure can make the system more expensive, more complicated, and awkward for the user.
  • the computer interaction system and the method of facilitating interaction between an interaction delivery device and physical objects in an environment presented herein are described in the context of developing and playing augmented reality computer games, use of the systems and methods of the presently disclosed subject matter is not limited thereto.
  • the presently disclosed systems and methods can be employed for other uses, such as, but not limited to, pedagogical computer environments for teaching physics, controls for user interfaces, pedagogical “virtual reality” computer environments for medical training, pilot training, etc.
  • the disclosed subject matter can be employed to develop and play entirely virtual games using a physical object as an interface to control virtual objects within an immersive virtual environment.
  • the computer interaction system includes an interaction delivery device and a computer.
  • the interaction delivery device can include a display device configured to display virtual objects, a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference.
  • the computer can include a processor and a memory, and can be configured to process at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism. Further, the computer can be configured to compensate for movement relative to the physical object of at least one of the first tracking mechanism and the second tracking mechanism and to output virtual object information to the display.
  • FIGS. 1-12 For purpose of explanation and illustration, and not limitation, an exemplary embodiment of the computer interaction system in accordance with the application is shown in FIGS. 1-12 .
  • FIG. 1 depicts a computer interaction system that includes a head-worn unit 10 (an interaction delivery device) and a computer 14 that can be connected to the head-worn unit 10 wirelessly or through a cable.
  • the computer interaction system can include a gameboard 16 (or other physical object) that can be manipulated to move in an x-direction, a y-direction, and a z-direction with respect to head-worn unit 10 or to rotate to a new orientation by a user.
  • the head-worn unit 10 can move in an a-direction, a b-direction, and a c-direction or rotate to a new orientation as the user's head moves.
  • a force of gravity can act upon the computer interaction environment in the direction of a true virtual gravity vector pointing downward relative to earth 25 , as indicated in FIG. 1 .
  • the true virtual gravity vector acts with the same magnitude and direction as the physical gravitational force acting on the user and gameboard (i.e., toward earth 25 ).
  • “true” indicates a natural and correct direction and magnitude, as would exist in real life.
  • the virtual gravity vector can be configured to be a not true virtual gravity vector, where, as used herein, “not true” indicates that the vector has at least one of a direction and magnitude that is different from the natural and correct direction and magnitude of the physical force, as would exist in real life.
  • FIG. 2A depicts the front side of head-worn unit 10 .
  • This embodiment depicts a modified version of the Vuzix WrapTM 920AR, but alternative devices can be used as heard-worn unit 10 .
  • the head-worn unit 10 can include one or more optical sensors 11 (first tracking mechanisms) for detecting optical markers on a gameboard (see, e.g., optical marker 23 on gameboard 16 in FIG. 3 ).
  • Optical sensors 11 detect the position and orientation of optical markers and transmit that information to the computer (not shown).
  • head-worn unit 10 can include a head tracker 12 (a second tracking mechanism) for detecting at least one of the orientation and position of head tracker 12 relative to a reference 24 .
  • Reference object 24 can be a position on earth 25 or an object moving relative to earth 25 such as a car, a boat, or a plane.
  • Head tracker 12 can be, but is not limited to being, rigidly fixed relative to at least one optical sensor 11 . Accordingly, one or more of the orientation of optical sensors 11 and the position of optical sensors 11 can be readily determined from the orientation and position of head tracker 12 if head tracker 12 is rigidly fixed relative to at least one optical sensor 11 .
  • FIG. 2B depicts the back side of head-worn unit 10 .
  • Head tracker 12 can be disposed on the back side of head-worn unit 10 .
  • display devices 13 can be disposed on the back side of head-worn unit 10 .
  • Display devices 13 can be configured to display augmented reality images 2 based on augmented reality information output by computer 14 .
  • Augmented reality images 2 can include physical objects, such as gameboard 16 , and virtual objects 15 , such as ball 15 a and obstacles 15 b as depicted in FIG. 4 .
  • Display devices 13 can also be configured to allow a user to directly view physical objects and to overlay virtual objects on the physical environment viewed by the user. Alternatively or additionally, display devices 13 can be configured to display entirely virtual environments.
  • FIG. 3 depicts board 16 , which can, for example, serve as a gameboard.
  • Board 16 can have a plurality of optical markers 23 .
  • the optical markers on board 16 are not limited to an array of optical markers 23 as depicted in FIG. 3 .
  • optical markers may include text, barcodes, patterns, colors, or other indicia that can be identified by an optical sensor.
  • the markers can be integrated within the gameboard to create a visually appealing appearance that can for example, simulate an environment consistent with the theme of the video game.
  • board 16 can have designs that include a more general set of optical features (color, patterns, shapes, images, etc.) that can be recognized by optical tracking software using markerless optical tracking.
  • board 16 can include natural or artificial physical features (texture, protrusions, holes, edges, grooves, notches, perforations, dips, divits, cavities, slits, etc.) that can be recognized by optical tracking software using optical feature tracking.
  • the system can be configured to pre-store information regarding these optical and physical features prior to tracking, but it is not necessary that the system pre-store this information.
  • FIG. 4 further depicts the computer interaction system including virtual objects 15 , such as ball 15 a , obstacles 15 b , and ramp 15 c , or other virtual objects that can affect motion of at least one virtual object 15 .
  • Virtual objects 15 are not limited to balls, obstacles, and ramps, however. Virtual objects 15 can be any number of objects found in virtual environments, such as pools, pits, prizes, bumpers, accelerators, edges, walls, etc.
  • Board 16 , ball 15 a , and obstacles 15 b are shown as they appear in the displays of head-worn unit 10 (see FIG. 2 b ).
  • ball 15 a is simulated to engage in motion consistent with a true virtual gravity vector, substantially identical to the true physical gravity vector.
  • the computer interaction environment may also include virtual objects 15 , such as obstacles 15 b , which alter the motion of ball 15 a .
  • obstacles 15 b can be configured to stop, decelerate, accelerate, bounce, deform, change direction, or engage in any number of other changes in motion or characteristic properties.
  • ball 15 a can be configured to move as if it were a physical ball encountering physical obstacles on board 16 , even though ball 15 a and obstacles 15 b are virtual objects. Additionally, a variety of nonphysical behaviors (shrinking, growing, color change, gravitational shifting, acceleration, deceleration, disappearing, etc.) can be simulated when the ball 15 a is in contact with or in close proximity to other virtual objects 15 .
  • FIG. 5 is a schematic diagram depicting the flow of information between head-worn unit 10 and computer 14 .
  • Optical sensors 11 can output images or video streams to computer 14 .
  • one optical sensor 11 can output video to computer 14 , which computer 14 can process to create at least one of position and orientation data for gameboard 16 , as will be described herein.
  • another optical sensor 11 can output video that can be combined with virtual objects 15 to generate an augmented reality environment.
  • One or more of optical sensors 11 can be configured to output video to computer 14 for both processing to generate at least one of position and orientation data and for combining with virtual objects 15 to generate an augmented reality environment.
  • Head tracker 12 can track the position and orientation of head-worn unit 10 .
  • Head tracker 12 can include one or more of but is not limited to, a three-axis accelerometer, a magnetometer, a gyroscope, and a full six-degrees-of-freedom tracker. Further, head tracker 12 can detect the relative direction of the true physical gravity vector acting on head-worn unit 10 . Head tracker 12 can output this position information and orientation information, which includes information regarding the relative direction of the true physical gravity vector, to computer 14 . Computer 14 can then process the information to create augmented reality information, as will be described herein. Alternatively, computer 14 can be configured to create purely virtual reality information. Computer 14 can then output the augmented reality or virtual reality information to display screens 13 .
  • the augmented reality information can be, but is not limited to, an augmented reality video stream including a sequence of augmented reality images.
  • the virtual reality information can be, but is not limited to, a virtual reality video stream including a sequence of virtual reality images.
  • Display screens 13 can be configured to display one or more of virtual reality information and augmented reality information. Further, display screens 13 can be optically transparent or disposed at a periphery of a user's field of vision, allowing a user to view the physical environment directly, and can be configured to overlay virtual objects 15 over the directly viewed physical environment.
  • computer 14 can be separate from head-worn-unit 10
  • head-worn unit 10 can include computer 14 .
  • Computer 14 can be any processing device. Further, one or more of optical sensors 11 and head tracker 12 can be configured to output one or more of position information, orientation information, or information regarding the relative position of the true physical gravity vector, alone or in combination with other information, to computer 14 .
  • FIG. 6 is a schematic diagram of computer 14 .
  • Computer 14 can include, but is not limited to, a processor 17 , memory 18 , input unit 21 , and output unit 22 .
  • Memory 18 can include, but is not limited to, one or more of a game engine 19 and a physics engine 20 .
  • Game engine 19 can be software stored in memory 18 and can be used for generating the graphics associated with the virtual objects 15 and the virtual objects' interactions with physical objects in the computer interaction environment. Accordingly, although this embodiment describes that memory 18 can include game engine 19 , it is not necessary that memory 18 includes game engine 19 .
  • physics engine 20 can be software stored in memory 18 and can be used to simulate physical interactions in the computer interaction system.
  • physics engine 20 can be software stored in memory 18 , this is not necessary.
  • Physics engine 20 can be a separate processor in computer 14 or a combination of a separate processor in computer 14 and software.
  • processor 17 can include physics engine 20 .
  • Physics engine 20 can perform the calculations needed to model the physical interactions of virtual objects 15 with additional virtual objects and, if desired, with physical objects, according to the laws of physics.
  • the present embodiment can include the Newton Game Dynamics (http://www.newtondynamics.com) physics engine, but other physics engines can be used, such as Havok Physics (http://www.havok.com), NVIDIA PhysX (http://www.nvidia.com/object/physx_ne.html), etc.
  • Input unit 21 and output unit 22 can link computer 14 to head-worn unit 10 .
  • Input unit 21 and output unit 22 can be independent of each other or can be integrated together. Further, input unit 21 and output unit 22 can be wireless communication devices or wired communication devices.
  • FIG. 7 outlines the operation of receiving orientation and position information and determining relevant information.
  • Memory 18 can store tracking software that can instruct processor 17 to process the information (e.g., video, images, data, etc.) output by optical sensors 11 in S 1 -A and S 1 -C.
  • the tracking software can instruct processor 17 to process the location of optical markers 23 in the video stream or other data and to determine one or more of the position and orientation of board 16 relative to optical sensors 11 .
  • the software can instruct processor 17 to store the position and orientation of board 16 relative to optical sensors 11 at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds).
  • S 1 -A and S 1 -C can occur separately or simultaneously.
  • a similar process occurs in S 1 -B and S 1 -D with respect to tracking data output from head tracker 12 .
  • the tracking software can instruct processor 17 to process the location of head tracker 12 , if necessary, and to determine one or more of the position and orientation of head tracker 12 relative to reference 24 .
  • the software can instruct processor 17 to store the position and orientation of head tracker 12 relative to reference 24 in memory 18 at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds).
  • S 1 -B and S 1 -D can occur separately or simultaneously. If necessary, processor 17 can store information about the relative positions of head tracker 12 and optical trackers 11 in memory 18 .
  • the tracking software can instruct the processor to determine a board movement vector, as shown in S 2 -A.
  • the processor can also determine a board rotation tensor for the successive change in orientation of board 16 over each discrete time interval as shown in S 2 -C.
  • the tracking software can instruct processor 17 to process one or more of the orientation and position information output by head tracker 12 to determine the position and orientation of head tracker 12 relative to reference 24 .
  • Processor 17 can also determine one or more of a head tracker movement vector and a head tracker rotation tensor for each successive change in position of board 16 and orientation of board 16 , respectively as shown in S 2 -B and S 2 -D. If head tracker 12 is not configured to output information regarding the relative direction of the true physical gravity vector to computer 14 , the tracking software can instruct processor 17 to determine the direction of the true physical gravity vector relative to head tracker 12 from the position and orientation of head tracker 12 relative to reference 24 as shown in S 2 -E.
  • the tracking software can also instruct processor 17 to perform a compensation procedure to account for motion of optical sensors 13 , as shown in FIG. 8 .
  • processor 17 can compare the magnitude of head tracker movement vector with a predetermined threshold distance stored in memory 18 .
  • This predetermined threshold distance stored in memory 18 can be determined based on the particular implementation of the application (e.g., smaller thresholds for complex simulations such as simulated medical training, and larger thresholds for simple simulations such as basic physics simulations). If the magnitude of head tracker movement vector exceeds or is equal to the predetermined threshold distance, processor 17 can determine that head tracker 12 has changed position. If the magnitude of head tracker movement vector is less than the predetermined threshold distance, processor 17 can determine that head tracker 12 has not changed position.
  • processor 17 can determine a resultant rotation from head tracker rotation tensor and compare the resultant rotation of head tracker 12 with a predetermined threshold rotation value stored in memory 18 as shown in S 4 -A.
  • This predetermined threshold rotation value stored in memory 18 can be determined based on the particular implementation of the application smaller thresholds, for complex simulations such as simulated medical training, and larger thresholds for simple simulations such as basic physics simulations). If the resultant rotation of head tracker 12 exceeds or is equal to the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has changed both position and orientation. If the resultant rotation of head tracker 12 is less than the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has not changed orientation but has changed position.
  • processor 17 can determine a resultant rotation from head tracker rotation tensor and compare the resultant rotation of head tracker 12 with a predetermined threshold rotation value stored in memory 18 as shown in S 3 -C. If the resultant rotation of head tracker 12 exceeds or is equal to the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has changed orientation but not position. If the resultant rotation of head tracker 12 is less than the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has not changed position or orientation.
  • one or more of the comparisons of S 3 -A, S 3 -C, and S 4 -A can occur simultaneously or as part of the same process. It is not necessary to separately compare relative rotation information and movement information of head tracker 12 . Accordingly, relative rotation information and movement information of head tracker 12 can be compared with a combined threshold value.
  • processor 17 can apply adjustments to the corresponding board movement vector and the corresponding board rotation tensor to account for the change in position and orientation of head tracker 12 and optical sensors 11 , as described in S 5 -A.
  • the adjustments include subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals and applying frame rotations to correct for changes in orientation in the proper sequence. This adjustment produces an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of board 16 .
  • processor 17 can apply adjustments to the corresponding board movement vector to account for the change in position of head tracker 12 and optical sensors 11 , as described in S 5 -B.
  • the position changes of optical sensors 11 can be determined by applying the proper sequence of translations.
  • the adjustments include subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals. This adjustment produces an adjusted board movement vector which more accurately models motion of board 16 and retains the previously determined board rotation tensor.
  • processor 17 can apply adjustments to the corresponding board rotation tensor to account for the change in orientation of head tracker 12 and optical sensors 11 , as described in S 5 -C.
  • the orientation changes of optical sensors 11 can be determined by applying the proper sequence of rotations.
  • the adjustments include applying coordinate transformations to the board rotation tensor which remove effects of the resultant rotation of the second tracking mechanism in the proper sequence. This adjustment produces an adjusted board rotation tensor which more accurately models motion of board 16 and retains the previously determined board movement vector.
  • processor 17 determines that both the position and orientation of head tracker 12 have not changed, processor 17 is not instructed to perform an adjustment and proceeds directly to simulation.
  • processor 17 can use the information regarding the relative direction of the true physical gravity vector determined in S 2 -E, the adjusted board movement vector, and the adjusted board rotation tensor to simulate the motion of a virtual object 15 , such as ball 15 a , under a true virtual gravity vector based on the actual motion of board 16 , as described in S 6 -A.
  • processor 17 can use the information about the relative direction of the true physical gravity vector determined in S 2 -E, the adjusted board movement vector, and the unadjusted board rotation tensor to simulate the motion of a virtual object 15 , such as ball 15 a , under a true virtual gravity vector based on the actual motion of board 16 , as described in S 6 -B.
  • processor 17 can use the information about the relative direction of the true physical gravity vector determined in S 2 -E, the unadjusted board movement vector, and the adjusted board rotation tensor to simulate the motion of a virtual object 15 , such as ball 15 a , under a true virtual gravity vector based on the actual motion of board 16 , as described in S 6 -C.
  • processor 17 can use the information about the relative direction of the true physical gravity vector determined in S 2 -E, the unadjusted board movement vector, and the unadjusted board rotation tensor to simulate the motion of a virtual object 15 , such as ball 15 a , under a true virtual gravity vector based on the actual motion of board 16 , as described in S 6 -D.
  • displays 13 can display a combination of virtual objects 15 and physical objects, such as board 16 , in real-time. Alternatively, displays 13 can display only virtual objects 15 .
  • FIGS. 9A-12 For purpose of explanation and illustration, and not limitation, another exemplary embodiment of the interaction system in accordance with the application is shown in FIGS. 9A-12 . For brevity, only the aspects of the another exemplary embodiment that are different from the previously described embodiment will be described.
  • FIG. 9A depicts the front side of a head-worn unit 110 .
  • the head-worn unit 110 can include a pair of optical sensors 111 (first tracking mechanisms) for detecting optical markers on a gameboard (see, e.g., optical marker 23 on gameboard 16 in FIG. 3 ).
  • Optical sensors 111 detect the position and orientation of optical markers and transmit that information to the processing device 114 (not shown).
  • head-worn unit 110 can include a head motion detector 126 (a detecting mechanism) for detecting at least one of an acceleration of the head motion detector 126 , a velocity of the head motion detector 126 , and an orientation of head motion detector 126 .
  • Head motion detector 126 can be rigidly fixed relative to optical sensors 11 , but is not limited to being so fixed as long as the position of head motion tracker 126 relative to at least one optical sensor 111 is predetermined. Accordingly, the orientation, position, and motion of at least one optical sensor 111 can be readily determined from the motion of head motion detector 126 if head motion detector 126 is rigidly fixed relative to at least one optical sensor 111 , or if the orientation and position of head motion detector 126 relative to at least one optical sensor 111 is otherwise known. Alternatively or additionally, head motion detector 126 can be configured to detect a correct direction and magnitude of a physical gravity vector.
  • head motion detector 126 can alternatively include one or more distinctly separate motion detectors each configured to detect one or more of correct direction and magnitude of a physical gravity vector, acceleration of the head motion detector 126 , a velocity of the head motion detector 126 , and an orientation of head motion detector 126 .
  • Head motion detector 126 can include accelerometers, gyros, magnetometers, etc.
  • FIG. 9B depicts the back side of head-worn unit 110 .
  • Head motion detector 126 can be disposed on the back side of head-worn unit 110 .
  • display devices 113 can be disposed on the back side of head-worn unit 110 .
  • Display devices 113 can be configured to display augmented reality images or virtual reality images as described in other embodiments of the application.
  • FIG. 10 is a schematic diagram depicting the flow of information between head-worn unit 110 and processing device 114 .
  • Optical sensors 111 can output images or video streams to processing device 114 , as described in other embodiments of the application.
  • Processing device 114 performs functions similar to computer 14 (e.g., FIG. 5 ), as described in the other embodiments of the application.
  • Head motion detector 126 can track the motion of head-worn unit 110 .
  • Head motion detector 126 can include one or more of, but is not limited to, a three-axis accelerometer, a three-axis magnetometer, a three-axis gyroscope, and a full six-degrees-of-freedom tracker.
  • head motion detector 126 can detect the relative direction of the true physical gravity vector acting on head-worn unit 110 and head motion detector 126 . Head motion detector 126 can output this motion information, which includes information regarding the relative direction of the true physical gravity vector, to processing device 114 . Processing device 114 can then process the information to create augmented reality information, as will be described herein. Alternatively, processing device 114 can be configured to create purely virtual reality information. Processing device 114 can then output the augmented reality or virtual reality information to display screens 113 .
  • the augmented reality information can be, but is not limited to, an augmented reality video stream including a sequence of augmented reality images. Further, the virtual reality information can be, but is not limited to, a virtual reality video stream including a sequence of virtual reality images.
  • processing device 114 can be separate from head-worn-unit 110
  • head-worn unit 110 can include processing device 114 .
  • Processing device 114 can be any processing device.
  • one or more of optical sensors 11 and head motion detector 126 can be configured to output one or more of position information, orientation information, information regarding the relative position of the true physical gravity vector, and head motion detector motion information alone or in combination with other information, to processing device 114 .
  • FIG. 11 outlines the operation of receiving orientation, position, and motion information and determining relevant information.
  • Processing device 114 can process the information (e.g., video, images, data, etc.) output by optical sensors 111 in S 11 -A and S 11 -B. Specifically, when optical sensors 111 output a video stream or other data to processing device 114 , processing device 114 can process the location of optical markers in the video stream or other data and can determine one or more of the position and orientation of a gameboard relative to optical sensors 111 . Processing device 114 can store the position and orientation of the gameboard relative to optical sensors 111 at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds). S 11 -A and S 11 -B can occur separately or simultaneously.
  • the information e.g., video, images, data, etc.
  • processing device 114 can process the location of optical markers in the video stream or other data and can determine one or more of the position and orientation of a gameboard relative to optical sensors
  • a similar process occurs in S 11 -C and S 11 -D with respect to head motion detector motion information and physical gravity vector information output from head motion detector 126 .
  • the processing device 114 can process the motion information of head motion detector, if necessary, and to determine one or more of the acceleration and velocity of head motion detector relative to a reference frame.
  • Processing device 114 can store the acceleration and velocity of head motion detector relative to a reference frame at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds).
  • S 11 -C and S 11 -D can occur separately or simultaneously. If necessary, processing device 114 can store information about the relative positions of head motion detector 126 and optical trackers 11 .
  • processing device 114 can determine a board movement vector, as shown in S 12 -A. Processing device 114 can also determine a board rotation tensor for the successive change in orientation of the gameboard over each discrete time interval as shown in S 12 -B.
  • processing device 114 can determine that head motion detector 126 has changed at least one of position and orientation. If the magnitude of acceleration of head motion detector 126 over the predetermined time is less than the predetermined threshold acceleration value, processing device 114 can determine that head motion detector 126 has not changed position and orientation, unless processing device 114 determines that head motion detector 126 has changed at least one of position and orientation as a result of a velocity comparison in S 23 , which will be explained herein.
  • the adjustments can include one or more heuristics for compensation, including, but not limited to: ignoring changes in one or more of board position information and board orientation information over the corresponding discrete time interval when simulating the physics of the virtual reality object, while simulating a physically accurate field of vision.
  • These adjustments produce at least one of an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of the gameboard.
  • Processing device 114 performs a comparison of velocity values in S 23 if processing device determines the velocity of head motion detector 126 relative to the reference frame in S 12 -C.
  • processing device 114 can compare the magnitude of velocity of head motion detector 114 over a predetermined time with a predetermined threshold velocity value.
  • This predetermined threshold velocity value can be determined based on the particular implementation of the application (e.g., smaller thresholds for complex simulations such as simulated medical training, and larger thresholds for simple simulations such as basic physics simulations). If the magnitude of velocity of head motion detector 114 over the predetermined time is greater than or equal to the predetermined threshold velocity value, processing device 114 can determine that head motion detector 126 has changed at least one of position and orientation.
  • processing device 114 can apply adjustments to the corresponding at least one of board movement vector and board rotation tensor to account for the change in at least one of position and orientation of head motion detector 126 and optical sensors 11 , as described in S 24 -A.
  • the orientation changes and position changes of optical sensors 11 can be determined by applying the proper adjustments.
  • the adjustments include one or more of subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals and applying frame rotations to correct for changes in orientation in the proper sequence. This adjustment produces at least one of an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of the gameboard.
  • processing device 114 can use the information about the relative direction of the true physical gravity vector received in S 11 -D and one or more of the adjusted board movement vector determined in S 14 -A or S 24 -A, and the adjusted board rotation tensor determined in S 14 -A or S 24 -A to simulate the motion of a virtual objects, under a true virtual gravity vector based on the actual motion of the gameboard, as described in S 15 -A (comparing acceleration) or S 25 -A (comparing velocity).
  • the processor can be programmed to select one of the adjusted board movement vector determined in S 14 -A and the adjusted board movement vector determined in S 24 -A, the processor can be programmed to implement only one of S 15 -A and S 25 -A. If an adjusted board rotation tensor is determined in both S 14 -A S 24 -A, the processor can be programmed to select one of the adjusted board rotation tensor determined in S 14 -A and the adjusted board rotation tensor determined in S 24 -A, the processor can be programmed to implement only one of S 15 -A and S 25 -A.
  • processing device 114 If processing device 114 infers that both the position and orientation of head motion detector 126 have not changed as a result of acceleration (S 13 ) or velocity (S 23 ), processing device 114 does not perform an adjustment and proceeds directly to simulation.
  • processing device 114 can use the information about the relative direction of the true physical gravity vector received in S 11 -D, the unadjusted board movement vector determined in S 12 -A, and the unadjusted board rotation tensor determined in S 12 -B to simulate the motion of a virtual objects, under a true virtual gravity vector based on the actual motion of the gameboard, as described in S 15 -B (comparing acceleration) or S 25 -B (comparing velocity).
  • processing device 114 performs one or more of S 15 -A, S 25 -A, S 15 -B and S 25 -B to simulate the motion of at least one virtual object
  • processing device 114 outputs this simulation as augmented reality information or virtual reality information to head-worn unit 110 .
  • displays 113 can display a combination of virtual objects and physical objects, such as the gameboard, in real-time. Alternatively, displays 113 can display only virtual objects.
  • a virtual gravitational force can act on ball 15 a as natural gravity would act on a physical object placed on board 16 .
  • the virtual force of gravity will appear to act in the true physical gravity vector direction with respect to board 16 .
  • interaction delivery devices other than head-worn unit 10 can be used in place of or in combination with head-worn unit 10 .
  • interaction delivery devices can include tracked physical displays that are held, worn on additional or alternative parts of the body, or mounted on objects in the environment.
  • one or more of the first tracking mechanism, the second tracking mechanism, and each display device, individually or in combination, can be disposed separately from the interaction delivery device.
  • the second tracking mechanism can include one or more of a three-axis accelerometer, a three-axis magnetometer, a three-axis gyroscope, and a full six-degrees-of-freedom tracker. Additionally, the second tracking mechanism can determine the relative direction of a true force vector acting in the force's natural direction (true physical vector direction) other than that of gravity, including, but not limited to, the true vectors of magnetic forces, electrostatic forces, friction, additional artificial forces, etc.
  • the simulated forces are not limited to gravitational forces applied in the true physical gravity vector direction.
  • the virtual gravity vector can be intentionally different from the true physical gravity vector (e.g., a not true virtual gravity vector).
  • the simulated gravitational force can have a different magnitude or act in a direction different from the true physical gravity vector.
  • the simulated force can include, but is not limited to, magnetic forces, electrostatic forces, and additional artificial forces in true or not true directions.
  • the virtual gravity vector can be an intentionally not true virtual gravity vector.

Abstract

A computer interaction system includes an augmented interaction device and a computer. The augmented interaction device includes a display device that displays augmented reality or virtual reality images, a first tracking mechanism that tracks a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism that tracks a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference. The computer includes a processor and a memory, and processes position and orientation information received from the first tracking mechanism, and position and orientation information received from the second tracking mechanism. The computer can be configured to compensate for movement relative to the physical object of the first and second tracking mechanisms, and to output augmented reality or virtual reality information to the display.

Description

    PRIORITY
  • This application claims the benefit of U.S. Provisional Application No. 61/322,521, filed Apr. 9, 2010, which is hereby incorporated by reference in its entirety.
  • FIELD
  • The present application relates to a computer interaction system and a method of facilitating interaction between an interaction delivery device and a physical object in an environment.
  • BACKGROUND
  • Computer games have been developed that account for gravity. An example of a game accounting for gravity is a virtual marble labyrinth game. In a virtual marble labyrinth game, a user tilts a gameboard having one or more virtual balls such that the virtual balls roll about the gameboard as if the virtual balls were real balls on the surface of the gameboard being influenced by a gravitational force and the user's actions. Additionally, the virtual balls can encounter a number of virtual objects that can affect the virtual balls' motion in the same manner as a physical object on the gameboard would affect the physical balls' motion. The objective of the virtual marble labyrinth game is to move the virtual balls to a desired position on the gameboard.
  • SUMMARY
  • One aspect of the presently disclosed subject matter provides an interaction delivery device that can include a display device, a first tracking mechanism, and a second tracking mechanism. The display device can be configured to display virtual objects. The first tracking mechanism can be configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism. The second tracking mechanism can be configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference. In certain embodiments, one or more of position information received from the first tracking mechanism and orientation information received from the first tracking mechanism can be used to generate motion data for the physical object. In certain embodiments, one or more of position information received from the second tracking mechanism and orientation information received from the second tracking mechanism can be used to generate adjusted motion data for the physical object. In certain embodiments, the adjusted motion data for the physical object is used to generate virtual object information. In certain embodiments, the virtual object information is received by the display.
  • Another aspect of the presently disclosed subject matter provides an interaction processing device that can include a processor, a memory, an input unit configured to receive information, and an output unit configured to output information. In certain embodiments, the input unit can be configured to receive at least one of physical object position information (e.g., data, video, images, etc.), physical object orientation information (e.g., data, video, images, etc.), tracking mechanism position information (e.g., data, video, images, etc.), and tracking mechanism orientation information (e.g., data, video, images, etc.). In certain embodiments, the processor can be configured to generate motion data for a physical object using at least one of the physical object position information and the physical object orientation information. In certain embodiments, the processor can be configured to generate adjusted motion data for the physical object using at least one of tracking mechanism position information and tracking mechanism orientation information. In certain embodiments, the adjusted motion data for the physical object can compensate for movement of a tracking mechanism relative to the physical object. In certain embodiments, the processor can be configured to generate virtual object information using the adjusted motion data for the physical object. In certain embodiments, the output unit can be configured to output the virtual object information.
  • Another aspect of the presently disclosed subject matter provides a computer interaction system that can include an interaction delivery device and a computer. The interaction delivery device can include a display device configured to display one or more of virtual objects, images, videos, and other information or media, a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference. The computer can include a processor and a memory, and can be configured to process one or more of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism. Further, in certain embodiments, the computer can be configured to compensate for movement relative to the physical object of at least one of the first tracking mechanism, the second tracking mechanism, and the display device. In some embodiments, the computer can be further configured to output virtual object information to the display device. The computer can be any processing device.
  • In one embodiment, the first tracking mechanism can be an optical tracking mechanism which can include one or more cameras. Further, the interaction delivery device can be configured to be head-worn and to display one or more of video see-through augmented reality (e.g., video depicting a physical environment viewed through one or more attached cameras that is augmented with additional virtual graphics), virtual reality video (e.g., video depicting an entirely virtual environment), and optical see-through augmented reality (e.g., viewing the physical environment directly, rather than through video, with virtual graphics overlaid on the field of vision by using optical elements such as mirrors, lenses, etc.). The second tracking mechanism can include one or more of a three-axis accelerometer and a magnetometer. Moreover, the second tracking mechanism can be configured to determine a true direction of a natural force, such as gravity or magnetism, acting on the second tracking mechanism. The second tracking mechanism can have a position that is rigidly fixed relative to the first tracking mechanism, or a position that is otherwise known relative to the first tracking mechanism.
  • In a further embodiment, the second tracking mechanism can be configured to comprise a six-degree-of-freedom tracker that can be configured to determine a three-dimensional position of the second tracking mechanism relative to the reference and a three-dimensional orientation of the second tracking mechanism relative to the reference. The reference can be the earth or fixed to the earth. Alternatively, the reference can be fixed to an one or more additional physical object moving relative to the earth.
  • In another embodiment, the second tracking mechanism can be configured to determine a true direction of a natural force vector. Further, the computer can be configured to simulate motion of virtual objects based on a true direction of a natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism. Alternatively or additionally, the computer can be configured to simulate motion of virtual objects based on a direction of force that is different from the true direction of a natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism.
  • In a further embodiment, the computer can further include one or more of a game engine and a physics engine, wherein the computer can be configured to simulate virtual objects. For example, the computer can be configured to simulate virtual forces acting, in a true physical direction, on the virtual objects based on one or more of the position information received from the first tracking mechanism, the orientation information received from the first tracking mechanism, the position information received from the second tracking mechanism, the orientation information received from the second tracking mechanism, acceleration information about the second tracking mechanism received from the second tracking mechanism, velocity information about the second tracking mechanism received from the second tracking mechanism, and true physical gravity vector information. The position information received from the first tracking mechanism can include a first position of the physical object relative to the first tracking mechanism at a first time and a second position of the physical object relative to the first tracking mechanism at a second time. The orientation information received from the first tracking mechanism can include a first orientation of the physical object relative to the first tracking mechanism at the first time and a second orientation of the physical object relative to the first tracking mechanism at the second time. The position and orientation information received from the first tracking mechanism can be data, video streams, images, or other forms of information. The position information received from the second tracking mechanism can include a first position of the second tracking mechanism relative to the reference at the first time and a second position of the second tracking mechanism relative to the reference at the second time. The orientation information received from the second tracking mechanism can include a first orientation of the second tracking mechanism relative to the reference at the first time and a second orientation of the second tracking mechanism relative to the reference at the second time.
  • Additionally, the second tracking mechanism can have a position rigidly fixed, or otherwise known, relative to the first tracking mechanism. A predetermined time can exist between the first time and the second time. Moreover, the computer can be configured to determine a physical object movement vector between the first position of the physical object and the second position of the physical object. The computer can be configured to determine a second tracking mechanism movement vector between the first position of the second tracking mechanism and the second position of the second tracking mechanism. Further, the computer can be configured to calculate an adjusted physical object movement vector by subtracting the second tracking mechanism movement vector from the physical object movement vector if a magnitude of the second tracking mechanism movement vector exceeds a predetermined threshold distance (e.g., the predetermined threshold distance can be set as moving about 0.02 meters in the about 1 second) The computer can be configured to simulate virtual forces acting on the virtual objects based on at least the adjusted physical object movement vector if the magnitude of the second tracking mechanism movement vector exceeds, or is equal to, the predetermined threshold distance. Additionally, the computer can be configured to simulate virtual forces acting on the virtual objects based on at least the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance.
  • In yet another embodiment, the computer can be configured to determine a physical object rotation tensor between the first orientation of the physical object and the second orientation of the physical object. The computer can be configured to determine a second tracking mechanism rotation tensor between the first orientation of the second tracking mechanism and the second orientation of the second tracking mechanism. Additionally, the computer can be configured to calculate an adjusted physical object rotation tensor by applying a transformation based on a resultant rotation of the second tracking mechanism, if the resultant rotation of the second tracking mechanism exceeds, or is equal to, a predetermined threshold rotation value (e.g., a predetermined threshold rotation value of, for example 3 degrees in a predetermined time of 1 second). The computer can be configured to simulate natural forces acting on the virtual objects based on at least the adjusted physical object rotation tensor if the resultant rotation of the second tracking mechanism exceeds, or is equal to, the threshold rotation magnitude. Additionally, the computer can be configured to simulate natural forces acting on the virtual objects based on at least the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
  • Another aspect of the presently disclosed subject matter provides a non-transitory computer readable medium having computer readable instructions stored thereon, which, when executed by a computer having a processor to execute a plurality of processes, are configured to cause the processor to perform several functions. In certain embodiments, the computer readable instructions can cause the processor to obtain one or more of first tracking mechanism information and second tracking mechanism information. In certain embodiments, the computer readable instructions can cause the processor to determine one or more of a physical object movement vector using the first tracking mechanism information, a second tracking mechanism movement vector using the second tracking mechanism information, and direction of a true physical gravity vector relative to the second tracking mechanism. In certain embodiments, the computer readable instructions can cause the processor to simulate motion data of a virtual object. The processor can simulate motion data of the virtual object based on a true or not true virtual gravity vector using an adjusted physical object movement vector if a magnitude of the second tracking mechanism movement vector is greater than or equal to a predetermined threshold distance, the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance, and the direction of a true physical gravity vector relative to the second tracking mechanism. In certain embodiments, the computer readable instructions can cause the processor to output the motion data of the virtual object to a display.
  • In one embodiment, the non-transitory computer readable medium having computer readable instructions stored thereon, which, when executed by a computer having a processor to execute a plurality of processes, are configured further to cause the processor to determine one or more of a physical object rotation tensor using the first tracking mechanism information, and a second tracking mechanism rotation tensor using the second tracking mechanism information. The computer readable instructions can also cause the processor to simulate motion data of an a virtual object using additionally: an adjusted physical object rotation tensor if a resultant rotation of the second tracking mechanism is greater than or equal to a predetermined threshold rotation value, and the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
  • The first tracking mechanism information can include a first position of a physical object relative to a first tracking mechanism and a first orientation of the physical object at a first time, and a second position of the physical object relative to the first tracking mechanism and a second orientation of the physical object at a second time. The second tracking mechanism information can include a first position of a second tracking mechanism relative to a reference and a first orientation of the second tracking mechanism at the first time, and a second position of the second tracking mechanism relative to the reference and a second orientation of the second tracking mechanism at the second time.
  • Another aspect of the presently disclosed subject matter provides a method of facilitating interaction between an interaction delivery device and a physical object in an environment, the method including generating one or more virtual objects in the environment; detecting a change in the physical object; determining whether the change in the physical object is based on a change in the state of the virtual objects and the physical object, or both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object; measuring a direction and effect (e.g., magnitude, etc.) of a natural force interacting with the environment; and updating the virtual objects based on a result of the determining and the measuring. In certain embodiments, the detecting can further include: detecting a change in position of the physical object over a given time, and detecting a change in position of the interaction delivery device over the given time. In certain embodiments, the determining can further include determining whether a magnitude of the change in position of the interaction delivery device over the given time exceeds, or is equal to, a predetermined threshold value, determining that the detected change in the physical object is based on a change in the state of the virtual objects and physical object if the change in position of the interaction delivery device over the given time is less than the predetermined threshold value, and determining that the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object if the change in position of the interaction delivery device over the given time exceeds or is equal to the predetermined threshold value. The updating can further include updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object over the given time if the detected change in the physical object is based on a change in the state of the virtual objects and the physical object, and updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object and adjusted to remove effects caused by the force applied to the interaction delivery device over the given time if the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object.
  • Another aspect of the presently disclosed subject matter provides an interaction system that can include a physical object, at least one virtual object, an interaction delivery device, and a processing device. The interaction delivery device can include: a tracking mechanism that can be configured to track at least one of a position of the physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism; a detecting mechanism configured to detect motion of the detecting mechanism, wherein a position of the detecting mechanism relative to a position of the tracking mechanism is predetermined; and a display configured to display the at least one virtual object. The processing device can be configured to receive physical object position information from the tracking mechanism. The processing device can be configured to receive detecting mechanism motion information from the detecting mechanism. The processing device can be configured to perform at least one of: determining if a magnitude of acceleration of the detecting mechanism is greater than a predetermined acceleration threshold value and generating adjusted physical object position information if the magnitude of acceleration of the detecting mechanism is greater than the predetermined acceleration threshold value; and determining if a magnitude of velocity of the detecting mechanism is greater than a predetermined velocity threshold value and generating adjusted physical object position information if the magnitude of velocity of the detecting mechanism is greater than the predetermined velocity threshold value. The processing device can be configured to generate motion information for the at least one virtual object based on the adjusted physical object position information if the processing device generates the adjusted physical object position information. The processing device can be configured to generate the motion information for the at least one virtual object based on the physical object position information if the processing device does not generate the adjusted physical object position information. The processing device can be configured to output the motion information for the at least one virtual object to the display.
  • The detecting mechanism can be configured to detect a correct direction and magnitude of a physical gravity vector. Additionally, the processing device can be further configured to generate the motion information for the at least one virtual object additionally based on the correct direction and magnitude of the physical gravity vector. Further, the tracking mechanism can be an optical tracking device. The physical object can be configured to not include attached or embedded electronic devices or components.
  • In one embodiment of the present aspect of the application, the at least one virtual object can be configured to move according to a virtual gravity vector, wherein the virtual gravity vector is substantially identical to the physical gravity vector. In another embodiment of the present aspect of the application, the at least one virtual object can be configured to move according to a virtual gravity vector, wherein the virtual gravity vector is different from the physical gravity vector.
  • Another aspect of the presently disclosed subject matter describes an interaction delivery system comprising: a first tracking mechanism rigidly attached to a display device configured to track the position and orientation of a physical object relative to the display device; a second tracking mechanism rigidly attached to the display device, configured to track the absolute orientation of the display device relative to the earth; and a processing device configured to process tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism, wherein the processing device is configured to simulate motion information of at least one virtual object from the tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism, wherein the processing device is configured to output the simulated motion information of the at least one virtual object to the display, wherein the display is configured to display the at least one virtual object disposed relative to the physical object, and wherein the at least one virtual object is configured to behave as if acted on by a virtual gravity vector in a same direction as a physical gravity vector acting on the physical object.
  • It is to be understood that both the foregoing general description and the following detailed description are exemplary and are intended to provide further explanation of the application claimed.
  • The accompanying drawings, which are incorporated in and constitute part of this specification, are included to illustrate and provide a further understanding of the apparatus and method of the application. Together with the written description, the drawings serve to explain the principles of the application.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 depicts a computer interaction system according to a non-limiting embodiment.
  • FIG. 2A depicts a front side of a head-worn unit, and FIG. 2B depicts a back side of the head-worn unit according to a non-limiting embodiment.
  • FIG. 3 depicts, in detail, a gameboard component of the computer interaction system depicted in FIG. 1 according to a non-limiting embodiment.
  • FIG. 4 depicts a computer interaction system including virtual graphics according to a non-limiting embodiment.
  • FIG. 5 is a schematic of the head-worn unit connected to a computer according to a non-limiting embodiment.
  • FIG. 6 is a schematic of a computer according to a non-limiting embodiment.
  • FIG. 7 is a flow chart depicting operations of a computer according to a non-limiting embodiment.
  • FIG. 8 is a flow chart depicting operations of the computer which compensate for movement and rotation of the optical tracker according to a non-limiting embodiment.
  • FIG. 9A depicts a front side of a head-worn unit, and FIG. 9B depicts a back side of the head-worn unit according to a non-limiting embodiment
  • FIG. 10 is a schematic of the head-worn unit connected to a processing device according to a non-limiting embodiment.
  • FIG. 11 is a flow chart depicting operations of a processing device according to a non-limiting embodiment.
  • FIG. 12A is a flow chart depicting operations of the processing device which compensate for movement and rotation of the optical tracker, based on acceleration information of the head motion detector, according to a non-limiting embodiment. FIG. 12B is a flow chart depicting operations of the processing device which compensate for movement and rotation of the optical tracker, based on velocity information of the head motion detector, according to a non-limiting embodiment.
  • DETAILED DESCRIPTION
  • The disclosed subject matter can be exemplified with, but is not limited to, a virtual marble labyrinth game. The virtual marble labyrinth game has been implemented using a number of technologies. For example, the virtual marble labyrinth game can consist of a hand-held display for displaying the virtual balls and other virtual objects, such as obstacles, bumps, ramps, prizes, walls, pools, holes, channels, etc. The hand-held display is further integrated with a set of built-in accelerometers that can directly measure the direction of gravity relative to the display. An example of this technology in action is available at http://www.youtube.com/watch?v=lipHmNbi7ss. This implementation has several drawbacks: the game graphics are restricted to the hand-held display; the weight and fragility of the gameboard depend upon the technology it contains; and the position and orientation of the player's head relative to the gameboard are not known, so graphics must be generated for an assumed head position and orientation, and cannot respond to changes in actual head position and orientation during gameplay.
  • Another example of a virtual marble labyrinth game uses a static camera to optically track a passive physical gameboard. This makes it possible to determine a six-degree-of-freedom position and orientation of the gameboard relative to the camera. The virtual objects are displayed on an external monitor, with the virtual objects overlaid on the actual view from the camera. An example of this technology in action is available at http://www.youtube.com/watch?v=L7dC8HU2KJY. In this case, the camera is assumed to be in a specific fixed pose relative to the direction of gravity. This fixed pose can be either a hardwired pose or a pose that is calibrated prior to gameplay. Consequently, changing the position or orientation of the camera during gameplay will be indistinguishable from a complementary change in the position or orientation of the gameboard. This is typically undesirable in gameplay, as it rules out having the camera move, since that would cause an unwanted change in the effective direction of gravity, preventing mobile game play or the use of a head-worn camera. Further, if the camera is rigidly affixed to a moving platform, such as a car, ship, plane, or train, on which the game is being played, the motion of the platform relative to the earth will not be properly taken into account.
  • Additional examples of a virtual marble labyrinth game use a display with a known or assumed orientation. One implementation allows the user to manipulate a virtual gameboard using an input device such as a mouse. This approach does not provide the natural feel of controlling a virtual gameboard in the same manner as one would feel while controlling a physical gameboard. Another implementation allows the user to manipulate a game controller containing an orientation tracker, effectively changing the direction of gravity in the game, but without changing the visual orientation of the gameboard. An example of this technology is available at http://www.youtube.com/watch?v=BjEKoDW9S-4. In both of these approaches, the gameboard is not held by the player.
  • Yet another example of the virtual marble labyrinth game uses a gameboard and head-worn unit. The gameboard and the head-worn unit are each individually tracked in six-degrees of freedom relative to a real world reference. This implementation requires that additional infrastructure for tracking the gameboard is added to the environment or added directly to the gameboard. Adding such additional infrastructure to the environment limits where the game can be played. Adding such additional infrastructure to the gameboard make it heavier, or more fragile. In both cases, adding additional infrastructure can make the system more expensive, more complicated, and awkward for the user.
  • Reference will now be made in detail to embodiments of the disclosed subject matter, examples of which are illustrated in the accompanying drawings and at on the internet at http://www.youtube.com/watch?v=6AKgH4On65A.
  • While, solely for purpose of convenience, the computer interaction system and the method of facilitating interaction between an interaction delivery device and physical objects in an environment presented herein are described in the context of developing and playing augmented reality computer games, use of the systems and methods of the presently disclosed subject matter is not limited thereto. For example, the presently disclosed systems and methods can be employed for other uses, such as, but not limited to, pedagogical computer environments for teaching physics, controls for user interfaces, pedagogical “virtual reality” computer environments for medical training, pilot training, etc. Additionally, the disclosed subject matter can be employed to develop and play entirely virtual games using a physical object as an interface to control virtual objects within an immersive virtual environment.
  • In accordance with the disclosed subject matter, a computer interaction system is provided. The computer interaction system includes an interaction delivery device and a computer. The interaction delivery device can include a display device configured to display virtual objects, a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference. The computer can include a processor and a memory, and can be configured to process at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism. Further, the computer can be configured to compensate for movement relative to the physical object of at least one of the first tracking mechanism and the second tracking mechanism and to output virtual object information to the display.
  • For purpose of explanation and illustration, and not limitation, an exemplary embodiment of the computer interaction system in accordance with the application is shown in FIGS. 1-12.
  • FIG. 1 depicts a computer interaction system that includes a head-worn unit 10 (an interaction delivery device) and a computer 14 that can be connected to the head-worn unit 10 wirelessly or through a cable. Further, the computer interaction system can include a gameboard 16 (or other physical object) that can be manipulated to move in an x-direction, a y-direction, and a z-direction with respect to head-worn unit 10 or to rotate to a new orientation by a user. Moreover, the head-worn unit 10 can move in an a-direction, a b-direction, and a c-direction or rotate to a new orientation as the user's head moves. A force of gravity can act upon the computer interaction environment in the direction of a true virtual gravity vector pointing downward relative to earth 25, as indicated in FIG. 1. In this embodiment, the true virtual gravity vector acts with the same magnitude and direction as the physical gravitational force acting on the user and gameboard (i.e., toward earth 25). Accordingly, as used herein, “true” indicates a natural and correct direction and magnitude, as would exist in real life. Alternatively, the virtual gravity vector can be configured to be a not true virtual gravity vector, where, as used herein, “not true” indicates that the vector has at least one of a direction and magnitude that is different from the natural and correct direction and magnitude of the physical force, as would exist in real life.
  • FIG. 2A depicts the front side of head-worn unit 10. This embodiment depicts a modified version of the Vuzix Wrap™ 920AR, but alternative devices can be used as heard-worn unit 10. The head-worn unit 10 can include one or more optical sensors 11 (first tracking mechanisms) for detecting optical markers on a gameboard (see, e.g., optical marker 23 on gameboard 16 in FIG. 3). Optical sensors 11 detect the position and orientation of optical markers and transmit that information to the computer (not shown). Further, head-worn unit 10 can include a head tracker 12 (a second tracking mechanism) for detecting at least one of the orientation and position of head tracker 12 relative to a reference 24. Reference object 24 can be a position on earth 25 or an object moving relative to earth 25 such as a car, a boat, or a plane. Head tracker 12 can be, but is not limited to being, rigidly fixed relative to at least one optical sensor 11. Accordingly, one or more of the orientation of optical sensors 11 and the position of optical sensors 11 can be readily determined from the orientation and position of head tracker 12 if head tracker 12 is rigidly fixed relative to at least one optical sensor 11.
  • FIG. 2B depicts the back side of head-worn unit 10. Head tracker 12 can be disposed on the back side of head-worn unit 10. Further, display devices 13 can be disposed on the back side of head-worn unit 10. Display devices 13 can be configured to display augmented reality images 2 based on augmented reality information output by computer 14. Augmented reality images 2 can include physical objects, such as gameboard 16, and virtual objects 15, such as ball 15 a and obstacles 15 b as depicted in FIG. 4. Display devices 13 can also be configured to allow a user to directly view physical objects and to overlay virtual objects on the physical environment viewed by the user. Alternatively or additionally, display devices 13 can be configured to display entirely virtual environments.
  • FIG. 3 depicts board 16, which can, for example, serve as a gameboard. Board 16 can have a plurality of optical markers 23. Nevertheless, the optical markers on board 16 are not limited to an array of optical markers 23 as depicted in FIG. 3. For example, optical markers may include text, barcodes, patterns, colors, or other indicia that can be identified by an optical sensor. Furthermore, the markers can be integrated within the gameboard to create a visually appealing appearance that can for example, simulate an environment consistent with the theme of the video game. Alternatively or additionally, board 16 can have designs that include a more general set of optical features (color, patterns, shapes, images, etc.) that can be recognized by optical tracking software using markerless optical tracking. Additionally or alternatively, board 16 can include natural or artificial physical features (texture, protrusions, holes, edges, grooves, notches, perforations, dips, divits, cavities, slits, etc.) that can be recognized by optical tracking software using optical feature tracking. The system can be configured to pre-store information regarding these optical and physical features prior to tracking, but it is not necessary that the system pre-store this information.
  • FIG. 4 further depicts the computer interaction system including virtual objects 15, such as ball 15 a, obstacles 15 b, and ramp 15 c, or other virtual objects that can affect motion of at least one virtual object 15. Virtual objects 15 are not limited to balls, obstacles, and ramps, however. Virtual objects 15 can be any number of objects found in virtual environments, such as pools, pits, prizes, bumpers, accelerators, edges, walls, etc. Board 16, ball 15 a, and obstacles 15 b are shown as they appear in the displays of head-worn unit 10 (see FIG. 2 b). In this particular embodiment, ball 15 a is simulated to engage in motion consistent with a true virtual gravity vector, substantially identical to the true physical gravity vector. For example, if board 16 is rotated such that corner 16 a is lowered, ball 15 a will appear to move toward corner 16 a. Ball 15 a will appear to move with a velocity and acceleration similar to that which would be observed if a physical ball having equivalent size, shape, and composition characteristics was placed upon board 16. However, the computer interaction environment may also include virtual objects 15, such as obstacles 15 b, which alter the motion of ball 15 a. For example, when ball 15 a approaches an obstacle 15 b, ball 15 a can be configured to stop, decelerate, accelerate, bounce, deform, change direction, or engage in any number of other changes in motion or characteristic properties. Thus, ball 15 a can be configured to move as if it were a physical ball encountering physical obstacles on board 16, even though ball 15 a and obstacles 15 b are virtual objects. Additionally, a variety of nonphysical behaviors (shrinking, growing, color change, gravitational shifting, acceleration, deceleration, disappearing, etc.) can be simulated when the ball 15 a is in contact with or in close proximity to other virtual objects 15.
  • FIG. 5 is a schematic diagram depicting the flow of information between head-worn unit 10 and computer 14. Optical sensors 11 can output images or video streams to computer 14. In the present embodiment, one optical sensor 11 can output video to computer 14, which computer 14 can process to create at least one of position and orientation data for gameboard 16, as will be described herein. Further, another optical sensor 11 can output video that can be combined with virtual objects 15 to generate an augmented reality environment. However, it is not necessary that optical sensors 11 each output video for separate processing. One or more of optical sensors 11 can be configured to output video to computer 14 for both processing to generate at least one of position and orientation data and for combining with virtual objects 15 to generate an augmented reality environment. Head tracker 12 can track the position and orientation of head-worn unit 10. Head tracker 12 can include one or more of but is not limited to, a three-axis accelerometer, a magnetometer, a gyroscope, and a full six-degrees-of-freedom tracker. Further, head tracker 12 can detect the relative direction of the true physical gravity vector acting on head-worn unit 10. Head tracker 12 can output this position information and orientation information, which includes information regarding the relative direction of the true physical gravity vector, to computer 14. Computer 14 can then process the information to create augmented reality information, as will be described herein. Alternatively, computer 14 can be configured to create purely virtual reality information. Computer 14 can then output the augmented reality or virtual reality information to display screens 13. The augmented reality information can be, but is not limited to, an augmented reality video stream including a sequence of augmented reality images. Further, the virtual reality information can be, but is not limited to, a virtual reality video stream including a sequence of virtual reality images. Display screens 13 can be configured to display one or more of virtual reality information and augmented reality information. Further, display screens 13 can be optically transparent or disposed at a periphery of a user's field of vision, allowing a user to view the physical environment directly, and can be configured to overlay virtual objects 15 over the directly viewed physical environment. Although this embodiment describes that computer 14 can be separate from head-worn-unit 10, head-worn unit 10 can include computer 14. Computer 14 can be any processing device. Further, one or more of optical sensors 11 and head tracker 12 can be configured to output one or more of position information, orientation information, or information regarding the relative position of the true physical gravity vector, alone or in combination with other information, to computer 14.
  • FIG. 6 is a schematic diagram of computer 14. Computer 14 can include, but is not limited to, a processor 17, memory 18, input unit 21, and output unit 22. Memory 18 can include, but is not limited to, one or more of a game engine 19 and a physics engine 20. Game engine 19 can be software stored in memory 18 and can be used for generating the graphics associated with the virtual objects 15 and the virtual objects' interactions with physical objects in the computer interaction environment. Accordingly, although this embodiment describes that memory 18 can include game engine 19, it is not necessary that memory 18 includes game engine 19. Further, physics engine 20 can be software stored in memory 18 and can be used to simulate physical interactions in the computer interaction system. Although this embodiment describes that physics engine 20 can be software stored in memory 18, this is not necessary. Physics engine 20 can be a separate processor in computer 14 or a combination of a separate processor in computer 14 and software. Alternatively, processor 17 can include physics engine 20. Physics engine 20 can perform the calculations needed to model the physical interactions of virtual objects 15 with additional virtual objects and, if desired, with physical objects, according to the laws of physics. The present embodiment can include the Newton Game Dynamics (http://www.newtondynamics.com) physics engine, but other physics engines can be used, such as Havok Physics (http://www.havok.com), NVIDIA PhysX (http://www.nvidia.com/object/physx_ne.html), etc. Input unit 21 and output unit 22 can link computer 14 to head-worn unit 10. Input unit 21 and output unit 22 can be independent of each other or can be integrated together. Further, input unit 21 and output unit 22 can be wireless communication devices or wired communication devices.
  • FIG. 7 outlines the operation of receiving orientation and position information and determining relevant information. Memory 18 can store tracking software that can instruct processor 17 to process the information (e.g., video, images, data, etc.) output by optical sensors 11 in S1-A and S1-C. Specifically, when optical sensors 11 output a video stream or other data to computer 14, the tracking software can instruct processor 17 to process the location of optical markers 23 in the video stream or other data and to determine one or more of the position and orientation of board 16 relative to optical sensors 11. The software can instruct processor 17 to store the position and orientation of board 16 relative to optical sensors 11 at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds). S1-A and S1-C can occur separately or simultaneously.
  • A similar process occurs in S1-B and S1-D with respect to tracking data output from head tracker 12. Specifically, when head tracker 12 outputs head tracking data to computer 14, the tracking software can instruct processor 17 to process the location of head tracker 12, if necessary, and to determine one or more of the position and orientation of head tracker 12 relative to reference 24. The software can instruct processor 17 to store the position and orientation of head tracker 12 relative to reference 24 in memory 18 at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds). S1-B and S1-D can occur separately or simultaneously. If necessary, processor 17 can store information about the relative positions of head tracker 12 and optical trackers 11 in memory 18.
  • For each successive change in position of board 16 over a discrete time interval, the tracking software can instruct the processor to determine a board movement vector, as shown in S2-A. The processor can also determine a board rotation tensor for the successive change in orientation of board 16 over each discrete time interval as shown in S2-C.
  • Further, the tracking software can instruct processor 17 to process one or more of the orientation and position information output by head tracker 12 to determine the position and orientation of head tracker 12 relative to reference 24. Processor 17 can also determine one or more of a head tracker movement vector and a head tracker rotation tensor for each successive change in position of board 16 and orientation of board 16, respectively as shown in S2-B and S2-D. If head tracker 12 is not configured to output information regarding the relative direction of the true physical gravity vector to computer 14, the tracking software can instruct processor 17 to determine the direction of the true physical gravity vector relative to head tracker 12 from the position and orientation of head tracker 12 relative to reference 24 as shown in S2-E.
  • The tracking software can also instruct processor 17 to perform a compensation procedure to account for motion of optical sensors 13, as shown in FIG. 8. In S3-A, processor 17 can compare the magnitude of head tracker movement vector with a predetermined threshold distance stored in memory 18. This predetermined threshold distance stored in memory 18 can be determined based on the particular implementation of the application (e.g., smaller thresholds for complex simulations such as simulated medical training, and larger thresholds for simple simulations such as basic physics simulations). If the magnitude of head tracker movement vector exceeds or is equal to the predetermined threshold distance, processor 17 can determine that head tracker 12 has changed position. If the magnitude of head tracker movement vector is less than the predetermined threshold distance, processor 17 can determine that head tracker 12 has not changed position.
  • If processor 17 determines that head tracker 12 has changed position, then processor 17 can determine a resultant rotation from head tracker rotation tensor and compare the resultant rotation of head tracker 12 with a predetermined threshold rotation value stored in memory 18 as shown in S4-A. This predetermined threshold rotation value stored in memory 18 can be determined based on the particular implementation of the application smaller thresholds, for complex simulations such as simulated medical training, and larger thresholds for simple simulations such as basic physics simulations). If the resultant rotation of head tracker 12 exceeds or is equal to the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has changed both position and orientation. If the resultant rotation of head tracker 12 is less than the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has not changed orientation but has changed position.
  • If processor 17 determines that head tracker 12 has not changed position, then processor 17 can determine a resultant rotation from head tracker rotation tensor and compare the resultant rotation of head tracker 12 with a predetermined threshold rotation value stored in memory 18 as shown in S3-C. If the resultant rotation of head tracker 12 exceeds or is equal to the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has changed orientation but not position. If the resultant rotation of head tracker 12 is less than the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has not changed position or orientation.
  • Alternatively, one or more of the comparisons of S3-A, S3-C, and S4-A can occur simultaneously or as part of the same process. It is not necessary to separately compare relative rotation information and movement information of head tracker 12. Accordingly, relative rotation information and movement information of head tracker 12 can be compared with a combined threshold value.
  • If processor 17 determines that both the position and orientation of head tracker 12 has changed, processor 17 can apply adjustments to the corresponding board movement vector and the corresponding board rotation tensor to account for the change in position and orientation of head tracker 12 and optical sensors 11, as described in S5-A. Specifically, because optical sensors 11 are rigidly fixed to head tracker 12, the orientation changes and position changes of optical sensors 11 can be determined by applying the proper adjustments. Here, the adjustments include subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals and applying frame rotations to correct for changes in orientation in the proper sequence. This adjustment produces an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of board 16.
  • If processor 17 determines that the position of head tracker 12 has changed but not the orientation, processor 17 can apply adjustments to the corresponding board movement vector to account for the change in position of head tracker 12 and optical sensors 11, as described in S5-B. The position changes of optical sensors 11 can be determined by applying the proper sequence of translations. Here, the adjustments include subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals. This adjustment produces an adjusted board movement vector which more accurately models motion of board 16 and retains the previously determined board rotation tensor.
  • If processor 17 determines that the orientation of head tracker 12 has changed but not the position, processor 17 can apply adjustments to the corresponding board rotation tensor to account for the change in orientation of head tracker 12 and optical sensors 11, as described in S5-C. The orientation changes of optical sensors 11 can be determined by applying the proper sequence of rotations. Here, the adjustments include applying coordinate transformations to the board rotation tensor which remove effects of the resultant rotation of the second tracking mechanism in the proper sequence. This adjustment produces an adjusted board rotation tensor which more accurately models motion of board 16 and retains the previously determined board movement vector.
  • If processor 17 determines that both the position and orientation of head tracker 12 have not changed, processor 17 is not instructed to perform an adjustment and proceeds directly to simulation.
  • If processor 17 has determined an adjusted board movement vector and an adjusted board rotation tensor, processor 17 can use the information regarding the relative direction of the true physical gravity vector determined in S2-E, the adjusted board movement vector, and the adjusted board rotation tensor to simulate the motion of a virtual object 15, such as ball 15 a, under a true virtual gravity vector based on the actual motion of board 16, as described in S6-A.
  • If processor 17 has determined an adjusted board movement vector but has not adjusted the optical sensor rotation tensor, processor 17 can use the information about the relative direction of the true physical gravity vector determined in S2-E, the adjusted board movement vector, and the unadjusted board rotation tensor to simulate the motion of a virtual object 15, such as ball 15 a, under a true virtual gravity vector based on the actual motion of board 16, as described in S6-B.
  • If processor 17 has determined an adjusted board rotation tensor but has not adjusted the board movement vector, processor 17 can use the information about the relative direction of the true physical gravity vector determined in S2-E, the unadjusted board movement vector, and the adjusted board rotation tensor to simulate the motion of a virtual object 15, such as ball 15 a, under a true virtual gravity vector based on the actual motion of board 16, as described in S6-C.
  • If processor 17 has not adjusted the board movement vector and the board rotation tensor, processor 17 can use the information about the relative direction of the true physical gravity vector determined in S2-E, the unadjusted board movement vector, and the unadjusted board rotation tensor to simulate the motion of a virtual object 15, such as ball 15 a, under a true virtual gravity vector based on the actual motion of board 16, as described in S6-D.
  • Once processor performs one or more of S6-A, S6-B, S6-C, and S6-D to simulate the motion of a virtual object 15, computer 14 outputs this simulation as augmented reality information or virtual reality information to head-worn unit 10. Accordingly, displays 13 can display a combination of virtual objects 15 and physical objects, such as board 16, in real-time. Alternatively, displays 13 can display only virtual objects 15.
  • For purpose of explanation and illustration, and not limitation, another exemplary embodiment of the interaction system in accordance with the application is shown in FIGS. 9A-12. For brevity, only the aspects of the another exemplary embodiment that are different from the previously described embodiment will be described.
  • FIG. 9A depicts the front side of a head-worn unit 110. The head-worn unit 110 can include a pair of optical sensors 111 (first tracking mechanisms) for detecting optical markers on a gameboard (see, e.g., optical marker 23 on gameboard 16 in FIG. 3). Optical sensors 111 detect the position and orientation of optical markers and transmit that information to the processing device 114 (not shown). Further, head-worn unit 110 can include a head motion detector 126 (a detecting mechanism) for detecting at least one of an acceleration of the head motion detector 126, a velocity of the head motion detector 126, and an orientation of head motion detector 126. Head motion detector 126 can be rigidly fixed relative to optical sensors 11, but is not limited to being so fixed as long as the position of head motion tracker 126 relative to at least one optical sensor 111 is predetermined. Accordingly, the orientation, position, and motion of at least one optical sensor 111 can be readily determined from the motion of head motion detector 126 if head motion detector 126 is rigidly fixed relative to at least one optical sensor 111, or if the orientation and position of head motion detector 126 relative to at least one optical sensor 111 is otherwise known. Alternatively or additionally, head motion detector 126 can be configured to detect a correct direction and magnitude of a physical gravity vector. Further, head motion detector 126 can alternatively include one or more distinctly separate motion detectors each configured to detect one or more of correct direction and magnitude of a physical gravity vector, acceleration of the head motion detector 126, a velocity of the head motion detector 126, and an orientation of head motion detector 126. Head motion detector 126 can include accelerometers, gyros, magnetometers, etc.
  • FIG. 9B depicts the back side of head-worn unit 110. Head motion detector 126 can be disposed on the back side of head-worn unit 110. Further, display devices 113 can be disposed on the back side of head-worn unit 110. Display devices 113 can be configured to display augmented reality images or virtual reality images as described in other embodiments of the application.
  • FIG. 10 is a schematic diagram depicting the flow of information between head-worn unit 110 and processing device 114. Optical sensors 111 can output images or video streams to processing device 114, as described in other embodiments of the application. Processing device 114 performs functions similar to computer 14 (e.g., FIG. 5), as described in the other embodiments of the application. Head motion detector 126 can track the motion of head-worn unit 110. Head motion detector 126 can include one or more of, but is not limited to, a three-axis accelerometer, a three-axis magnetometer, a three-axis gyroscope, and a full six-degrees-of-freedom tracker. Further, head motion detector 126 can detect the relative direction of the true physical gravity vector acting on head-worn unit 110 and head motion detector 126. Head motion detector 126 can output this motion information, which includes information regarding the relative direction of the true physical gravity vector, to processing device 114. Processing device 114 can then process the information to create augmented reality information, as will be described herein. Alternatively, processing device 114 can be configured to create purely virtual reality information. Processing device 114 can then output the augmented reality or virtual reality information to display screens 113. The augmented reality information can be, but is not limited to, an augmented reality video stream including a sequence of augmented reality images. Further, the virtual reality information can be, but is not limited to, a virtual reality video stream including a sequence of virtual reality images. Although this embodiment describes that processing device 114 can be separate from head-worn-unit 110, head-worn unit 110 can include processing device 114. Processing device 114 can be any processing device. Further, one or more of optical sensors 11 and head motion detector 126 can be configured to output one or more of position information, orientation information, information regarding the relative position of the true physical gravity vector, and head motion detector motion information alone or in combination with other information, to processing device 114.
  • FIG. 11 outlines the operation of receiving orientation, position, and motion information and determining relevant information. Processing device 114 can process the information (e.g., video, images, data, etc.) output by optical sensors 111 in S11-A and S11-B. Specifically, when optical sensors 111 output a video stream or other data to processing device 114, processing device 114 can process the location of optical markers in the video stream or other data and can determine one or more of the position and orientation of a gameboard relative to optical sensors 111. Processing device 114 can store the position and orientation of the gameboard relative to optical sensors 111 at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds). S11-A and S11-B can occur separately or simultaneously.
  • A similar process occurs in S11-C and S11-D with respect to head motion detector motion information and physical gravity vector information output from head motion detector 126. Specifically, when head motion detector 126 outputs head motion detector motion information and physical gravity vector information to processing device 114, the processing device 114 can process the motion information of head motion detector, if necessary, and to determine one or more of the acceleration and velocity of head motion detector relative to a reference frame. Processing device 114 can store the acceleration and velocity of head motion detector relative to a reference frame at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds). S11-C and S11-D can occur separately or simultaneously. If necessary, processing device 114 can store information about the relative positions of head motion detector 126 and optical trackers 11.
  • For each successive change in position of the gameboard over a discrete time interval, processing device 114 can determine a board movement vector, as shown in S12-A. Processing device 114 can also determine a board rotation tensor for the successive change in orientation of the gameboard over each discrete time interval as shown in S12-B.
  • Further, in S12-C, processing device 114 can process one or more of the motion information output by head motion detector 126 to determine one or more of the velocity and acceleration of head motion detector 126 relative to the reference frame.
  • Processing device 114 can perform a compensation procedure to account for motion of optical sensors 113, as shown in FIGS. 12A and 12B. Processing device 114 performs a comparison of acceleration values in S13 if processing device determines the acceleration of head motion detector 126 relative to the reference frame in S12-C. In S13 of FIG. 12A, processing device 114 can compare the magnitude of acceleration of head motion detector 114 over a predetermined time with a predetermined threshold acceleration value. This predetermined threshold acceleration value can be determined based on the particular implementation of the application (e.g., smaller thresholds for complex simulations such as simulated medical training, and larger thresholds for simple simulations such as basic physics simulations). If the magnitude of acceleration of head motion detector 114 over the predetermined time is greater than or equal to the predetermined threshold acceleration value, processing device 114 can determine that head motion detector 126 has changed at least one of position and orientation. If the magnitude of acceleration of head motion detector 126 over the predetermined time is less than the predetermined threshold acceleration value, processing device 114 can determine that head motion detector 126 has not changed position and orientation, unless processing device 114 determines that head motion detector 126 has changed at least one of position and orientation as a result of a velocity comparison in S23, which will be explained herein.
  • If processing device 114 determines that head motion detector 126 has changed at least one of position and orientation as a result of acceleration, processing device 114 can apply adjustments to the corresponding at least one of board movement vector and board rotation tensor to account for the change in at least one of position and orientation of head motion detector 126 and optical sensors 11, as described in S14-A. Specifically, because optical sensors 11 are rigidly fixed to head motion detector 126, or have an otherwise known relative position, the orientation changes and position changes of optical sensors 11 can be determined by applying the proper adjustments. Here, the adjustments can include one or more heuristics for compensation, including, but not limited to: ignoring changes in one or more of board position information and board orientation information over the corresponding discrete time interval when simulating the physics of the virtual reality object, while simulating a physically accurate field of vision. These adjustments produce at least one of an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of the gameboard.
  • Processing device 114 performs a comparison of velocity values in S23 if processing device determines the velocity of head motion detector 126 relative to the reference frame in S12-C. In S23 of FIG. 12B, processing device 114 can compare the magnitude of velocity of head motion detector 114 over a predetermined time with a predetermined threshold velocity value. This predetermined threshold velocity value can be determined based on the particular implementation of the application (e.g., smaller thresholds for complex simulations such as simulated medical training, and larger thresholds for simple simulations such as basic physics simulations). If the magnitude of velocity of head motion detector 114 over the predetermined time is greater than or equal to the predetermined threshold velocity value, processing device 114 can determine that head motion detector 126 has changed at least one of position and orientation. If the magnitude of velocity of head motion detector 126 over the predetermined time is less than the predetermined threshold velocity value, processing device 114 can determine that head motion detector 126 has not changed position and orientation, unless processing device 114 determines that head motion detector 126 has changed at least one of position and orientation as a result of a acceleration comparison in S13, as explained above.
  • If processing device 114 infers that head motion detector 126 has changed at least one of position and orientation as a result of acceleration, processing device 114 can apply adjustments to the corresponding at least one of board movement vector and board rotation tensor to account for the change in at least one of position and orientation of head motion detector 126 and optical sensors 11, as described in S24-A. Specifically, because optical sensors 11 are rigidly fixed to head motion detector 126, or have an otherwise known relative position, the orientation changes and position changes of optical sensors 11 can be determined by applying the proper adjustments. Here, the adjustments include one or more of subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals and applying frame rotations to correct for changes in orientation in the proper sequence. This adjustment produces at least one of an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of the gameboard.
  • If processing device 114 has adjusted one or more of the board movement vector and the board rotation tensor, processing device 114 can use the information about the relative direction of the true physical gravity vector received in S11-D and one or more of the adjusted board movement vector determined in S14-A or S24-A, and the adjusted board rotation tensor determined in S14-A or S24-A to simulate the motion of a virtual objects, under a true virtual gravity vector based on the actual motion of the gameboard, as described in S15-A (comparing acceleration) or S25-A (comparing velocity). If an adjusted board movement vector is determined in both S14-A S24-A, the processor can be programmed to select one of the adjusted board movement vector determined in S14-A and the adjusted board movement vector determined in S24-A, the processor can be programmed to implement only one of S15-A and S25-A. If an adjusted board rotation tensor is determined in both S14-A S24-A, the processor can be programmed to select one of the adjusted board rotation tensor determined in S14-A and the adjusted board rotation tensor determined in S24-A, the processor can be programmed to implement only one of S15-A and S25-A.
  • If processing device 114 infers that both the position and orientation of head motion detector 126 have not changed as a result of acceleration (S13) or velocity (S23), processing device 114 does not perform an adjustment and proceeds directly to simulation.
  • If processing device 114 has not adjusted the board movement vector and the board rotation tensor, processing device 114 can use the information about the relative direction of the true physical gravity vector received in S11-D, the unadjusted board movement vector determined in S12-A, and the unadjusted board rotation tensor determined in S12-B to simulate the motion of a virtual objects, under a true virtual gravity vector based on the actual motion of the gameboard, as described in S15-B (comparing acceleration) or S25-B (comparing velocity).
  • Once processing device 114 performs one or more of S15-A, S25-A, S15-B and S25-B to simulate the motion of at least one virtual object, processing device 114 outputs this simulation as augmented reality information or virtual reality information to head-worn unit 110. Accordingly, displays 113 can display a combination of virtual objects and physical objects, such as the gameboard, in real-time. Alternatively, displays 113 can display only virtual objects.
  • As described above, a virtual gravitational force can act on ball 15 a as natural gravity would act on a physical object placed on board 16. Thus, even when the head-worn unit 10 moves independently of board 16 or changes orientation with respect to board 16, the virtual force of gravity will appear to act in the true physical gravity vector direction with respect to board 16.
  • The computer interaction environment is not limited to the embodiments described above. Interaction delivery devices other than head-worn unit 10 can be used in place of or in combination with head-worn unit 10. For example, interaction delivery devices can include tracked physical displays that are held, worn on additional or alternative parts of the body, or mounted on objects in the environment. Alternatively, one or more of the first tracking mechanism, the second tracking mechanism, and each display device, individually or in combination, can be disposed separately from the interaction delivery device.
  • Further, the first tracking mechanism can be a device other than optical sensors 11 and can include fewer than two optical sensors or other tracking mechanisms. For example, the first tracking mechanism can include dephth cameras or acoustic tracking devices, such as sonar devices, etc. Additionally, second tracking mechanisms other than head tracker 12 can be used in place of or in combination with head tracker 12. For example, the second tracking mechanism can be or fixed to earth or to an object moving relative to earth rather than being fixed to or at a predetermined position relative to the first tracking mechanism. Here, the second tracking mechanism can track the first tracking mechanism as the reference. The second tracking mechanism can include one or more of a three-axis accelerometer, a three-axis magnetometer, a three-axis gyroscope, and a full six-degrees-of-freedom tracker. Additionally, the second tracking mechanism can determine the relative direction of a true force vector acting in the force's natural direction (true physical vector direction) other than that of gravity, including, but not limited to, the true vectors of magnetic forces, electrostatic forces, friction, additional artificial forces, etc.
  • Further, the tracking software can be configured to cause the processor to determine only one of a movement vector and a rotation tensor for each of the board and the second tracking mechanism. Alternatively, the tracking software can be configured to cause the processor to determine both a movement vector and a rotation tensor for each of the board and the second tracking mechanism as described above. Additionally, the tracking software can be configured to cause the processor to apply an adjustment to only movement vectors, only rotation tensors, or some combination of rotation tensors and movement vectors.
  • Further, the simulated forces are not limited to gravitational forces applied in the true physical gravity vector direction. The virtual gravity vector can be intentionally different from the true physical gravity vector (e.g., a not true virtual gravity vector). For example, the simulated gravitational force can have a different magnitude or act in a direction different from the true physical gravity vector. Additionally, the simulated force can include, but is not limited to, magnetic forces, electrostatic forces, and additional artificial forces in true or not true directions. Thus, the virtual gravity vector can be an intentionally not true virtual gravity vector.
  • In addition to the specific embodiments and features disclosed herein, this application also incorporates by reference the entire disclosure of each and every patent publication identified herein. This application therefore includes any possible combination of the various features disclosed, incorporated by reference or claimed herein. As such, the particular features presented in the dependent claims and disclosed above can be combined with each other in other manners within the scope of the application such that the application should be recognized as also specifically directed to other embodiments having any other possible combinations. Thus, the foregoing description of specific embodiments of the application has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the application to those embodiments disclosed.

Claims (27)

1. An interaction delivery device comprising:
a display device configured to display virtual objects;
a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism; and
a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference,
wherein one or more of position information received from the first tracking mechanism and orientation information received from the first tracking mechanism is used to generate motion data for the physical object,
wherein one or more of position information received from the second tracking mechanism and orientation information received from the second tracking mechanism is used to generate adjusted motion data for the physical object,
wherein the adjusted motion data for the physical object compensates for movement relative to the physical object of at least one of the first tracking mechanism and the second tracking mechanism, and
wherein the adjusted motion data for the physical object is used to generate virtual object information, and
wherein the virtual object information is received by the display.
2. An interaction processing device comprising:
a processor;
a memory;
an input unit configured to receive information; and
an output unit configured to output information,
wherein the input unit is configured to receive at least one of physical object position information, physical object orientation information, tracking mechanism position information, and tracking mechanism orientation information,
wherein the processor is configured to generate motion data for a physical object using at least one of the physical object position information and the physical object orientation information,
wherein the processor is configured to generate adjusted motion data for the physical object using at least one of tracking mechanism position information and tracking mechanism orientation information,
wherein the adjusted motion data for the physical object compensates for movement of a tracking mechanism relative to the physical object,
wherein the processor is configured to generate virtual object information using the adjusted motion data for the physical object, and
wherein the output unit is configured to output the virtual object information.
3. A computer interaction system, comprising:
an interaction delivery device comprising:
a display device configured to display virtual objects,
a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and
a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference; and
a computer comprising:
a processor, and
a memory,
wherein the computer is configured to process at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism,
wherein the computer is further configured to compensate for movement relative to the physical object of at least one of the first tracking mechanism and the second tracking mechanism, and
wherein the computer is further configured to output virtual object information to the display device.
4. The computer interaction system of claim 3, wherein the first tracking mechanism is an optical tracking mechanism.
5. The computer interaction system of claim 4, wherein the interaction delivery device is configured to be head-worn and to display see-through video.
6. The computer interaction system of claim 3, wherein the second tracking mechanism comprises a three-axis accelerometer.
7. The computer interaction system of claim 3, wherein the second tracking mechanism comprises a six-degree-of-freedom tracker configured to determine a three-dimensional position of the second tracking mechanism relative to the reference and a three-dimensional orientation of the second tracking mechanism relative to the reference.
8. The computer interaction system of claim 7, wherein the reference is earth or is fixed to earth.
9. The computer interaction system of claim 7, wherein the reference is fixed to an object moving relative to earth.
10. The computer interaction system of claim 3, wherein the second tracking mechanism is configured to determine a true direction of a natural force vector.
11. The computer interaction system of claim 10, wherein the computer is configured to simulate motion of virtual objects based on a true direction of a natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism.
12. The computer interaction system of claim 10, wherein the computer is configured to simulate motion of virtual objects based on a direction of force that is different from the true direction of the natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism.
13. The computer interaction system of claim 3,
wherein the computer further comprises a physics engine, and
wherein the computer is configured to simulate virtual objects.
14. The computer interaction system of claim 13,
wherein the computer is configured to simulate natural forces acting, in the true direction, on the virtual objects based on at least one of the position information received from the first tracking mechanism, the orientation information received from the first tracking mechanism, the position information received from the second tracking mechanism, and the orientation information received from the second tracking mechanism,
wherein the position information received from the first tracking mechanism comprises a first position of the physical object relative to the first tracking mechanism at a first time and a second position of the physical object relative to the first tracking mechanism at a second time,
wherein the orientation information received from the first tracking mechanism comprises a first orientation of the physical object relative to the first tracking mechanism at the first time and a second orientation of the physical object relative to the first tracking mechanism at the second time,
wherein the position information received from the second tracking mechanism comprises a first position of the second tracking mechanism relative to the reference at the first time and a second position of the second tracking mechanism relative to the reference at the second time, and
wherein the orientation information received from the second tracking mechanism comprises a first orientation of the second tracking mechanism relative to the reference at the first time and a second orientation of the second tracking mechanism relative to the reference at the second time.
15. The computer interaction system of claim 14,
wherein the second tracking mechanism has a position rigidly fixed relative to the first tracking mechanism,
wherein a predetermined time exists between the first time and the second time,
wherein the computer is configured to determine a physical object movement vector between the first position of the physical object and the second position of the physical object,
wherein the computer is configured to determine a second tracking mechanism movement vector between the first position of the second tracking mechanism and the second position of the second tracking mechanism,
wherein the computer is configured to calculate an adjusted physical object movement vector if a magnitude of the second tracking mechanism movement vector is greater than or equal to a predetermined threshold distance, and to simulate natural forces acting on the virtual objects based on at least the adjusted physical object movement vector if the magnitude of the second tracking mechanism movement vector is greater than or equal to the predetermined threshold distance, and
wherein the computer is configured to simulate natural forces acting on the virtual objects based on at least the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance.
16. The computer interaction system of claim 14,
wherein the computer is configured to determine a physical object rotation tensor between the first orientation of the physical object and the second orientation of the physical object,
wherein the computer is configured to determine a second tracking mechanism rotation tensor between the first orientation of the second tracking mechanism and the second orientation of the second tracking mechanism,
wherein the computer is configured to calculate an adjusted physical object rotation tensor if a resultant rotation of the second tracking mechanism is greater than or equal to a predetermined threshold rotation value,
wherein the computer is configured to simulate natural forces acting on the virtual objects based on at least the adjusted physical object rotation tensor if the resultant rotation of the second tracking is greater than or equal to the predetermined threshold rotation value, and
wherein the computer is configured to simulate natural forces acting on the virtual objects based on at least the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
17. A non-transitory computer readable medium having computer readable instructions stored thereon, which, when executed by a computer having a processor to execute a plurality of processes, are configured to cause the processor to:
obtain first tracking mechanism information;
obtain second tracking mechanism information;
determine a physical object movement vector using the first tracking mechanism information;
determine a second tracking mechanism movement vector using the second tracking mechanism information;
determine a direction of a true physical gravity vector relative to the second tracking mechanism;
simulate motion data of a virtual object using:
an adjusted physical object movement vector if a magnitude of the second tracking mechanism movement vector is greater than or equal to a predetermined threshold distance,
the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance, and
the direction of a true physical gravity vector relative to the second tracking mechanism; and
output the motion data of the virtual object to a display.
18. The non-transitory computer readable medium having computer readable instructions stored thereon of claim 17, which, when executed by a computer having a processor to execute a plurality of processes, are configured further to cause the processor to:
determine a physical object rotation tensor using the first tracking mechanism information;
determine a second tracking mechanism rotation tensor using the second tracking mechanism information; and
simulate motion data of an a virtual object using additionally:
an adjusted physical object rotation tensor if a resultant rotation of the second tracking mechanism is greater than or equal to a predetermined threshold rotation value, and
the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
19. The non-transitory computer readable medium having computer readable instructions stored thereon according to claim 18, wherein:
the first tracking mechanism information comprises:
a first position of a physical object relative to a first tracking mechanism and a first orientation of the physical object at a first time, and
a second position of the physical object relative to the first tracking mechanism and a second orientation of the physical object at a second time; and
the second tracking mechanism information comprises:
a first position of a second tracking mechanism relative to a reference and a first orientation of the second tracking mechanism at the first time, and
a second position of the second tracking mechanism relative to the reference and a second orientation of the second tracking mechanism at the second time.
20. A method of facilitating interaction between an interaction delivery device and a physical object in an environment, the method comprising:
generating one or more virtual objects in the environment;
detecting a change in the physical object;
determining whether the change in the physical object is based on a change in the state of the virtual objects and the physical object, or both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object;
measuring a direction and effect of a natural force interacting with the environment; and
updating the virtual objects based on a result of the determining and the measuring.
21. The method of facilitating interaction between an interaction delivery device and a physical object in an environment of claim 20,
wherein the detecting further comprises:
detecting a change in position of the physical object over a given time, and
detecting a change in position of the interaction delivery device over the given time;
wherein the determining further comprises:
determining whether a magnitude of the change in position of the interaction delivery device over the given time is greater than or equal to a threshold value,
determining that the detected change in the physical object is based on a change in the state of the virtual objects and the physical object if the change in position of the interaction delivery device over the given time is less than the threshold value, and
determining that the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object if the change in position of the interaction delivery device over the given time is greater than or equal to the threshold value; and
wherein the updating further comprises:
updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object over the given time if the detected change in the physical object is based on a change in the state of the virtual objects and the physical object, and
updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object and adjusted to remove effects caused by the force applied to the interaction delivery device over the given time if the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object.
22. An interaction system comprising:
a physical object;
at least one virtual object;
an interaction delivery device comprising:
a tracking mechanism configured to track at least one of a position of the physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism,
a detecting mechanism configured to detect motion of the detecting mechanism, wherein a position of the detecting mechanism relative to a position of the tracking mechanism is predetermined, and
a display configured to display the at least one virtual object; and
a processing device,
wherein the processing device is configured to receive physical object position information from the tracking mechanism,
wherein the processing device is configured to receive detecting mechanism motion information from the detecting mechanism,
wherein the processing device is configured to perform at least one of:
determining if a magnitude of acceleration of the detecting mechanism is greater than a predetermined acceleration threshold value and generating adjusted physical object position information if the magnitude of acceleration of the detecting mechanism is greater than the predetermined acceleration threshold value, and
determining if a magnitude of velocity of the detecting mechanism is greater than a predetermined velocity threshold value and generating adjusted physical object position information if the magnitude of velocity of the detecting mechanism is greater than the predetermined velocity threshold value,
wherein the processing device is configured to generate motion information for the at least one virtual object based on the adjusted physical object position information if the processing device generates the adjusted physical object position information, and
wherein the processing device is configured to generate the motion information for the at least one virtual object based on the physical object position information if the processing device does not generate the adjusted physical object position information, and
wherein the processing device is configured to output the motion information for the at least one virtual object to the display.
23. The interaction system of claim 22,
wherein the detecting mechanism is further configured to detect a correct direction and magnitude of a physical gravity vector, and
wherein the processing device is further configured to generate the motion information for the at least one virtual object additionally based on the correct direction and magnitude of the physical gravity vector.
24. The interaction system of claim 23,
wherein the at least one virtual object is configured to move according to a virtual gravity vector, and
wherein the virtual gravity vector is substantially identical to the physical gravity vector.
25. The interaction system of claim 23,
wherein the at least one virtual object is configured to move according to a virtual gravity vector, and
wherein the virtual gravity vector is different from the physical gravity vector.
26. The interaction system of claim 22,
wherein the tracking mechanism is an optical tracking device and the physical object does not include attached or embedded electronic devices or components.
27. An interaction delivery system comprising:
a first tracking mechanism rigidly attached to a display device configured to track the position and orientation of a physical object relative to the display device;
a second tracking mechanism rigidly attached to the display device, configured to track the absolute orientation of the display device relative to the earth; and
a processing device configured to process tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism,
wherein the processing device is configured to simulate motion information of at least one virtual object from the tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism,
wherein the processing device is configured to output the simulated motion information of the at least one virtual object to the display,
wherein the display is configured to display the at least one virtual object disposed relative to the physical object, and
wherein the at least one virtual object is configured to behave as if acted on by a virtual gravity vector in a same direction as a physical gravity vector acting on the physical object.
US13/084,488 2010-04-09 2011-04-11 System and method for a 3d computer game with true vector of gravity Abandoned US20110250962A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/084,488 US20110250962A1 (en) 2010-04-09 2011-04-11 System and method for a 3d computer game with true vector of gravity

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US32252110P 2010-04-09 2010-04-09
US13/084,488 US20110250962A1 (en) 2010-04-09 2011-04-11 System and method for a 3d computer game with true vector of gravity

Publications (1)

Publication Number Publication Date
US20110250962A1 true US20110250962A1 (en) 2011-10-13

Family

ID=44761324

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/084,488 Abandoned US20110250962A1 (en) 2010-04-09 2011-04-11 System and method for a 3d computer game with true vector of gravity

Country Status (1)

Country Link
US (1) US20110250962A1 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110227945A1 (en) * 2010-03-17 2011-09-22 Sony Corporation Information processing device, information processing method, and program
US20120277001A1 (en) * 2011-04-28 2012-11-01 Microsoft Corporation Manual and Camera-based Game Control
US20120276994A1 (en) * 2011-04-28 2012-11-01 Microsoft Corporation Control of separate computer game elements
US8625200B2 (en) 2010-10-21 2014-01-07 Lockheed Martin Corporation Head-mounted display apparatus employing one or more reflective optical surfaces
US20140049559A1 (en) * 2012-08-17 2014-02-20 Rod G. Fleck Mixed reality holographic object development
US8781794B2 (en) 2010-10-21 2014-07-15 Lockheed Martin Corporation Methods and systems for creating free space reflective optical surfaces
US20150065866A1 (en) * 2013-09-03 2015-03-05 Siemens Aktiengesellschaft Method for repositioning a mobile imaging system, image capturing unit and optical marker
US20150241708A1 (en) * 2013-08-23 2015-08-27 Panasonic Intellectual Property Corporation Of America Head-mounted display
US20150241717A1 (en) * 2012-10-05 2015-08-27 Essilor International (Compagnie Générale d'Optique) Method For Improving Visual Comfort To a Wearer And Associated Active System Of Vision
US9153073B2 (en) 2012-05-23 2015-10-06 Qualcomm Incorporated Spatially registered augmented video
US9423620B2 (en) 2014-04-24 2016-08-23 Lg Electronics Inc. Head mounted display and method for controlling the same
US9632315B2 (en) 2010-10-21 2017-04-25 Lockheed Martin Corporation Head-mounted display apparatus employing one or more fresnel lenses
US9720228B2 (en) 2010-12-16 2017-08-01 Lockheed Martin Corporation Collimating display with pixel lenses
US20180075644A1 (en) * 2012-12-04 2018-03-15 Nintendo Co., Ltd. Caching in map systems for displaying panoramic images
US9939650B2 (en) 2015-03-02 2018-04-10 Lockheed Martin Corporation Wearable display system
US9995936B1 (en) 2016-04-29 2018-06-12 Lockheed Martin Corporation Augmented reality systems having a virtual image overlaying an infrared portion of a live scene
US20180364965A1 (en) * 2013-07-16 2018-12-20 Seiko Epson Corporation Information processing apparatus, information processing method, and information processing system
CN109976527A (en) * 2019-03-28 2019-07-05 重庆工程职业技术学院 Interactive VR display systems
US10359545B2 (en) 2010-10-21 2019-07-23 Lockheed Martin Corporation Fresnel lens with reduced draft facet visibility
US10365712B2 (en) 2016-05-17 2019-07-30 Google Llc Object tracking in a head mounted reference frame in an augmented and/or virtual reality environment
US20190304191A1 (en) * 2018-03-29 2019-10-03 Disney Enterprises, Inc. Systems and methods to augment an appearance of physical object for an augmented reality experience
US10481680B2 (en) 2018-02-02 2019-11-19 Disney Enterprises, Inc. Systems and methods to provide a shared augmented reality experience
US20190373042A1 (en) * 2015-04-22 2019-12-05 Samsung Electronics Co., Ltd. Method and apparatus for transmitting and receiving image data for virtual-reality streaming service
US10587834B2 (en) 2016-03-07 2020-03-10 Disney Enterprises, Inc. Systems and methods for tracking objects for augmented reality
US10684476B2 (en) 2014-10-17 2020-06-16 Lockheed Martin Corporation Head-wearable ultra-wide field of view display device
CN111492330A (en) * 2017-12-22 2020-08-04 斯纳普公司 Augmented reality user interface control
US10754156B2 (en) 2015-10-20 2020-08-25 Lockheed Martin Corporation Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system
US10914957B1 (en) 2017-05-30 2021-02-09 Apple Inc. Video compression methods and apparatus
US10916061B2 (en) 2019-04-24 2021-02-09 Disney Enterprises, Inc. Systems and methods to synchronize real-world motion of physical objects with presentation of virtual content
US10974132B2 (en) 2018-10-02 2021-04-13 Disney Enterprises, Inc. Systems and methods to provide a shared interactive experience across multiple presentation devices based on detection of one or more extraterrestrial bodies
US11014008B2 (en) 2019-03-27 2021-05-25 Disney Enterprises, Inc. Systems and methods for game profile development based on virtual and/or real activities
US20220155853A1 (en) * 2020-11-19 2022-05-19 Beijing Boe Optoelectronics Technology Co., Ltd. Augmented reality information prompting system, display control method, equipment and medium
US11351472B2 (en) 2016-01-19 2022-06-07 Disney Enterprises, Inc. Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon
US11567323B2 (en) * 2019-12-01 2023-01-31 Vision Products, Llc Partial electronic see-through head-mounted display
US11663783B2 (en) 2016-02-10 2023-05-30 Disney Enterprises, Inc. Systems and methods for using augmented reality with the internet of things

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040104934A1 (en) * 2001-06-19 2004-06-03 Fager Jan G. Device and a method for creating an environment for a creature
US20040110565A1 (en) * 2002-12-04 2004-06-10 Louis Levesque Mobile electronic video game
US20080204361A1 (en) * 2007-02-28 2008-08-28 Science Applications International Corporation System and Method for Video Image Registration and/or Providing Supplemental Data in a Heads Up Display
US20090271715A1 (en) * 2008-01-29 2009-10-29 Tumuluri Ramakrishna J Collaborative augmented virtuality system
US20110066682A1 (en) * 2009-09-14 2011-03-17 Applied Research Associates, Inc. Multi-Modal, Geo-Tempo Communications Systems

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040104934A1 (en) * 2001-06-19 2004-06-03 Fager Jan G. Device and a method for creating an environment for a creature
US20040110565A1 (en) * 2002-12-04 2004-06-10 Louis Levesque Mobile electronic video game
US20080204361A1 (en) * 2007-02-28 2008-08-28 Science Applications International Corporation System and Method for Video Image Registration and/or Providing Supplemental Data in a Heads Up Display
US20090271715A1 (en) * 2008-01-29 2009-10-29 Tumuluri Ramakrishna J Collaborative augmented virtuality system
US20110066682A1 (en) * 2009-09-14 2011-03-17 Applied Research Associates, Inc. Multi-Modal, Geo-Tempo Communications Systems

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Machine translation of Koch et al., WO 2009/138069, publsihed on 19 November 2009, translated on 21 February 2013. *
Yasuyoshi Yokokohji, Yoshihiko Sugawara, and Tsuneo Yoshikawa, Acccurate Image Overlay on Video See-Through HMDs Using Vision and Accelerometers, 2000, IEEE, VR '00 Proceedings of the IEEE Virtual Reality 2000 Conference. *

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110227945A1 (en) * 2010-03-17 2011-09-22 Sony Corporation Information processing device, information processing method, and program
US8854393B2 (en) * 2010-03-17 2014-10-07 Sony Corporation Information processing device, information processing method, and program
US10495790B2 (en) 2010-10-21 2019-12-03 Lockheed Martin Corporation Head-mounted display apparatus employing one or more Fresnel lenses
US10359545B2 (en) 2010-10-21 2019-07-23 Lockheed Martin Corporation Fresnel lens with reduced draft facet visibility
US8625200B2 (en) 2010-10-21 2014-01-07 Lockheed Martin Corporation Head-mounted display apparatus employing one or more reflective optical surfaces
US9632315B2 (en) 2010-10-21 2017-04-25 Lockheed Martin Corporation Head-mounted display apparatus employing one or more fresnel lenses
US8781794B2 (en) 2010-10-21 2014-07-15 Lockheed Martin Corporation Methods and systems for creating free space reflective optical surfaces
US9720228B2 (en) 2010-12-16 2017-08-01 Lockheed Martin Corporation Collimating display with pixel lenses
US20120276994A1 (en) * 2011-04-28 2012-11-01 Microsoft Corporation Control of separate computer game elements
US9259643B2 (en) * 2011-04-28 2016-02-16 Microsoft Technology Licensing, Llc Control of separate computer game elements
US20120277001A1 (en) * 2011-04-28 2012-11-01 Microsoft Corporation Manual and Camera-based Game Control
US9153073B2 (en) 2012-05-23 2015-10-06 Qualcomm Incorporated Spatially registered augmented video
US9429912B2 (en) * 2012-08-17 2016-08-30 Microsoft Technology Licensing, Llc Mixed reality holographic object development
US20140049559A1 (en) * 2012-08-17 2014-02-20 Rod G. Fleck Mixed reality holographic object development
US20150241717A1 (en) * 2012-10-05 2015-08-27 Essilor International (Compagnie Générale d'Optique) Method For Improving Visual Comfort To a Wearer And Associated Active System Of Vision
US10042184B2 (en) * 2012-10-05 2018-08-07 Essilor International Method for improving visual comfort to a wearer and associated active system of vision
US10438401B2 (en) * 2012-12-04 2019-10-08 Nintendo Co., Ltd. Caching in map systems for displaying panoramic images
US20180075644A1 (en) * 2012-12-04 2018-03-15 Nintendo Co., Ltd. Caching in map systems for displaying panoramic images
US11341726B2 (en) 2013-07-16 2022-05-24 Seiko Epson Corporation Information processing apparatus, information processing method, and information processing system
US10664216B2 (en) * 2013-07-16 2020-05-26 Seiko Epson Corporation Information processing apparatus, information processing method, and information processing system
US20180364965A1 (en) * 2013-07-16 2018-12-20 Seiko Epson Corporation Information processing apparatus, information processing method, and information processing system
US20150241708A1 (en) * 2013-08-23 2015-08-27 Panasonic Intellectual Property Corporation Of America Head-mounted display
US9804398B2 (en) * 2013-08-23 2017-10-31 Panasonic Intellectual Property Corporation Of America Head-mounted perfume dispenser apparatus
US20150065866A1 (en) * 2013-09-03 2015-03-05 Siemens Aktiengesellschaft Method for repositioning a mobile imaging system, image capturing unit and optical marker
US9763599B2 (en) * 2013-09-03 2017-09-19 Siemens Aktiengesellschaft Method for repositioning a mobile imaging system, image capturing unit and optical marker
US9423620B2 (en) 2014-04-24 2016-08-23 Lg Electronics Inc. Head mounted display and method for controlling the same
US10684476B2 (en) 2014-10-17 2020-06-16 Lockheed Martin Corporation Head-wearable ultra-wide field of view display device
US9939650B2 (en) 2015-03-02 2018-04-10 Lockheed Martin Corporation Wearable display system
US20190373042A1 (en) * 2015-04-22 2019-12-05 Samsung Electronics Co., Ltd. Method and apparatus for transmitting and receiving image data for virtual-reality streaming service
US11050810B2 (en) * 2015-04-22 2021-06-29 Samsung Electronics Co., Ltd. Method and apparatus for transmitting and receiving image data for virtual-reality streaming service
US10754156B2 (en) 2015-10-20 2020-08-25 Lockheed Martin Corporation Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system
US11351472B2 (en) 2016-01-19 2022-06-07 Disney Enterprises, Inc. Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon
US11663783B2 (en) 2016-02-10 2023-05-30 Disney Enterprises, Inc. Systems and methods for using augmented reality with the internet of things
US10587834B2 (en) 2016-03-07 2020-03-10 Disney Enterprises, Inc. Systems and methods for tracking objects for augmented reality
US9995936B1 (en) 2016-04-29 2018-06-12 Lockheed Martin Corporation Augmented reality systems having a virtual image overlaying an infrared portion of a live scene
US10365712B2 (en) 2016-05-17 2019-07-30 Google Llc Object tracking in a head mounted reference frame in an augmented and/or virtual reality environment
US11243402B2 (en) 2017-05-30 2022-02-08 Apple Inc. Video compression methods and apparatus
US10914957B1 (en) 2017-05-30 2021-02-09 Apple Inc. Video compression methods and apparatus
US11914152B2 (en) 2017-05-30 2024-02-27 Apple Inc. Video compression methods and apparatus
US11543929B2 (en) * 2017-12-22 2023-01-03 Snap Inc. Augmented reality user interface control
CN111492330A (en) * 2017-12-22 2020-08-04 斯纳普公司 Augmented reality user interface control
US10996811B2 (en) * 2017-12-22 2021-05-04 Snap Inc. Augmented reality user interface control
US10481680B2 (en) 2018-02-02 2019-11-19 Disney Enterprises, Inc. Systems and methods to provide a shared augmented reality experience
US10546431B2 (en) * 2018-03-29 2020-01-28 Disney Enterprises, Inc. Systems and methods to augment an appearance of physical object for an augmented reality experience
US20190304191A1 (en) * 2018-03-29 2019-10-03 Disney Enterprises, Inc. Systems and methods to augment an appearance of physical object for an augmented reality experience
US10974132B2 (en) 2018-10-02 2021-04-13 Disney Enterprises, Inc. Systems and methods to provide a shared interactive experience across multiple presentation devices based on detection of one or more extraterrestrial bodies
US11389730B2 (en) 2019-03-27 2022-07-19 Disney Enterprises, Inc. Systems and methods for game profile development based on virtual and/or real activities
US11014008B2 (en) 2019-03-27 2021-05-25 Disney Enterprises, Inc. Systems and methods for game profile development based on virtual and/or real activities
CN109976527A (en) * 2019-03-28 2019-07-05 重庆工程职业技术学院 Interactive VR display systems
US10916061B2 (en) 2019-04-24 2021-02-09 Disney Enterprises, Inc. Systems and methods to synchronize real-world motion of physical objects with presentation of virtual content
US11567323B2 (en) * 2019-12-01 2023-01-31 Vision Products, Llc Partial electronic see-through head-mounted display
US20220155853A1 (en) * 2020-11-19 2022-05-19 Beijing Boe Optoelectronics Technology Co., Ltd. Augmented reality information prompting system, display control method, equipment and medium
US11703945B2 (en) * 2020-11-19 2023-07-18 Beijing Boe Optoelectronics Technology Co., Ltd. Augmented reality information prompting system, display control method, equipment and medium

Similar Documents

Publication Publication Date Title
US20110250962A1 (en) System and method for a 3d computer game with true vector of gravity
US10137374B2 (en) Method for an augmented reality character to maintain and exhibit awareness of an observer
EP3592443B1 (en) Augmented ride system and method
Anthes et al. State of the art of virtual reality technology
Murray Building virtual reality with unity and steamvr
US20080211771A1 (en) Approach for Merging Scaled Input of Movable Objects to Control Presentation of Aspects of a Shared Virtual Environment
US20160124502A1 (en) Sensory feedback systems and methods for guiding users in virtual reality environments
CN109643014A (en) Head-mounted display tracking
US8669938B2 (en) Approach for offset motion-based control of a computer
Hachimura et al. A prototype dance training support system with motion capture and mixed reality technologies
CN102542867A (en) Driving simulator control with virtual skeleton
US10884505B1 (en) Systems and methods for transitioning to higher order degree-of-freedom tracking
Song et al. An immersive VR system for sports education
Martin Virtual reality
Datta et al. An exploratory analysis of head mounted displays for vr applications
Benton et al. Oculus Rift in Action
Gu et al. Analysis of the Treadmill Utilization for the Development of a Virtual Reality Walking Interface
TW201944365A (en) A method to enhance first-person-view experience
Chung Metaverse XR Components
Wu et al. Capturing reality for a billiards simulation
Loviscach Playing with all senses: Human–Computer interface devices for games
KR102212508B1 (en) Virtual reality control system
Tingvall Interior Design and Navigation in Virtual Reality
Whitton et al. Locomotion interfaces
Medved Augmented Reality Billiards Assistant

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FEINER, STEVEN K.;ODA, OHAN;REEL/FRAME:026506/0830

Effective date: 20110622

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION