US20110306413A1 - Entertainment device and entertainment methods - Google Patents

Entertainment device and entertainment methods Download PDF

Info

Publication number
US20110306413A1
US20110306413A1 US13/150,357 US201113150357A US2011306413A1 US 20110306413 A1 US20110306413 A1 US 20110306413A1 US 201113150357 A US201113150357 A US 201113150357A US 2011306413 A1 US2011306413 A1 US 2011306413A1
Authority
US
United States
Prior art keywords
pointing
display
source
dimensional image
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/150,357
Inventor
Ian Bickerstaff
Ian Michael Hocking
Nigel Kershaw
Simon Benson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
D YOUNG AND CO LLP
Original Assignee
D YOUNG AND CO LLP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by D YOUNG AND CO LLP filed Critical D YOUNG AND CO LLP
Assigned to SONY COMPUTER ENTERTAINMENT EUROPE LIMITED reassignment SONY COMPUTER ENTERTAINMENT EUROPE LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Benson, Simon, BICKERSTAFF, IAN, HOCKING, IAN MICHAEL, Kershaw, Nigel
Publication of US20110306413A1 publication Critical patent/US20110306413A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6607Methods for processing data by generating or executing the game program for rendering three dimensional images for animating game characters, e.g. skeleton kinematics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6653Methods for processing data by generating or executing the game program for rendering three dimensional images for altering the visibility of an object, e.g. preventing the occlusion of an object, partially hiding an object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • A63F2300/695Imported photos, e.g. of the player

Definitions

  • the present invention relates to an entertainment device and entertainment methods.
  • motion controllers such as the controller for the Nintendo® Wii® (sometimes known as the “Wiimote”) have become popular for controlling entertainment device.
  • Such controllers typically use motion data from internal motion sensors such as gyroscopes and accelerometers combined with optical data generated in dependence on a light source mounted on a display to detect the position and orientation of the motion controller. Accordingly, a user can use the motion controller to point at objects displayed on the display to select one of the objects. Furthermore, the user can move the motion controller to control motion of a game character.
  • motion control is limited to objects displayed on the display.
  • so-called three-dimensional (3D) TVs which can display images to be viewed in 3D are becoming more popular.
  • Such TVs allow a user to perceive a three-dimensional image, for example by the user wearing suitable viewing glasses to view the TV.
  • videogames which can output images which can be displayed on 3D TVs so that the user perceives the video game content as three-dimensional images.
  • an entertainment device comprising: means for displaying an image on a display to a user; means for detecting a three-dimensional position of an object in front of an image plane of the display; means for detecting a source position of a pointing source with respect to the display, the pointing source having an associated pointing direction indicative of a direction in which the pointing source is pointing; means for detecting a pointing direction of a user control device, the pointing direction of the user control device being associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source; and means for detecting whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
  • an entertainment method comprising: displaying an image on a display to a user; detecting a three-dimensional position of an object in front of an image plane of the display; detecting a source position of a pointing source with respect to the display, the pointing source having an associated pointing direction indicative of a to direction in which the pointing source is pointing; detecting a pointing direction of a user control device, the pointing direction of the user control device being associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source; and detecting whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
  • embodiments of the invention can advantageously detect whether a user is using the user control device to cause the pointing source to point at an object in front of the image plane of the display. For example, a user could use the user control device to point at a real object in front of the display to select that object for image processing. As another example, during a game executed by the entertainment device, the user could use the user control device to point at a computer generated object which is caused to appear in front of the image plane of the display. Embodiments of the present invention therefore can provide a more immersive and diverse experience for a user.
  • the pointing source corresponds to a position of a game character within a game executed by the entertainment device. In other embodiments, the pointing source corresponds to the position of the user control device.
  • an entertainment device comprising: generating means for generating a three-dimensional image to be displayed on a display; means for detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; and means for detecting a source position and pointing direction of a user control device for controlling the entertainment device, the source position being indicative of a position of the user control device with respect to the display, and the pointing direction being indicative of a direction in which the user control device is pointing with respect to the display; in which the generating means is operable to generate the three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the source position and the pointing direction of the user control device.
  • an entertainment method comprising: generating, using an entertainment device, a three-dimensional image to be displayed on a display; detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; detecting a source position and pointing direction of a user control device for to controlling the entertainment device, the source position being indicative of a position of the user control device with respect to the display, and the pointing direction being indicative of a direction in which the user control device is pointing with respect to the display; and generating the three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the source position and the pointing direction of the user control device.
  • embodiments of the present invention advantageously provide a more interactive user experience.
  • the user control device could act as a flame thrower within a game with the flames appearing to come from the motion controller.
  • a more powerful and immersive 3D experience can therefore be provided to the user.
  • an entertainment device comprising: generating means for generating a three-dimensional image to be displayed on a display; means for detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; means for detecting the apparent position of the three-dimensional image with respect to the display; and controlling means for controlling the appearance of the three-dimensional image in dependence upon the relative position of the three-dimensional image with respect to the image display region.
  • an entertainment method comprising: generating a three-dimensional image to be displayed on a display; detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; detecting the apparent position of the three-dimensional image with respect to the display; and controlling the appearance of the three-dimensional image in dependence upon the relative position of the three-dimensional image with respect to the image display region.
  • Embodiments of the present invention advantageously address a problem which may occur when displaying 3D images, for example when rendering 3D images for 3D video games which use pseudo-realistic physics engines.
  • video games may use realistic physics models to predict the motion of objects within a game such as debris from an explosion.
  • the motion of objects within physics based video games or physics based virtual reality simulations typically allows 3D objects to travel unrestricted through the virtual environment.
  • embodiments of the invention control the appearance of the 3D image in dependence upon the relative position of the 3D image with respect to the 3D image display region.
  • the path or trajectory of the object could be controlled so that the object remains within the 3D image display region.
  • the appearance of the 3D image could be caused to fade when a position of the 3D image is within a threshold distance of an edge of the 3D image display region. Accordingly, a likelihood that the user experiences nausea and/or headaches can be reduced. Furthermore, a more diverse and immersive 3D experience can be provided, because the user is more likely to feel comfortable experiencing the 3D experience for longer periods of time.
  • FIG. 1 is a schematic diagram of an entertainment device
  • FIG. 2 is a schematic diagram of a cell processor
  • FIG. 3 is a schematic diagram of a video graphics processor
  • FIGS. 4A and 4B are schematic diagrams of a stereoscopic camera and captured stereoscopic images
  • FIG. 5A is a schematic diagram of a stereoscopic camera
  • FIGS. 5B and 5C are schematic diagrams of a viewed stereoscopic image
  • FIG. 6 is a schematic diagram of a motion controller in accordance with embodiments of the present invention.
  • FIG. 7 is a schematic diagram of a user using the motion controller to control the entertainment device in accordance with embodiments of the present invention.
  • FIG. 8 is a schematic diagram of a user using the motion controller to control the entertainment device in accordance with embodiments of the present invention.
  • FIG. 9 is a schematic diagram of a user using the motion controller to control the entertainment device in accordance with embodiments of the present invention.
  • FIG. 10 is a schematic diagram showing a plan view of the user seen from above viewing a display at two different horizontal positions in accordance with embodiments of the present invention.
  • FIG. 11 is a schematic diagram of the user using the motion controller to point at a computer generated object as seen from one side of the display in accordance with embodiments of the present invention.
  • FIG. 12 is a schematic diagram of a three-dimensional (3D) image display region shown with respect to the display seen from above in accordance with embodiments of the present invention.
  • FIG. 13 is a schematic diagram of the 3D image display region viewed from the side in accordance with embodiments of the present invention.
  • FIG. 14 is a schematic diagram of control of the appearance of a 3D image in accordance with embodiments of the present invention.
  • FIG. 15 is a schematic diagram of a 3D icon field in accordance with embodiments of the present invention.
  • FIG. 16 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • FIG. 17 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • FIG. 18 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • FIG. 1 schematically illustrates the overall system architecture of a Sony® Playstation 3® entertainment device.
  • a system unit 10 is provided, with various peripheral devices connectable to the system unit.
  • the system unit 10 comprises: a Cell processor 100 ; a Rambus® dynamic random access memory (XDRAM) unit 500 ; a Reality Synthesiser graphics unit 200 with a dedicated video random access memory (VRAM) unit 250 ; and an I/O bridge 700 .
  • XDRAM Rambus® dynamic random access memory
  • VRAM dedicated video random access memory
  • the system unit 10 also comprises a Blu Ray® Disk BD-ROM® optical disk reader 430 for reading from a disk 440 and a removable slot-in hard disk drive (HDD) 400 , accessible through the I/O bridge 700 .
  • the system unit also comprises a memory card reader 450 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 700 .
  • the I/O bridge 700 also connects to four Universal Serial Bus (USB) 2.0 ports 710 ; a gigabit Ethernet port 720 ; an IEEE 802.11b/g wireless network (Wi-Fi) port 730 ; and a Bluetooth® wireless link port 740 capable of supporting up to seven Bluetooth connections.
  • USB Universal Serial Bus
  • Wi-Fi IEEE 802.11b/g wireless network
  • the I/O bridge 700 handles all wireless, USB and Ethernet data, including data from one or more game controllers 751 .
  • the I/O bridge 700 receives data from the game controller 751 via a Bluetooth link and directs it to the Cell processor 100 , which updates the current state of the game accordingly.
  • the wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 751 , such as: a remote control 752 ; a keyboard 753 ; a mouse 754 ; a portable entertainment device 755 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 756 ; a microphone headset 757 ; and a motion controller 758 .
  • game controllers 751 such as: a remote control 752 ; a keyboard 753 ; a mouse 754 ; a portable entertainment device 755 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 756 ; a microphone headset 757 ; and a motion controller 758 .
  • the motion controller 758 will be described in more detail later below.
  • the video camera is a stereoscopic video camera 1010 .
  • peripheral devices may therefore in principle be connected to the system unit 10 wirelessly; for example the portable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 757 may communicate via a Bluetooth link.
  • Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
  • DVRs digital video recorders
  • set-top boxes digital cameras
  • portable media players Portable media players
  • Voice over IP telephones mobile telephones, printers and scanners.
  • a legacy memory card reader 410 may be connected to the system unit via a USB port 710 , enabling the reading of memory cards 420 of the kind used by the Playstation® or Playstation 2® devices.
  • the game controller 751 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link
  • the game controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 751 .
  • the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands.
  • other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller.
  • additional game or control information may be provided on the screen of the device.
  • Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).
  • the remote control 752 is also operable to communicate wirelessly with the system unit 10 via a Bluetooth link
  • the remote control 752 comprises controls suitable for the operation of the Blu Ray Disk BD-ROM reader 430 and for the navigation of disk content.
  • the Blu Ray Disk BD-ROM reader 430 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs.
  • the reader 430 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs.
  • the reader 430 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
  • the system unit 10 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesiser graphics unit 200 , through audio and video connectors to a display and sound output device 300 such as a monitor or television set having a display 305 and one or more loudspeakers 310 .
  • the audio connectors 210 may include conventional analogue and digital outputs whilst the video connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 720p, 1080i or 1080p high definition. Audio processing (generation, decoding and so on) is performed by the Cell processor 100 .
  • the Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks.
  • DTS Dolby® Theatre Surround
  • the video camera 756 comprises a single charge coupled to device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 10 .
  • the camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 10 , for example to signify adverse lighting conditions.
  • Embodiments of the video camera 756 may variously connect to the system unit 10 via a USB, Bluetooth or Wi-Fi communication port.
  • Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data.
  • the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
  • a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 10
  • an appropriate piece of software such as a device driver should be provided.
  • Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
  • the Cell processor 100 has an architecture comprising four basic components: external input and output structures comprising a memory controller 160 and a dual bus interface controller 170 A,B; a main processor referred to as the Power Processing Element 150 ; eight co-processors referred to as Synergistic Processing Elements (SPEs) 110 A-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 180 .
  • the total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine.
  • the Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L 2 ) cache and a 32 kB level 1 (L 1 ) cache.
  • the PPE 150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz.
  • the primary role of the PPE 150 is to act as a controller for the Synergistic Processing Elements 110 A-H, which handle most of the computational workload. In operation the PPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 110 A-H and monitoring their progress. Consequently each Synergistic Processing Element 110 A-H runs a kernel whose role is to fetch a job, execute it and to synchronise with the PPE 150 .
  • Each Synergistic Processing Element (SPE) 110 A-H comprises a respective Synergistic Processing Unit (SPU) 120 A-H, and a respective Memory Flow Controller (MFC) 140 A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142 A-H, a respective Memory Management Unit (MMU) 144 A-H and a bus interface (not shown).
  • SPU Synergistic Processing Unit
  • MFC Memory Flow Controller
  • DMAC Dynamic Memory Access Controller
  • MMU Memory Management Unit
  • Each SPU 120 A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 130 A-H, expandable in principle to 4 GB.
  • Each SPE gives a theoretical 25.6 GFLOPS of single precision performance.
  • An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation.
  • the SPU 120 A-H does not directly access the system memory XDRAM 500 ; the 64-bit addresses formed by the SPU 120 A-H are passed to the MFC 140 A-H which instructs its DMA controller 142 A-H to access memory via the Element Interconnect Bus 180 and the memory controller 160 .
  • the Element Interconnect Bus (EIB) 180 is a logically circular communication bus internal to the Cell processor 100 which connects the above processor elements, namely the PPE 150 , the memory controller 160 , the dual bus interface 170 A,B and the 8 SPEs 110 A-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 110 A-H comprises a DMAC 142 A-H for scheduling longer read or write sequences.
  • the EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction.
  • the theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2 GHz.
  • the memory controller 160 comprises an XDRAM interface 162 , developed by Rambus Incorporated.
  • the memory controller interfaces with the Rambus XDRAM 500 with a theoretical peak bandwidth of 25.6 GB/s.
  • the dual bus interface 170 A,B comprises a Rambus FlexIO® system interface 172 A,B.
  • the interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170 A and the Reality Simulator graphics unit 200 via controller 170 B.
  • Data sent by the Cell processor 100 to the Reality Simulator graphics unit 200 will to typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
  • the Reality Simulator graphics (RSX) unit 200 is a video accelerator based upon the NVidia® G 70 / 71 architecture that processes and renders lists of commands produced by the Cell processor 100 .
  • the RSX unit 200 comprises a host interface 202 operable to communicate with the bus interface controller 170 B of the Cell processor 100 ; a vertex pipeline 204 (VP) comprising eight vertex shaders 205 ; a pixel pipeline 206 (PP) comprising 24 pixel shaders 207 ; a render pipeline 208 (RP) comprising eight render output units (ROPs) 209 ; a memory interface 210 ; and a video converter 212 for generating a video output.
  • VP vertex pipeline 204
  • PP pixel pipeline 206
  • RP render pipeline 208
  • ROPs render output units
  • the RSX 200 is complemented by 256 MB double data rate (DDR) video RAM (VRAM) 250 , clocked at 600 MHz and operable to interface with the RSX 200 at a theoretical peak bandwidth of 25.6 GB/s.
  • the VRAM 250 maintains a frame buffer 214 and a texture buffer 216 .
  • the texture buffer 216 provides textures to the pixel shaders 207 , whilst the frame buffer 214 stores results of the processing pipelines.
  • the RSX can also access the main memory 500 via the EIB 180 , for example to load textures into the VRAM 250 .
  • the vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered.
  • the pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel.
  • Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture).
  • the render pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image.
  • the render pipeline and vertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency.
  • the render pipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image.
  • Both the vertex shaders 205 and pixel shaders 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined to pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second.
  • the total floating point performance of the RSX 200 is 1.8 TFLOPS.
  • the RSX 200 operates in close collaboration with the Cell processor 100 ; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene.
  • the PPU 155 of the Cell processor may schedule one or more SPEs 110 A-H to compute the trajectories of respective batches of particles.
  • the RSX 200 accesses any texture data (e.g. snowflakes) not currently held in the video RAM 250 from the main system memory 500 via the element interconnect bus 180 , the memory controller 160 and a bus interface controller 170 B.
  • the or each SPE 110 A-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to the video RAM 250 ; the DMA controller 142 A-H of the or each SPE 110 A-H addresses the video RAM 250 via the bus interface controller 170 B.
  • the assigned SPEs become part of the video processing pipeline for the duration of the task.
  • the PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled.
  • the disabling of one SPE provides a greater level of tolerance during fabrication of the Cell processor, as it allows for one SPE to fail the fabrication process.
  • the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor.
  • the PPU 155 can assign tasks to SPEs in several ways. For example, SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video and audio decoding, and error masking, with each step being assigned to a separate SPE. Alternatively or in addition, two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above.
  • Software instructions implemented by the Cell processor 100 and/or the RSX 200 may be supplied at manufacture and stored on the HDD 400 , and/or may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or internet connection, or via combinations of these.
  • the software supplied at manufacture comprises system firmware and the Playstation 3 device's operating system (OS).
  • the OS provides a user interface enabling a to user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video.
  • the interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally.
  • XMB cross media-bar
  • the user navigates by moving through the function icons (representing the functions) horizontally using the game controller 751 , remote control 752 or other suitable control device so as to highlight a desired function icon, at which point options pertaining to that function appear as a vertically scrollable list of option icons centred on that function icon, which may be navigated in analogous fashion.
  • the Playstation 3 device may select appropriate options automatically (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HDD 400 ).
  • the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demonstration games (demos) and other media may be downloaded, and a friends management capability, providing on-line communication with other Playstation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available.
  • the on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the Playstation 3 device itself. It will be appreciated that the term “on-line” does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.
  • a stereoscopic camera 1010 generates a pair of images whose viewpoints are separated by a known distance equal to average eye separation.
  • both lenses of the stereoscopic camera are looking at a sequence of objects P, Q, R, S and T, and two additional objects N and O (assumed for the purposes of explanation to be positioned above the other objects).
  • the resulting pair of images comprising left-eye image 1012 and right-eye image 1014
  • the different image viewpoints result in a different image of the objects from each lens.
  • an overlay image 1020 of the stereoscopic image pair illustrates that the displacement between the objects within the image pair 1012 and 1014 is inversely proportional to the distance of the object from the stereoscopic camera.
  • the stereoscopic image pair is displayed via a display mechanism (such as alternate frame sequencing and glasses with switchably occluded lenses, or lenticular lensing on an autostereoscopic display screen) that delivers a respective one of the pair of to images ( 1012 , 1014 ) to a respective eye of the viewer, and the object displacement between the images delivered to each eye causes an illusion of depth in the viewed content.
  • a display mechanism such as alternate frame sequencing and glasses with switchably occluded lenses, or lenticular lensing on an autostereoscopic display screen
  • parallax This relative displacement between corresponding image elements of the left- and right-image is also referred to in stereoscopy as parallax (as distinct from the visual effect of objects at different distances panning at different speeds, also sometimes known as the parallax effect).
  • so-called ‘positive parallax’ causes an object to appear to be within or behind the plane of the screen, and in this case the displacement is such that a left eye image element is to the left of a right eye image element.
  • ‘negative parallax’ causes an object to appear to be in front of the plane of the screen, and in this case the displacement is such that a left eye image element is to the right of a right eye image element, as is the case in FIGS. 4A and 4B .
  • ‘zero parallax’ occurs at the plane of the screen, where the user focuses their eyes and hence there is no displacement between left and right image elements.
  • the display mechanism and the position of the viewer in combination determine whether the apparent distance of the objects is faithfully reproduced.
  • the size of the display acts as a scaling factor on the apparent displacement (parallax) of the objects; as a result a large screen (such as in a cinema) requires a greater distance from the user (i.e. in the cinema auditorium) to produce the appropriate parallax. Meanwhile a smaller screen such as that of a domestic television requires a smaller distance.
  • FIG. 5A reference is made only to objects P and T for clarity, but it will appreciated that the following holds true for all stereoscopic image elements.
  • the respective distances from the objects P and T to the stereoscopic camera 1010 are ⁇ P and ⁇ T . As described previously, these respective distances result in different displacements for the objects between the images captured by the two lenses of the stereoscopic camera, as seen again in the overlay 1020 of the two captured images in FIG. 5B . In this case, both objects show negative parallax.
  • the viewer's brain interprets the position of the objects as being located at the point of intersection of the respective lines of sight of each eye and the object as depicted in each image. In FIG. 5B , these are at distances ⁇ P ′ and ⁇ T ′ from the user.
  • the apparent depth is basically correct as long as the distance of the user is correct for the current screen size, even if the user is not central to the image; as can be seen in FIG. 5C , where again ⁇ P ′ ⁇ P , ⁇ T ′ ⁇ T and ( ⁇ T ′ ⁇ P ′) ⁇ ( ⁇ T ⁇ P ).
  • the effective magnification of the captured scene is 1:1.
  • different scenes may zoom in or out, and different screen sizes also magnify the reproduced image.
  • the apparent depth is correct if the apparent scale (magnification) along the depth or ‘Z’ axis is the same as the apparent scale in the ‘X’ and ‘Y’ axis of the image plane.
  • the distance of the viewer is detected using, for example, a video camera such as the EyeToy coupled with a remote distance measuring system, such as an infra-red emitter and detector.
  • a video camera such as the EyeToy coupled with a remote distance measuring system, such as an infra-red emitter and detector.
  • a remote distance measuring system such as an infra-red emitter and detector.
  • a stereoscopic video camera can be used to determine the distance of the user based on the same displacement measurements noted for the stereoscopic images as described above.
  • Another alternative is to use a conventional webcam or EyeToy camera and to use known face recognition techniques to identify faces or heads of viewers, and from these to generate a measure of viewer distance from the display screen. In any of these cases, the relative position of the camera with respect to the 3D image display is also known, so that the distance from the viewer to the 3D image can be computed.
  • the Bluetooth 740 , Wi-Fi 730 and ethernet 720 ports provide interfaces for peripherals such as the motion controller 758 .
  • the operation of the motion controller 758 in cooperation with the system unit 10 will now be described with reference to FIGS. 6 and 7 .
  • FIG. 6 is a schematic diagram of a motion controller in accordance with embodiments of the present invention.
  • the motion controller 758 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link
  • the motion controller 758 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the motion controller 758 .
  • the motion controller 758 comprises motion sensors such as accelerometers and gyroscopes which are sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each of three orthogonal axes. Consequently gestures and movements by the user of the motion controller 758 may be translated as inputs to a game or other application executed by the system unit 10 .
  • the motion controller 758 could be sensitive to any other suitable number of degrees of freedom as appropriate.
  • the motion controller 758 is operable to transmit motion data generated by the motion sensors to the system unit via the Bluetooth link or any other suitable communications link
  • the motion controller 758 comprises a light source 2005 and four input buttons 2010 .
  • the system unit 10 is operable to carry out image processing on images captured by the camera 756 using known techniques so as to detect a position of the light source 2005 within images captured by the camera 756 . Therefore, the system unit 10 can track the position of the light source 2005 within the captured images and hence track the position of the motion controller 758 with respect to the camera 756 . Accordingly, the system unit 10 is operable to generate position tracking data from the captured images which relates to the position of the light source 2005 with respect to the camera 756 or system unit 10 .
  • the system unit 10 is operable to combine data from the motion sensors of the motion controller 758 with the position tracking data. This allows a more accurate estimation of the position (e.g. x, y, z, coordinates) and attitude (e.g. pitch, roll, yaw) of the motion controller 758 because any error in either of the position tracking data or the motion data can be compensated for by data from the other data source.
  • the motion controller 758 acts as a user control device.
  • the light source 2005 comprises one or more light emitting diodes (LED) so as provide a full colour spectrum of colours in the visible range.
  • LED light emitting diodes
  • any other suitable light source such as organic light emitting diodes (OLED), incandescent bulbs, laser light sources, and the like may be used.
  • the light source 2005 comprises a spherical translucent housing which surrounds the LEDs, although it will be appreciated that any other suitable shape could be used.
  • the light source need not comprise the housing and the light source could comprise one or more light sources mounted so that light from the light sources can be emitted to the surroundings of the motion controller 758 .
  • the input buttons 2010 act in a similar manner to control buttons on the game controller 751 as described above.
  • the embodiment of FIG. 6 has four input buttons, it will be appreciated that the motion controller 758 could comprise any number of input buttons or indeed not have any input buttons.
  • the motion controller 758 could comprise one or more analogue or digital joysticks for controlling the system unit 10 .
  • the motion controller is operable to control a colour and/or intensity of light emitted by the light source 2005 .
  • the colour and/or intensity of the light source 2005 can be controlled by the system unit 10 by sending appropriate command via the wireless link.
  • the control of the colour and/or intensity of the light source 2005 can be useful when two or more motion controllers are to be used to control the system unit 10 .
  • each motion controller could be caused to have a different colour, thereby allowing the motion controllers to be distinguished from each other in the images captured by the camera 756 .
  • the colour of the light source can be controlled so as to contrast with an environmental background colour. This improves the reliability of the position tracking data because the light source is more easily distinguished from the background.
  • the colour and/or intensity of the light source 2005 can be controlled in response to game state or operational state.
  • a red colour could indicate that the motion controller is off or not paired with the system unit 10 .
  • the light source 2005 could be caused to turn red if a game player is shot in a game, or caused to turn blue or flash different colours if a game player casts a spell in a game.
  • FIG. 7 shows an example of a user 2015 using the motion controller to control the system unit 10 in accordance with embodiments of the present invention.
  • the user 2015 is using the motion controller 758 to point at the display 300 .
  • the system unit 10 is operable to analyse images captured by the camera 756 and correlate the motion data from the motion sensors to detect the relative position of the light source 2005 with respect to the camera 756 (as denoted by a vector A in FIG. 7 ).
  • the cell processor 100 is therefore operable to detect the pointing direction (as indicated by a vector B) of the motion controller 758 .
  • the user 2015 can therefore use the motion controller 758 to control a game or other functionality of the system unit 10 by appropriate manipulation of the motion controller 758 .
  • the user 2015 could point the motion controller 758 at menu item displayed as part of a menu on the display 300 and activate one of the input buttons 2010 so as to select that menu item.
  • the user 2015 could use the motion controller 758 to point at a game object displayed on the display which they wish their game character to pick up and use within the game.
  • the user 2015 is limited in using the motion controller to point at object displayed on the display 300 .
  • the user 2015 can use the motion controller 758 to point at objects that are in front of an image plane of the display 300 . This will now be described in more detail with reference to FIGS. 8 and 9 .
  • FIG. 8 shows a schematic diagram of user using the motion controller 758 to control the system unit 10 in accordance with embodiments of the present invention.
  • FIG. 8 shows the user 2015 using the motion controller 758 to point at an object 2020 .
  • the position of the motion controller 758 with respect to the camera as indicated by the vector A is detected by the cell processor by analysis of the images captured by the camera 756 .
  • the cell processor 100 is operable to carry out known image analysis techniques on the captured images so as to detect the location of the light source within the captured images.
  • the direction in which the motion controller is pointing (referred to herein as a pointing direction) is denoted by the vector B in FIG. 8 .
  • the cell processor 100 is operable to combine the motion data (such as pitch, roll, and yaw data and x, y, z position data) from the motion sensors of the motion controller 758 with the position data of the light source as generated from the captured images so as to detect the position and pointing direction of the motion controller 758 .
  • the object 2020 can be a real physical object.
  • the object 2020 could be a tennis ball as shown schematically in FIG. 8 .
  • the system unit 10 is operable to cause a list of possible objects for detection to be displayed on the display 300 .
  • the list of possible objects for detection could comprise a ball, a face, a box and the like.
  • each object is associated with a template for pattern matching by the cell processor 100 .
  • the cell processor 100 is operable to carry out known image recognition techniques and image processing techniques such as pattern matching and face detection so as to detect the position of the object 2020 with respect to the camera 756 .
  • the cell processor 100 is operable to carry out image processing techniques on the images captured by the camera so as to detect a three-dimensional position of the object 2020 with respect to the camera 756 as denoted by the vector C in FIG. 8 .
  • the detection of the position of the horizontal and vertical position of the object can be carried out using known image processing and pattern matching techniques to locate the object in the captured images.
  • the distance from the camera 756 to the object 2020 also needs to be detected.
  • the system unit 10 is operable to detect the distance from the camera 756 to the object in a similar manner to that described above for detecting the distance of the viewer.
  • a so-called “z-cam” from 3DV Systems (http://www.3dvsystems.com/)
  • a stereoscopic pair of cameras could be used to detect the distance to the object 2020 .
  • any suitable technique for detecting the distance between the object 2020 and the camera 756 could be used.
  • the motion controller 758 acts as a pointing source with the pointing direction being in the direction of the vector B.
  • the cell processor 100 is operable to detect whether the three-dimensional position of the object 2020 lies on a line passing through the source position in the direction of the pointing direction of the pointing source. In other word, the cell processor 100 is operable to detect whether the object 2020 corresponds to the pointing direction of the motion controller 758 . Therefore, the system unit 10 can detect whether the user 2015 is pointing at the object 2020 .
  • the system unit 10 is operable to enact functions of the system unit in response to detection that the user 2015 is pointing at the object 2020 .
  • the user 2015 could point at an object to select that object for further image processing or to act as an object within a game.
  • the pointing functionality of pointing at an object in front of the image plane of the display 300 could be used in any suitable way to interact with the system unit 10 .
  • the motion controller 758 acts as the pointing source, in other embodiments, the motion controller 758 need not be the pointing source. This will now be described in more detail with reference to FIG. 9 .
  • FIG. 9 shows a schematic diagram of the user 2015 using the motion controller 758 to control the system unit 10 in accordance with embodiments of the present invention.
  • the system unit 10 is operable to cause the display 300 to display a game character 2025 , which acts as the pointing to source.
  • the pointing source corresponds to a position of the game character within a game executed by the system unit 10 .
  • the pointing source corresponds to the position of the motion controller 758 .
  • the pointing source has an associated pointing direction indicative of a direction in which the pointing source is pointing.
  • the cell processor 100 is operable to determine the position of the game character 2025 with respect to the camera 756 . As illustrated in FIG. 9 , the position of the game character 2025 with respect to the camera is illustrated by the vector A′.
  • the position of the camera 756 with respect to the display 300 can be calibrated by a user via a suitable calibration user interface executed by the system unit 10 and displayed on the display 300 . Alternatively the user 2015 could be instructed to place the camera 756 at a predetermined position with respect to the display 300 as instructed by a suitable message displayed 300 on the display by the system unit 10 .
  • a user may also input information regarding make and model of the display, screen size, display resolution and the like. Alternatively, this information may be acquired automatically by the system unit 10 via a suitable communication link between the display 300 and the system unit 10 .
  • any suitable method for determining the position of the camera with respect to the display 300 could be used.
  • the pointing source corresponds to the position of the game character 2025 .
  • game character is the source of the pointing direction as indicated by the vector B′.
  • the pointing direction of the motion controller 758 (as indicated by a vector B′′ with the motion controller 758 as the source of the vector B′′) is associated with the pointing direction B′ of the game character 2025 .
  • the cell processor is operable to associate the pointing direction B′ of the game character 2025 with the pointing direction B′′ of the motion controller 758 so that they are parallel with each.
  • any other appropriate association between the pointing direction of the game character 2025 and the pointing direction of the motion controller 758 could be used.
  • the pointing direction of the motion controller 758 (user control device) is associated with the pointing direction of the pointing source so that a change in the pointing direction of the motion controller 758 (user control device) causes a change in the pointing direction of the pointing source.
  • movement of the to motion controller 758 such that the pointing direction of the motion controller 758 changes causes a change in the pointing direction of the pointing source.
  • the system unit 10 is also operable to detect the position of the object 2020 with respect to the camera 756 . Therefore, the three-dimensional position of the object 2020 with respect to the camera 756 can be detected in a similar manner to that described above with reference to FIG. 8 .
  • the position of the object 2020 is denoted by the vector C′ in FIG. 9 .
  • the pointing direction of the pointing source is associated with the pointing direction of the motion controller 758 .
  • the pointing direction B′ of the game character 2025 is associated with the pointing direction B′′ of the motion controller 758 so that B′ is parallel to B′′.
  • the user 2015 could manipulate the motion controller 758 so that the game character 2025 points at the object 2020 .
  • the cell processor 100 is operable to detect whether the three-dimensional position of the object (as defined by C′) lies on a line passing through the source position in the direction of the pointing direction of the pointing source (game character 2025 ).
  • the cell processor 100 can detect whether the game character 2025 is pointing at the object 2025 . Additionally, in some embodiments, the cell processor 100 is operable to generate the game character 2025 so that the pointing direction B′ is associated with the pointing direction B′′ of the motion controller 758 so that the pointing direction B′ moves with changes in the pointing direction B′′. Therefore, the user 2015 can control the pointing direction B′ of the game character 2025 by appropriate manipulation and control of the motion controller 758 .
  • the pointing source corresponds to a position of a game character within a game executed by the entertainment device. However, it will be appreciated that the pointing source could correspond to a position of any suitable computer generated object generated by the system unit 10 (entertainment device).
  • the game character 2025 is a two-dimensional image displayed on the display 300 by the system unit 10 .
  • using a two-dimensional image can mean that it is difficult to convey the pointing direction B′ of the game character to the user 2015 .
  • the cell processor 100 is operable to generate the game character 2025 so that when displayed on the display 300 the game character 2025 appears as a three-dimensional image. In embodiments, this is achieved using the 3D viewing described above with reference to FIGS. 4A , 4 A, 5 A, 5 B, and 5 C, although it will be appreciated that any other suitable technique could be used.
  • the three-dimensional position of the game character 2025 may be generated by the cell processor 100 so that the three-dimensional position is appears at an image plane which is different from the image plane of the display.
  • the cell processor 100 is operable to determine the position A′ of the game character by calculating the apparent position of the game character with respect to the display 300 . This technique will be described in more detail later below.
  • the object 2020 was treated as a real object.
  • the object 2020 is a computer generated object.
  • the cell processor 100 could generate the object 2020 as a stereo pair of images such that the object appears in front of the display 300 .
  • the position of the object 2020 should then be determined so that a user can user the motion controller 758 to point to the object 2020 . A way in which this may be achieved in accordance with embodiments of the present invention will now be described with reference to FIGS. 10 and 11 .
  • FIG. 10 shows a schematic plan view of the user seen from above viewing the display at two different horizontal positions.
  • the cell processor 100 is operable to cause the display 300 to display a stereo pair of images corresponding to the object 2020 so that, when viewed in a suitable manner (for example by using polarised glasses) the user 2015 will perceive the object 2020 in front of the display 300 .
  • the user 2015 is illustrated being positioned at a first position (position 1 ) and a second position (position 2 ), which is to the right of position 1 .
  • position 1 first position
  • position 2 position
  • the user should perceive the object 2020 at a first object position 2035 .
  • the user should perceive the object 2020 at a second object position 2040 . Therefore, it will be understood from FIGS. 5A , 5 B, 5 C and 10 , that a three-dimensional position at which the computer generated object will appear to the users will depend on the position of the user with respect to the display 300 .
  • the system is operable to detect the position of the user's face with respect to the display.
  • the cell processor 100 is operable to carry out known face recognition techniques on the images captured by the camera 756 so as to detect the position of the user's face with respect to the camera 756 . Providing the position of to the camera with respect to the display 300 is known, the cell processor can calculate the position of the user's face with respect to the display. Furthermore, as the position at which the object will appear to the user depends on the distance from the user to the display as described above, in embodiments, the system unit 10 is operable to detect the distance from the display 300 to the user 2015 as described above.
  • the object is assumed to lie on a line (as indicated by the dotted line 2045 ) between a midpoint 2050 between the stereo pair of images 2030 and a midpoint 2055 between the user's eyes.
  • the distance between the user's eyes can be estimated by the cell processor 100 , either by analysis of the captured images or by using the average interpupillary distance (distance between the eyes) of adult humans.
  • any other suitable technique for estimating the interpupillary distance could be used.
  • the cell processor 100 is operable to calculate lines between the stereo pair of images 2030 and the user's eyes as indicated by the dashed lines 2060 and 2065 .
  • the distance between the stereo pair of images 2030 can be calculated from known parameters of the display such as display resolution, screen size and the like
  • the apparent position of the computer generated object in the horizontal and vertical directions can be estimated from the intersection of the lines 2045 , 2060 , and 2065 .
  • the lines 2045 , 2060 and 2065 interest at the first object position 2035 . Therefore the computer generated object can reasonably be assumed to be perceived by the user at the first object position 2035 .
  • the object can be assumed to appear to the user at the second object position 2040 .
  • the apparent position of the object in the horizontal and vertical directions can be calculated by the cell processor for any stereo pair of images in a similar manner to that described above for the first object position 2035 .
  • the line 2045 is not calculated so as to save processing resources.
  • the use of three intersecting lines to calculate the apparent position of the object can improve the accuracy of the calculation.
  • the directions horizontal (x) and vertical (y) are taken to mean direction in the image plane of the display.
  • the depth (z) is taken to mean the distance from the display 300 .
  • any other suitable coordinate system could be used to define the apparent position of the object, the user, the motion controller, and the like.
  • FIG. 10 illustrates how the horizontal position (x) position of the object and the depth of the object from the display (z) may be determined
  • the vertical position (y) to also needs to be determined
  • FIG. 11 shows a schematic diagram of the user 2015 using the motion controller 758 to point at a computer generated object as seen from one side of the display 300 .
  • the system unit 10 is operable to detect the distance of the user 2015 from the display 300 .
  • the cell processor 100 is operable to detect the position of the user's eyes 2090 with respect to the camera 756 using known eye detection techniques (in some cases in combination with known face recognition techniques).
  • the cell processor 100 is operable to detect by analysis of the captured images and depth data generated by the distance detector, a vector D (as illustrated by dashed line 2070 in FIG. 11 ) which indicates a position of the user's eyes 2090 with respect to the camera 756 .
  • the position of the camera 756 with respect to the display 300 can be input by a suitable calibration process or preset in any other suitable way. Therefore, the cell processor 100 is operable to detect the distance between the user's eyes 2090 and the display 300 using appropriate trigonometry calculations.
  • the system unit 10 is operable to cause computer generated objects (such as the stereo pair of images 2030 and a stereo pair of images 2075 ) to be displayed on the display 300 .
  • computer generated objects such as the stereo pair of images 2030 and a stereo pair of images 2075
  • FIG. 11 shows a side view of the display, it will be appreciated that the stereo pair 2030 and the stereo pair 2075 will appear as one image when viewed from the side because there share the same horizontal axis, although each stereo pair in fact comprises two images.
  • the stereo pair of images 2075 should cause the user 2015 to perceive a virtual object at a position 2080 in front of the display 300 .
  • the vertical position of a computer generated object is assumed to lie on a line between the user's eyes 2090 and the position of the stereo pair of images corresponding to that object.
  • the position 2080 lies on a line (indicated by the dashed line 2095 ) between the user's eyes 2090 and the stereo pair of images 2075 .
  • the position 2035 lies on a line (indicated by dashed line 3000 ) between the user's eyes 2090 and the stereo pair of images 2030 .
  • the horizontal position (x) and depth (z) can be determined using the technique described above with reference to FIG. 10 .
  • system unit 10 is operable to calculate the vertical position (y) of a computer generated object from the relative position of the user's eyes 2090 with respect to the display 300 and the position of the stereo pairs of images as displayed on the display 300 using appropriate trigonometric calculations.
  • the system unit 10 is operable to detect whether the three-dimensional position of an object lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
  • the motion controller 758 is the pointing source, and the pointing direction of the motion controller 758 (as indicated by a dotted line 2085 in FIG. 11 ) corresponds to the three-dimensional position of the object 2035 . Therefore, the system unit 10 can detect that the user 2015 is pointing at the position 2035 rather than the position 2080 .
  • the cell processor 100 can detect which object is being pointed at by the user.
  • embodiments of the present invention provide an enhanced technique for allowing a user to interact with an entertainment device.
  • the user could use the motion controller 758 to point at the object to select the object in a game, or to pick up the object in a game.
  • the user could select a real object for further image processing or as a real object for augmented reality applications.
  • the embodiments described herein could enable any suitable functionality in which a user can point at an object (computer generated or a real physical object) to be achieved.
  • the cell processor 100 is operable to generate a three-dimensional image such as a virtual image feature so that the three-dimensional image appears to be associated with the source position and pointing direction of the motion control device 758 . This will now be described in more detail with reference to FIGS. 12 and 13 .
  • FIG. 12 is a schematic diagram of a three-dimensional (3D) image display region shown with respect to the display 300 seen from above.
  • the 3D image display region 3010 is a region in front of the display 300 in which a three-dimensional image (for example a stereo pair of images) can be perceived as three-dimensional by the user 2015 .
  • the 3D image display region 3010 occurs where the viewpoint of both of the user's eyes 2090 overlap.
  • the origin of the 3D image display region can also be understood by referring to FIGS. 5B and 5C .
  • the system unit 10 is operable to detect the 3D image display region by detecting the position of the user's eyes 2090 with respect to the display.
  • the cell processor 100 is operable to analyse the images captured by the camera 756 so as to detect the horizontal (x) and vertical (y) positions of the user's eyes with respect to the display 300 .
  • the system unit 10 is operable to detect the distance (z) of the user from the display 300 .
  • the x-z plane i.e.
  • the 3D image display region corresponds to a triangular region bounded by: a line 3015 from the user's right eye 3020 to a left-hand edge 3025 of a display area of the display 300 ; a line 3030 from the user's left eye 3035 to a right-hand edge 3040 of the display area of the display 300 ; and a line 3045 corresponding to an image plane of the display 300 .
  • the 3D image display region corresponds to a triangle from the user's eyes 2090 to the display 300 .
  • FIG. 13 shows a schematic diagram of the 3D image display region viewed from the side.
  • the 3D image display region 3010 is bounded by: a line 3050 from the user's eyes 2090 to a lower edge 3055 of the display area of the display 300 ; a line 3060 from the user's eyes 2090 to an upper edge 3065 of the display area of the display 300 ; and a line 3070 corresponding to the image plane of the display 300 .
  • the cell processor 100 can then therefore determine the 3D image display region based on the position of the user's eyes 2090 with respect to the display 300 .
  • the 3D image display area 3010 can be thought of as an irregular four-sided pyramid with a base corresponding to the display area of the display 300 and the apex located between the display and the user's eyes 2090 .
  • any other suitable 3D image display region could be used.
  • system unit 10 is operable to generate the three-dimensional image so that it appears to lie along the pointing direction B of the motion controller 758 . This is illustrated with respect to FIG. 13 .
  • the system unit 10 is operable to generate a 3D image 4000 .
  • the 3D image corresponds to flames which are caused to appear to come from the motion controller 758 along the pointing direction B.
  • the user 2015 can therefore manipulate the motion controller 758 to control where the 3D appears to point. Therefore, the motion controller 758 could act as a flame thrower within a game with the flames appearing to come from the motion controller.
  • any other suitable functionality could be realised such as tracer bullets, a laser pointer and the like.
  • embodiments of the present invention can create a more powerful and immersive 3D experience for the user 2015 .
  • the cell processor 100 is operable to generate the 3D image so that it is positioned within the 3D image display region 3010 . If the motion controller 758 is detected to be positioned outside the 3D image display region 3010 , the cell processor 100 is operable to generate the 3D image 4000 so that the 3D image 4000 only appears within the 3D image display region 3010 rather than outside. This helps prevent the user 2015 from experiencing headaches and/or nausea.
  • the use of a 3D image display region can also be important when rendering 3D images for 3D games which use pseudo-realistic physics engines.
  • video games may use realistic physics models to predict the motion of objects within a game such as debris from an explosion.
  • the motion of objects within physics based video games or physics based virtual reality simulations typically allows 3D objects to travel unrestricted through the virtual environment. However, this may mean that objects may quickly move out of the 3D image display region, thus degrading the 3D effect and possibly causing the user to experience headaches and/or nausea. Therefore, to address this problem, in some embodiments, the system unit 10 is operable to control the appearance of the 3D image in dependence upon the relative position of the 3D image with respect to the 3D image display region. This will now be described in more detail with reference to FIGS. 14 and 15 .
  • FIG. 14 shows a schematic view of control of the appearance of a 3D image in accordance with embodiments of the present invention.
  • FIG. 14 shows the user's eyes 2090 and the display 300 .
  • the system unit 10 is operable to generate a 3D image.
  • the system unit is operable to generate the 3D image so that, over a sequence of image frames, the 3D image appears to move towards the user 2015 .
  • a 3D object may be caused by a so-called physics engine to move along a trajectory 4010 as indicated in FIG. 14 .
  • this may cause the object to leave the 3D image display region 3010 over relatively few image frames. Therefore, in some to embodiments, the cell processor 100 is operable to control the apparent motion of the 3D image so that the 3D image appears to remain within the 3D image display region 3010 .
  • FIG. 14 This is illustrated schematically in FIG. 14 .
  • the 3D image appears at a position 4000 a .
  • the 3D image appears at a position 4000 b .
  • the 3D image appears at a position 4000 c .
  • the 3D image appears at a position 4000 d .
  • t 1 , t 2 , t 3 , and t 4 refer to image frames of an image sequence.
  • the frame rate is 1/25 seconds, although it will be appreciated that any suitable frame rate could be used.
  • the cell processor 100 is operable to steer the object so that it follows a trajectory 4015 and the object remains within the 3D image display region 3010 .
  • embodiments of the present invention can reduce the likelihood that the user will experience any nausea and/or headaches.
  • the cell processor 100 is operable to control the appearance of the 3D image so that the 3D image appears to fade when a position of the 3D image is within a threshold distance of an edge of the 3D image display region 3010 . This is illustrated in FIG. 14 .
  • the physics engine of a game executed by the system unit 10 may cause the object to follow a trajectory 4020 .
  • the cell processor 100 may cause the 3D image to appear at a position 4000 e at a time t 5 , at a position 4000 f at a time t 6 , at a position 4000 g at a time t 7 , and at a position 4000 h at a time t 8 .
  • the cell processor 100 is operable to detect when the object is within a threshold distance 4050 of an edge (such as edge 4055 ) of the 3D image display region 3010 .
  • the cell processor causes the 3D image to appear to fade at the positions 4000 f , 4000 g , and 4000 h over the corresponding image frame times t 6 , t 7 , and t 8 .
  • the 3D image could be caused to fade over any suitable number of image frames and that any suitable threshold distance could be used. It should be appreciated that the term “fade” can be taken to mean become more transparent, become less bright or any other suitable image processing operation that causes the object to to become less apparent over a sequence of image frames.
  • virtual objects which have a predefined relationship to the virtual environment such as trees at the side of a road
  • choreographed objects are caused to fade.
  • free-moving objects such as virtual objects whose position is generated by a physics based engine (e.g. explosion debris) can be caused to be steered. Accordingly, a likelihood that the user experiences nausea and/or headaches is reduced.
  • 3D image display region can help improve the display of icons for controlling the system unit 10 . This will now be described with reference to FIG. 15 .
  • FIG. 15 shows a schematic diagram of a 3D icon field in accordance with embodiments of the present invention.
  • FIG. 15 shows a plurality of icons 5005 , 5010 , 5015 , 5020 , 5025 , 5030 , and 5035 .
  • the cell processor 100 together with the other functional units of the system unit 10 is operable to generate one or more icons such as those illustrated in FIG. 15 .
  • each icon is associated with a control function of the system unit 10 .
  • the plurality of icons 5005 to 5035 comprise one or more subgroups of icons.
  • a first subgroup comprises the icons 5005 , 5010 , 5015 and 5020 .
  • a second subgroup comprises the icons 5025 and 5030 .
  • a third subgroup comprises the icon 5035 .
  • Each subgroup can comprise one or more icons. However, it will be appreciated that any number of icons could be used and each subgroup could comprise one or more icons as appropriate.
  • the cell processor 100 is operable to generate the subgroups so that each respective subgroup appears located at a respective distance from the display. As illustrated in FIG. 15 , the first subgroup is caused to appear at a distance f from the display 300 . The second subgroup is caused to appear at a distance g from the display 300 . The third subgroup is caused to appear at a distance h from the display 300 .
  • the cell processor is operable to control the number of icons in each subgroup in dependence upon the relative position of each subgroup with respect to the 3D image display region 3010 .
  • the cell processor 100 is operable to dynamically adjust the number of icons in each subgroup in dependence on the relative position of the subgroup with respect to the 3D image display position.
  • more frequently accessed icons can be displayed closer to the user whilst less frequently used icons can be displayed closer to the display 300 .
  • any other suitable method for ordering the icons could be used.
  • the icons can be generated by the cell processor 100 so that they appear semi-transparent so that the user can see icons which may appear to be behind other icons.
  • any other suitable method of displaying the icons could be used.
  • FIG. 16 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • the cell processor 100 causes an image (such as a control screen, game image and the like) to be displayed to the user 2015 .
  • the system unit 10 detects a three-dimensional position of an object in front of an image plane of the display.
  • the object could be a real object or a computer generated object as described above with reference to FIGS. 8 to 11 .
  • the system unit 10 detects a source position of a pointing source with respect to the display using techniques as described above with reference to FIGS. 6 to 11 .
  • the pointing source has an associated pointing direction indicative of a direction in which the pointing source is pointing.
  • the system unit detects a pointing direction of a user control device (such as the motion controller 758 ).
  • a user control device such as the motion controller 758
  • the pointing direction of the user control device is associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source.
  • the system unit 10 detects whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source. In other words, the system unit 10 can detect whether the user is using the motion controller 758 to point at a real physical object and/or a computer generated object in front of the display.
  • FIG. 17 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • the system unit 10 generates a 3D image such as the 3D image 4000 described with reference to FIG. 13 .
  • the system unit 10 detects, with respect to the display 300 , the 3D image display region 3010 in which the 3D image can be perceived as three-dimensional by the user 2015 using the techniques described above.
  • the system unit detects a source position and pointing direction of a user control device (such as motion controller 758 ) for controlling the entertainment device (such as system unit 10 ).
  • the source position is indicative of a position of the user control device (such as motion controller 758 ) with respect to the display 300
  • the pointing direction is indicative of a direction in which the user control device is pointing with respect to the display 300 .
  • the source position and pointing direction of the user control device are detected using the techniques described above. However, it will be appreciated that any other suitable method for detecting the source position and pointing direction of the user control device could be used.
  • the system unit 10 generates the 3D image within the 3D image display region so that the 3D image appears to be associated with the source position and pointing direction of the user control device.
  • the system unit 10 generates the 3D image so that is appears to be associated with the source position and pointing direction of the user control device in a similar manner to that described above with reference to FIG. 13 .
  • any other suitable technique could be used.
  • a method for controlling the appearance of a 3D image in dependence upon its relative position with respect to a 3D image display region will now be described with reference to FIG. 18 .
  • FIG. 18 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • the system unit 10 generates a 3D such as a 3D image described above with reference to FIGS. 14 and 15 for displaying on a display (such as display 300 ).
  • the system unit 10 detects the 3D image display region in a similar manner to that described above with reference to FIGS. 12 and 13 .
  • the 3D image display region is an image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display.
  • the system unit 10 detects the apparent position of the 3D image with respect to the display.
  • the position of the 3D image is detected in a similar manner to the techniques described above for determining the position of the computer generated object (described with reference to FIGS. 10 and 11 ).
  • any other suitable method for detecting the apparent position of the 3D image could be used.
  • the system unit 10 controls the appearance of the 3D in dependence upon the relative position of the three-dimensional image with respect to the image display region.
  • the appearance of the 3D image can be controlled in a similar manner to that described above with respect to FIGS. 14 and 15 , although it will be appreciated that any other suitable technique could be used.
  • motion controller 758 any number of motion controllers could be used to implement the functionality of the embodiments described above.
  • a first user could use a first motion controller to point at a first object and a second user could use a second motion controller to point at a second object.
  • any real physical object could be pointed to.
  • the system unit 10 is operable to implement an augmented reality application for an augmented reality environment using known techniques.
  • the real physical object to which a user can point using the motion controller 758 could be an augmented reality marker, although any other suitable real physical object could be used.
  • a computer program product comprising processor implementable instructions stored on a data carrier (a storage medium) such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the existing equivalent device.
  • a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the existing equivalent device.
  • ASIC application specific integrated circuit
  • FPGA field

Abstract

An entertainment device comprises a display, an object position detector to detect a three-dimensional position of an object in front of an image plane of the display, and a source position detector to detect a source position of a pointing source with respect to the display. A pointing direction indicates a direction in which the pointing source is pointing. The device also comprises a direction detector to detect a pointing direction of a user control device. The pointing direction of the user control device is associated with the pointing direction, so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source. The device also comprises an alignment detector to detect whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an entertainment device and entertainment methods.
  • 2. Description of the Prior Art
  • Recently, motion controllers such as the controller for the Nintendo® Wii® (sometimes known as the “Wiimote”) have become popular for controlling entertainment device. Such controllers typically use motion data from internal motion sensors such as gyroscopes and accelerometers combined with optical data generated in dependence on a light source mounted on a display to detect the position and orientation of the motion controller. Accordingly, a user can use the motion controller to point at objects displayed on the display to select one of the objects. Furthermore, the user can move the motion controller to control motion of a game character. However, such motion control is limited to objects displayed on the display.
  • Additionally, so-called three-dimensional (3D) TVs which can display images to be viewed in 3D are becoming more popular. Such TVs allow a user to perceive a three-dimensional image, for example by the user wearing suitable viewing glasses to view the TV. As such, there is increasing interest in videogames which can output images which can be displayed on 3D TVs so that the user perceives the video game content as three-dimensional images.
  • It is an object of the present invention to provide improved techniques for three-dimensional display.
  • SUMMARY OF THE INVENTION
  • In a first aspect, there is provided an entertainment device comprising: means for displaying an image on a display to a user; means for detecting a three-dimensional position of an object in front of an image plane of the display; means for detecting a source position of a pointing source with respect to the display, the pointing source having an associated pointing direction indicative of a direction in which the pointing source is pointing; means for detecting a pointing direction of a user control device, the pointing direction of the user control device being associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source; and means for detecting whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
  • In a second aspect, there is provided an entertainment method comprising: displaying an image on a display to a user; detecting a three-dimensional position of an object in front of an image plane of the display; detecting a source position of a pointing source with respect to the display, the pointing source having an associated pointing direction indicative of a to direction in which the pointing source is pointing; detecting a pointing direction of a user control device, the pointing direction of the user control device being associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source; and detecting whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
  • By detecting whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source, embodiments of the invention can advantageously detect whether a user is using the user control device to cause the pointing source to point at an object in front of the image plane of the display. For example, a user could use the user control device to point at a real object in front of the display to select that object for image processing. As another example, during a game executed by the entertainment device, the user could use the user control device to point at a computer generated object which is caused to appear in front of the image plane of the display. Embodiments of the present invention therefore can provide a more immersive and diverse experience for a user. In some embodiments, the pointing source corresponds to a position of a game character within a game executed by the entertainment device. In other embodiments, the pointing source corresponds to the position of the user control device.
  • In a third aspect, there is provided an entertainment device comprising: generating means for generating a three-dimensional image to be displayed on a display; means for detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; and means for detecting a source position and pointing direction of a user control device for controlling the entertainment device, the source position being indicative of a position of the user control device with respect to the display, and the pointing direction being indicative of a direction in which the user control device is pointing with respect to the display; in which the generating means is operable to generate the three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the source position and the pointing direction of the user control device.
  • In a fourth aspect, there is provided an entertainment method comprising: generating, using an entertainment device, a three-dimensional image to be displayed on a display; detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; detecting a source position and pointing direction of a user control device for to controlling the entertainment device, the source position being indicative of a position of the user control device with respect to the display, and the pointing direction being indicative of a direction in which the user control device is pointing with respect to the display; and generating the three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the source position and the pointing direction of the user control device.
  • By generating a three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the source position and the pointing direction of the user control device, embodiments of the present invention advantageously provide a more interactive user experience. For example, the user control device could act as a flame thrower within a game with the flames appearing to come from the motion controller. A more powerful and immersive 3D experience can therefore be provided to the user.
  • In a fifth aspect, there is provided an entertainment device comprising: generating means for generating a three-dimensional image to be displayed on a display; means for detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; means for detecting the apparent position of the three-dimensional image with respect to the display; and controlling means for controlling the appearance of the three-dimensional image in dependence upon the relative position of the three-dimensional image with respect to the image display region.
  • In a sixth aspect, there is provided an entertainment method comprising: generating a three-dimensional image to be displayed on a display; detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; detecting the apparent position of the three-dimensional image with respect to the display; and controlling the appearance of the three-dimensional image in dependence upon the relative position of the three-dimensional image with respect to the image display region.
  • Embodiments of the present invention advantageously address a problem which may occur when displaying 3D images, for example when rendering 3D images for 3D video games which use pseudo-realistic physics engines. Typically, such video games may use realistic physics models to predict the motion of objects within a game such as debris from an explosion. The motion of objects within physics based video games or physics based virtual reality simulations typically allows 3D objects to travel unrestricted through the virtual environment.
  • However, this may mean that objects may quickly move out of the 3D image display region. This can degrade the 3D effect and possibly cause the user to experience headaches and/or nausea because it is more likely that the user may only be able to view one image of a stereo pair if the object is outside the three-dimensional image display region. Therefore, embodiments of the invention control the appearance of the 3D image in dependence upon the relative position of the 3D image with respect to the 3D image display region.
  • For example, the path or trajectory of the object (such as explosion debris in a game) could be controlled so that the object remains within the 3D image display region. As another example, the appearance of the 3D image could be caused to fade when a position of the 3D image is within a threshold distance of an edge of the 3D image display region. Accordingly, a likelihood that the user experiences nausea and/or headaches can be reduced. Furthermore, a more diverse and immersive 3D experience can be provided, because the user is more likely to feel comfortable experiencing the 3D experience for longer periods of time.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other objects, features and advantages of the invention will be apparent from the following detailed description of illustrative embodiments which is to be read in connection with the accompanying drawings, in which:
  • FIG. 1 is a schematic diagram of an entertainment device;
  • FIG. 2 is a schematic diagram of a cell processor;
  • FIG. 3 is a schematic diagram of a video graphics processor;
  • FIGS. 4A and 4B are schematic diagrams of a stereoscopic camera and captured stereoscopic images;
  • FIG. 5A is a schematic diagram of a stereoscopic camera;
  • FIGS. 5B and 5C are schematic diagrams of a viewed stereoscopic image;
  • FIG. 6 is a schematic diagram of a motion controller in accordance with embodiments of the present invention;
  • FIG. 7 is a schematic diagram of a user using the motion controller to control the entertainment device in accordance with embodiments of the present invention;
  • FIG. 8 is a schematic diagram of a user using the motion controller to control the entertainment device in accordance with embodiments of the present invention;
  • FIG. 9 is a schematic diagram of a user using the motion controller to control the entertainment device in accordance with embodiments of the present invention;
  • FIG. 10 is a schematic diagram showing a plan view of the user seen from above viewing a display at two different horizontal positions in accordance with embodiments of the present invention;
  • FIG. 11 is a schematic diagram of the user using the motion controller to point at a computer generated object as seen from one side of the display in accordance with embodiments of the present invention;
  • FIG. 12 is a schematic diagram of a three-dimensional (3D) image display region shown with respect to the display seen from above in accordance with embodiments of the present invention;
  • FIG. 13 is a schematic diagram of the 3D image display region viewed from the side in accordance with embodiments of the present invention;
  • FIG. 14 is a schematic diagram of control of the appearance of a 3D image in accordance with embodiments of the present invention;
  • FIG. 15 is a schematic diagram of a 3D icon field in accordance with embodiments of the present invention;
  • FIG. 16 is a flow chart of an entertainment method in accordance with embodiments of the present invention;
  • FIG. 17 is a flow chart of an entertainment method in accordance with embodiments of the present invention; and
  • FIG. 18 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • An entertainment device and entertainment methods are disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to a person skilled in the art that these specific details need not be employed to practise the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity where appropriate.
  • FIG. 1 schematically illustrates the overall system architecture of a Sony® Playstation 3® entertainment device. A system unit 10 is provided, with various peripheral devices connectable to the system unit.
  • The system unit 10 comprises: a Cell processor 100; a Rambus® dynamic random access memory (XDRAM) unit 500; a Reality Synthesiser graphics unit 200 with a dedicated video random access memory (VRAM) unit 250; and an I/O bridge 700.
  • The system unit 10 also comprises a Blu Ray® Disk BD-ROM® optical disk reader 430 for reading from a disk 440 and a removable slot-in hard disk drive (HDD) 400, accessible through the I/O bridge 700. Optionally the system unit also comprises a memory card reader 450 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 700.
  • The I/O bridge 700 also connects to four Universal Serial Bus (USB) 2.0 ports 710; a gigabit Ethernet port 720; an IEEE 802.11b/g wireless network (Wi-Fi) port 730; and a Bluetooth® wireless link port 740 capable of supporting up to seven Bluetooth connections.
  • In operation the I/O bridge 700 handles all wireless, USB and Ethernet data, including data from one or more game controllers 751. For example when a user is playing a game, the I/O bridge 700 receives data from the game controller 751 via a Bluetooth link and directs it to the Cell processor 100, which updates the current state of the game accordingly.
  • The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 751, such as: a remote control 752; a keyboard 753; a mouse 754; a portable entertainment device 755 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 756; a microphone headset 757; and a motion controller 758. The motion controller 758 will be described in more detail later below.
  • In embodiments of the present invention, the video camera is a stereoscopic video camera 1010. Such peripheral devices may therefore in principle be connected to the system unit 10 wirelessly; for example the portable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 757 may communicate via a Bluetooth link.
  • The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
  • In addition, a legacy memory card reader 410 may be connected to the system unit via a USB port 710, enabling the reading of memory cards 420 of the kind used by the Playstation® or Playstation 2® devices.
  • In the present embodiment, the game controller 751 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link However, the game controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 751. In addition to one or more analogue joysticks and to conventional control buttons, the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller. In the case of the Playstation Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).
  • The remote control 752 is also operable to communicate wirelessly with the system unit 10 via a Bluetooth link The remote control 752 comprises controls suitable for the operation of the Blu Ray Disk BD-ROM reader 430 and for the navigation of disk content.
  • The Blu Ray Disk BD-ROM reader 430 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader 430 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. The reader 430 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
  • The system unit 10 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesiser graphics unit 200, through audio and video connectors to a display and sound output device 300 such as a monitor or television set having a display 305 and one or more loudspeakers 310. The audio connectors 210 may include conventional analogue and digital outputs whilst the video connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 720p, 1080i or 1080p high definition. Audio processing (generation, decoding and so on) is performed by the Cell processor 100. The Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks.
  • In the present embodiment, the video camera 756 comprises a single charge coupled to device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 10. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 10, for example to signify adverse lighting conditions. Embodiments of the video camera 756 may variously connect to the system unit 10 via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
  • In general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 10, an appropriate piece of software such as a device driver should be provided. Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
  • Referring now to FIG. 2, the Cell processor 100 has an architecture comprising four basic components: external input and output structures comprising a memory controller 160 and a dual bus interface controller 170A,B; a main processor referred to as the Power Processing Element 150; eight co-processors referred to as Synergistic Processing Elements (SPEs) 110A-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 180. The total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine.
  • The Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (L1) cache. The PPE 150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 150 is to act as a controller for the Synergistic Processing Elements 110A-H, which handle most of the computational workload. In operation the PPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 110A-H and monitoring their progress. Consequently each Synergistic Processing Element 110A-H runs a kernel whose role is to fetch a job, execute it and to synchronise with the PPE 150.
  • Each Synergistic Processing Element (SPE) 110A-H comprises a respective Synergistic Processing Unit (SPU) 120A-H, and a respective Memory Flow Controller (MFC) 140A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142A-H, a respective Memory Management Unit (MMU) 144A-H and a bus interface (not shown).
  • Each SPU 120A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 130A-H, expandable in principle to 4 GB. Each SPE gives a theoretical 25.6 GFLOPS of single precision performance. An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. The SPU 120A-H does not directly access the system memory XDRAM 500; the 64-bit addresses formed by the SPU 120A-H are passed to the MFC 140A-H which instructs its DMA controller 142A-H to access memory via the Element Interconnect Bus 180 and the memory controller 160.
  • The Element Interconnect Bus (EIB) 180 is a logically circular communication bus internal to the Cell processor 100 which connects the above processor elements, namely the PPE 150, the memory controller 160, the dual bus interface 170A,B and the 8 SPEs 110A-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 110A-H comprises a DMAC 142A-H for scheduling longer read or write sequences. The EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction. The theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2 GHz.
  • The memory controller 160 comprises an XDRAM interface 162, developed by Rambus Incorporated. The memory controller interfaces with the Rambus XDRAM 500 with a theoretical peak bandwidth of 25.6 GB/s.
  • The dual bus interface 170A,B comprises a Rambus FlexIO® system interface 172A,B. The interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170A and the Reality Simulator graphics unit 200 via controller 170B.
  • Data sent by the Cell processor 100 to the Reality Simulator graphics unit 200 will to typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
  • Referring now to FIG. 3, the Reality Simulator graphics (RSX) unit 200 is a video accelerator based upon the NVidia® G70/71 architecture that processes and renders lists of commands produced by the Cell processor 100. The RSX unit 200 comprises a host interface 202 operable to communicate with the bus interface controller 170B of the Cell processor 100; a vertex pipeline 204 (VP) comprising eight vertex shaders 205; a pixel pipeline 206 (PP) comprising 24 pixel shaders 207; a render pipeline 208 (RP) comprising eight render output units (ROPs) 209; a memory interface 210; and a video converter 212 for generating a video output. The RSX 200 is complemented by 256 MB double data rate (DDR) video RAM (VRAM) 250, clocked at 600 MHz and operable to interface with the RSX 200 at a theoretical peak bandwidth of 25.6 GB/s. In operation, the VRAM 250 maintains a frame buffer 214 and a texture buffer 216. The texture buffer 216 provides textures to the pixel shaders 207, whilst the frame buffer 214 stores results of the processing pipelines. The RSX can also access the main memory 500 via the EIB 180, for example to load textures into the VRAM 250.
  • The vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered.
  • The pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel. Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture).
  • The render pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image. Optionally, if the intervening pixel process will not affect depth values (for example in the absence of transparency or displacement mapping) then the render pipeline and vertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency. In addition, the render pipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image.
  • Both the vertex shaders 205 and pixel shaders 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined to pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second. The total floating point performance of the RSX 200 is 1.8 TFLOPS.
  • Typically, the RSX 200 operates in close collaboration with the Cell processor 100; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene. In this case, the PPU 155 of the Cell processor may schedule one or more SPEs 110A-H to compute the trajectories of respective batches of particles. Meanwhile, the RSX 200 accesses any texture data (e.g. snowflakes) not currently held in the video RAM 250 from the main system memory 500 via the element interconnect bus 180, the memory controller 160 and a bus interface controller 170B. The or each SPE 110A-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to the video RAM 250; the DMA controller 142A-H of the or each SPE 110A-H addresses the video RAM 250 via the bus interface controller 170B. Thus in effect the assigned SPEs become part of the video processing pipeline for the duration of the task.
  • In general, the PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled. The disabling of one SPE provides a greater level of tolerance during fabrication of the Cell processor, as it allows for one SPE to fail the fabrication process. Alternatively if all eight SPEs are functional, then the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor.
  • The PPU 155 can assign tasks to SPEs in several ways. For example, SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video and audio decoding, and error masking, with each step being assigned to a separate SPE. Alternatively or in addition, two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above.
  • Software instructions implemented by the Cell processor 100 and/or the RSX 200 may be supplied at manufacture and stored on the HDD 400, and/or may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or internet connection, or via combinations of these.
  • The software supplied at manufacture comprises system firmware and the Playstation 3 device's operating system (OS). In operation, the OS provides a user interface enabling a to user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video. The interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally. The user navigates by moving through the function icons (representing the functions) horizontally using the game controller 751, remote control 752 or other suitable control device so as to highlight a desired function icon, at which point options pertaining to that function appear as a vertically scrollable list of option icons centred on that function icon, which may be navigated in analogous fashion. However, if a game, audio or movie disk 440 is inserted into the BD-ROM optical disk reader 430, the Playstation 3 device may select appropriate options automatically (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HDD 400).
  • In addition, the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demonstration games (demos) and other media may be downloaded, and a friends management capability, providing on-line communication with other Playstation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available. The on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the Playstation 3 device itself. It will be appreciated that the term “on-line” does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.
  • Referring now to FIGS. 4A and 4B, in conventional stereoscopic image generation a stereoscopic camera 1010 generates a pair of images whose viewpoints are separated by a known distance equal to average eye separation. In FIG. 4A, both lenses of the stereoscopic camera are looking at a sequence of objects P, Q, R, S and T, and two additional objects N and O (assumed for the purposes of explanation to be positioned above the other objects). As can be seen in the resulting pair of images comprising left-eye image 1012 and right-eye image 1014, the different image viewpoints result in a different image of the objects from each lens. In FIG. 4B, an overlay image 1020 of the stereoscopic image pair illustrates that the displacement between the objects within the image pair 1012 and 1014 is inversely proportional to the distance of the object from the stereoscopic camera.
  • Subsequently, the stereoscopic image pair is displayed via a display mechanism (such as alternate frame sequencing and glasses with switchably occluded lenses, or lenticular lensing on an autostereoscopic display screen) that delivers a respective one of the pair of to images (1012, 1014) to a respective eye of the viewer, and the object displacement between the images delivered to each eye causes an illusion of depth in the viewed content.
  • This relative displacement between corresponding image elements of the left- and right-image is also referred to in stereoscopy as parallax (as distinct from the visual effect of objects at different distances panning at different speeds, also sometimes known as the parallax effect). In the context of stereoscopy, so-called ‘positive parallax’ causes an object to appear to be within or behind the plane of the screen, and in this case the displacement is such that a left eye image element is to the left of a right eye image element. Meanwhile, ‘negative parallax’ causes an object to appear to be in front of the plane of the screen, and in this case the displacement is such that a left eye image element is to the right of a right eye image element, as is the case in FIGS. 4A and 4B. Finally, ‘zero parallax’ occurs at the plane of the screen, where the user focuses their eyes and hence there is no displacement between left and right image elements.
  • Referring now to FIGS. 5A and 5B, the display mechanism and the position of the viewer in combination determine whether the apparent distance of the objects is faithfully reproduced. Firstly, the size of the display acts as a scaling factor on the apparent displacement (parallax) of the objects; as a result a large screen (such as in a cinema) requires a greater distance from the user (i.e. in the cinema auditorium) to produce the appropriate parallax. Meanwhile a smaller screen such as that of a domestic television requires a smaller distance.
  • In FIG. 5A, reference is made only to objects P and T for clarity, but it will appreciated that the following holds true for all stereoscopic image elements. In FIG. 5A, the respective distances from the objects P and T to the stereoscopic camera 1010 are δP and δT. As described previously, these respective distances result in different displacements for the objects between the images captured by the two lenses of the stereoscopic camera, as seen again in the overlay 1020 of the two captured images in FIG. 5B. In this case, both objects show negative parallax.
  • As seen in FIG. 5B, there is a small displacement (negative parallax) for distant object T and a large displacement (negative parallax) for nearby object P.
  • With a suitable 3D display arranged to project the image from the left lens to the viewers left eye, and the image from the right lens to the viewer's right eye, the viewer's brain interprets the position of the objects as being located at the point of intersection of the respective lines of sight of each eye and the object as depicted in each image. In FIG. 5B, these are at distances δP′ and δT′ from the user.
  • Where the factors of the size of display, the distance of the user (and individual eye separation) are correct, as in FIG. 5B, then δP′≈δP and δT′≈δT. Moreover, the relative distance between these objects (δT′−δP′) is substantially the same as in the original scene (δT−δP). Consequently in this case the sense of depth experienced by the viewer feels correct and natural, with no distortion in the separation of depth between objects or image elements in or out of the plane of the image.
  • Notably the apparent depth is basically correct as long as the distance of the user is correct for the current screen size, even if the user is not central to the image; as can be seen in FIG. 5C, where again δP′≈δPT′≈δT and (δT′−δP′)≈(δT−δP).
  • It will be appreciated that in this case for the sake of explanation the effective magnification of the captured scene is 1:1. Of course typically different scenes may zoom in or out, and different screen sizes also magnify the reproduced image. Thus more generally the apparent depth is correct if the apparent scale (magnification) along the depth or ‘Z’ axis is the same as the apparent scale in the ‘X’ and ‘Y’ axis of the image plane.
  • In an embodiment, the distance of the viewer is detected using, for example, a video camera such as the EyeToy coupled with a remote distance measuring system, such as an infra-red emitter and detector. Such combined devices are currently available, such as for example the so-called ‘z-cam’ from 3DV Systems (http://www.3dvsystems.com/). Alternatively a stereoscopic video camera can be used to determine the distance of the user based on the same displacement measurements noted for the stereoscopic images as described above. Another alternative is to use a conventional webcam or EyeToy camera and to use known face recognition techniques to identify faces or heads of viewers, and from these to generate a measure of viewer distance from the display screen. In any of these cases, the relative position of the camera with respect to the 3D image display is also known, so that the distance from the viewer to the 3D image can be computed.
  • As mentioned above, the Bluetooth 740, Wi-Fi 730 and ethernet 720 ports provide interfaces for peripherals such as the motion controller 758. The operation of the motion controller 758 in cooperation with the system unit 10 will now be described with reference to FIGS. 6 and 7.
  • FIG. 6 is a schematic diagram of a motion controller in accordance with embodiments of the present invention.
  • The motion controller 758 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link However, the motion controller 758 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the motion controller 758. The motion controller 758 comprises motion sensors such as accelerometers and gyroscopes which are sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each of three orthogonal axes. Consequently gestures and movements by the user of the motion controller 758 may be translated as inputs to a game or other application executed by the system unit 10. However, it will be appreciated that the motion controller 758 could be sensitive to any other suitable number of degrees of freedom as appropriate. The motion controller 758 is operable to transmit motion data generated by the motion sensors to the system unit via the Bluetooth link or any other suitable communications link
  • The motion controller 758 comprises a light source 2005 and four input buttons 2010. The system unit 10 is operable to carry out image processing on images captured by the camera 756 using known techniques so as to detect a position of the light source 2005 within images captured by the camera 756. Therefore, the system unit 10 can track the position of the light source 2005 within the captured images and hence track the position of the motion controller 758 with respect to the camera 756. Accordingly, the system unit 10 is operable to generate position tracking data from the captured images which relates to the position of the light source 2005 with respect to the camera 756 or system unit 10.
  • The system unit 10 is operable to combine data from the motion sensors of the motion controller 758 with the position tracking data. This allows a more accurate estimation of the position (e.g. x, y, z, coordinates) and attitude (e.g. pitch, roll, yaw) of the motion controller 758 because any error in either of the position tracking data or the motion data can be compensated for by data from the other data source. In other words, the motion controller 758 acts as a user control device.
  • In embodiments, the light source 2005 comprises one or more light emitting diodes (LED) so as provide a full colour spectrum of colours in the visible range. However, any other suitable light source such as organic light emitting diodes (OLED), incandescent bulbs, laser light sources, and the like may be used. In the embodiment shown in FIG. 6, the light source 2005 comprises a spherical translucent housing which surrounds the LEDs, although it will be appreciated that any other suitable shape could be used. Furthermore, the light source need not comprise the housing and the light source could comprise one or more light sources mounted so that light from the light sources can be emitted to the surroundings of the motion controller 758.
  • The input buttons 2010 act in a similar manner to control buttons on the game controller 751 as described above. Although the embodiment of FIG. 6 has four input buttons, it will be appreciated that the motion controller 758 could comprise any number of input buttons or indeed not have any input buttons. Furthermore, the motion controller 758 could comprise one or more analogue or digital joysticks for controlling the system unit 10.
  • In embodiments, the motion controller is operable to control a colour and/or intensity of light emitted by the light source 2005. Alternatively, the colour and/or intensity of the light source 2005 can be controlled by the system unit 10 by sending appropriate command via the wireless link.
  • The control of the colour and/or intensity of the light source 2005 can be useful when two or more motion controllers are to be used to control the system unit 10. For example, each motion controller could be caused to have a different colour, thereby allowing the motion controllers to be distinguished from each other in the images captured by the camera 756. Furthermore, the colour of the light source can be controlled so as to contrast with an environmental background colour. This improves the reliability of the position tracking data because the light source is more easily distinguished from the background.
  • In some embodiments, the colour and/or intensity of the light source 2005 can be controlled in response to game state or operational state. For example, a red colour could indicate that the motion controller is off or not paired with the system unit 10. In another example, the light source 2005 could be caused to turn red if a game player is shot in a game, or caused to turn blue or flash different colours if a game player casts a spell in a game.
  • FIG. 7 shows an example of a user 2015 using the motion controller to control the system unit 10 in accordance with embodiments of the present invention. As shown in FIG. 7, the user 2015 is using the motion controller 758 to point at the display 300. The system unit 10 is operable to analyse images captured by the camera 756 and correlate the motion data from the motion sensors to detect the relative position of the light source 2005 with respect to the camera 756 (as denoted by a vector A in FIG. 7). The cell processor 100 is therefore operable to detect the pointing direction (as indicated by a vector B) of the motion controller 758. The user 2015 can therefore use the motion controller 758 to control a game or other functionality of the system unit 10 by appropriate manipulation of the motion controller 758.
  • For example, the user 2015 could point the motion controller 758 at menu item displayed as part of a menu on the display 300 and activate one of the input buttons 2010 so as to select that menu item. As another example, the user 2015 could use the motion controller 758 to point at a game object displayed on the display which they wish their game character to pick up and use within the game.
  • However, whilst this can provide some extra functionality, the user 2015 is limited in using the motion controller to point at object displayed on the display 300. To provide additionally functionality, in embodiments, the user 2015 can use the motion controller 758 to point at objects that are in front of an image plane of the display 300. This will now be described in more detail with reference to FIGS. 8 and 9.
  • FIG. 8 shows a schematic diagram of user using the motion controller 758 to control the system unit 10 in accordance with embodiments of the present invention. In particular, FIG. 8 shows the user 2015 using the motion controller 758 to point at an object 2020. The position of the motion controller 758 with respect to the camera as indicated by the vector A is detected by the cell processor by analysis of the images captured by the camera 756. As mentioned above, the cell processor 100 is operable to carry out known image analysis techniques on the captured images so as to detect the location of the light source within the captured images.
  • The direction in which the motion controller is pointing (referred to herein as a pointing direction) is denoted by the vector B in FIG. 8. The cell processor 100 is operable to combine the motion data (such as pitch, roll, and yaw data and x, y, z position data) from the motion sensors of the motion controller 758 with the position data of the light source as generated from the captured images so as to detect the position and pointing direction of the motion controller 758.
  • In embodiments, the object 2020 can be a real physical object. For example, the object 2020 could be a tennis ball as shown schematically in FIG. 8. As there are potentially a large number of real physical objects which a user could point at, in embodiments, the system unit 10 is operable to cause a list of possible objects for detection to be displayed on the display 300. For example, the list of possible objects for detection could comprise a ball, a face, a box and the like. In embodiments, each object is associated with a template for pattern matching by the cell processor 100. The cell processor 100 is operable to carry out known image recognition techniques and image processing techniques such as pattern matching and face detection so as to detect the position of the object 2020 with respect to the camera 756.
  • Whilst geometrically simple objects such as a ball or a box are likely to be less computationally expensive to detect, it will be appreciated that any suitable object could be detected subject to the necessary processing resources being available.
  • As mentioned above, in embodiments, the cell processor 100 is operable to carry out image processing techniques on the images captured by the camera so as to detect a three-dimensional position of the object 2020 with respect to the camera 756 as denoted by the vector C in FIG. 8. The detection of the position of the horizontal and vertical position of the object can be carried out using known image processing and pattern matching techniques to locate the object in the captured images. However, to detect the three-dimensional position of the object 2020, the distance from the camera 756 to the object 2020 also needs to be detected.
  • In embodiments, the system unit 10 is operable to detect the distance from the camera 756 to the object in a similar manner to that described above for detecting the distance of the viewer. For example, a so-called “z-cam” (from 3DV Systems (http://www.3dvsystems.com/)) could be used, or a stereoscopic pair of cameras could be used to detect the distance to the object 2020. However, it will be appreciated that any suitable technique for detecting the distance between the object 2020 and the camera 756 could be used.
  • In the embodiment shown in FIG. 8, the motion controller 758 acts as a pointing source with the pointing direction being in the direction of the vector B. The cell processor 100 is operable to detect whether the three-dimensional position of the object 2020 lies on a line passing through the source position in the direction of the pointing direction of the pointing source. In other word, the cell processor 100 is operable to detect whether the object 2020 corresponds to the pointing direction of the motion controller 758. Therefore, the system unit 10 can detect whether the user 2015 is pointing at the object 2020.
  • The system unit 10 is operable to enact functions of the system unit in response to detection that the user 2015 is pointing at the object 2020. For example, the user 2015 could point at an object to select that object for further image processing or to act as an object within a game. However, if will be appreciated that the pointing functionality of pointing at an object in front of the image plane of the display 300 could be used in any suitable way to interact with the system unit 10.
  • Although the embodiment shown in FIG. 8, the motion controller 758 acts as the pointing source, in other embodiments, the motion controller 758 need not be the pointing source. This will now be described in more detail with reference to FIG. 9.
  • FIG. 9 shows a schematic diagram of the user 2015 using the motion controller 758 to control the system unit 10 in accordance with embodiments of the present invention. However, in the embodiment illustrated with respect to FIG. 9, the system unit 10 is operable to cause the display 300 to display a game character 2025, which acts as the pointing to source. In other words, in some embodiments, the pointing source corresponds to a position of the game character within a game executed by the system unit 10. However, in other embodiments, the pointing source corresponds to the position of the motion controller 758. In embodiments, the pointing source has an associated pointing direction indicative of a direction in which the pointing source is pointing.
  • The cell processor 100 is operable to determine the position of the game character 2025 with respect to the camera 756. As illustrated in FIG. 9, the position of the game character 2025 with respect to the camera is illustrated by the vector A′. To achieve this functionality, the position of the camera 756 with respect to the display 300 can be calibrated by a user via a suitable calibration user interface executed by the system unit 10 and displayed on the display 300. Alternatively the user 2015 could be instructed to place the camera 756 at a predetermined position with respect to the display 300 as instructed by a suitable message displayed 300 on the display by the system unit 10. As part of the calibration or setup process, a user may also input information regarding make and model of the display, screen size, display resolution and the like. Alternatively, this information may be acquired automatically by the system unit 10 via a suitable communication link between the display 300 and the system unit 10. However, it will be appreciated that any suitable method for determining the position of the camera with respect to the display 300 could be used.
  • As shown in FIG. 9, in some embodiments, the pointing source corresponds to the position of the game character 2025. In these embodiments, game character is the source of the pointing direction as indicated by the vector B′. The pointing direction of the motion controller 758 (as indicated by a vector B″ with the motion controller 758 as the source of the vector B″) is associated with the pointing direction B′ of the game character 2025. In embodiments, the cell processor is operable to associate the pointing direction B′ of the game character 2025 with the pointing direction B″ of the motion controller 758 so that they are parallel with each. However, any other appropriate association between the pointing direction of the game character 2025 and the pointing direction of the motion controller 758 could be used.
  • More generally, in embodiments, the pointing direction of the motion controller 758 (user control device) is associated with the pointing direction of the pointing source so that a change in the pointing direction of the motion controller 758 (user control device) causes a change in the pointing direction of the pointing source. In embodiments, this applies when the motion controller acts as the pointing source and when a computer generated object such as a game character acts as the pointing source. In other words, in embodiments, movement of the to motion controller 758 such that the pointing direction of the motion controller 758 changes causes a change in the pointing direction of the pointing source.
  • As mentioned above, the system unit 10 is also operable to detect the position of the object 2020 with respect to the camera 756. Therefore, the three-dimensional position of the object 2020 with respect to the camera 756 can be detected in a similar manner to that described above with reference to FIG. 8. The position of the object 2020 is denoted by the vector C′ in FIG. 9.
  • As mentioned above, the pointing direction of the pointing source is associated with the pointing direction of the motion controller 758. In the embodiment illustrated in FIG. 9, the pointing direction B′ of the game character 2025 is associated with the pointing direction B″ of the motion controller 758 so that B′ is parallel to B″. For example, the user 2015 could manipulate the motion controller 758 so that the game character 2025 points at the object 2020. The cell processor 100 is operable to detect whether the three-dimensional position of the object (as defined by C′) lies on a line passing through the source position in the direction of the pointing direction of the pointing source (game character 2025).
  • Therefore, in embodiments, the cell processor 100 can detect whether the game character 2025 is pointing at the object 2025. Additionally, in some embodiments, the cell processor 100 is operable to generate the game character 2025 so that the pointing direction B′ is associated with the pointing direction B″ of the motion controller 758 so that the pointing direction B′ moves with changes in the pointing direction B″. Therefore, the user 2015 can control the pointing direction B′ of the game character 2025 by appropriate manipulation and control of the motion controller 758. In other words, in some embodiments, the pointing source corresponds to a position of a game character within a game executed by the entertainment device. However, it will be appreciated that the pointing source could correspond to a position of any suitable computer generated object generated by the system unit 10 (entertainment device).
  • In some embodiments, the game character 2025 is a two-dimensional image displayed on the display 300 by the system unit 10. However, using a two-dimensional image can mean that it is difficult to convey the pointing direction B′ of the game character to the user 2015. For example, it may be difficult to convey to the user 2015 that the game character 2025 is pointing at the object 2020 if the game character is displayed as a two dimensional image. Therefore, in some embodiments, the cell processor 100 is operable to generate the game character 2025 so that when displayed on the display 300 the game character 2025 appears as a three-dimensional image. In embodiments, this is achieved using the 3D viewing described above with reference to FIGS. 4A, 4A, 5A, 5B, and 5C, although it will be appreciated that any other suitable technique could be used.
  • Furthermore, where the game character 2025 is displayed as a 3D representation, in some embodiments, the three-dimensional position of the game character 2025 may be generated by the cell processor 100 so that the three-dimensional position is appears at an image plane which is different from the image plane of the display. In this case, the cell processor 100 is operable to determine the position A′ of the game character by calculating the apparent position of the game character with respect to the display 300. This technique will be described in more detail later below.
  • In the embodiments described above with reference to FIGS. 8 and 9, the object 2020 was treated as a real object. However, in some embodiments, the object 2020 is a computer generated object. For example the cell processor 100 could generate the object 2020 as a stereo pair of images such that the object appears in front of the display 300. However, the position of the object 2020 should then be determined so that a user can user the motion controller 758 to point to the object 2020. A way in which this may be achieved in accordance with embodiments of the present invention will now be described with reference to FIGS. 10 and 11.
  • FIG. 10 shows a schematic plan view of the user seen from above viewing the display at two different horizontal positions. The cell processor 100 is operable to cause the display 300 to display a stereo pair of images corresponding to the object 2020 so that, when viewed in a suitable manner (for example by using polarised glasses) the user 2015 will perceive the object 2020 in front of the display 300. In FIG. 10, the user 2015 is illustrated being positioned at a first position (position 1) and a second position (position 2), which is to the right of position 1. When the user 2015 is at position 1, then the user should perceive the object 2020 at a first object position 2035. However, when the user is at position 2, the user should perceive the object 2020 at a second object position 2040. Therefore, it will be understood from FIGS. 5A, 5B, 5C and 10, that a three-dimensional position at which the computer generated object will appear to the users will depend on the position of the user with respect to the display 300.
  • Therefore, in embodiments, the system is operable to detect the position of the user's face with respect to the display. In embodiments, the cell processor 100 is operable to carry out known face recognition techniques on the images captured by the camera 756 so as to detect the position of the user's face with respect to the camera 756. Providing the position of to the camera with respect to the display 300 is known, the cell processor can calculate the position of the user's face with respect to the display. Furthermore, as the position at which the object will appear to the user depends on the distance from the user to the display as described above, in embodiments, the system unit 10 is operable to detect the distance from the display 300 to the user 2015 as described above.
  • To determine the position of the object with respect to the display, the object is assumed to lie on a line (as indicated by the dotted line 2045) between a midpoint 2050 between the stereo pair of images 2030 and a midpoint 2055 between the user's eyes. Additionally, the distance between the user's eyes can be estimated by the cell processor 100, either by analysis of the captured images or by using the average interpupillary distance (distance between the eyes) of adult humans. However, any other suitable technique for estimating the interpupillary distance could be used.
  • The cell processor 100 is operable to calculate lines between the stereo pair of images 2030 and the user's eyes as indicated by the dashed lines 2060 and 2065. As the distance between the stereo pair of images 2030 can be calculated from known parameters of the display such as display resolution, screen size and the like, the apparent position of the computer generated object in the horizontal and vertical directions can be estimated from the intersection of the lines 2045, 2060, and 2065. For example the lines 2045, 2060 and 2065 interest at the first object position 2035. Therefore the computer generated object can reasonably be assumed to be perceived by the user at the first object position 2035. Similarly, when the user is at position 2, the object can be assumed to appear to the user at the second object position 2040. More generally, the apparent position of the object in the horizontal and vertical directions can be calculated by the cell processor for any stereo pair of images in a similar manner to that described above for the first object position 2035.
  • In some embodiments, the line 2045 is not calculated so as to save processing resources. However, the use of three intersecting lines to calculate the apparent position of the object can improve the accuracy of the calculation.
  • Here, the directions horizontal (x) and vertical (y) are taken to mean direction in the image plane of the display. The depth (z) is taken to mean the distance from the display 300. However, it will be appreciated that any other suitable coordinate system could be used to define the apparent position of the object, the user, the motion controller, and the like.
  • FIG. 10 illustrates how the horizontal position (x) position of the object and the depth of the object from the display (z) may be determined However, the vertical position (y) to also needs to be determined A technique by which this may be achieved will now be described with reference to FIG. 11.
  • FIG. 11 shows a schematic diagram of the user 2015 using the motion controller 758 to point at a computer generated object as seen from one side of the display 300. As mentioned above, the system unit 10 is operable to detect the distance of the user 2015 from the display 300. Additionally, the cell processor 100 is operable to detect the position of the user's eyes 2090 with respect to the camera 756 using known eye detection techniques (in some cases in combination with known face recognition techniques). In embodiments, the cell processor 100 is operable to detect by analysis of the captured images and depth data generated by the distance detector, a vector D (as illustrated by dashed line 2070 in FIG. 11) which indicates a position of the user's eyes 2090 with respect to the camera 756. As mentioned above, the position of the camera 756 with respect to the display 300 can be input by a suitable calibration process or preset in any other suitable way. Therefore, the cell processor 100 is operable to detect the distance between the user's eyes 2090 and the display 300 using appropriate trigonometry calculations.
  • As shown in FIG. 11, the system unit 10 is operable to cause computer generated objects (such as the stereo pair of images 2030 and a stereo pair of images 2075) to be displayed on the display 300. As FIG. 11 shows a side view of the display, it will be appreciated that the stereo pair 2030 and the stereo pair 2075 will appear as one image when viewed from the side because there share the same horizontal axis, although each stereo pair in fact comprises two images. When viewed in an appropriate manner by the user 2015, the stereo pair of images 2075 should cause the user 2015 to perceive a virtual object at a position 2080 in front of the display 300.
  • The vertical position of a computer generated object is assumed to lie on a line between the user's eyes 2090 and the position of the stereo pair of images corresponding to that object. For example, the position 2080 lies on a line (indicated by the dashed line 2095) between the user's eyes 2090 and the stereo pair of images 2075. Additionally, the position 2035 lies on a line (indicated by dashed line 3000) between the user's eyes 2090 and the stereo pair of images 2030. As mentioned above, the horizontal position (x) and depth (z) can be determined using the technique described above with reference to FIG. 10. In embodiments, the system unit 10 is operable to calculate the vertical position (y) of a computer generated object from the relative position of the user's eyes 2090 with respect to the display 300 and the position of the stereo pairs of images as displayed on the display 300 using appropriate trigonometric calculations.
  • As mentioned above, the system unit 10 is operable to detect whether the three-dimensional position of an object lies on a line passing through the source position in the direction of the pointing direction of the pointing source. In the embodiment illustrated in FIG. 11, the motion controller 758 is the pointing source, and the pointing direction of the motion controller 758 (as indicated by a dotted line 2085 in FIG. 11) corresponds to the three-dimensional position of the object 2035. Therefore, the system unit 10 can detect that the user 2015 is pointing at the position 2035 rather than the position 2080.
  • In other words, by combining the distance data from the camera 756 with the positions of the motion controller 758 and the computer generated objects (such as the position 2035 and the position 2080), the cell processor 100 can detect which object is being pointed at by the user.
  • Accordingly, embodiments of the present invention provide an enhanced technique for allowing a user to interact with an entertainment device. For example, in the case of a computer generated object, the user could use the motion controller 758 to point at the object to select the object in a game, or to pick up the object in a game. In the case of a real object, the user could select a real object for further image processing or as a real object for augmented reality applications. However, it will be appreciated that the embodiments described herein could enable any suitable functionality in which a user can point at an object (computer generated or a real physical object) to be achieved.
  • In some embodiments, the cell processor 100 is operable to generate a three-dimensional image such as a virtual image feature so that the three-dimensional image appears to be associated with the source position and pointing direction of the motion control device 758. This will now be described in more detail with reference to FIGS. 12 and 13.
  • FIG. 12 is a schematic diagram of a three-dimensional (3D) image display region shown with respect to the display 300 seen from above. The 3D image display region 3010 is a region in front of the display 300 in which a three-dimensional image (for example a stereo pair of images) can be perceived as three-dimensional by the user 2015. The 3D image display region 3010 occurs where the viewpoint of both of the user's eyes 2090 overlap. The origin of the 3D image display region can also be understood by referring to FIGS. 5B and 5C.
  • Outside of the 3D image display region 3010, only one of the left or the right image of a stereo pair of images is likely to be seen by one eye. Therefore, outside the 3D image display region, the 3D effect will not be perceived correctly because only one image of the stereo pair is likely to be viewable by the user's eyes 2090. This can cause the user 2015 to experience headaches and/or nausea.
  • In embodiments, the system unit 10 is operable to detect the 3D image display region by detecting the position of the user's eyes 2090 with respect to the display. In embodiments, the cell processor 100 is operable to analyse the images captured by the camera 756 so as to detect the horizontal (x) and vertical (y) positions of the user's eyes with respect to the display 300. Additionally, as mentioned above, the system unit 10 is operable to detect the distance (z) of the user from the display 300. Typically, in the x-z plane (i.e. a plane normal to the image plane of the display when viewed from above), the 3D image display region corresponds to a triangular region bounded by: a line 3015 from the user's right eye 3020 to a left-hand edge 3025 of a display area of the display 300; a line 3030 from the user's left eye 3035 to a right-hand edge 3040 of the display area of the display 300; and a line 3045 corresponding to an image plane of the display 300.
  • In the y-z plane (i.e. a plane normal to the image plane of the display when viewed from the side), the 3D image display region corresponds to a triangle from the user's eyes 2090 to the display 300. This is illustrated in more detail with respect to FIG. 13. FIG. 13 shows a schematic diagram of the 3D image display region viewed from the side. The 3D image display region 3010 is bounded by: a line 3050 from the user's eyes 2090 to a lower edge 3055 of the display area of the display 300; a line 3060 from the user's eyes 2090 to an upper edge 3065 of the display area of the display 300; and a line 3070 corresponding to the image plane of the display 300. The cell processor 100 can then therefore determine the 3D image display region based on the position of the user's eyes 2090 with respect to the display 300.
  • In other words, when considered in three dimensions, the 3D image display area 3010 can be thought of as an irregular four-sided pyramid with a base corresponding to the display area of the display 300 and the apex located between the display and the user's eyes 2090. However, it will be appreciated that any other suitable 3D image display region could be used.
  • In some embodiments, the system unit 10 is operable to generate the three-dimensional image so that it appears to lie along the pointing direction B of the motion controller 758. This is illustrated with respect to FIG. 13.
  • As shown in FIG. 13, the system unit 10 is operable to generate a 3D image 4000. In the example shown in FIG. 13, the 3D image corresponds to flames which are caused to appear to come from the motion controller 758 along the pointing direction B. The user 2015 can therefore manipulate the motion controller 758 to control where the 3D appears to point. Therefore, the motion controller 758 could act as a flame thrower within a game with the flames appearing to come from the motion controller. However, any other suitable functionality could be realised such as tracer bullets, a laser pointer and the like. By generating the 3D image 400 so that it appears to emanate from the motion controller 758, for example to provide an illusion that the user 2015 is pointing or firing a weapon into the display 300, embodiments of the present invention can create a more powerful and immersive 3D experience for the user 2015.
  • In embodiments, the cell processor 100 is operable to generate the 3D image so that it is positioned within the 3D image display region 3010. If the motion controller 758 is detected to be positioned outside the 3D image display region 3010, the cell processor 100 is operable to generate the 3D image 4000 so that the 3D image 4000 only appears within the 3D image display region 3010 rather than outside. This helps prevent the user 2015 from experiencing headaches and/or nausea.
  • The use of a 3D image display region can also be important when rendering 3D images for 3D games which use pseudo-realistic physics engines. Typically, such video games may use realistic physics models to predict the motion of objects within a game such as debris from an explosion. The motion of objects within physics based video games or physics based virtual reality simulations typically allows 3D objects to travel unrestricted through the virtual environment. However, this may mean that objects may quickly move out of the 3D image display region, thus degrading the 3D effect and possibly causing the user to experience headaches and/or nausea. Therefore, to address this problem, in some embodiments, the system unit 10 is operable to control the appearance of the 3D image in dependence upon the relative position of the 3D image with respect to the 3D image display region. This will now be described in more detail with reference to FIGS. 14 and 15.
  • FIG. 14 shows a schematic view of control of the appearance of a 3D image in accordance with embodiments of the present invention. In particular, FIG. 14 shows the user's eyes 2090 and the display 300. As mentioned above, the system unit 10 is operable to generate a 3D image. In some embodiments, the system unit is operable to generate the 3D image so that, over a sequence of image frames, the 3D image appears to move towards the user 2015.
  • For example, a 3D object may be caused by a so-called physics engine to move along a trajectory 4010 as indicated in FIG. 14. However, this may cause the object to leave the 3D image display region 3010 over relatively few image frames. Therefore, in some to embodiments, the cell processor 100 is operable to control the apparent motion of the 3D image so that the 3D image appears to remain within the 3D image display region 3010.
  • This is illustrated schematically in FIG. 14. At a time t1, the 3D image appears at a position 4000 a. At a time t2, the 3D image appears at a position 4000 b. At a time t3, the 3D image appears at a position 4000 c. At a time t4, the 3D image appears at a position 4000 d. Here t1, t2, t3, and t4 refer to image frames of an image sequence. Typically, the frame rate is 1/25 seconds, although it will be appreciated that any suitable frame rate could be used.
  • Therefore, over t1 to t4, the 3D will appear to move towards the user 2015 but remain within the 3D image display region 3010. In other words, in embodiments, the cell processor 100 is operable to steer the object so that it follows a trajectory 4015 and the object remains within the 3D image display region 3010.
  • In contrast, referring to FIG. 14, if the trajectory of the object followed the trajectory 4010, the object would leave the 3D image display region 3010 after the time t2. Therefore, by controlling the path or trajectory of the object so that the object remains within the 3D image display region, embodiments of the present invention can reduce the likelihood that the user will experience any nausea and/or headaches.
  • In some embodiments, the cell processor 100 is operable to control the appearance of the 3D image so that the 3D image appears to fade when a position of the 3D image is within a threshold distance of an edge of the 3D image display region 3010. This is illustrated in FIG. 14.
  • As shown in FIG. 14, the physics engine of a game executed by the system unit 10 may cause the object to follow a trajectory 4020. For example, the cell processor 100 may cause the 3D image to appear at a position 4000 e at a time t5, at a position 4000 f at a time t6, at a position 4000 g at a time t7, and at a position 4000 h at a time t8. In embodiments, the cell processor 100 is operable to detect when the object is within a threshold distance 4050 of an edge (such as edge 4055) of the 3D image display region 3010.
  • If the object is within the threshold distance 4050 of the edge 4055 of the 3D image display region 3010, then the cell processor causes the 3D image to appear to fade at the positions 4000 f, 4000 g, and 4000 h over the corresponding image frame times t6, t7, and t8.
  • However, it will be appreciated that the 3D image could be caused to fade over any suitable number of image frames and that any suitable threshold distance could be used. It should be appreciated that the term “fade” can be taken to mean become more transparent, become less bright or any other suitable image processing operation that causes the object to to become less apparent over a sequence of image frames.
  • In some embodiments, virtual objects which have a predefined relationship to the virtual environment (such as trees at the side of a road) or choreographed objects are caused to fade. However, in some embodiments, free-moving objects such as virtual objects whose position is generated by a physics based engine (e.g. explosion debris) can be caused to be steered. Accordingly, a likelihood that the user experiences nausea and/or headaches is reduced.
  • Additionally, the use of a 3D image display region can help improve the display of icons for controlling the system unit 10. This will now be described with reference to FIG. 15.
  • FIG. 15 shows a schematic diagram of a 3D icon field in accordance with embodiments of the present invention. In particular, FIG. 15 shows a plurality of icons 5005, 5010, 5015, 5020, 5025, 5030, and 5035. The cell processor 100 together with the other functional units of the system unit 10 is operable to generate one or more icons such as those illustrated in FIG. 15. In embodiments, each icon is associated with a control function of the system unit 10. The plurality of icons 5005 to 5035 comprise one or more subgroups of icons. As shown in FIG. 15, a first subgroup comprises the icons 5005, 5010, 5015 and 5020. A second subgroup comprises the icons 5025 and 5030. A third subgroup comprises the icon 5035. Each subgroup can comprise one or more icons. However, it will be appreciated that any number of icons could be used and each subgroup could comprise one or more icons as appropriate.
  • In embodiments, the cell processor 100 is operable to generate the subgroups so that each respective subgroup appears located at a respective distance from the display. As illustrated in FIG. 15, the first subgroup is caused to appear at a distance f from the display 300. The second subgroup is caused to appear at a distance g from the display 300. The third subgroup is caused to appear at a distance h from the display 300.
  • To try to ensure that the icons are positioned within the 3D image display region 3010 so as to preserve the 3D effect and reduce nausea and/or headaches on the part of the user, in embodiments, the cell processor is operable to control the number of icons in each subgroup in dependence upon the relative position of each subgroup with respect to the 3D image display region 3010. For example, at the distance h from the display, there is only space for one icon to be displayed, whereas, at the distance f, four icons can be displayed without having to display some or all of one or more icons outside the 3D image display region 3010. In other words, the cell processor 100 is operable to dynamically adjust the number of icons in each subgroup in dependence on the relative position of the subgroup with respect to the 3D image display position.
  • In some embodiments, more frequently accessed icons can be displayed closer to the user whilst less frequently used icons can be displayed closer to the display 300. However, any other suitable method for ordering the icons could be used. Additionally, in some embodiments, the icons can be generated by the cell processor 100 so that they appear semi-transparent so that the user can see icons which may appear to be behind other icons. However, any other suitable method of displaying the icons could be used.
  • A method of detecting where the motion controller is pointing will now be described with reference to FIG. 16.
  • FIG. 16 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • At a step s10, the cell processor 100 causes an image (such as a control screen, game image and the like) to be displayed to the user 2015.
  • At a step s12, the system unit 10 detects a three-dimensional position of an object in front of an image plane of the display. For example, the object could be a real object or a computer generated object as described above with reference to FIGS. 8 to 11.
  • Then, at a step s14, the system unit 10 detects a source position of a pointing source with respect to the display using techniques as described above with reference to FIGS. 6 to 11. As mentioned above, the pointing source has an associated pointing direction indicative of a direction in which the pointing source is pointing.
  • At a step s16, the system unit detects a pointing direction of a user control device (such as the motion controller 758). As mentioned above, the pointing direction of the user control device (e.g. motion controller 758) is associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source.
  • Then, at a step s 18, the system unit 10 detects whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source. In other words, the system unit 10 can detect whether the user is using the motion controller 758 to point at a real physical object and/or a computer generated object in front of the display.
  • A method of associating a three-dimensional image with the motion controller will now be described with reference to FIG. 17.
  • FIG. 17 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • At a step s20, the system unit 10 generates a 3D image such as the 3D image 4000 described with reference to FIG. 13.
  • At a step s22, the system unit 10 detects, with respect to the display 300, the 3D image display region 3010 in which the 3D image can be perceived as three-dimensional by the user 2015 using the techniques described above.
  • Then, at a step s24, the system unit detects a source position and pointing direction of a user control device (such as motion controller 758) for controlling the entertainment device (such as system unit 10). The source position is indicative of a position of the user control device (such as motion controller 758) with respect to the display 300, and the pointing direction is indicative of a direction in which the user control device is pointing with respect to the display 300. In embodiments, the source position and pointing direction of the user control device are detected using the techniques described above. However, it will be appreciated that any other suitable method for detecting the source position and pointing direction of the user control device could be used.
  • At a step s26, the system unit 10 generates the 3D image within the 3D image display region so that the 3D image appears to be associated with the source position and pointing direction of the user control device. In embodiments, the system unit 10 generates the 3D image so that is appears to be associated with the source position and pointing direction of the user control device in a similar manner to that described above with reference to FIG. 13. However, it will be appreciated that any other suitable technique could be used.
  • A method for controlling the appearance of a 3D image in dependence upon its relative position with respect to a 3D image display region will now be described with reference to FIG. 18.
  • FIG. 18 is a flow chart of an entertainment method in accordance with embodiments of the present invention.
  • At a step s30, the system unit 10 generates a 3D such as a 3D image described above with reference to FIGS. 14 and 15 for displaying on a display (such as display 300).
  • Then, at a step s32, the system unit 10 detects the 3D image display region in a similar manner to that described above with reference to FIGS. 12 and 13. As mentioned above, the 3D image display region is an image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display.
  • At a step s34, the system unit 10 detects the apparent position of the 3D image with respect to the display. In embodiments, the position of the 3D image is detected in a similar manner to the techniques described above for determining the position of the computer generated object (described with reference to FIGS. 10 and 11). However, it will be appreciated that any other suitable method for detecting the apparent position of the 3D image could be used.
  • At a step s36, the system unit 10 controls the appearance of the 3D in dependence upon the relative position of the three-dimensional image with respect to the image display region. In embodiments, the appearance of the 3D image can be controlled in a similar manner to that described above with respect to FIGS. 14 and 15, although it will be appreciated that any other suitable technique could be used.
  • It will be appreciated that any or all of the above techniques could be combined as appropriate. For example, a user could use the motion controller to point at, and select, an icon displayed on the display in a similar manner to that described with reference to FIG. 15. However, any appropriate functionality achieved by appropriate combination of the above embodiments could be used.
  • Additionally, although the use of one motion controller 758 has been described above, any number of motion controllers could be used to implement the functionality of the embodiments described above. For example, a first user could use a first motion controller to point at a first object and a second user could use a second motion controller to point at a second object.
  • Additionally, any real physical object could be pointed to. In some embodiments, the system unit 10 is operable to implement an augmented reality application for an augmented reality environment using known techniques. In these embodiments, the real physical object to which a user can point using the motion controller 758 could be an augmented reality marker, although any other suitable real physical object could be used.
  • The various methods set out above may be implemented by adaptation of an existing entertainment device, for example by using a computer program product comprising processor implementable instructions stored on a data carrier (a storage medium) such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the existing equivalent device.
  • In conclusion, although a variety of embodiments have been described herein, these are provided by way of example only, and many variations and modifications on such embodiments will be apparent to the skilled person and fall within the scope of the present invention, which is defined by the appended claims and their equivalents.
  • Although illustrative embodiments of the invention have been described in detail herein with reference to the accompanying drawings, it is to be understood that the invention is not limited to those precise embodiments, and that various changes and modifications can be effected therein by one skilled in the art without departing from the scope and spirit of the invention as defined by the appended claims

Claims (23)

1. An entertainment device comprising:
a display to display an image to a user;
an object position detector to detect a three-dimensional position of an object in front of an image plane of the display;
a source position detector to detect a source position of a pointing source with respect to the display, the pointing source having an associated pointing direction indicative of a direction in which the pointing source is pointing;
a direction detector to detect a pointing direction of a user control device, the pointing direction of the user control device being associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source; and
an alignment detector to detect whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
2. A device according to claim 1, in which:
the object is a computer generated object;
the image comprises the computer generated object; and
the entertainment device is operable to generate the computer generated object so that, when displayed on the display, the computer generated object appears to be positioned at the three-dimensional position.
3. A device according to claim 2, comprising:
a user position detector to detect a position of the user with respect to the display; and
a computer generated object position detector to detect an apparent position of the computer generated object with respect to the display.
4. A device according to claim 1, in which:
the object is a real object.
5. A device according to claim 4, in which the object position detector comprises:
a camera operable to capture one or more images of the real object; and
a processor operable to carry out image processing on the captured images so as to detect the three-dimensional position of the real object.
6. A device according to claim 5, in which the camera comprises a distance detector detect a relative distance between the camera and the real object.
7. A device according to claim 1, in which the pointing source corresponds to a position of a game character within a game executed by the entertainment device.
8. A device according to claim 1, in which the pointing source corresponds to the position of the user control device.
9. A device according to claim 8, comprising:
a generator to generate a three-dimensional image to be displayed on a display;
an image region detector to detect, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display;
in which the generator is operable to generate the three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the position and the pointing direction of the user control device.
10. An entertainment system comprising:
an object position detector to detect a three-dimensional position of an object in front of an image plane of a display;
a source position detector to detect a source position of a pointing source with respect to the display, the pointing source having an associated pointing direction indicative of a direction in which the pointing source is pointing;
a direction detector to detect a pointing direction of a user control device, the pointing direction of the user control device being associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source; and
an alignment detector to detect whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
11. An entertainment method comprising:
displaying an image on a display to a user;
detecting a three-dimensional position of an object in front of an image plane of the display;
detecting a source position of a pointing source with respect to the display, the pointing source having an associated pointing direction indicative of a direction in which the pointing source is pointing;
detecting a pointing direction of a user control device, the pointing direction of the user control device being associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source; and
detecting whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
12. An entertainment device comprising:
a generator to generate a three-dimensional image to be displayed on a display;
an image region detector to detect, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display; and
a control device detector to detect a source position and pointing direction of a user-operated control device for controlling the entertainment device, the source position being indicative of a position of the user-operated control device with respect to the display, and the pointing direction being indicative of a direction in which the user-operated control device is pointing with respect to the display;
in which the generator is operable to generate the three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the source position and the pointing direction of the user-operated control device.
13. A device according to claim 12, in which the three-dimensional image is associated with the source position and the pointing direction of the user-operated control device so that the three-dimensional image appears to lie along the pointing direction.
14. An entertainment method comprising:
generating, using an entertainment device, a three-dimensional image to be displayed on a display;
detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display;
detecting a source position and pointing direction of a user control device for controlling the entertainment device, the source position being indicative of a position of the user control device with respect to the display, and the pointing direction being indicative of a direction in which the user control device is pointing with respect to the display; and
generating the three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the source position and the pointing direction of the user control device.
15. An entertainment device comprising:
a generator to generate a three-dimensional image to be displayed on a display;
an image region detector to detect, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display;
an apparent position detector to detect the apparent position of the three-dimensional image with respect to the display; and
a controller to control an appearance of the three-dimensional image in dependence upon the relative position of the three-dimensional image with respect to the image display region.
16. A device according to claim 15, in which the generator is operable to generate the three-dimensional image so that, over a sequence of image frames, the three-dimensional image appears to move towards the viewer.
17. A device according to claim 16, in which the controller is operable to control apparent motion of the three-dimensional image so that the three-dimensional image appears to remain within the three-dimensional image display region.
18. A device according to claim 16, in which the controller is operable to control the appearance of the three-dimensional image so that the three-dimensional image appears to fade when a position of the three-dimensional image is within a threshold distance of an edge of the three-dimensional image display region.
19. A device according to claim 15, in which:
the generator is operable to generate one or more icons each associated with a function of the device, the one or more icons comprising one or more subgroups of icons generated by the generator so that the respective subgroups appear located at respective distances from the display; and
the controller is operable to cause the generator to control the number of icons in each subgroup in dependence upon the relative position of each subgroup with respect to the three-dimensional image display region.
20. An entertainment method comprising:
generating a three-dimensional image to be displayed on a display;
detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display;
detecting an apparent position of the three-dimensional image with respect to the display; and
controlling an appearance of the three-dimensional image in dependence upon the relative position of the three-dimensional image with respect to the image display region.
21. A tangible, non-transitory computer program product comprising a storage medium on which is stored computer readable program code, the program code, when executed by a processor, cause the processor to perform an entertainment method,
the method comprising:
displaying an image on a display to a user;
detecting a three-dimensional position of an object in front of an image plane of the display;
detecting a source position of a pointing source with respect to the display, the pointing source having an associated pointing direction indicative of a direction in which the pointing source is pointing;
detecting a pointing direction of a user control device, the pointing direction of the user control device being associated with the pointing direction of the pointing source so that a change in the pointing direction of the user control device causes a change in the pointing direction of the pointing source; and
detecting whether the three-dimensional position lies on a line passing through the source position in the direction of the pointing direction of the pointing source.
22. A tangible, non-transitory computer program product comprising a storage medium on which is stored computer readable program code, the program code, when executed by a processor, cause the processor to perform an entertainment method, the method comprising:
generating, using an entertainment device, a three-dimensional image to be displayed on a display;
detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display;
detecting a source position and pointing direction of a user control device for controlling the entertainment device, the source position being indicative of a position of the user control device with respect to the display, and the pointing direction being indicative of a direction in which the user control device is pointing with respect to the display; and
generating the three-dimensional image within the image display region so that the three-dimensional image appears to be associated with the source position and the pointing direction of the user control device.
23. A tangible, non-transitory computer program product comprising a storage medium on which is stored computer readable program code, the program code, when executed by a processor, cause the processor to perform an entertainment method, the method comprising:
generating a three-dimensional image to be displayed on a display;
detecting, with respect to the display, a three-dimensional image display region in which the three-dimensional image can be perceived as three-dimensional by a viewer viewing the display;
detecting an apparent position of the three-dimensional image with respect to the display; and
controlling an appearance of the three-dimensional image in dependence upon the relative position of the three-dimensional image with respect to the image display region.
US13/150,357 2010-06-03 2011-06-01 Entertainment device and entertainment methods Abandoned US20110306413A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1009249.2 2010-06-03
GB1009249.2A GB2481366B (en) 2010-06-03 2010-06-03 Entertainment device and entertainment methods

Publications (1)

Publication Number Publication Date
US20110306413A1 true US20110306413A1 (en) 2011-12-15

Family

ID=42471060

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/150,357 Abandoned US20110306413A1 (en) 2010-06-03 2011-06-01 Entertainment device and entertainment methods

Country Status (2)

Country Link
US (1) US20110306413A1 (en)
GB (1) GB2481366B (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120293637A1 (en) * 2011-05-20 2012-11-22 Echostar Technologies L.L.C. Bufferless 3D On Screen Display
US20130251241A1 (en) * 2012-03-20 2013-09-26 Dolby Laboratories Licensing Corporation Applying Perceptually Correct 3D Film Noise
WO2015091607A1 (en) * 2013-12-18 2015-06-25 Basf Se Target device for use in optical detection of an object
US20150189256A1 (en) * 2013-12-16 2015-07-02 Christian Stroetmann Autostereoscopic multi-layer display and control approaches
US9557856B2 (en) 2013-08-19 2017-01-31 Basf Se Optical detector
US9665182B2 (en) 2013-08-19 2017-05-30 Basf Se Detector for determining a position of at least one object
US9727945B1 (en) * 2016-08-30 2017-08-08 Alex Simon Blaivas Construction and evolution of invariants to rotational and translational transformations for electronic visual image recognition
US9741954B2 (en) 2013-06-13 2017-08-22 Basf Se Optical detector and method for manufacturing the same
US9829564B2 (en) 2013-06-13 2017-11-28 Basf Se Detector for optically detecting at least one longitudinal coordinate of one object by determining a number of illuminated pixels
CN107408003A (en) * 2015-02-27 2017-11-28 索尼公司 Message processing device, information processing method and program
US9858638B1 (en) 2016-08-30 2018-01-02 Alex Simon Blaivas Construction and evolution of invariants to rotational and translational transformations for electronic visual image recognition
US10094927B2 (en) 2014-09-29 2018-10-09 Basf Se Detector for optically determining a position of at least one object
US10120078B2 (en) 2012-12-19 2018-11-06 Basf Se Detector having a transversal optical sensor and a longitudinal optical sensor
CN109003398A (en) * 2012-06-14 2018-12-14 百利游戏技术有限公司 System and method for augmented reality game
US10353049B2 (en) 2013-06-13 2019-07-16 Basf Se Detector for optically detecting an orientation of at least one object
US10412283B2 (en) 2015-09-14 2019-09-10 Trinamix Gmbh Dual aperture 3D camera and method using differing aperture areas
US10775505B2 (en) 2015-01-30 2020-09-15 Trinamix Gmbh Detector for an optical detection of at least one object
US10866658B2 (en) 2018-12-20 2020-12-15 Industrial Technology Research Institute Indicator device, mixed reality device and operation method thereof
US10890491B2 (en) 2016-10-25 2021-01-12 Trinamix Gmbh Optical detector for an optical detection
US10948567B2 (en) 2016-11-17 2021-03-16 Trinamix Gmbh Detector for optically detecting at least one object
US10955936B2 (en) 2015-07-17 2021-03-23 Trinamix Gmbh Detector for optically detecting at least one object
US11041718B2 (en) 2014-07-08 2021-06-22 Basf Se Detector for determining a position of at least one object
US11060922B2 (en) 2017-04-20 2021-07-13 Trinamix Gmbh Optical detector
US11067692B2 (en) 2017-06-26 2021-07-20 Trinamix Gmbh Detector for determining a position of at least one object
US11125880B2 (en) 2014-12-09 2021-09-21 Basf Se Optical detector
WO2021235232A1 (en) * 2020-05-18 2021-11-25 ソニーグループ株式会社 Information-processing device, information-processing method, and program
US11211513B2 (en) 2016-07-29 2021-12-28 Trinamix Gmbh Optical sensor and detector for an optical detection
US11428787B2 (en) 2016-10-25 2022-08-30 Trinamix Gmbh Detector for an optical detection of at least one object
US11442549B1 (en) * 2019-02-07 2022-09-13 Apple Inc. Placement of 3D effects based on 2D paintings
WO2022221147A1 (en) * 2021-04-15 2022-10-20 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11860292B2 (en) 2016-11-17 2024-01-02 Trinamix Gmbh Detector and methods for authenticating at least one object

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060236251A1 (en) * 2005-04-19 2006-10-19 Takashi Kataoka Apparatus with thumbnail display
US20090002483A1 (en) * 2004-03-02 2009-01-01 Kabushiki Kaisha Toshiba Apparatus for and method of generating image, and computer program product
US20090077504A1 (en) * 2007-09-14 2009-03-19 Matthew Bell Processing of Gesture-Based User Interactions
US20090237763A1 (en) * 2008-03-18 2009-09-24 Kramer Kwindla H User Interaction with Holographic Images

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6731988B1 (en) * 1992-01-21 2004-05-04 Sri International System and method for remote endoscopic surgery
US5394202A (en) * 1993-01-14 1995-02-28 Sun Microsystems, Inc. Method and apparatus for generating high resolution 3D images in a head tracked stereo display system
GB2297876A (en) * 1995-02-09 1996-08-14 Sharp Kk Observer tracking autostereoscopic display
US6894716B1 (en) * 1999-10-01 2005-05-17 Xerox Corporation Method and apparatus for identifying a position of a predetermined object in free space using a video image
US6985290B2 (en) * 1999-12-08 2006-01-10 Neurok Llc Visualization of three dimensional images and multi aspect imaging
SG115546A1 (en) * 2003-06-23 2005-10-28 Affineon Technologies Pte Ltd Computer input device tracking six degrees of freedom
US9405372B2 (en) * 2006-07-14 2016-08-02 Ailive, Inc. Self-contained inertial navigation system for interactive control using movable controllers
US8698735B2 (en) * 2006-09-15 2014-04-15 Lucasfilm Entertainment Company Ltd. Constrained virtual camera control
IT1396752B1 (en) * 2009-01-30 2012-12-14 Galileo Avionica S P A Ora Selex Galileo Spa VISUALIZATION OF A THREE-DIMENSIONAL VIRTUAL SPACE GENERATED BY AN ELECTRONIC SIMULATION SYSTEM
US8284234B2 (en) * 2009-03-20 2012-10-09 Absolute Imaging LLC Endoscopic imaging using reflection holographic optical element for autostereoscopic 3-D viewing
KR101629479B1 (en) * 2009-11-04 2016-06-10 삼성전자주식회사 High density multi-view display system and method based on the active sub-pixel rendering

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002483A1 (en) * 2004-03-02 2009-01-01 Kabushiki Kaisha Toshiba Apparatus for and method of generating image, and computer program product
US20060236251A1 (en) * 2005-04-19 2006-10-19 Takashi Kataoka Apparatus with thumbnail display
US20090077504A1 (en) * 2007-09-14 2009-03-19 Matthew Bell Processing of Gesture-Based User Interactions
US20090237763A1 (en) * 2008-03-18 2009-09-24 Kramer Kwindla H User Interaction with Holographic Images

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120293637A1 (en) * 2011-05-20 2012-11-22 Echostar Technologies L.L.C. Bufferless 3D On Screen Display
US20130251241A1 (en) * 2012-03-20 2013-09-26 Dolby Laboratories Licensing Corporation Applying Perceptually Correct 3D Film Noise
US9031356B2 (en) * 2012-03-20 2015-05-12 Dolby Laboratories Licensing Corporation Applying perceptually correct 3D film noise
CN109003398A (en) * 2012-06-14 2018-12-14 百利游戏技术有限公司 System and method for augmented reality game
US10120078B2 (en) 2012-12-19 2018-11-06 Basf Se Detector having a transversal optical sensor and a longitudinal optical sensor
US10845459B2 (en) 2013-06-13 2020-11-24 Basf Se Detector for optically detecting at least one object
US10823818B2 (en) 2013-06-13 2020-11-03 Basf Se Detector for optically detecting at least one object
US9741954B2 (en) 2013-06-13 2017-08-22 Basf Se Optical detector and method for manufacturing the same
US9829564B2 (en) 2013-06-13 2017-11-28 Basf Se Detector for optically detecting at least one longitudinal coordinate of one object by determining a number of illuminated pixels
US10353049B2 (en) 2013-06-13 2019-07-16 Basf Se Detector for optically detecting an orientation of at least one object
US9989623B2 (en) 2013-06-13 2018-06-05 Basf Se Detector for determining a longitudinal coordinate of an object via an intensity distribution of illuminated pixels
US9557856B2 (en) 2013-08-19 2017-01-31 Basf Se Optical detector
US9665182B2 (en) 2013-08-19 2017-05-30 Basf Se Detector for determining a position of at least one object
US9958535B2 (en) 2013-08-19 2018-05-01 Basf Se Detector for determining a position of at least one object
US10012532B2 (en) 2013-08-19 2018-07-03 Basf Se Optical detector
US20150189256A1 (en) * 2013-12-16 2015-07-02 Christian Stroetmann Autostereoscopic multi-layer display and control approaches
CN105980812A (en) * 2013-12-18 2016-09-28 巴斯夫欧洲公司 Target device for use in optical detection of an object
WO2015091607A1 (en) * 2013-12-18 2015-06-25 Basf Se Target device for use in optical detection of an object
US11041718B2 (en) 2014-07-08 2021-06-22 Basf Se Detector for determining a position of at least one object
US10094927B2 (en) 2014-09-29 2018-10-09 Basf Se Detector for optically determining a position of at least one object
US11125880B2 (en) 2014-12-09 2021-09-21 Basf Se Optical detector
US10775505B2 (en) 2015-01-30 2020-09-15 Trinamix Gmbh Detector for an optical detection of at least one object
US20180033195A1 (en) * 2015-02-27 2018-02-01 Sony Corporation Information processing apparatus, information processing method, and program
CN107408003A (en) * 2015-02-27 2017-11-28 索尼公司 Message processing device, information processing method and program
US10672187B2 (en) * 2015-02-27 2020-06-02 Sony Corporation Information processing apparatus and information processing method for displaying virtual objects in a virtual space corresponding to real objects
US10955936B2 (en) 2015-07-17 2021-03-23 Trinamix Gmbh Detector for optically detecting at least one object
US10412283B2 (en) 2015-09-14 2019-09-10 Trinamix Gmbh Dual aperture 3D camera and method using differing aperture areas
US11211513B2 (en) 2016-07-29 2021-12-28 Trinamix Gmbh Optical sensor and detector for an optical detection
US9727945B1 (en) * 2016-08-30 2017-08-08 Alex Simon Blaivas Construction and evolution of invariants to rotational and translational transformations for electronic visual image recognition
US10062144B2 (en) 2016-08-30 2018-08-28 Alex Simon Blaivas Construction and evolution of invariants to rotational and translational transformations for electronic visual image recognition
US9858638B1 (en) 2016-08-30 2018-01-02 Alex Simon Blaivas Construction and evolution of invariants to rotational and translational transformations for electronic visual image recognition
US10890491B2 (en) 2016-10-25 2021-01-12 Trinamix Gmbh Optical detector for an optical detection
US11428787B2 (en) 2016-10-25 2022-08-30 Trinamix Gmbh Detector for an optical detection of at least one object
US11635486B2 (en) 2016-11-17 2023-04-25 Trinamix Gmbh Detector for optically detecting at least one object
US11698435B2 (en) 2016-11-17 2023-07-11 Trinamix Gmbh Detector for optically detecting at least one object
US11860292B2 (en) 2016-11-17 2024-01-02 Trinamix Gmbh Detector and methods for authenticating at least one object
US10948567B2 (en) 2016-11-17 2021-03-16 Trinamix Gmbh Detector for optically detecting at least one object
US11415661B2 (en) 2016-11-17 2022-08-16 Trinamix Gmbh Detector for optically detecting at least one object
US11060922B2 (en) 2017-04-20 2021-07-13 Trinamix Gmbh Optical detector
US11067692B2 (en) 2017-06-26 2021-07-20 Trinamix Gmbh Detector for determining a position of at least one object
US10866658B2 (en) 2018-12-20 2020-12-15 Industrial Technology Research Institute Indicator device, mixed reality device and operation method thereof
US11442549B1 (en) * 2019-02-07 2022-09-13 Apple Inc. Placement of 3D effects based on 2D paintings
WO2021235232A1 (en) * 2020-05-18 2021-11-25 ソニーグループ株式会社 Information-processing device, information-processing method, and program
US11954244B2 (en) 2020-05-18 2024-04-09 Sony Group Corporation Information processing device and information processing method
WO2022221147A1 (en) * 2021-04-15 2022-10-20 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US20220343537A1 (en) * 2021-04-15 2022-10-27 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11954886B2 (en) * 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects

Also Published As

Publication number Publication date
GB2481366B (en) 2014-05-28
GB201009249D0 (en) 2010-07-21
GB2481366A (en) 2011-12-28

Similar Documents

Publication Publication Date Title
US20110306413A1 (en) Entertainment device and entertainment methods
EP2662838B1 (en) Apparatus and method for augmented reality
US8750599B2 (en) Stereoscopic image processing method and apparatus
CN107852573B (en) Mixed reality social interactions
JP6526184B2 (en) Real-time lens aberration correction by eye tracking
US11010958B2 (en) Method and system for generating an image of a subject in a scene
US8970624B2 (en) Entertainment device, system, and method
JP2018523326A (en) Full spherical capture method
WO2017131970A1 (en) Methods and systems for navigation within virtual reality space using head mounted display
US20120086729A1 (en) Entertainment device, system, and method
JP2017532847A (en) 3D recording and playback
KR20160079794A (en) Mixed reality spotlight
JP6310898B2 (en) Image processing apparatus, information processing apparatus, and image processing method
KR102059732B1 (en) Digital video rendering
US20130040737A1 (en) Input device, system and method
US10803652B2 (en) Image generating apparatus, image generating method, and program for displaying fixation point objects in a virtual space
US20190295324A1 (en) Optimized content sharing interaction using a mixed reality environment
JP2012234411A (en) Image generation device, image generation system, image generation program and image generation method
US10796485B2 (en) Rendering objects in virtual views
JP6563592B2 (en) Display control apparatus, display control method, and program
GB2473263A (en) Augmented reality virtual image degraded based on quality of camera image
WO2010139984A1 (en) Device and method of display
TWI817335B (en) Stereoscopic image playback apparatus and method of generating stereoscopic images thereof
WO2010116171A1 (en) Transmission of video images modified based on stereoscopic video image acquisition

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY COMPUTER ENTERTAINMENT EUROPE LIMITED, UNITED

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BICKERSTAFF, IAN;HOCKING, IAN MICHAEL;KERSHAW, NIGEL;AND OTHERS;SIGNING DATES FROM 20110708 TO 20110728;REEL/FRAME:026819/0021

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION