WO2006100644A2 - Orientation and position adaptation for immersive experiences - Google Patents

Orientation and position adaptation for immersive experiences Download PDF

Info

Publication number
WO2006100644A2
WO2006100644A2 PCT/IB2006/050870 IB2006050870W WO2006100644A2 WO 2006100644 A2 WO2006100644 A2 WO 2006100644A2 IB 2006050870 W IB2006050870 W IB 2006050870W WO 2006100644 A2 WO2006100644 A2 WO 2006100644A2
Authority
WO
WIPO (PCT)
Prior art keywords
portable electronic
electronic device
processor
immersive
orientation
Prior art date
Application number
PCT/IB2006/050870
Other languages
French (fr)
Other versions
WO2006100644A3 (en
Inventor
Hubertus M. R. Cortenraad
Anthonie H. Bergman
Original Assignee
Koninklijke Philips Electronics, N.V.
U.S. Philips Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics, N.V., U.S. Philips Corporation filed Critical Koninklijke Philips Electronics, N.V.
Publication of WO2006100644A2 publication Critical patent/WO2006100644A2/en
Publication of WO2006100644A3 publication Critical patent/WO2006100644A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4131Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4524Management of client data or end-user data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S3/00Systems employing more than two channels, e.g. quadraphonic
    • H04S3/002Non-adaptive circuits, e.g. manually adjustable or static, for enhancing the sound image or the spatial distribution
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/302Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device specially adapted for receiving control signals not targeted to a display device or game input means, e.g. vibrating driver's seat, scent dispenser
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • H04S7/303Tracking of listener position or orientation

Definitions

  • the invention generally relates to portable devices and the creation of immersive experiences.
  • an immersive experience for a device that has a fixed position in a room is facilitated by the setting.
  • the positions of the TV and the associated speakers, lights, etc. that provide the experience are fixed with respect to the viewers.
  • the TV is generally in front of the seats from which the viewers watch and the speakers lights, etc., are positioned around them.
  • the viewers face the TV, so the relative orientation between the viewers and the speakers, lights, etc. are also fixed.
  • a speaker located to the right of the viewer watching the TV remains to the right and can be programmed to output immersive audio effects to a viewer's right.
  • an "immersive effect” is generally an invocation of one or more senses of the user separate from the output (e.g., the delivery of content) of the portable electronic device presented locally to a user.
  • An immersive effect is generally correlated to the local output of the portable electronic device and is generally provided by an intelligent re-presentation, supplementation and/or enhancement of the local output of the portable electronic device that is scaled and presented to the user in the real- world frame of reference.
  • immersive effects are created by lights, speakers and other presentation devices.
  • An “immersive experience” presented to the user is the totality of one or more immersive effects.
  • Immersive effects generally serves to give the user the sensation of being immersed or physically involved and integrated in the local content output of the device.
  • an immersive experience will generally create sensory input separate from the gaming device itself that makes a player feel as if he/she is in the game, instead of just playing a game on a display.
  • the immersive effects and experience will similarly provide a heightened feeling of involvement and connection.
  • the invention comprises a method of providing an immersive experience for a portable electronic device. The method comprises controlling the output of one or more presentation devices in the vicinity of the portable electronic device as a function of an output of the portable electronic device and the orientation of the portable electronic device.
  • the output of the one or more presentation devices is controlled to provide an immersive experience for the output of the portable electronic device.
  • a system for providing an immersive experience for a portable electronic device which includes a processor that interfaces with the portable electronic device and one or more presentation devices in the vicinity of the portable electronic device.
  • the processor receives position signals that provide a determination of an orientation of the portable electronic device.
  • the processor also receives signals from the portable electronic device that relate to an immersive experience.
  • the processor controls the output of one or more presentation devices in the vicinity of the portable electronic device using the output of the portable electronic device and the orientation of the portable electronic device to provide an immersive experience for the user of the portable electronic device.
  • FIG. 1 is a representative depiction of a system in accordance with an embodiment of the invention
  • FIG. Ia is a particular representative depiction of a system in accordance with an embodiment of the invention.
  • FIG 2 is another particular representative depiction of a system in accordance with an embodiment of the invention.
  • FIG. 3 is a flow chart of a method in accordance with an embodiment of the invention.
  • Fig. 1 is a representative illustration of a system 10 in accordance with an embodiment of the invention.
  • the system of Fig. 1 comprises a portable electronic device (PED) 20, computer 30 and one or more presentation devices (PD) 40a-40N, where 4ON reflects the number of presentation devices.
  • Computer 30 may be, for example, a server.
  • PED 20 is a portable device that may be carried by the user to different positions and utilized for its intended purpose.
  • the presentation devices may be speakers, lights, and so forth, that generally remain associated with a room or other physical space.
  • presentation devices may be moved to other positions within the space, they typically remain in the same position for extended periods of time.
  • presentation devices generally remain in fixed locations and orientations while PED 20 may have different orientations and/or move to different locations as a user carrying PED 20 moves in the space.
  • PED 20 generally can be any portable electronic device that provides content locally to a user. Examples include portable gaming devices (e.g., Game Boy), cellular phones, portable DVD players, MP3 and other personal audio players, laptop PCs and PDAs. Many modern PEDs 20 include an internal (and/or associated) processor 21 and memory 22.
  • Presentation devices generally include any controllable device that presents a sensation to a user. Examples include audio speakers and video displays. Other visual devices that are readily recognized as presentation devices include, for example, video projectors and lights. Many devices other than speakers can produce audio effects, for example, alarms, a clock ticking, and even pyrotechnic devices. Presentation devices also include many devices that invoke a tactile sensation, for example, fans, misting and fog generators, seat vibrators (rumblers), ultrasonic generators, and the like. Tactile devices may also include temperature control devices, for example, air conditioning and heaters. Presentation devices also include devices that generate an olfactory sensation, for example, a misting or smoke device that is controllable to present odors. Similarly, presentation devices also include devices that invoke a taste sensation. For example, misting or smoke devices may also be used to invoke tastes. Other specific presentation devices are readily identifiable.
  • FIG. 1 Also shown in Fig. 1 is computer 30 interposed between PED 20 and presentation devices 40a-40N. Communication interface 25 links PED 20 and computer 30. Computer 30 is also linked to each presentation device 40a-40N via a respective communication link 35a-35N as shown in Fig. 1. Presentation devices 40a-40N also include appropriate control electronics that allow their output to be controlled and adjusted by computer 30 over the respective communication line 35a-35N. Also shown in Fig. 1 is positioning system 50, which also has a communication interface 55 with computer 30. Depending on the embodiment, interfaces 25, 35, 55 may be wireless or wired, as described in more detail below.
  • PED 20 transmits data regarding the immersive experience (or the required immersive effects to provide the immersive experience) over communication interface 25 to computer 30.
  • Communication interface 25 may also be used to identify PED 20 to computer 30, for example, when PED 20 is moved by a user into the space served by computer 30, or turns on the PED 20.
  • the orientation of PED 20 is communicated from positioning system 50 to computer 30 over communication interface 55.
  • Positioning system 50 may also be used to identify PED 20 to computer 30 in lieu of (or in addition to) PED 20 identifying itself to computer 30 over interface 25 as noted above.
  • Positioning system 50 may be separate, or it may be incorporated in whole or in part in computer 30 and/or PED 20.
  • Computer 30 (generally using one or more internal processors 31 and attendant memory 32) is configured for the processing hereinafter described, for example, via programming provided by appropriate software stored in memory, firmware, or other programming sources.
  • Computer 30 may first process the immersive effects data received and/or the data received from positioning system 50.
  • Computer 30 utilizes the effects data received and the orientation data received to drive one or more of the presentation devices 40a-40N to create immersive effects for the user appropriate for the orientation of the PED 20 in the space.
  • Fig. Ia is a particular system that is supported by components shown in Fig. 1 and described above.
  • the various components are located in space, shown to be a room 70.
  • the perspective of the room 70 is looking down from above.
  • Presentation devices 40a-40h reside along the perimeter of the room 70.
  • presentation devices 40a, 40c, 4Oe, 4Og are speakers Sl -S4 respectively, positioned in the four corners of the room 70 and oriented toward the center of room 70.
  • Presentation devices 40b, 4Od, 4Oh are lights L1-L3 respectively each located approximately halfway along three of the walls of the room 70.
  • Presentation device 4Of is a television TV located along the fourth wall of the room 70 facing toward the center of room 70.
  • presentation devices 40a-40h are associated with the room 70 and once placed generally stay in the same location and/or orientation for extended periods of time (e.g., a room in a home is typically rearranged and left for months or even years having the same arrangement). In any event, presentation devices will in the overwhelming majority of cases remain fixed for the time interval a PED 20 is being used in the room 70.
  • computer 30 associated with room 70 is shown in Fig. Ia as being adjacent TV 4Of.
  • Computer 30 may be located elsewhere, and need not be physically located in room 70.
  • User 60 is shown at a position in room 70 facing in a certain direction and engaging PED 20.
  • the orientation and position of PED 20 will be focused on in the ensuing description. Because the user 60 is typically closely engaged in using the PED 20, it is generally acceptable to provide the immersive effects based on the orientation of PED 20 and, where applicable, its position. However, as described further below, the effects can be readily adjusted to the user 60 when desired or necessary (for example, to the user's head).
  • the interfaces 35a-35h between computer 30 and the respective presentation device 40a-40h are omitted for clarity in Fig. Ia, but will typically be a wired or wireless connection.
  • determination of the current orientation of PED 20 alone is sufficient in this embodiment to provide the spatial relationships used in selecting presentation device(s) to output an immersive effect(s).
  • orientation of the PED 20 provides a general indication of the relative position between the PED and the presentation devices that is referenced to the orientation of PED 20, and may be used for purposes of the immersive effects.
  • PED 20 includes an internal compass 20a that indicates to PED 's internal processor in which direction the PED 20 is facing.
  • the orientation of the PED 20 is taken as the direction the user 60 will be facing when engaging PED. For example, based on north being the direction as shown in Fig. Ia, PED 20 is shown facing south in room 70. (Thus, in this example, PED 20 incorporates the substance of positioning system as internal compass 20a, which determines PED 20 orientation.)
  • PED 20 transmits a signal to computer 30 indicating the PED 20 orientation (in this case, facing south).
  • Communication interface 25 between PED 20 and computer 30 in Fig. Ia is a wireless link (e.g., Bluetooth, infrared, RF, or the like) and is thus not shown in Fig. Ia.
  • Computer 30 in turn determines the general relative positions of presentation devices 40a-40h referenced to the orientation of PED 20. For example, computer 30 has pre-stored in memory the types and general positions of the various presentation devices 40a-40h in room 70 based on north being as shown in Fig. Ia.
  • speaker 40a is located in the northwest corner
  • light 40b is along the north wall
  • speaker 40c is in the northeast corner
  • light 4Od is along the east wall
  • speaker 4Oe is in southeast corner
  • TV 4Of is along south wall
  • speaker 4Og is in southwest corner
  • light 4Oh is along west wall.
  • speaker 40a is also determined to be to the right of the user 60 (i.e., to the northwest of a user 60 facing south) and speaker 40c is also determined to be to the left of user 60.
  • the general positions of the other presentation devices 40d-40h relative to the PED 20 and user 60 having a particular orientation is likewise readily determined by computer 30.
  • PED 20 also supplies the data or signal giving the immersive effect required that corresponds to the current local output at PED 20. This is sent to computer 30 over communication interface 25.
  • PED 20 in this embodiment provides the substance of the immersive effects needed for the immersive experience.
  • Computer 30 may provide certain electronic processing of the signal (e.g., conversion, amplification).
  • Computer 30 utilizes the general relative positions of the presentation devices 40a-40h with respect to PED 20 for its current orientation, as well as the types of presentation devices 40a-40h, to output the appropriate immersive effects.
  • PED 20 in Fig. Ia may be a portable gaming device where on the PED display the avatar is entering a dark cave and there is also a waterfall behind him.
  • the corresponding immersive effects may thus be to 1) provide lighting from the rear, 2) dimming lighting to the sides and front and 3) provide a low bass rumble from the rear. (Nos. 1 and 2 would both provide an immersive effect of entering the dark cave on the PED display, and no.
  • PED 20 transmits this data and/or signal to computer 30 via interface 25.
  • computer 30 raises the light output of light 40b behind user 60 and lowers the other lights 4Od, 4Oh via interfaces 35b, 35d and 35h, respectively.
  • TV 4Of is turned off and speakers 4Og, 4Oe are also lowered, if necessary.
  • computer 30 engages speakers 40a, 40c behind user 60 to provide the low rumble of the waterfall.
  • the corresponding immersive effects change to bright light and a rumble in front of the PED 20.
  • computer 30 dims light 40b and lowers speakers 40a, 40c, and engages speakers 4Og and 4Oe known to be in front of user 60 to provide the rumble of the waterfall.
  • Computer 30 in this example knows that side lights 4Od and 4Oh are already dimmed.
  • TV 4Of may be driven by computer 30 to output a bright blue color, for an immersive effect of exiting the cave into bright light.
  • PED 20 sends an updated orientation signal ("east") to computer 30.
  • Computer re-determines the general relative positions between the presentation devices 40a-40h and the PED 20 referenced to the new east orientation of the PED 20 in the manner described above.
  • speakers 40c and 4Oe and light 4Od are determined to be in front of user 60.
  • computer 30 raises light 4Od and engages speakers 40c, 4Oe to provide the waterfall effect.
  • Computer 30 also dims the TV 4Of output (now to the right side of user 60) and lowers speaker 4Og output (now behind and to the right of user 60).
  • presentation devices are all configured or oriented in Fig. Ia to project output from the perimeter into the room 70.
  • speakers 40a, 40c, 4Oe, 4Og face toward the center of the room 70, thus projecting their output most broadly throughout the room 70.
  • lights 40b, 4Od and 4Oh may project in all directions, also projecting their output broadly throughout the room 70.
  • TV 4Of is visible and projects throughout the room 70.
  • presentation devices in Fig. Ia generally direct their output at a PED 20 located in the room 70, so orientation of the presentation device is not used as a factor in selection.
  • PED 20 provides the substantive content of the immersive effect to computer 30 over interface 25.
  • computer 30 may include software that generates the substance of the immersive effect.
  • the local output that PED 20 outputs to user 60 may also be transmitted in whole or part to computer 30 over interface 25, and computer 30 may generate the substance of the needed immersive effects and control presentation devices accordingly.
  • data regarding the content (audio, video output) of the game being output by PED 20 to the user 60 may also be transmitted to the computer 30, which analyzes the content and generates appropriate immersive effects.
  • the local output of PED 20 to the user 60 in the above description is generated within PED 20 itself (through internal memory and processing).
  • computer 30 may provide the content and other substantive output of PED 20 over interface 25.
  • a laptop or computer monitor with detachable monitor like the Philips DesXcape may provide the PED according to such a configuration. Such a monitor may be used, for example, to present a photo slide show from a user's collection with attendant immersive effects created by the user.
  • the interface 25 between computer 30 and PED 20 may also provide the user with a menu to select games, other content, or the like, that may be run on PED 20 through interface 25.
  • orientation of PED is supplied by an internal compass within PED 20.
  • Many other systems and techniques may be utilized as positioning system 50 to determine the orientation of PED 20.
  • Many available optical recognition techniques and/or systems may be used.
  • the system may be configured to recognize the contour of those PEDs that are supported by the system, and to determine the orientation of the recognized PED.
  • Orientation of the PED may be determined by determining the orientation of the device itself (using images from multiple cameras or an overhead camera, if necessary), and/or by determining the direction a user is facing through face recognition.
  • the system may also provide initial recognition of a PED 20, for example, through image recognition as noted above, signaling protocols transmitted along communication interface 25, or another interface and/or technique. If interface 25 and RF signaling is used, any one of a multitude of available signaling protocols may be utilized to announce the presence of PED 20.
  • the locations of presentation devices 40a- 4Oh may be manually input to computer 30 or otherwise determined and, if necessary, transmitted to computer 30. (Certain techniques and components used in locating the presentation devices are described with respect to the embodiment of Fig. 2 below.)
  • Recognition and communication setup with PED 20 may occur, for example, when a user 60 enters a room, or when a user 60 in the room turns on the PED 20. Immersive effects may occur immediately thereafter, or they may be engaged when a user 60 provides an input at the PED 20.
  • orientation of PED 20 is used to determine a non-quantitative, general indication of the relative positions of presentation devices 40a- 4Oh. These relative positions of the presentation devices 40a-40h are referenced to the current orientation of PED 20. This technique is particularly suitable where the presentation devices lie along the perimeter of the space (e.g., wall of the room), and tend to "surround" a user that tends to be located generally in the center of the room.
  • the presentation devices are also configured or oriented in Fig. Ia such that their output projects at least in part toward the center of the room 70 from their locations on the perimeter. In many cases, as in Fig. Ia, such arrangement allows determination of current orientation of PED 20 alone to be sufficient to provide the spatial relationships used in selecting presentation device(s). (Other factors may still be involved in the selection: In the above illustrative embodiment, presentation devices were also chosen based on type of device. For example, speakers were chosen to output audio effects and lights were chosen to output lighting effects.) It is preferrable that presentation devices are also relatively uniformly distributed according to their type (e.g., lamps, speakers display), as in Fig. Ia.
  • a coordinate position of the PED 20 is determined and utilized in addition to its orientation in providing the immersive effects. This serves to improve the immersive experience in many situations, for example, if the presentation devices are not distributed around the perimeter of the room in a relatively uniform manner and/or the user is located in the room such that the relative positions of the various types of presentation devices are relatively asymmetric.
  • the underlying system for this embodiment is also supported by components generally shown in Fig. 1 and described above.
  • positioning system 50 provides an orientation and coordinate position of the PED 20 in the space, which is provided to computer 30.
  • Computer 30 also receives, stores, or determines coordinate positions of the presentation devices in the room.
  • Computer 30 processes the coordinate data to quantitatively determine the relative positions of the presentation devices with respect to the current location of PED 20, and references the relative positions to the current orientation of PED 20. Computer 30 also receives or internally generates immersive effects data. Computer 30 uses the relative positions of the presentation devices as referenced to the current orientation of PED 20 as a factor in selecting one or more presentation devices to output appropriate immersive effects. The outputs of the selected presentation devices are then adjusted to provide the immersive effects proper for the current orientation and location of PED 20.
  • Fig. 2 is a representative depiction of a particular system that corresponds to such an embodiment.
  • user 60 is positioned to the left of center in a space (also shown to be a room 70) and facing down the page while engaging PED 20.
  • Various presentation devices 40i-40n are shown in non-uniform locations about the room 70, with some located away from the perimeter of the room 70.
  • light 4Oi is located along a wall behind and to the left of user 60, whereas light 40m is closer to user 60.
  • Speaker 4Oj is to the left and behind user's current position
  • speaker 40k is to the forward left of user 60.
  • TV 401 lies along the wall that user 60 is currently facing, also to user's left.
  • speaker 4On is shown as being a few feet directly to user's right.
  • Computer 30 and positioning system components 50 are represented as shown in
  • Fig. 2 As noted above, computer 30 may be located in any location inside or outside the room 70, it may be integrated with other components (including one or more of the presentation devices), or be implemented using other configurations. In manner analogous as described for the embodiment of Fig. Ia, PED 20 supplies the content of the local PED
  • Computer 30 interfaces with presentation devices 40i-40n to output the immersive effects over interfaces
  • Computer 30 also pre-stores the coordinate locations of presentation devices 4Oi-
  • Positioning system components 50 work in conjunction with computer 30 to provide the current orientation and coordinate location of PED 20. Although positioning system components 50 are represented adjacent computer 30, this is for simplicity of the depiction. Depending on the particular positioning system 50 used, its components may be decentralized and located throughout the room 70. Also, as noted above, some or all of positioning system components may reside in computer 30 (in particular, the processing components). Positioning system may be comprised of any one of many systems and techniques that may be configured to provide orientation and coordinates of PED 20. For example, many available image recognition techniques and/or systems and associated coordinate processing are well suited for detecting the presence of PED 20 in the room 70 and, once detected, for determining its orientation and coordinate location.
  • positioning system 50 will typically include one or more video cameras positioned in room 70 that provide images that are processed using attendant processing software.
  • orientation of the PED 20 may be determined using image processing techniques; for example, one or more cameras facing downward from the ceiling that captures images may be readily processed to determine orientation of PED.
  • Face recognition techniques applied to detect frontal images of user 60 may also be used to determine the angle at which user 60 is rotated with respect to the axes of the original origin O.
  • Such a positioning system 50 is used to determine the orientation and x-y coordinates of PED 20 and transmit them to computer 30 (if the processing of system 50 is separately located).
  • Computer 30 uses the position of PED 20 and the known positions of presentation devices 40i-40n to determine the relative coordinate position of each presentation device 40i-40n with respect to PED 20. Determination of the relative positions of the presentation devices with respect to PED 20 will preferably include translating the origin to the current location of the PED 20. In addition, the relative positions are referenced to the current orientation of PED 20 preferably by also aligning the translated origin with the current detected orientation direction of the PED 20.
  • the origin is translated as shown to the position of PED 20, and oriented so that the positive y axis lies in the direction PED 20 is oriented.
  • Computer 30 also does a simple translation of the known positions of presentation devices 40i-40n in the original coordinate system O to determine the relative positions in the translated coordinate system O'. Because O' is also aligned with the current orientation of PED 20, the translated coordinates of presentation devices are referenced to the current orientation of PED 20.
  • light 4Oi may for example be determined to be at x-y location (-5, -8), indicating 5 feet to the left and 8 feet behind PED 20.
  • the other devices have the following coordinate locations: speaker 4Oj at (-6, -4); speaker 40k at (-8, 5); TV 401 at (-4, 8); speaker 4On at (2, 0) and light 40m at (-0.5, -0.5).
  • these coordinates provide the relative positions of each presentation device in a frame of reference for the current orientation of PED 20. (Namely, positive x is to the right of PED as currently orientated, negative x to the left, positive y in front and negative y behind.)
  • computer 30 also receives the substance of the current immersive effect from PED 20.
  • Computer 30 processes the immersive effect data received using the relative positions (as referenced to current orientation of PED) and types of presentation devices available to determine appropriate output by one or more of the presentation devices that best creates the immersive effect for the user 60 at the current position of PED .
  • the substance of the immersive effect received by computer 30 from PED 20 has various criteria (e.g., create diffuse backlighting).
  • Computer 30 includes a series of rules, which may be incorporated into decision trees, or the like, that provides for selection of presentation devices and adjustment of the output of the selected devices to reflect the required immersive effect for the user.
  • the factors used by computer 30 for selection of particular presentation devices for a required immersive effect will include the position and type of the presentation device.
  • Fig. 2 if the immersive effect received calls for backlighting, computer 30 may first look for available lights behind PED 20 as currently positioned and oriented. Using the example coordinates given above, computer 30 knows from the y coordinate of presentation devices in O' that light 4Oi is located 8 feet behind user 60 and light 40m is located 0.5 feet behind user 60. If the immersive effect calls for a relatively evenly distributed backlight and/or a more diffuse backlight (such as an emerging sunrise), computer 30 may select light 4Oi since although five feet to the left of user 60, it is further behind user 60 and will provide a more even and diffuse backlight than light 40m.
  • a relatively evenly distributed backlight and/or a more diffuse backlight such as an emerging sunrise
  • the immersive effect calls for a relatively intense and immediate backlight (e.g., providing an effect for a sudden emergence of a spotlight in a jailbreak game)
  • processing of computer 30 will likely select light 40m for output of the effect since it is closer behind user 60.
  • a similar example is illustrative of the processing that may typically take place for audio related to an immersive effect.
  • speakers 4Oj, 40k, and 4On are located asymmetrically in the room 70, and user 60 is likewise positioned at a relatively unbalanced point with respect to them.
  • computer 30 attempts to select and adjust the speaker output to accurately reflect a current immersive audio effect from PED 20. For example, if the user 60 is engaged in a game on PED 20 where a train is approaching from the right of his avatar, computer 30 may receive an immersive effect signal requiring an increasing audio output from the right of the PED 20.
  • Computer 30 looks for speakers to the right of PED 20 and determines (based again on the coordinates in O' described above) that speaker 4On is suitable, since it is 2 feet directly to user's right.
  • the audio output of speaker 4On is adjusted by computer 30 based on the proximity of the train in the game (which is reflected in the immersive effect data received) and the 2 feet distance between speaker 4On and user 60.
  • computer 30 looks for speakers to the left of the current position of user 60 and finds speakers 4Oj, 40k at locations (-6, -4) and (-8, 5).
  • the y coordinate indicates to computer 30 that neither is directly to the left.
  • computer 30 determines from the y coordinates that one speaker 40k is in front of user 60 and one 4Oj is behind.
  • computer 30 selects both speakers and uses the relative positions in O' to balance the audio output of each to provide the sound of a train emerging from the immediate left of user 60.
  • Many immersive audio effects may be adequately output by computer 30 by speakers 4Oj, 40k and 4On positioned as in Fig. 2.
  • a general surrounding background noise effect (representing, for example, the wind in a hurricane on the PED 20) may also be represented to the user 60 by the computer 30 based on appropriate balancing of the audio output of each speaker given the relative positions.
  • certain immersive effects required may not be adequately output by the available presentation devices for the current orientation of PED 20.
  • the rules followed by the processing of computer 30 will therefore also determine when to forego an immersive effect. For example, if the immersive effect signal calls for a flash of light to the user's right in Fig. 2, computer 30 will simply not find a light anywhere to the user's right when oriented as shown.
  • While computer 30 may accept a light that is forward or rearward to the right of user 60 for the effect, it will generally not select lights such as 40m and 4Oi in Fig. 2 that are to the left of user 60, since lighting these lights will generally not corollate to the local output of the PED (e.g., a Game Boy display). Thus, computer 30 may elect not to output anything for this particular immersive effect. Similarly, if the immersive effect signal calls for a train approaching from directly in front of user 60, computer 30 may determine that speaker 40k is too far to the user's left to adequately present this effect.
  • computer 30 may determine it is adequately centered in front of user 60 to provide the effect.
  • computer 30 may determine that an immersive effect skewed to the left of user 60 from speaker 40k may be acceptable.
  • the embodiment of Fig. 2 may support many of the variations described above for the embodiment of Fig. Ia.
  • computer 30 may alternatively generate the substance of the immersive effect based on the current output of PED 20 to user 60.
  • PED 20 is a game device
  • PED 20 may supply some or all of the game content to computer 30 via interface 25.
  • Computer 30 may process the game content and generate the substance of an immersive effect that corresponds to the current game content.
  • Computer 30 selects and controls presentation devices to output the appropriate immersive effect, as described above.
  • computer 30 may provide the content and other substantive output to PED 20 over interface 25.
  • PED 20 can emit a signature RF, ultrasound, or other signal that is captured by a number of corresponding detectors situated around the room 70. Processing the received signals using standard triangulation techniques based on time-of-flight, phase-change, or the like, will give the PED 20 position. Line of sight detectors can be used to determine orientation.
  • PED 20 may incorporate an internal navigation system comprised of accelerometers configured to determine orientation and position in two or three dimensions based on acceleration (including rotation) attendant to movement of the PED 20. (In that case, the embodiment of Fig.
  • the positioning system 20a has an internal positioning system 20a (not shown).
  • two offset accelerometers in each measured dimension may be used to detect both translation and rotation.
  • gyroscopes comprising an inertial navigation system, or one or more compasses, may be used to detect orientation.
  • Such a positioning system may also periodically correct its coordinates when at a known position, for example, when placed in a recharging cradle having known fixed coordinates. It is generally desirable that the position of PED 20 be determined to an accuracy of 30-50 cm.
  • the image recognition positioning system relied on determination of positions in two dimensions, namely, the plane of the floor of the room 70.
  • the positioning system 50 may determine the position of PED 20 in three dimensions.
  • the image recognition positioning system described above may determine the position of PED 20 in three dimensions.
  • Computer 30 may also store the positions of the presentation devices in three dimensions.
  • computer 30 may determine the relative coordinate positions of the presentation devices 40 with respect to the PED 20 in three dimensions in analogous manner described above for two dimensions. For Fig. 2, this would effectively include the z coordinate, namely, the height above the floor in coordinate system O, or above PED 20 in O' .
  • the translated coordinate system O' is still aligned with the orientation of PED 20, although the orientation is generally limited to the horizontal plane (that is, tilt angle of PED 20 is ignored).
  • Having a third dimension available to computer 30 in creating immersive effects may be helpful in many situations. For example, for an immersive effect may require high backlighting, a third dimension will give computer 30 the ability to choose a ceiling lamp over a wall-mounted lamp located behind user 60.
  • determination of the location and orientation of PED 20 in the above description is used as a surrogate location for providing immersive effects to the user 60, since it is likely that user 60 is closely engaged with PED 20.
  • the head of a user may be higher than an adjacent lamp, while the PED 20 he/she is holding is lower, and an immersive effect may call for overhead lighting.
  • computer 30 may engage the lamp to provide the effect. However, from the perspective of the user, that would provide a lighting effect from below.
  • the positioning systems and/or related processing described above may be adjusted to the user.
  • the system may be adapted to determine the location and orientation of a user's face (or other body part) adjacent a recognized PED 20.
  • an average position at which a user typically holds the particular type of PED 20 may be used to adjust for the user's position.
  • the orientation of user may be determined from orientation of the PED 20.
  • the positions of the various presentation devices may be pre-stored in computer 30 for the various embodiments.
  • Such pre-storage can involve manually determining the positions of the presentation devices with respect to an origin and inputting the coordinates to computer 30 via a user interface (keyboard, GUI, voice input, or the like). If and when presenatation devices are rearranged, the new coordinates are manually input.
  • the system may include electronic components that determine the positions of the presentation devices.
  • a straightforward way of automatic determination of location includes accelerometers associated with each presentation device, which generates data related to change of position based on acceleration, as described above for PED 20.
  • the interface 35 between presentation devices 40 and computer 30 may support two-way communication; thus, presentation devices 40 may update computer 30 with data reflecting its current location, and computer 30 may use that data to determine the updated location (if not determined locally at the presentation device).
  • the updated locations are subsequently used by computer 30 in determining the relative position with respect to current location of PED 20 (referenced to current PED orientation) for the immersive effect processing.
  • image processing techniques may also be readily applied to detect the contours of various speakers, lights and other presentation devices, and determine positions from the images.
  • a presentation device may emit a signature RF, ultrasound, or other signal that is captured by a number of corresponding detectors situated around the room. Processing the received signals using standard triangulation techniques based on time-of-flight, phase-change, or the like, will give the position of the presentation device.
  • Some or all processing aspects used for automatic determination of the locations of presentation devices may be incorporated into computer 30 or can reside in a separate controller or processor and memory (including positioning system 50 if separate). (For embodiments such as Fig. Ia that use only a relative position, such automatic determination is generally simpler.)
  • the direction of output of presentation devices themselves with respect to PED 20 is not included in selection of a presentation device for an immersive effect. This generally assumes that the presentation devices project their output toward PED 20 (or user's 60) position. Such a presumption is acceptable in many cases, for example, where a presentation device projects in all directions, or is located along the perimeter of the room and projects into the room. However, orientation of a presentation device's output with respect to the location of the PED 20 may be an important factor to use in the selection of a presentation device for an immersive effect. For example, referring back to Fig. 2, light 40m may be a desk lamp that projects away from user 60 (e.g., in the negative y direction in O').
  • orientation or direction of the output of presentation devices may be included with position in computer 30 and used in selecting an appropriate presentation device for an immersive effect. Orientation of the presentation devices may be input manually to computer 30 or automatically detected and input to computer 30 using the illustrative techniques as discussed above.
  • computer 30 and PED 20 are separate components.
  • computer 30 and presentation devices 40 are programmed to provide immersive effects for a PED 20 in the space. Any space served by such a computer 30 that recognizes a particular PED 20 may immediately begin providing immersive effects. Also, as described above, computer 30 provides much of the processing attendant to the immersive experience, so PED 20 is not burdened with these tasks.
  • the above-described features of computer 30 may be substantially incorporated into PED 20, and computer 30 may be eliminated. In that case, the different interfaces 35 of Fig. 1 extend between PED 20 and the respective presentation device 40. Some or all of the interfaces 35 will often be wireless.
  • the interfaces may be wired at the presentation devices 40 and all run into a local wireless transmitter (e.g. Bluetooth) that communicates wirelessly with PED 20.
  • PED 20 is provided with the orientation data (and position data, where applicable) that computer 30 received.
  • presentation devices may store and provide their coordinates (e.g., in room 70 reference system O) and device type to PED 20.
  • PED 20 may be provided with input of its own coordinates and orientation in O (e.g., at a marked location and orientation in the room 70), which is subsequently used to keep track of current location and orientation via an internal navigation system (e.g., accelerometers).
  • a separate image recognition system can provide PED 20 with its position and orientation in O, and the positions and types of presentation devices.
  • FIG. 3 is a flow chart for a basic method in accordance with an embodiment of the invention.
  • immersive effect data is generated by a PED and in block 110 it is received and processed by a computer. Such processing may include generating the substance of the immersive effect.
  • computer selects at least one presentation device for output of at least one immersive effect based at least in part on the orientation of the PED.
  • the computer controls the selected presentation device to provide the immersive effect.
  • video game PEDs are the focus of the above embodiments, any PED where a user may be provided with a corresponding immersive effect may be used with the techniques of the current invention. This includes, for example, cellular telephones, personal audio players and the like.
  • electronic books and other literary works that qualify as PEDs and which may be utilized in the context of this invention are described in commonly owned co-filed U.S. Provisional Patent Application entitled "Immersive Reading Experience Using Eye Tracking" by Hubertus M. R.
  • processing performed in support of the invention may, for example, be carried out by multiple processors in multiple locations.
  • Interfaces shown for the various embodiments between various components can have multiple paths and types.
  • the specific presentation devices shown for the embodiments of Figs. Ia and 2 are speakers, lights and a television, any other type of presentation device may serve the room or space.
  • the locations and/or relative locations of the pertinent types of presentation devices for an immersive effect may be determined after an immersive effect is received or generated that require that type of presentation device.
  • the processing can be simplified.
  • Fig. Ia computer can select a presentation device based solely on the relative positions of presentation devices determined using current PED orientation.
  • computer may select a presentation device based solely on the relative positions between PED and presentation devices, referenced to orientation of PED.

Abstract

A system (10) for providing an immersive effect for a portable electronic device (20). A processor (21, 31) interfaces with a portable electronic device (20) and one or more presentation devices (40) in the vicinity of the portable electronic device (20). The processor (21, 31) receives data that provides a determination of an orientation of the portable electronic device (20). The processor (21, 31) also receives data that relates to the local output of the portable electronic device (20). The processor (21, 31) controls the output of one or more presentation devices (40) in the vicinity of the portable electronic device (20) based on the orientation and output of the portable electronic device (20) to provide an immersive experience for the user (60) of the portable electronic device (20).

Description

ORIENTATION AND POSITION ADAPTATION FOR IMMERSIVE
EXPERIENCES
The entire contents of the following two commonly owned U.S. provisional patent applications are hereby incorporated by reference herein: (1) U.S. provisional patent application ser. no. 60/665,025, entitled "Orientation and Position Adaptation for Immersive Experiences" by Hubertus M. R. Cortenraad and Anthonie H. Bergman, filed March 24, 2005; and (2) U.S. provisional patent application ser. no. 60/665,023, entitled "Immersive Reading Experience By Means Of Eye-Tracking" by Hubertus M. R. Cortenraad, filed March 24, 2005.
The invention generally relates to portable devices and the creation of immersive experiences.
Systems for delivering immersive experiences for traditional televisions have recently been developed. The Philips Ambilight system has been developed for traditional home television sets that are at a fixed location in the home, for example. Built-in lighting around the periphery of the television set is coordinated with the content on the television to give the viewer a more immersive experience. For example, if a soccer game is being viewed (where much of the television display is green grass), the peripheral lighting projects a green hue on the walls or other surfaces immediately surrounding the television, thus giving the viewer the feeling that the display is actually larger and more continuous. Many portable electronic devices exist. An example of a portable gaming device that provides audio and video output is the popular Game Boy by Nintendo. However, the Game Boy audio and video output is typically confined to the display and speakers integral to the device. Many portable electronic devices (such as the Game Boy, laptop PCs and cellular telephones) also provide a jack for headphones and/or extension speakers as an alternative to the built-in speaker.
Implementation of an immersive experience for a device that has a fixed position in a room (such as a TV) is facilitated by the setting. The positions of the TV and the associated speakers, lights, etc. that provide the experience are fixed with respect to the viewers. For example, the TV is generally in front of the seats from which the viewers watch and the speakers lights, etc., are positioned around them. In addition, the viewers face the TV, so the relative orientation between the viewers and the speakers, lights, etc. are also fixed. Thus, for example, a speaker located to the right of the viewer watching the TV remains to the right and can be programmed to output immersive audio effects to a viewer's right.
However, if one were to consider providing an immersive experience for a portable electronic device, the maneuverability of the device creates difficulties. Perhaps the greatest difficulty that would be confronted is that, due to its portability, the relative orientation can readily change between the user of the portable electronic device and the presentation devices that would provide the experience. For example, a user of a portable gaming device sitting in a room with a room speaker A to his left and room speaker B to his right may shift his chair or otherwise move so that speaker A is to his right and speaker B is to his left. Thus, permanent programming of speaker A and B to provide effects for one relative orientation of the user would provide incorrect experiences in many cases.
Among other things, the present invention provides immersive experiences for a portable electronic device. As used herein, an "immersive effect" is generally an invocation of one or more senses of the user separate from the output (e.g., the delivery of content) of the portable electronic device presented locally to a user. An immersive effect is generally correlated to the local output of the portable electronic device and is generally provided by an intelligent re-presentation, supplementation and/or enhancement of the local output of the portable electronic device that is scaled and presented to the user in the real- world frame of reference. As described further below, immersive effects are created by lights, speakers and other presentation devices. An "immersive experience" presented to the user is the totality of one or more immersive effects. Immersive effects (and the resulting immersive experience) generally serves to give the user the sensation of being immersed or physically involved and integrated in the local content output of the device. Thus, in the example of a game, an immersive experience will generally create sensory input separate from the gaming device itself that makes a player feel as if he/she is in the game, instead of just playing a game on a display. For other types of devices, the immersive effects and experience will similarly provide a heightened feeling of involvement and connection. In one of many possible embodiments, the invention comprises a method of providing an immersive experience for a portable electronic device. The method comprises controlling the output of one or more presentation devices in the vicinity of the portable electronic device as a function of an output of the portable electronic device and the orientation of the portable electronic device. In the method, the output of the one or more presentation devices is controlled to provide an immersive experience for the output of the portable electronic device. Another of many possible embodiments of the invention comprises a system for providing an immersive experience for a portable electronic device, which includes a processor that interfaces with the portable electronic device and one or more presentation devices in the vicinity of the portable electronic device. The processor receives position signals that provide a determination of an orientation of the portable electronic device. The processor also receives signals from the portable electronic device that relate to an immersive experience. The processor controls the output of one or more presentation devices in the vicinity of the portable electronic device using the output of the portable electronic device and the orientation of the portable electronic device to provide an immersive experience for the user of the portable electronic device.
FIG. 1 is a representative depiction of a system in accordance with an embodiment of the invention;
FIG. Ia is a particular representative depiction of a system in accordance with an embodiment of the invention;
FIG 2 is another particular representative depiction of a system in accordance with an embodiment of the invention; and FIG. 3 is a flow chart of a method in accordance with an embodiment of the invention.
Fig. 1 is a representative illustration of a system 10 in accordance with an embodiment of the invention. The system of Fig. 1 comprises a portable electronic device (PED) 20, computer 30 and one or more presentation devices (PD) 40a-40N, where 4ON reflects the number of presentation devices. Computer 30 may be, for example, a server. PED 20 is a portable device that may be carried by the user to different positions and utilized for its intended purpose. As will be described in more detail through the use of Fig. Ia, the presentation devices may be speakers, lights, and so forth, that generally remain associated with a room or other physical space. Thus, while some or all presentation devices may be moved to other positions within the space, they typically remain in the same position for extended periods of time. In particular, presentation devices generally remain in fixed locations and orientations while PED 20 may have different orientations and/or move to different locations as a user carrying PED 20 moves in the space.
PED 20 generally can be any portable electronic device that provides content locally to a user. Examples include portable gaming devices (e.g., Game Boy), cellular phones, portable DVD players, MP3 and other personal audio players, laptop PCs and PDAs. Many modern PEDs 20 include an internal (and/or associated) processor 21 and memory 22.
Presentation devices generally include any controllable device that presents a sensation to a user. Examples include audio speakers and video displays. Other visual devices that are readily recognized as presentation devices include, for example, video projectors and lights. Many devices other than speakers can produce audio effects, for example, alarms, a clock ticking, and even pyrotechnic devices. Presentation devices also include many devices that invoke a tactile sensation, for example, fans, misting and fog generators, seat vibrators (rumblers), ultrasonic generators, and the like. Tactile devices may also include temperature control devices, for example, air conditioning and heaters. Presentation devices also include devices that generate an olfactory sensation, for example, a misting or smoke device that is controllable to present odors. Similarly, presentation devices also include devices that invoke a taste sensation. For example, misting or smoke devices may also be used to invoke tastes. Other specific presentation devices are readily identifiable.
Also shown in Fig. 1 is computer 30 interposed between PED 20 and presentation devices 40a-40N. Communication interface 25 links PED 20 and computer 30. Computer 30 is also linked to each presentation device 40a-40N via a respective communication link 35a-35N as shown in Fig. 1. Presentation devices 40a-40N also include appropriate control electronics that allow their output to be controlled and adjusted by computer 30 over the respective communication line 35a-35N Also shown in Fig. 1 is positioning system 50, which also has a communication interface 55 with computer 30. Depending on the embodiment, interfaces 25, 35, 55 may be wireless or wired, as described in more detail below.
In general, PED 20 transmits data regarding the immersive experience (or the required immersive effects to provide the immersive experience) over communication interface 25 to computer 30. Communication interface 25 may also be used to identify PED 20 to computer 30, for example, when PED 20 is moved by a user into the space served by computer 30, or turns on the PED 20. In addition, the orientation of PED 20 is communicated from positioning system 50 to computer 30 over communication interface 55. Positioning system 50 may also be used to identify PED 20 to computer 30 in lieu of (or in addition to) PED 20 identifying itself to computer 30 over interface 25 as noted above. Positioning system 50 may be separate, or it may be incorporated in whole or in part in computer 30 and/or PED 20. Computer 30 (generally using one or more internal processors 31 and attendant memory 32) is configured for the processing hereinafter described, for example, via programming provided by appropriate software stored in memory, firmware, or other programming sources. Computer 30 may first process the immersive effects data received and/or the data received from positioning system 50. Computer 30 utilizes the effects data received and the orientation data received to drive one or more of the presentation devices 40a-40N to create immersive effects for the user appropriate for the orientation of the PED 20 in the space.
Fig. Ia is a particular system that is supported by components shown in Fig. 1 and described above. In Fig. Ia, the various components are located in space, shown to be a room 70. The perspective of the room 70 is looking down from above. Thus, for example, the top of the head of user 60 is seen. Presentation devices 40a-40h reside along the perimeter of the room 70. In this case presentation devices 40a, 40c, 4Oe, 4Og are speakers Sl -S4 respectively, positioned in the four corners of the room 70 and oriented toward the center of room 70. Presentation devices 40b, 4Od, 4Oh are lights L1-L3 respectively each located approximately halfway along three of the walls of the room 70. Presentation device 4Of is a television TV located along the fourth wall of the room 70 facing toward the center of room 70. As noted above, presentation devices 40a-40h are associated with the room 70 and once placed generally stay in the same location and/or orientation for extended periods of time (e.g., a room in a home is typically rearranged and left for months or even years having the same arrangement). In any event, presentation devices will in the overwhelming majority of cases remain fixed for the time interval a PED 20 is being used in the room 70.
In addition, computer 30 associated with room 70 is shown in Fig. Ia as being adjacent TV 4Of. Computer 30 may be located elsewhere, and need not be physically located in room 70. User 60 is shown at a position in room 70 facing in a certain direction and engaging PED 20. For ease of presentation, the orientation and position of PED 20 will be focused on in the ensuing description. Because the user 60 is typically closely engaged in using the PED 20, it is generally acceptable to provide the immersive effects based on the orientation of PED 20 and, where applicable, its position. However, as described further below, the effects can be readily adjusted to the user 60 when desired or necessary (for example, to the user's head). Also, the interfaces 35a-35h between computer 30 and the respective presentation device 40a-40h are omitted for clarity in Fig. Ia, but will typically be a wired or wireless connection. Considering the arrangement of presentation devices 40a-40h in Fig. Ia, determination of the current orientation of PED 20 alone is sufficient in this embodiment to provide the spatial relationships used in selecting presentation device(s) to output an immersive effect(s). As described below, orientation of the PED 20 provides a general indication of the relative position between the PED and the presentation devices that is referenced to the orientation of PED 20, and may be used for purposes of the immersive effects. PED 20 includes an internal compass 20a that indicates to PED 's internal processor in which direction the PED 20 is facing. Typically, the orientation of the PED 20 is taken as the direction the user 60 will be facing when engaging PED. For example, based on north being the direction as shown in Fig. Ia, PED 20 is shown facing south in room 70. (Thus, in this example, PED 20 incorporates the substance of positioning system as internal compass 20a, which determines PED 20 orientation.)
PED 20 transmits a signal to computer 30 indicating the PED 20 orientation (in this case, facing south). Communication interface 25 between PED 20 and computer 30 in Fig. Ia is a wireless link (e.g., Bluetooth, infrared, RF, or the like) and is thus not shown in Fig. Ia. Computer 30 in turn determines the general relative positions of presentation devices 40a-40h referenced to the orientation of PED 20. For example, computer 30 has pre-stored in memory the types and general positions of the various presentation devices 40a-40h in room 70 based on north being as shown in Fig. Ia. For example, it is known to computer 30 that speaker 40a is located in the northwest corner, light 40b is along the north wall, speaker 40c is in the northeast corner, light 4Od is along the east wall, speaker 4Oe is in southeast corner, TV 4Of is along south wall, speaker 4Og is in southwest corner and light 4Oh is along west wall. Using these pre-stored general positions of the presentation devices and the received orientation of the PED 20, computer 30 is able to determine basic relative positions between the PED 20 and the presentation devices 40a-40h that is referenced to the current orientation of PED 20. For example, for the PED 20 facing south, computer 30 determines that presentation devices 40a, 40b and 40c are behind PED 20, since they lie along the north wall. Likewise, speaker 40a is also determined to be to the right of the user 60 (i.e., to the northwest of a user 60 facing south) and speaker 40c is also determined to be to the left of user 60. The general positions of the other presentation devices 40d-40h relative to the PED 20 and user 60 having a particular orientation is likewise readily determined by computer 30. PED 20 also supplies the data or signal giving the immersive effect required that corresponds to the current local output at PED 20. This is sent to computer 30 over communication interface 25. Thus, PED 20 in this embodiment provides the substance of the immersive effects needed for the immersive experience. Computer 30 may provide certain electronic processing of the signal (e.g., conversion, amplification). Computer 30 utilizes the general relative positions of the presentation devices 40a-40h with respect to PED 20 for its current orientation, as well as the types of presentation devices 40a-40h, to output the appropriate immersive effects. For example, PED 20 in Fig. Ia may be a portable gaming device where on the PED display the avatar is entering a dark cave and there is also a waterfall behind him. The corresponding immersive effects may thus be to 1) provide lighting from the rear, 2) dimming lighting to the sides and front and 3) provide a low bass rumble from the rear. (Nos. 1 and 2 would both provide an immersive effect of entering the dark cave on the PED display, and no. 3 would provide an immersive effect for the position of the waterfall in the game.) PED 20 transmits this data and/or signal to computer 30 via interface 25. Using the current general relative positions of the presentation devices 40a-40h with respect to the PED 20 based on current PED orientation as described above, computer 30 raises the light output of light 40b behind user 60 and lowers the other lights 4Od, 4Oh via interfaces 35b, 35d and 35h, respectively. (TV 4Of is turned off and speakers 4Og, 4Oe are also lowered, if necessary.) Likewise, computer 30 engages speakers 40a, 40c behind user 60 to provide the low rumble of the waterfall. Should user 60 cause his avatar in the game to turn around and start to walk out of the cave, then the corresponding immersive effects change to bright light and a rumble in front of the PED 20. For the user 60 oriented as in Fig. Ia, computer 30 dims light 40b and lowers speakers 40a, 40c, and engages speakers 4Og and 4Oe known to be in front of user 60 to provide the rumble of the waterfall. (Computer 30 in this example knows that side lights 4Od and 4Oh are already dimmed.) Likewise, TV 4Of may be driven by computer 30 to output a bright blue color, for an immersive effect of exiting the cave into bright light. Now, should the user 60 turn to his left (facing east in the room 70), PED 20 sends an updated orientation signal ("east") to computer 30. Computer re-determines the general relative positions between the presentation devices 40a-40h and the PED 20 referenced to the new east orientation of the PED 20 in the manner described above. In this case, speakers 40c and 4Oe and light 4Od are determined to be in front of user 60. Thus, to provide the immersive effects of exiting the cave, computer 30 raises light 4Od and engages speakers 40c, 4Oe to provide the waterfall effect. Computer 30 also dims the TV 4Of output (now to the right side of user 60) and lowers speaker 4Og output (now behind and to the right of user 60).
It is noted that presentation devices are all configured or oriented in Fig. Ia to project output from the perimeter into the room 70. For example, speakers 40a, 40c, 4Oe, 4Og face toward the center of the room 70, thus projecting their output most broadly throughout the room 70. Similarly, lights 40b, 4Od and 4Oh may project in all directions, also projecting their output broadly throughout the room 70. Likewise, TV 4Of is visible and projects throughout the room 70. Thus, presentation devices in Fig. Ia generally direct their output at a PED 20 located in the room 70, so orientation of the presentation device is not used as a factor in selection.
For the embodiment of Fig. Ia, PED 20 provides the substantive content of the immersive effect to computer 30 over interface 25. Alternatively, computer 30 may include software that generates the substance of the immersive effect. For example, the local output that PED 20 outputs to user 60 may also be transmitted in whole or part to computer 30 over interface 25, and computer 30 may generate the substance of the needed immersive effects and control presentation devices accordingly. For example, for a PED 20 displaying a game, data regarding the content (audio, video output) of the game being output by PED 20 to the user 60 may also be transmitted to the computer 30, which analyzes the content and generates appropriate immersive effects. In addition, the local output of PED 20 to the user 60 in the above description is generated within PED 20 itself (through internal memory and processing). Alternatively, computer 30 (or a different computer) may provide the content and other substantive output of PED 20 over interface 25. (A laptop or computer monitor with detachable monitor like the Philips DesXcape may provide the PED according to such a configuration. Such a monitor may be used, for example, to present a photo slide show from a user's collection with attendant immersive effects created by the user.) The interface 25 between computer 30 and PED 20 may also provide the user with a menu to select games, other content, or the like, that may be run on PED 20 through interface 25.
Also in the above embodiment, orientation of PED is supplied by an internal compass within PED 20. Many other systems and techniques may be utilized as positioning system 50 to determine the orientation of PED 20. For example, many available optical recognition techniques and/or systems may be used. The system may be configured to recognize the contour of those PEDs that are supported by the system, and to determine the orientation of the recognized PED. Orientation of the PED may be determined by determining the orientation of the device itself (using images from multiple cameras or an overhead camera, if necessary), and/or by determining the direction a user is facing through face recognition. (Once recognized, changes in the position and orientation of the PED and user may also utilize available image tracking techniques.) Other alternatives may be used to detect the orientation of PED 20, such as the accelerometers, gyroscopes, external detectors and the like described with respect to the embodiment of Fig. 2 below. In addition, the system may also provide initial recognition of a PED 20, for example, through image recognition as noted above, signaling protocols transmitted along communication interface 25, or another interface and/or technique. If interface 25 and RF signaling is used, any one of a multitude of available signaling protocols may be utilized to announce the presence of PED 20. In addition, the locations of presentation devices 40a- 4Oh may be manually input to computer 30 or otherwise determined and, if necessary, transmitted to computer 30. (Certain techniques and components used in locating the presentation devices are described with respect to the embodiment of Fig. 2 below.)
Recognition and communication setup with PED 20 may occur, for example, when a user 60 enters a room, or when a user 60 in the room turns on the PED 20. Immersive effects may occur immediately thereafter, or they may be engaged when a user 60 provides an input at the PED 20. As noted, in the embodiment of Fig. Ia, orientation of PED 20 is used to determine a non-quantitative, general indication of the relative positions of presentation devices 40a- 4Oh. These relative positions of the presentation devices 40a-40h are referenced to the current orientation of PED 20. This technique is particularly suitable where the presentation devices lie along the perimeter of the space (e.g., wall of the room), and tend to "surround" a user that tends to be located generally in the center of the room. The presentation devices are also configured or oriented in Fig. Ia such that their output projects at least in part toward the center of the room 70 from their locations on the perimeter. In many cases, as in Fig. Ia, such arrangement allows determination of current orientation of PED 20 alone to be sufficient to provide the spatial relationships used in selecting presentation device(s). (Other factors may still be involved in the selection: In the above illustrative embodiment, presentation devices were also chosen based on type of device. For example, speakers were chosen to output audio effects and lights were chosen to output lighting effects.) It is preferrable that presentation devices are also relatively uniformly distributed according to their type (e.g., lamps, speakers display), as in Fig. Ia. In another embodiment of the invention, among other things, a coordinate position of the PED 20 is determined and utilized in addition to its orientation in providing the immersive effects. This serves to improve the immersive experience in many situations, for example, if the presentation devices are not distributed around the perimeter of the room in a relatively uniform manner and/or the user is located in the room such that the relative positions of the various types of presentation devices are relatively asymmetric. The underlying system for this embodiment is also supported by components generally shown in Fig. 1 and described above. In this case, positioning system 50 provides an orientation and coordinate position of the PED 20 in the space, which is provided to computer 30. Computer 30 also receives, stores, or determines coordinate positions of the presentation devices in the room. Computer 30 processes the coordinate data to quantitatively determine the relative positions of the presentation devices with respect to the current location of PED 20, and references the relative positions to the current orientation of PED 20. Computer 30 also receives or internally generates immersive effects data. Computer 30 uses the relative positions of the presentation devices as referenced to the current orientation of PED 20 as a factor in selecting one or more presentation devices to output appropriate immersive effects. The outputs of the selected presentation devices are then adjusted to provide the immersive effects proper for the current orientation and location of PED 20.
Fig. 2 is a representative depiction of a particular system that corresponds to such an embodiment. In Fig. 2, from the view from above as shown, user 60 is positioned to the left of center in a space (also shown to be a room 70) and facing down the page while engaging PED 20. Various presentation devices 40i-40n are shown in non-uniform locations about the room 70, with some located away from the perimeter of the room 70.
For example, light 4Oi is located along a wall behind and to the left of user 60, whereas light 40m is closer to user 60. Speaker 4Oj is to the left and behind user's current position, speaker 40k is to the forward left of user 60. TV 401 lies along the wall that user 60 is currently facing, also to user's left. Finally, speaker 4On is shown as being a few feet directly to user's right.
Computer 30 and positioning system components 50 are represented as shown in
Fig. 2. As noted above, computer 30 may be located in any location inside or outside the room 70, it may be integrated with other components (including one or more of the presentation devices), or be implemented using other configurations. In manner analogous as described for the embodiment of Fig. Ia, PED 20 supplies the content of the local PED
20 output to user 60 and supplies data and/or signals containing the substance of the corresponding immersive effect to computer 30 over a wireless interface 25. Computer 30 interfaces with presentation devices 40i-40n to output the immersive effects over interfaces
35i-35n. These interfaces are omitted in Fig. 2 for clarity, but will typically be a wired or other conventional wireless connection.
Computer 30 also pre-stores the coordinate locations of presentation devices 4Oi-
4On, which can be in two or three dimensions. For many immersive effects the positions of the presentation devices in two dimensions (corresponding to their location in the plane of the floor) will be adequate. The coordinates may be with respect to an origin such as origin
O shown in the corner of the room 70 of Fig. 2.
Positioning system components 50 work in conjunction with computer 30 to provide the current orientation and coordinate location of PED 20. Although positioning system components 50 are represented adjacent computer 30, this is for simplicity of the depiction. Depending on the particular positioning system 50 used, its components may be decentralized and located throughout the room 70. Also, as noted above, some or all of positioning system components may reside in computer 30 (in particular, the processing components). Positioning system may be comprised of any one of many systems and techniques that may be configured to provide orientation and coordinates of PED 20. For example, many available image recognition techniques and/or systems and associated coordinate processing are well suited for detecting the presence of PED 20 in the room 70 and, once detected, for determining its orientation and coordinate location. If image recognition is used, then positioning system 50 will typically include one or more video cameras positioned in room 70 that provide images that are processed using attendant processing software. As in the case of the embodiment of Fig. Ia, orientation of the PED 20 may be determined using image processing techniques; for example, one or more cameras facing downward from the ceiling that captures images may be readily processed to determine orientation of PED. Face recognition techniques applied to detect frontal images of user 60 (from images taken by wall mounted cameras covering a room, for example) may also be used to determine the angle at which user 60 is rotated with respect to the axes of the original origin O. For coordinate position, a particular system that is readily adaptable to use available image recognition techniques to recognize the contours of PEDs and describes processing which may be used with the images to determine the recognized PED' s coordinate position in two or three dimensions is described in PCT Published International Application having International Publication No. WO 02/41664 A2, entitled "Automatically Adjusting Audio System" by M. Trajkovic et al., having International Publication Date 23 May 2002, the entire contents of which are hereby incorporated by reference herein. Rather than initially detecting the PED by recognizing the contour in the image, the cameras may detect in the image a beacon (e.g., IR) output by the PED. Once detected, the corresponding orientation and coordinate position of PED 20 can then be determined from the images.
Such a positioning system 50 is used to determine the orientation and x-y coordinates of PED 20 and transmit them to computer 30 (if the processing of system 50 is separately located). Computer 30 uses the position of PED 20 and the known positions of presentation devices 40i-40n to determine the relative coordinate position of each presentation device 40i-40n with respect to PED 20. Determination of the relative positions of the presentation devices with respect to PED 20 will preferably include translating the origin to the current location of the PED 20. In addition, the relative positions are referenced to the current orientation of PED 20 preferably by also aligning the translated origin with the current detected orientation direction of the PED 20.
Thus, for example, in the depiction of Fig. 2, the origin is translated as shown to the position of PED 20, and oriented so that the positive y axis lies in the direction PED 20 is oriented. (This is depicted in Fig. 2 as translated origin O'.) Computer 30 also does a simple translation of the known positions of presentation devices 40i-40n in the original coordinate system O to determine the relative positions in the translated coordinate system O'. Because O' is also aligned with the current orientation of PED 20, the translated coordinates of presentation devices are referenced to the current orientation of PED 20. Thus, in the O' coordinate system, light 4Oi may for example be determined to be at x-y location (-5, -8), indicating 5 feet to the left and 8 feet behind PED 20. For the layout shown, it may also be determined that the other devices have the following coordinate locations: speaker 4Oj at (-6, -4); speaker 40k at (-8, 5); TV 401 at (-4, 8); speaker 4On at (2, 0) and light 40m at (-0.5, -0.5). As noted, these coordinates provide the relative positions of each presentation device in a frame of reference for the current orientation of PED 20. (Namely, positive x is to the right of PED as currently orientated, negative x to the left, positive y in front and negative y behind.)
As noted, computer 30 also receives the substance of the current immersive effect from PED 20. Computer 30 processes the immersive effect data received using the relative positions (as referenced to current orientation of PED) and types of presentation devices available to determine appropriate output by one or more of the presentation devices that best creates the immersive effect for the user 60 at the current position of PED . The substance of the immersive effect received by computer 30 from PED 20 has various criteria (e.g., create diffuse backlighting). Computer 30 includes a series of rules, which may be incorporated into decision trees, or the like, that provides for selection of presentation devices and adjustment of the output of the selected devices to reflect the required immersive effect for the user. The factors used by computer 30 for selection of particular presentation devices for a required immersive effect will include the position and type of the presentation device. Typically, such processing rules generally follow a certain logical decision making, but there is also typically a degree of design choice involved. For example, referring again to Fig. 2, if the immersive effect received calls for backlighting, computer 30 may first look for available lights behind PED 20 as currently positioned and oriented. Using the example coordinates given above, computer 30 knows from the y coordinate of presentation devices in O' that light 4Oi is located 8 feet behind user 60 and light 40m is located 0.5 feet behind user 60. If the immersive effect calls for a relatively evenly distributed backlight and/or a more diffuse backlight (such as an emerging sunrise), computer 30 may select light 4Oi since although five feet to the left of user 60, it is further behind user 60 and will provide a more even and diffuse backlight than light 40m. On the other hand, if the immersive effect calls for a relatively intense and immediate backlight (e.g., providing an effect for a sudden emergence of a spotlight in a jailbreak game), processing of computer 30 will likely select light 40m for output of the effect since it is closer behind user 60.
A similar example is illustrative of the processing that may typically take place for audio related to an immersive effect. As seen in Fig. 2, speakers 4Oj, 40k, and 4On are located asymmetrically in the room 70, and user 60 is likewise positioned at a relatively unbalanced point with respect to them. Thus, computer 30 attempts to select and adjust the speaker output to accurately reflect a current immersive audio effect from PED 20. For example, if the user 60 is engaged in a game on PED 20 where a train is approaching from the right of his avatar, computer 30 may receive an immersive effect signal requiring an increasing audio output from the right of the PED 20. Computer 30 looks for speakers to the right of PED 20 and determines (based again on the coordinates in O' described above) that speaker 4On is suitable, since it is 2 feet directly to user's right. The audio output of speaker 4On is adjusted by computer 30 based on the proximity of the train in the game (which is reflected in the immersive effect data received) and the 2 feet distance between speaker 4On and user 60.
If, on the other hand, the train is approaching from the left in the game, then computer 30 looks for speakers to the left of the current position of user 60 and finds speakers 4Oj, 40k at locations (-6, -4) and (-8, 5). The y coordinate indicates to computer 30 that neither is directly to the left. However, computer 30 determines from the y coordinates that one speaker 40k is in front of user 60 and one 4Oj is behind. Thus, computer 30 selects both speakers and uses the relative positions in O' to balance the audio output of each to provide the sound of a train emerging from the immediate left of user 60. Many immersive audio effects may be adequately output by computer 30 by speakers 4Oj, 40k and 4On positioned as in Fig. 2. A general surrounding background noise effect (representing, for example, the wind in a hurricane on the PED 20) may also be represented to the user 60 by the computer 30 based on appropriate balancing of the audio output of each speaker given the relative positions. On the other hand, certain immersive effects required may not be adequately output by the available presentation devices for the current orientation of PED 20. The rules followed by the processing of computer 30 will therefore also determine when to forego an immersive effect. For example, if the immersive effect signal calls for a flash of light to the user's right in Fig. 2, computer 30 will simply not find a light anywhere to the user's right when oriented as shown. While computer 30 may accept a light that is forward or rearward to the right of user 60 for the effect, it will generally not select lights such as 40m and 4Oi in Fig. 2 that are to the left of user 60, since lighting these lights will generally not corollate to the local output of the PED (e.g., a Game Boy display). Thus, computer 30 may elect not to output anything for this particular immersive effect. Similarly, if the immersive effect signal calls for a train approaching from directly in front of user 60, computer 30 may determine that speaker 40k is too far to the user's left to adequately present this effect. (It is noted that if a speaker is available in TV 401, computer 30 may determine it is adequately centered in front of user 60 to provide the effect.) On the other hand, if the immersive effect signal calls for "rolling thunder" in front of the user 60 (i.e., a more diffuse signal), computer 30 may determine that an immersive effect skewed to the left of user 60 from speaker 40k may be acceptable.
The embodiment of Fig. 2 may support many of the variations described above for the embodiment of Fig. Ia. Thus, instead of receiving the immersive effect signal from PED 20, computer 30 may alternatively generate the substance of the immersive effect based on the current output of PED 20 to user 60. For example, if PED 20 is a game device, PED 20 may supply some or all of the game content to computer 30 via interface 25. Computer 30 may process the game content and generate the substance of an immersive effect that corresponds to the current game content. Computer 30 then selects and controls presentation devices to output the appropriate immersive effect, as described above. In addition, instead of the output of PED 20 to the user 60 being generated within PED 20 itself, alternatively computer 30 may provide the content and other substantive output to PED 20 over interface 25.
There are many alternatives to the image recognition system used as positioning system 50 in the embodiment of Fig. 2 described above. For example, PED 20 can emit a signature RF, ultrasound, or other signal that is captured by a number of corresponding detectors situated around the room 70. Processing the received signals using standard triangulation techniques based on time-of-flight, phase-change, or the like, will give the PED 20 position. Line of sight detectors can be used to determine orientation. Alternatively, PED 20 may incorporate an internal navigation system comprised of accelerometers configured to determine orientation and position in two or three dimensions based on acceleration (including rotation) attendant to movement of the PED 20. (In that case, the embodiment of Fig. 2 has an internal positioning system 20a (not shown).) Generally, two offset accelerometers in each measured dimension may be used to detect both translation and rotation. Alternatively, gyroscopes (comprising an inertial navigation system), or one or more compasses, may be used to detect orientation. Such a positioning system may also periodically correct its coordinates when at a known position, for example, when placed in a recharging cradle having known fixed coordinates. It is generally desirable that the position of PED 20 be determined to an accuracy of 30-50 cm. In the embodiment of Fig. 2, the image recognition positioning system relied on determination of positions in two dimensions, namely, the plane of the floor of the room 70. As noted, however, the positioning system 50 may determine the position of PED 20 in three dimensions. For example, the image recognition positioning system described above may determine the position of PED 20 in three dimensions. Computer 30 may also store the positions of the presentation devices in three dimensions. Thus, when the position of PED 20 is determined by positioning system 50, computer 30 may determine the relative coordinate positions of the presentation devices 40 with respect to the PED 20 in three dimensions in analogous manner described above for two dimensions. For Fig. 2, this would effectively include the z coordinate, namely, the height above the floor in coordinate system O, or above PED 20 in O' . The translated coordinate system O' is still aligned with the orientation of PED 20, although the orientation is generally limited to the horizontal plane (that is, tilt angle of PED 20 is ignored). Having a third dimension available to computer 30 in creating immersive effects may be helpful in many situations. For example, for an immersive effect may require high backlighting, a third dimension will give computer 30 the ability to choose a ceiling lamp over a wall-mounted lamp located behind user 60.
As noted, determination of the location and orientation of PED 20 in the above description is used as a surrogate location for providing immersive effects to the user 60, since it is likely that user 60 is closely engaged with PED 20. For some applications, it may be desirable to use the actual location of the user, or even portions of the user's body (e.g., user's head). For example, the head of a user may be higher than an adjacent lamp, while the PED 20 he/she is holding is lower, and an immersive effect may call for overhead lighting. If the position of the PED 20 is used by computer 30 to determine the relative position of the lamp, computer 30 may engage the lamp to provide the effect. However, from the perspective of the user, that would provide a lighting effect from below. The positioning systems and/or related processing described above may be adjusted to the user. For image recognition positioning systems, for example, the system may be adapted to determine the location and orientation of a user's face (or other body part) adjacent a recognized PED 20. For systems that cannot be readily adapted to determine a more precise determination of the user, an average position at which a user typically holds the particular type of PED 20 may be used to adjust for the user's position. (Generally, the orientation of user may be determined from orientation of the PED 20.) As noted, the positions of the various presentation devices may be pre-stored in computer 30 for the various embodiments. Such pre-storage can involve manually determining the positions of the presentation devices with respect to an origin and inputting the coordinates to computer 30 via a user interface (keyboard, GUI, voice input, or the like). If and when presenatation devices are rearranged, the new coordinates are manually input. Alternatively, the system may include electronic components that determine the positions of the presentation devices. A straightforward way of automatic determination of location includes accelerometers associated with each presentation device, which generates data related to change of position based on acceleration, as described above for PED 20. The interface 35 between presentation devices 40 and computer 30 may support two-way communication; thus, presentation devices 40 may update computer 30 with data reflecting its current location, and computer 30 may use that data to determine the updated location (if not determined locally at the presentation device). The updated locations are subsequently used by computer 30 in determining the relative position with respect to current location of PED 20 (referenced to current PED orientation) for the immersive effect processing. Alternatively, image processing techniques may also be readily applied to detect the contours of various speakers, lights and other presentation devices, and determine positions from the images. Also, a presentation device may emit a signature RF, ultrasound, or other signal that is captured by a number of corresponding detectors situated around the room. Processing the received signals using standard triangulation techniques based on time-of-flight, phase-change, or the like, will give the position of the presentation device. Some or all processing aspects used for automatic determination of the locations of presentation devices may be incorporated into computer 30 or can reside in a separate controller or processor and memory (including positioning system 50 if separate). (For embodiments such as Fig. Ia that use only a relative position, such automatic determination is generally simpler.)
In the description of Fig. 2 above, the direction of output of presentation devices themselves with respect to PED 20 is not included in selection of a presentation device for an immersive effect. This generally assumes that the presentation devices project their output toward PED 20 (or user's 60) position. Such a presumption is acceptable in many cases, for example, where a presentation device projects in all directions, or is located along the perimeter of the room and projects into the room. However, orientation of a presentation device's output with respect to the location of the PED 20 may be an important factor to use in the selection of a presentation device for an immersive effect. For example, referring back to Fig. 2, light 40m may be a desk lamp that projects away from user 60 (e.g., in the negative y direction in O'). Although its relative position to user 60 would make it suitable for an intense lightening effect, its direction of output would not. However, for a soft backlighting effect, both its position and orientation would be suitable. Thus, orientation or direction of the output of presentation devices may be included with position in computer 30 and used in selecting an appropriate presentation device for an immersive effect. Orientation of the presentation devices may be input manually to computer 30 or automatically detected and input to computer 30 using the illustrative techniques as discussed above.
In the above embodiments, computer 30 and PED 20 are separate components. Among the advantages of this configuration, computer 30 and presentation devices 40 are programmed to provide immersive effects for a PED 20 in the space. Any space served by such a computer 30 that recognizes a particular PED 20 may immediately begin providing immersive effects. Also, as described above, computer 30 provides much of the processing attendant to the immersive experience, so PED 20 is not burdened with these tasks. Alternatively, the above-described features of computer 30 may be substantially incorporated into PED 20, and computer 30 may be eliminated. In that case, the different interfaces 35 of Fig. 1 extend between PED 20 and the respective presentation device 40. Some or all of the interfaces 35 will often be wireless. (For example, the interfaces may be wired at the presentation devices 40 and all run into a local wireless transmitter (e.g. Bluetooth) that communicates wirelessly with PED 20.) PED 20 is provided with the orientation data (and position data, where applicable) that computer 30 received. For example, presentation devices may store and provide their coordinates (e.g., in room 70 reference system O) and device type to PED 20. PED 20 may be provided with input of its own coordinates and orientation in O (e.g., at a marked location and orientation in the room 70), which is subsequently used to keep track of current location and orientation via an internal navigation system (e.g., accelerometers). Alternatively, a separate image recognition system can provide PED 20 with its position and orientation in O, and the positions and types of presentation devices. Continuing the example, PED 20 translates the coordinate system to O' (centered and oriented on itself) and performs analogous processing that results in selection of presentation devices as computer 30 did in the Fig. 2 embodiment. Presentation devices selected to output the current immersive effects receive control signals directly from PED 20. Processing by PED 20 is performed by processor 21 in conjunction with memory 22. Such a configuration will typically require less overall communication between components. Fig. 3 is a flow chart for a basic method in accordance with an embodiment of the invention. In block 100, immersive effect data is generated by a PED and in block 110 it is received and processed by a computer. Such processing may include generating the substance of the immersive effect. In block 120, computer selects at least one presentation device for output of at least one immersive effect based at least in part on the orientation of the PED. In block 130, the computer controls the selected presentation device to provide the immersive effect. Although video game PEDs are the focus of the above embodiments, any PED where a user may be provided with a corresponding immersive effect may be used with the techniques of the current invention. This includes, for example, cellular telephones, personal audio players and the like. As another example, electronic books and other literary works that qualify as PEDs and which may be utilized in the context of this invention are described in commonly owned co-filed U.S. Provisional Patent Application entitled "Immersive Reading Experience Using Eye Tracking" by Hubertus M. R. Cortenraad, Attorney Docket US050111, filed concurrently herewith, assigned U.S. Provisional Patent Application Ser. No. , the entire contents of which are hereby incorporated by reference herein. For example, people tend to read at different positions and orientations in a room. Application of the above-described techniques provide lighting, sound and other effects as emanating from the proper positions for the reader. For example, when reading about a character in an e-book that is at a beach watching a sunset, the sunset lighting effects and sounds of the sea (waves, ripples, or the like) are generated in front of the reader's position.
While the invention has been described with reference to several embodiments, it will be understood by those skilled in the art that the invention is not limited to the specific forms shown and described. Thus, various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. For example, there are many alternative positioning systems that may be utilized in the various embodiments. In addition to those noted above, commercially available GPS systems may be employed in some settings. There are also many alternative configurations and locations for the various components that support the invention. For example, where a computer is referred to, it is utilized broadly and may include, for example, servers, PCs, microcomputers and microcontrollers. It may also generally include a processor, microprocessor and CPU with its associated memory. (Any other computing device that is possibly excluded from the broad scope of "computer" may nonetheless be substituted if it may be configured to carry out the required processing.) In addition, processing performed in support of the invention (such as processing performed by computer 30) may, for example, be carried out by multiple processors in multiple locations. Interfaces shown for the various embodiments between various components can have multiple paths and types. Although the specific presentation devices shown for the embodiments of Figs. Ia and 2 are speakers, lights and a television, any other type of presentation device may serve the room or space. Also, the locations and/or relative locations of the pertinent types of presentation devices for an immersive effect may be determined after an immersive effect is received or generated that require that type of presentation device. In addition, where the presentation devices are all the same type, the processing can be simplified. For example, those immersive effects that are not supported by the type of presentation device may be ignored and computer does not have to consider device type in the selection. Under these circumstances, for the embodiment of Fig. Ia for example, computer can select a presentation device based solely on the relative positions of presentation devices determined using current PED orientation. Also under these circumstances, for the embodiment of Fig. 2 for example, computer may select a presentation device based solely on the relative positions between PED and presentation devices, referenced to orientation of PED. Thus, the particular techniques described above are by way of example only and not to limit the scope of the invention.

Claims

What Is Claimed Is:
1) A method of providing an immersive experience for a user (60) of a portable electronic device (20) using one or more presentation devices (40) associated with a space (70) in which the portable electronic device (20) is located, the method comprising: a) considering one or more immersive effects that provide the immersive experience (110); b) selecting at least one of the presentation devices (40) for outputting at least one of the immersive effects based on at least the orientation of the portable electronic device (20) in the space (70) (120); and c) controlling the selected at least one presentation device (40) to output at least one of the immersive effects (130).
2) The method as in Claim 1 , further including determining the orientation of the portable electronic device (20).
3) The method as in Claim 1, wherein selecting at least one of the presentation devices (40) for outputting at least one of the immersive effects is further based on the position of the portable electronic device in the space (70).
4) The method as in Claim 3, further including determining the position and orientation of the portable electronic device (20).
5) The method as in Claim 1, wherein data for the one or more immersive effects considered is generated by the portable electronic device (20).
6) The method of Claim 1 , wherein the immersive experience is related to a current local output at the portable electronic device (20).
7) The method of Claim 1, wherein selecting at least one presentation device
(40) for outputting the immersive effect is further based on at least one of the type of the presentation device (40), the coordinate location of the presentation device (40) and the orientation of the presentation device (40).
8) The method of Claim 1, wherein at least one of the one or more presentation devices (40) associated with the space (70) is one of a speaker (40a, 40c, 4Oe, 4Og, 4Oj,
40k, 4On), video display (4Of, 401), video projector and lights (40b, 4Od, 4Oh, 4Oi, 40m).
9) The method of Claim 1 , wherein at least one of the one or more presentation devices (40) associated with the space is one of a fan, mist generator, fog generator, ultrasonic generator, vibrator, heater and air conditioner.
10) The method of Claim 1 , wherein the immersive effect output provides at least one of a audio sensation, visual sensation, tactile sensation, olfactory sensation and taste sensation.
11) The method as in Claim 1, wherein the controlling the selected at least one presentation device (40) to output at least one of the immersive effects includes adjusting the output to provide the immersive effect at the location of the portable electronic device (20).
12) The method as in Claim 1, wherein the orientation of the portable electronic device (20) is based on the orientation of the user (60).
13) The method as in Claim 1, wherein selecting at least one of the presentation devices (40) for outputting at least one of the immersive effects is further based on the position of the user (60) in the space (70).
14) A system (10) that provides an immersive experience for a user (60) of a portable electronic device (20), the system (10) comprising at least one processor (21, 31) operatively coupled to at least one memory (22, 32), the at least one processor (21, 31) configured to accept as input data related to at least an orientation of the portable electronic device (20) in a space (70), the at least one processor (21, 31) further configured to accept as input data related to at least one immersive effect based on the current local output of the portable electronic device (20), the at least one processor (21, 31) further configured to select at least one presentation device (40) to output the at least one immersive effect based on at least the orientation of the portable device (20).
15) The system (10) as in Claim 14, wherein the at least one processor (21, 31) is further configured to select at least one of the presentation devices (40) to output the at least one immersive effect based on the position of the portable electronic device (20) in the space (70).
16) The system (10) as in Claim 14, wherein the at least one processor (21, 31) comprises a processor (21) in the portable electronic device (20) that generates the data related to the orientation of the portable electronic device (20) in the space (70) in association with a positioning system (20a, 50) in the portable electronic device (20).
17) The system (10) as in Claim 16, wherein the processor (21) in the portable electronic device (20) further generates data related to the position of the portable electronic device (20) in the space (70) in association with the positioning system (20a, 50) in the portable electronic device (20).
18) The system (10) as in Claim 16, wherein the positioning system (20a, 50) in the portable electronic device (20) comprises at least one of an accelerometer, a compass and a gyroscope.
19) The system (10) as in Claim 14, wherein the system (10) further comprises a positioning system (50) associated with the space (70) and separate from the portable electronic device (20), the positioning system (50) configured to generate the data related to at least the orientation of the portable electronic device (20) in the space (70).
20) The system (10) as in Claim 19, wherein the positioning system (50) comprises an image recognition system. 21) The system (10) as in Claim 14, wherein the at least one processor (21, 31) comprises a processor (21) in the portable electronic device (20) that generates the data related to at least one immersive effect based on the current local output of the portable electronic device (20).
22) The system (10) as in Claim 14, wherein the system (10) comprises a first processor (21) in the portable electronic device (20) that operatively communicates with a second processor (31) associated with the space (70), the first processor (21) configured to generate the data related to the at least one immersive effect based on the current local output of the portable electronic device (20), and transmit the data to the second processor (31).
23) The system (10) as in Claim 22, wherein the second processor (31) is configured to process as input the data related to at least the orientation of a portable electronic device (20) in the space (70), the second processor (31) further configured to process as input the data related to the at least one immersive effect based on the current local output of the portable electronic device (20) received from the first processor (21), the second processor (31) selecting the at least one presentation device (40) to output the at least one immersive effect based on the immersive effect data received and at least the orientation data.
24) The system (10) as in Claim 14, wherein the at least one processor (21, 31) is further configured to select the at least one presentation device (40) to output the at least one immersive effect based on at least one of the type, position and orientation of the presentation device (40).
25) The system (10) of Claim 14, further including one or more presentation devices (40) operatively coupled to the at least one processor (21, 31), wherein at least one of the one or more presentation devices (40) is one of a speaker (40a, 40c, 4Oe, 4Og, 4Oj, 40k, 4On), video display (4Of, 401), video projector and light (40b, 4Od, 4Oh, 4Oi, 40m). 26) The system (10) of Claim 14, further including one or more presentation devices (40) operatively coupled to the at least one processor (21, 31), wherein at least one of the one or more presentation devices (40) is one of a fan, mist generator, fog generator, ultrasonic generator, vibrator, heater and air conditioner.
27) The system (10) of Claim 14, further including one or more presentation devices (40) operatively coupled to the at least one processor (21, 31), at least one of the one or more presentation devices (40) providing one of an audio sensation, visual sensation, tactile sensation, olfactory sensation and taste sensation.
28) The system (10) of Claim 14, wherein the portable electronic device (20) is a gaming device that provides an avatar for the user (60).
29) The system (10) of Claim 28, wherein the at least one immersive effect is a visual output by a presentation device (40) to the user (60) that corresponds to the visual perception of the avatar in the game.
30) The system (10) as in Claim 28, wherein the at least one immersive effect is an audio output by a presentation device (40) to the user (60) that corresponds to the audio perception of the avatar in the game.
31) The system (10) as in Claim 14, wherein the orientation of the portable electronic device (20) is based on the orientation of the user (60).
32) The system (10) as in Claim 14, wherein the at least one processor (21, 31) is configured to select at least one of the presentation devices (40) to output the at least one immersive effect further based on the position of the user (60) in the space (70).
33) The system (10) as in Claim 14, wherein the at least one processor (21, 31) of the system (10) is a processor (21) in the portable electronic device (20), the processor (21) in the portable electronic device (20) generating the data related to the orientation of the portable electronic device (20) in the space (70), the processor (21) subsequently using the data generated related to orientation of the portable electronic device (20) and data related to at least one immersive effect based on the current local output of the portable electronic device (20) to select at least one presentation device (40) to output the at least one immersive effect.
34) The system (10) as in Claim 14, wherein the at least one processor (21, 31) of the system (10) is a processor (21) in the portable electronic device (20), the processor (21) in the portable electronic device (20) generating the data related to at least one immersive effect based on the current local output of the portable electronic device (20), the processor (21) subsequently using the data generated related to the at least one immersive effect and data related to at least an orientation of the portable electronic device (20) to select at least one presentation device (40) to output the at least one immersive effect.
PCT/IB2006/050870 2005-03-24 2006-03-21 Orientation and position adaptation for immersive experiences WO2006100644A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US66502505P 2005-03-24 2005-03-24
US60/665,025 2005-03-24
US71094905P 2005-08-24 2005-08-24
US60/710,949 2005-08-24

Publications (2)

Publication Number Publication Date
WO2006100644A2 true WO2006100644A2 (en) 2006-09-28
WO2006100644A3 WO2006100644A3 (en) 2007-02-15

Family

ID=36698893

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2006/050870 WO2006100644A2 (en) 2005-03-24 2006-03-21 Orientation and position adaptation for immersive experiences

Country Status (1)

Country Link
WO (1) WO2006100644A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009124773A1 (en) * 2008-04-09 2009-10-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Sound reproduction system and method for performing a sound reproduction using a visual face tracking
WO2010140088A1 (en) * 2009-06-03 2010-12-09 Koninklijke Philips Electronics N.V. Estimation of loudspeaker positions
EP2418874A1 (en) * 2010-08-11 2012-02-15 Sony Computer Entertainment Europe Ltd. Apparatus and method of audio reproduction
US20130151955A1 (en) * 2011-12-09 2013-06-13 Mechell Williams Physical effects for electronic books
US8629866B2 (en) 2009-06-18 2014-01-14 International Business Machines Corporation Computer method and apparatus providing interactive control and remote identity through in-world proxy
GB2548091A (en) * 2016-03-04 2017-09-13 Ambx Uk Ltd Content delivery

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4101156A1 (en) * 1991-01-14 1992-07-16 Audiocinema Electronic Und Med Position location and tracking of objects - using pulsed outputs from infrared transmitters detected by image processing system
GB2339127A (en) * 1998-02-03 2000-01-12 Sony Corp Headphone apparatus
WO2001095669A2 (en) * 2000-06-08 2001-12-13 Koninklijke Philips Electronics N.V. A remote control apparatus and a receiver and an audio system
WO2002041664A2 (en) * 2000-11-16 2002-05-23 Koninklijke Philips Electronics N.V. Automatically adjusting audio system
US6630915B1 (en) * 1999-01-26 2003-10-07 Lsa. Inc. Wireless transmission system for transmitting data to a simulation system user
US20040109022A1 (en) * 2002-12-04 2004-06-10 Bennett Daniel H System and method for three-dimensional imaging
WO2004112432A1 (en) * 2003-06-16 2004-12-23 Koninklijke Philips Electronics N.V. Device and method for locating a room area
JP2005046270A (en) * 2003-07-31 2005-02-24 Konami Computer Entertainment Yokyo Inc Game device, control method of computer and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4101156A1 (en) * 1991-01-14 1992-07-16 Audiocinema Electronic Und Med Position location and tracking of objects - using pulsed outputs from infrared transmitters detected by image processing system
GB2339127A (en) * 1998-02-03 2000-01-12 Sony Corp Headphone apparatus
US6630915B1 (en) * 1999-01-26 2003-10-07 Lsa. Inc. Wireless transmission system for transmitting data to a simulation system user
WO2001095669A2 (en) * 2000-06-08 2001-12-13 Koninklijke Philips Electronics N.V. A remote control apparatus and a receiver and an audio system
WO2002041664A2 (en) * 2000-11-16 2002-05-23 Koninklijke Philips Electronics N.V. Automatically adjusting audio system
US20040109022A1 (en) * 2002-12-04 2004-06-10 Bennett Daniel H System and method for three-dimensional imaging
WO2004112432A1 (en) * 2003-06-16 2004-12-23 Koninklijke Philips Electronics N.V. Device and method for locating a room area
JP2005046270A (en) * 2003-07-31 2005-02-24 Konami Computer Entertainment Yokyo Inc Game device, control method of computer and program

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009124773A1 (en) * 2008-04-09 2009-10-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Sound reproduction system and method for performing a sound reproduction using a visual face tracking
WO2010140088A1 (en) * 2009-06-03 2010-12-09 Koninklijke Philips Electronics N.V. Estimation of loudspeaker positions
RU2543937C2 (en) * 2009-06-03 2015-03-10 Конинклейке Филипс Электроникс Н.В. Loudspeaker position estimation
US9332371B2 (en) 2009-06-03 2016-05-03 Koninklijke Philips N.V. Estimation of loudspeaker positions
US8629866B2 (en) 2009-06-18 2014-01-14 International Business Machines Corporation Computer method and apparatus providing interactive control and remote identity through in-world proxy
EP2418874A1 (en) * 2010-08-11 2012-02-15 Sony Computer Entertainment Europe Ltd. Apparatus and method of audio reproduction
US8764565B2 (en) 2010-08-11 2014-07-01 Sony Computer Entertainment Europe Limited Apparatus and method of audio reproduction
US20130151955A1 (en) * 2011-12-09 2013-06-13 Mechell Williams Physical effects for electronic books
GB2548091A (en) * 2016-03-04 2017-09-13 Ambx Uk Ltd Content delivery

Also Published As

Publication number Publication date
WO2006100644A3 (en) 2007-02-15

Similar Documents

Publication Publication Date Title
US11683471B2 (en) Information processing device and information processing method
US10842003B2 (en) Ambience control system
JP6266736B1 (en) Method for communicating via virtual space, program for causing computer to execute the method, and information processing apparatus for executing the program
EP3136826B1 (en) Information processing device, information processing method and program
US7632185B2 (en) Portable projection gaming system
JP6012009B2 (en) Lighting system
US10412815B2 (en) Lighting system and multi-mode lighting device thereof and controlling method
US8647198B2 (en) Image display system, illumination system, information processing device, and storage medium having control program stored therein
KR20220062513A (en) Deployment of Virtual Content in Environments with Multiple Physical Participants
WO2006100644A2 (en) Orientation and position adaptation for immersive experiences
JPWO2008004438A1 (en) Projector system and video projection method
US10459599B2 (en) Method for moving in virtual space and information processing apparatus for executing the method
EP3550404B1 (en) Information processing device, information processing method, and computer program
US11850509B2 (en) Interactive theater system with real-time feedback and dynamic special effects
CN105652571B (en) Projection arrangement and its optical projection system
CN110677608A (en) Double-screen smart television and control method thereof
JP2014056030A (en) Image projection system, operation method for image projection system, image projection device, and remote control device for image projection system
US11689794B2 (en) Information processing device, information processing method, and non-transitory computer readable medium
JP6529571B1 (en) Program, method executed by computer to provide virtual space, and information processing apparatus for executing program
KR20200117444A (en) Virtual Reality Device and Control Method thereof
CN117440184B (en) Live broadcast equipment and control method thereof
JP5965434B2 (en) Image display system, lighting system, information processing apparatus, and control program
WO2022091589A1 (en) Information processing device, information processing method, and program
CN110989833B (en) Control method, AR device and computer readable storage medium
EP3572133A1 (en) System for enhancing perception of video games

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

WWW Wipo information: withdrawn in national office

Country of ref document: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06727699

Country of ref document: EP

Kind code of ref document: A2

WWW Wipo information: withdrawn in national office

Ref document number: 6727699

Country of ref document: EP