US20120256945A1 - System for altering virtual views - Google Patents

System for altering virtual views Download PDF

Info

Publication number
US20120256945A1
US20120256945A1 US13/378,904 US200913378904A US2012256945A1 US 20120256945 A1 US20120256945 A1 US 20120256945A1 US 200913378904 A US200913378904 A US 200913378904A US 2012256945 A1 US2012256945 A1 US 2012256945A1
Authority
US
United States
Prior art keywords
vehicle
display device
scene
computing unit
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/378,904
Inventor
Ben Kidron
Amir Notea
Yuval Sapir
Eyal Eshed
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DIGIGAGE Ltd
Original Assignee
DIGIGAGE Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DIGIGAGE Ltd filed Critical DIGIGAGE Ltd
Publication of US20120256945A1 publication Critical patent/US20120256945A1/en
Assigned to DIGIGAGE LTD. reassignment DIGIGAGE LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ESHED, EYAL, SAPIR, YUVAL, NOTEA, AMIR, KIDRON, BEN
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63GMERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
    • A63G31/00Amusement arrangements
    • A63G31/16Amusement arrangements creating illusions of travel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B3/00Applications of devices for indicating or signalling operating conditions of elevators
    • B66B3/002Indicators
    • B66B3/008Displaying information not related to the elevator, e.g. weather, publicity, internet or TV
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/22Advertising or display means on roads, walls or similar surfaces, e.g. illuminated
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/006Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols involving public key infrastructure [PKI] trust models
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0816Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
    • H04L9/0838Key agreement, i.e. key establishment technique in which a shared key is derived by parties as a function of information contributed by, or associated with, each of these
    • H04L9/0841Key agreement, i.e. key establishment technique in which a shared key is derived by parties as a function of information contributed by, or associated with, each of these involving Diffie-Hellman or related key agreement protocols
    • H04L9/0844Key agreement, i.e. key establishment technique in which a shared key is derived by parties as a function of information contributed by, or associated with, each of these involving Diffie-Hellman or related key agreement protocols with user authentication or key authentication, e.g. ElGamal, MTI, MQV-Menezes-Qu-Vanstone protocol or Diffie-Hellman protocols using implicitly-certified keys
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D11/00Passenger or crew accommodation; Flight-deck installations not otherwise provided for
    • B64D2011/0061Windows displaying outside view, artificially generated

Definitions

  • the present invention relates generally to virtual reality, images and video, and particularly to altering views of a virtual reality scene in response to movement of the environment, such as a vehicle.
  • the invention relates to altering the display on a display device in a moving vehicle in response to the movement of the vehicle itself.
  • Changing a view of a virtual reality scene can be made in response to for example a movement of the head of a viewer wearing virtual reality goggles or to a movement of a hand of a wearer of a virtual reality glove.
  • U.S. Pat. No. 4,414,537 describes a data glove designed to replace a computer keyboard. This glove uses flex sensors and electrical contacts on the fingertips to determine static positions representing the characters of the alphabet.
  • U.S. Pat. No. 5,184,319 discloses a data glove using strain gauges attached to the fingers of the glove to sense the bend of fingers, and to transmit this information to a computer.
  • U.S. Pat. No. 4,613,139 proposes use of a glove with contacts on the fingertips to be used as an input device for a video game.
  • U.S. Pat. No. 5,991,085 describes one implementation of a head mounted display system that tracks the head of the user and changes the virtual display according to the viewer head movement.
  • U.S. Pat. No. 5,955,710 and others by the same assignee, Captivate Network Inc. disclose a system which distributes “real-time” information along with digital advertising to elevator display units mounted in elevators.
  • the system includes an elevator display unit having a display monitor for displaying general and commercial information to passengers within the elevator, and a local server which, receives scheduling information from a remote production server over a data communication path and, in accordance with the scheduling information, retrieves and formats the general and commercial information for display at the elevator display unit.
  • this system supplies information to the riders in the elevator, not in response to the spatial movement of the elevator.
  • the present invention relates to system for altering virtual views comprising one or more display devices located on a moving vehicle, the device linked to a computing device and one or more sensors that detect the spatial movement direction of the vehicle and changes the content presented on the display device according to the detected movement or lack of movement of the vehicle.
  • the moving vehicle may be selected from among a bus, a taxi, a train whether above the ground or below, a plane, an elevator or any other moving vehicle which can be fitted with a display device or means, such as an electronic screen or projector.
  • the display device may be located on a wall of the moving vehicle, on the ceiling, on the floor, in place of a window in the vehicle or any other viewable location inside the vehicle or affixed outside the vehicle in such a manner so as to allow viewing from inside the vehicle.
  • the view can be changed by using pre-defined set of instructions, behavioral patterns or in any other manner that imply an action reaction scenario.
  • the display device is linked to a computing unit, such as a PC, or other computing device, such as an embedded system, as well as to one or more local or remote sensors.
  • a computing unit such as a PC, or other computing device, such as an embedded system, as well as to one or more local or remote sensors.
  • the sensors may comprise an accelerometer, a gyro, a positioning sensor, an altimeter, a camera, a microphone, an RFID sensor or a motion sensor capable of sensing a change in the environment.
  • the senor is linked to one or more memory units, such as a hard drive, flash memory or other storage unit local or remote that is capable of storing images or scenes or any other type of content to be transmitted to the display system in real time or offline.
  • memory units such as a hard drive, flash memory or other storage unit local or remote that is capable of storing images or scenes or any other type of content to be transmitted to the display system in real time or offline.
  • the system in whole reacts to the movement of the vehicle or the movement of the objects (such as people) inside the vehicle or both, by changing the content that is rendered on the display device in correlation with the actual physical sensation of the people in the vehicle or in the vehicle local or remote surroundings.
  • the display itself is agnostic to the location or the current state of the screen, such that for example a screen can be rotating constantly, while the content shown on the screen remains aligned to the viewer and reacts to the movement vector of the vehicle in physical space and not to the movement of the display inside the vehicle.
  • FIG. 1 is a diagram of a display in a moving vehicle being an elevator in accordance with an embodiment of the invention
  • FIG. 2 is a flow diagram of a method in accordance with an embodiment of the invention.
  • FIG. 3 shows an embodiment of the invention where the display device(s) are situated in a moving vehicle being a train car.
  • FIG. 1 shows a diagram of a display in a vehicle being an elevator in accordance with one embodiment of the invention.
  • the vehicle i.e. the elevator 100 is outfitted with one or more display devices, e.g. electronic screens 102 that one of which is shown located in the embodiment of FIG. 1 on one wall of the elevator 100 .
  • the one or more screens 102 are linked to a computing unit 104 , such as a PC, as well as to one or more sensors 106 , such as for example an accelerometer, a positioning sensor, an altimeter, a camera, a motion sensor, a microphone or other sensors.
  • a computing unit 104 such as a PC
  • sensors 106 such as for example an accelerometer, a positioning sensor, an altimeter, a camera, a motion sensor, a microphone or other sensors.
  • Computing unit 104 may also be linked to one or more memory units 108 , such as a hard drive, flash memory or other mass data storage unit or to a network device 110 that is linked to a remote storage and/or application server that can transmit or change existing images or scenes that are transmitted for display on the screens 102 .
  • memory units 108 such as a hard drive, flash memory or other mass data storage unit or to a network device 110 that is linked to a remote storage and/or application server that can transmit or change existing images or scenes that are transmitted for display on the screens 102 .
  • computing unit 104 issues a signal to screen 102 to project a view of a virtual scene or any type of content, such as a scene that is other than the scene that would be viewed by looking out the window of elevator 100 .
  • a scene that is other than the scene that would be viewed by looking out the window of elevator 100 For example, if the elevator is ascending and descending inside a shopping mall, the scene that could be shown on screen 102 may e.g. be a scene of a street in Paris, or a scene that would be visible while diving though a coral reef, or a scene that would be visible if moving in a canyon on Mars.
  • Any movement of elevator 100 is sensed by sensor 106 , such as an altimeter and/or an accelerometer, which detects that the elevator is ascending or descending, and signals such movement to computing unit 104 .
  • Computing unit 104 signals screen 102 inside of the elevator to change the displayed scene to show a view of, for example, Paris, as would be seen from a glass elevator in Paris, where such view matches the height of the elevator to the height of the view that is shown of Paris on the display.
  • the view shown on screen 102 is how a coral reef would have been seen when going up and down in an elevator at the bottom of the ocean. It is possible to apply different behaviors to different objects such that for example a fish 112 could swim along the virtual window to the ocean while the elevator goes up and down.
  • the processor as a reaction of the sensor input would advance the displayed scene forward or backwards so that, for example, the virtual view of the street in Paris would advance forward as if the bus or the taxi cab were moving down the street in Paris.
  • the display could show a pedestrian on a Parisian street, who is walking on a sidewalk when the vehicle moves. The displayed view of the pedestrian would appear to move ahead of the vehicle when the vehicle is stopped. Once the vehicle moves forward, for example down a street in New York, the view of the Parisian pedestrian would be overtaken by the vehicle so that the vehicle would appear to be moving ahead of the pedestrian.
  • the scene that is shown on the screen may likewise change to show the scene that would appear had a vehicle in Paris made a left turn.
  • one screen 102 is shown. In another embodiment, several such screens 102 could be coordinated to present a panoramic view of a scene from some or all sides of the elevator 100 .
  • riders of the vehicle see the scenes that they would see through the windows of vehicle, had vehicle been in a different place, and the movement of the scenes would be dictated by the movement of the vehicle, as exemplified by the description of the bus or taxi cab above.
  • a displayed scene may respond to movements of a viewer as if such viewer was present in the scene.
  • a camera or other image capture or motion detection device may sense a movement of a viewer towards a screen 102 that is showing a scene of fish 112 swimming.
  • the fish may be shown as frightened and as swimming away from the viewer.
  • memory 108 may store a scene of for example a giant bottle of a beverage, and the view that will be shown on the display, is a view that would be seen by a viewer who is ascending up the side of the beverage in an elevator.
  • the possibilities for the view on the screens and the changes in the views in response to a change in the environment are of course numerous.
  • FIG. 2 shows a flow diagram in accordance with an embodiment of the invention.
  • a sensor issues a signal to a computing unit in response to the movement of a vehicle or a change in the environment surrounding.
  • a computing unit signals a display 206 as a result of checking with a local or remote logic or data center 204 to move a scene or some pre-defined parts of the scene, or change the contents shown on such display to mimic or react to a movement of the vehicle, or a change in the environment, as if the vehicle were moving through the scene being displayed or changing it and effecting its content.
  • FIG. 3 shows a diagram of the interior of a passenger train car 300 that has a screen facing the direction of movement 301 , and a further screen that is facing the side view 302 .
  • the screens are connected to a box that contains a computing unit, a storage and ⁇ or networking device and a sensor 304 .
  • the system changes the content presented on the screens according to the movement of the train.
  • side screen that is positioned on the train car window 302 can show virtual content 307 that correlates to the actually observed view 306 in the window and adds on top of it virtual content 308 , reacting to the train movement.
  • content presented on screen 301 can show a virtual scene that simulates an actual window was facing the direction of movement.
  • multiple screens can be deployed in several locations inside or outside of the car. It is also understood that further horizontal system embodiments can be deployed in locations such as underground trains, taxi cabs, busses and other vehicles, in addition to the specific embodiments described above with reference to FIGS. 1-3 of the drawings.

Abstract

This invention relates to virtual reality, images and video, particularly to altering views of a virtual reality scene in response to movement of the environment, such as a vehicle.

Description

    FIELD OF THE INVENTION
  • The present invention relates generally to virtual reality, images and video, and particularly to altering views of a virtual reality scene in response to movement of the environment, such as a vehicle. In particular, the invention relates to altering the display on a display device in a moving vehicle in response to the movement of the vehicle itself.
  • BACKGROUND OF THE INVENTION
  • Changing a view of a virtual reality scene can be made in response to for example a movement of the head of a viewer wearing virtual reality goggles or to a movement of a hand of a wearer of a virtual reality glove.
  • In the case of gloves, for example, U.S. Pat. No. 4,414,537 describes a data glove designed to replace a computer keyboard. This glove uses flex sensors and electrical contacts on the fingertips to determine static positions representing the characters of the alphabet. U.S. Pat. No. 5,184,319 discloses a data glove using strain gauges attached to the fingers of the glove to sense the bend of fingers, and to transmit this information to a computer. U.S. Pat. No. 4,613,139 proposes use of a glove with contacts on the fingertips to be used as an input device for a video game.
  • U.S. Pat. No. 5,991,085 describes one implementation of a head mounted display system that tracks the head of the user and changes the virtual display according to the viewer head movement.
  • However, none of these known devices alter the view of a virtual reality scene as a result in the change of the environment, they alter the view in response to input or action given by the user, just as a keyboard or a joystick would.
  • U.S. Pat. No. 5,955,710 and others by the same assignee, Captivate Network Inc., disclose a system which distributes “real-time” information along with digital advertising to elevator display units mounted in elevators. The system includes an elevator display unit having a display monitor for displaying general and commercial information to passengers within the elevator, and a local server which, receives scheduling information from a remote production server over a data communication path and, in accordance with the scheduling information, retrieves and formats the general and commercial information for display at the elevator display unit. However, this system supplies information to the riders in the elevator, not in response to the spatial movement of the elevator.
  • SUMMARY OF THE INVENTION
  • The present invention relates to system for altering virtual views comprising one or more display devices located on a moving vehicle, the device linked to a computing device and one or more sensors that detect the spatial movement direction of the vehicle and changes the content presented on the display device according to the detected movement or lack of movement of the vehicle.
  • The moving vehicle may be selected from among a bus, a taxi, a train whether above the ground or below, a plane, an elevator or any other moving vehicle which can be fitted with a display device or means, such as an electronic screen or projector.
  • The display device may be located on a wall of the moving vehicle, on the ceiling, on the floor, in place of a window in the vehicle or any other viewable location inside the vehicle or affixed outside the vehicle in such a manner so as to allow viewing from inside the vehicle.
  • The view can be changed by using pre-defined set of instructions, behavioral patterns or in any other manner that imply an action reaction scenario.
  • The display device is linked to a computing unit, such as a PC, or other computing device, such as an embedded system, as well as to one or more local or remote sensors.
  • The sensors may comprise an accelerometer, a gyro, a positioning sensor, an altimeter, a camera, a microphone, an RFID sensor or a motion sensor capable of sensing a change in the environment.
  • In a preferred embodiment, the sensor is linked to one or more memory units, such as a hard drive, flash memory or other storage unit local or remote that is capable of storing images or scenes or any other type of content to be transmitted to the display system in real time or offline.
  • The system in whole reacts to the movement of the vehicle or the movement of the objects (such as people) inside the vehicle or both, by changing the content that is rendered on the display device in correlation with the actual physical sensation of the people in the vehicle or in the vehicle local or remote surroundings.
  • The display itself is agnostic to the location or the current state of the screen, such that for example a screen can be rotating constantly, while the content shown on the screen remains aligned to the viewer and reacts to the movement vector of the vehicle in physical space and not to the movement of the display inside the vehicle.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with features and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanied drawings in which:
  • FIG. 1 is a diagram of a display in a moving vehicle being an elevator in accordance with an embodiment of the invention;
  • FIG. 2 is a flow diagram of a method in accordance with an embodiment of the invention; and
  • FIG. 3 shows an embodiment of the invention where the display device(s) are situated in a moving vehicle being a train car.
  • DETAILED DESCRIPTION OF THE INVENTION
  • In the following description, various embodiments of the invention will be described. For purposes of explanation, specific examples are set forth in order to provide a thorough understanding of at least one embodiment of the invention. However, it will also be apparent to one skilled in the art that other embodiments of the invention are not limited to the examples described herein. Furthermore, well-known features may be omitted or simplified in order not to obscure embodiments of the invention described herein.
  • Reference is made to FIG. 1, which shows a diagram of a display in a vehicle being an elevator in accordance with one embodiment of the invention. The vehicle, i.e. the elevator 100 is outfitted with one or more display devices, e.g. electronic screens 102 that one of which is shown located in the embodiment of FIG. 1 on one wall of the elevator 100. The one or more screens 102 are linked to a computing unit 104, such as a PC, as well as to one or more sensors 106, such as for example an accelerometer, a positioning sensor, an altimeter, a camera, a motion sensor, a microphone or other sensors. Computing unit 104 may also be linked to one or more memory units 108, such as a hard drive, flash memory or other mass data storage unit or to a network device 110 that is linked to a remote storage and/or application server that can transmit or change existing images or scenes that are transmitted for display on the screens 102.
  • In operation, computing unit 104 issues a signal to screen 102 to project a view of a virtual scene or any type of content, such as a scene that is other than the scene that would be viewed by looking out the window of elevator 100. For example, if the elevator is ascending and descending inside a shopping mall, the scene that could be shown on screen 102 may e.g. be a scene of a street in Paris, or a scene that would be visible while diving though a coral reef, or a scene that would be visible if moving in a canyon on Mars. Any movement of elevator 100 is sensed by sensor 106, such as an altimeter and/or an accelerometer, which detects that the elevator is ascending or descending, and signals such movement to computing unit 104. Computing unit 104 signals screen 102 inside of the elevator to change the displayed scene to show a view of, for example, Paris, as would be seen from a glass elevator in Paris, where such view matches the height of the elevator to the height of the view that is shown of Paris on the display. Alternatively, the view shown on screen 102 is how a coral reef would have been seen when going up and down in an elevator at the bottom of the ocean. It is possible to apply different behaviors to different objects such that for example a fish 112 could swim along the virtual window to the ocean while the elevator goes up and down. Similarly, if one or more screens 102 are placed in a vehicle being a bus or a taxi cab, the processor as a reaction of the sensor input would advance the displayed scene forward or backwards so that, for example, the virtual view of the street in Paris would advance forward as if the bus or the taxi cab were moving down the street in Paris. Thus, the display could show a pedestrian on a Parisian street, who is walking on a sidewalk when the vehicle moves. The displayed view of the pedestrian would appear to move ahead of the vehicle when the vehicle is stopped. Once the vehicle moves forward, for example down a street in New York, the view of the Parisian pedestrian would be overtaken by the vehicle so that the vehicle would appear to be moving ahead of the pedestrian. As the vehicle makes, e.g. a left turn, the scene that is shown on the screen may likewise change to show the scene that would appear had a vehicle in Paris made a left turn.
  • In FIG. 1, one screen 102 is shown. In another embodiment, several such screens 102 could be coordinated to present a panoramic view of a scene from some or all sides of the elevator 100.
  • In some embodiments, riders of the vehicle see the scenes that they would see through the windows of vehicle, had vehicle been in a different place, and the movement of the scenes would be dictated by the movement of the vehicle, as exemplified by the description of the bus or taxi cab above.
  • In another embodiment, a displayed scene may respond to movements of a viewer as if such viewer was present in the scene. For example, a camera or other image capture or motion detection device may sense a movement of a viewer towards a screen 102 that is showing a scene of fish 112 swimming. In response to the movement of the viewer as is detected by the motion detector, the fish may be shown as frightened and as swimming away from the viewer.
  • In one embodiment, memory 108 may store a scene of for example a giant bottle of a beverage, and the view that will be shown on the display, is a view that would be seen by a viewer who is ascending up the side of the beverage in an elevator. The possibilities for the view on the screens and the changes in the views in response to a change in the environment are of course numerous.
  • FIG. 2 shows a flow diagram in accordance with an embodiment of the invention. In block 200, a sensor issues a signal to a computing unit in response to the movement of a vehicle or a change in the environment surrounding. In block 202, a computing unit signals a display 206 as a result of checking with a local or remote logic or data center 204 to move a scene or some pre-defined parts of the scene, or change the contents shown on such display to mimic or react to a movement of the vehicle, or a change in the environment, as if the vehicle were moving through the scene being displayed or changing it and effecting its content.
  • FIG. 3 shows a diagram of the interior of a passenger train car 300 that has a screen facing the direction of movement 301, and a further screen that is facing the side view 302. The screens are connected to a box that contains a computing unit, a storage and\or networking device and a sensor 304. The system changes the content presented on the screens according to the movement of the train. In one embodiment side screen that is positioned on the train car window 302 can show virtual content 307 that correlates to the actually observed view 306 in the window and adds on top of it virtual content 308, reacting to the train movement.
  • Likewise, content presented on screen 301 can show a virtual scene that simulates an actual window was facing the direction of movement. In accordance with an embodiment of the invention, multiple screens can be deployed in several locations inside or outside of the car. It is also understood that further horizontal system embodiments can be deployed in locations such as underground trains, taxi cabs, busses and other vehicles, in addition to the specific embodiments described above with reference to FIGS. 1-3 of the drawings.

Claims (13)

1-13. (canceled)
14. A system for altering a virtual reality scene displayed on one or more display devices on a moving vehicle comprising:
a computing unit,
one or more sensors and
one or more display devices,
wherein the computing unit, in response to input received from one or more sensors located on the vehicle, signals to the one or more display devices to project a virtual reality scene, the scene being different than the scene that would be viewed by looking out a window of the vehicle and not representing real-time information
15. A system according to claim 14, comprising two or more display devices.
16. A system according to claim 14, wherein the display device is located inside the vehicle.
17. A system according to claim 14, wherein the display device is located outside the vehicle.
18. A system according to claim 14, wherein the vehicle is selected from among a bus, a taxi, a train, a plane and an elevator.
19. A system according to claim 14, wherein the display device is located on a wall of the moving vehicle, on the ceiling, on the floor, in place of a window in the vehicle or any other viewable location inside the vehicle or affixed outside the vehicle in such a manner so as to allow viewing.
20. A system according to claim 14, wherein the display device is an electronic screen.
21. A system according to claim 14, where the display device is a projector that is projecting the content in or out of the vehicle.
22. A system according to claim 14, wherein the computing unit is an embedded system.
23. A system according to claim 22, wherein the computing unit is a PC.
24. A system according to claim 14, wherein the one or more sensors comprise an accelerometer, a gyro, a positioning sensor, an altimeter, a camera, a microphone, an RFID sensor or a motion sensor capable of sensing a change in the environment
25. A system according to claim 22, wherein the one or more sensors are linked to one or more memory units, selected from a hard drive, flash memory or other storage unit locally or remotely that is capable of storing virtual reality scenes to be transmitted to the display device in real time or offline.
US13/378,904 2008-06-17 2009-06-16 System for altering virtual views Abandoned US20120256945A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US7313108P 2008-06-17 2008-06-17
PCT/IL2009/000594 WO2010004547A1 (en) 2008-06-17 2009-06-16 System for altering virtual views

Publications (1)

Publication Number Publication Date
US20120256945A1 true US20120256945A1 (en) 2012-10-11

Family

ID=40943271

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/378,904 Abandoned US20120256945A1 (en) 2008-06-17 2009-06-16 System for altering virtual views
US14/347,663 Active 2034-09-06 US10284366B2 (en) 2008-06-17 2012-09-27 Mobile communication system implementing integration of multiple logins of mobile device applications

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/347,663 Active 2034-09-06 US10284366B2 (en) 2008-06-17 2012-09-27 Mobile communication system implementing integration of multiple logins of mobile device applications

Country Status (7)

Country Link
US (2) US20120256945A1 (en)
EP (1) EP2443055A1 (en)
JP (1) JP2012530317A (en)
CN (1) CN102548886A (en)
IL (2) IL215424B (en)
RU (1) RU2012101303A (en)
WO (1) WO2010004547A1 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120212613A1 (en) * 2011-02-22 2012-08-23 Sekai Electronics, Inc. Vehicle virtual window system, components and method
US20140160285A1 (en) * 2012-12-06 2014-06-12 Airbus Operations (Sas) Aircraft fuselage including a window equipped with a system for displaying images of the outside environment
US9423620B2 (en) 2014-04-24 2016-08-23 Lg Electronics Inc. Head mounted display and method for controlling the same
DE102015208993A1 (en) * 2015-05-15 2016-11-17 Lufthansa Technik Ag Partial segment of an aircraft and electric chimney
WO2017018844A1 (en) * 2015-07-30 2017-02-02 삼성전자 주식회사 Autonomous vehicle and operation method of same
KR20170015213A (en) * 2015-07-30 2017-02-08 삼성전자주식회사 Autonomous Vehicle and Operation Method thereof
US20170113899A1 (en) * 2015-05-29 2017-04-27 Legends Attractions, Llc Transformative elevator display system
US9669302B2 (en) 2014-07-04 2017-06-06 Lg Electronics Inc. Digital image processing apparatus and controlling method thereof
US20170252642A1 (en) * 2014-11-21 2017-09-07 Vr Moving Floor, Llc Moving Floor for Interactions with Virtual Reality Systems and Uses Thereof
US9988008B2 (en) 2015-10-26 2018-06-05 Active Knowledge Ltd. Moveable internal shock-absorbing energy dissipation padding in an autonomous vehicle
US20180186598A1 (en) * 2015-07-03 2018-07-05 Laurent Coldre Elevator car wall imaging system and method
US10059347B2 (en) 2015-10-26 2018-08-28 Active Knowledge Ltd. Warning a vehicle occupant before an intense movement
US10585471B2 (en) 2017-10-03 2020-03-10 Disney Enterprises, Inc. Systems and methods to provide an interactive space based on predicted events
US10589625B1 (en) 2015-12-11 2020-03-17 Disney Enterprises, Inc. Systems and methods for augmenting an appearance of an actual vehicle component with a virtual vehicle component
US10710608B2 (en) 2015-10-26 2020-07-14 Active Knowledge Ltd. Provide specific warnings to vehicle occupants before intense movements
US10717406B2 (en) 2015-10-26 2020-07-21 Active Knowledge Ltd. Autonomous vehicle having an external shock-absorbing energy dissipation padding
US10785621B1 (en) 2019-07-30 2020-09-22 Disney Enterprises, Inc. Systems and methods to provide an interactive space based on vehicle-to-vehicle communications
US10841632B2 (en) 2018-08-08 2020-11-17 Disney Enterprises, Inc. Sequential multiplayer storytelling in connected vehicles
US10854002B2 (en) * 2017-09-08 2020-12-01 Verizon Patent And Licensing Inc. Interactive vehicle window system including augmented reality overlays
US10970560B2 (en) 2018-01-12 2021-04-06 Disney Enterprises, Inc. Systems and methods to trigger presentation of in-vehicle content
US10969748B1 (en) * 2015-12-28 2021-04-06 Disney Enterprises, Inc. Systems and methods for using a vehicle as a motion base for a simulated experience
US11076276B1 (en) 2020-03-13 2021-07-27 Disney Enterprises, Inc. Systems and methods to provide wireless communication between computing platforms and articles
US11150102B2 (en) 2018-07-19 2021-10-19 Alpha Code Inc. Virtual-space-image providing device and program for providing virtual space image
US11332061B2 (en) 2015-10-26 2022-05-17 Atnomity Ltd. Unmanned carrier for carrying urban manned vehicles
US20220198942A1 (en) * 2016-12-01 2022-06-23 SZ DJI Technology Co., Ltd. Methods and associated systems for managing 3d flight paths
US11524242B2 (en) 2016-01-20 2022-12-13 Disney Enterprises, Inc. Systems and methods for providing customized instances of a game within a virtual space
JP2022188084A (en) * 2017-03-06 2022-12-20 ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー Systems and methods for layered virtual features in amusement park environment

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013016921A1 (en) 2013-10-11 2015-04-16 Oliver Bunsen Image display system and method for motion-synchronous image display in a means of transport
EP2913228B1 (en) * 2014-02-28 2018-08-29 Volvo Car Corporation Vehicle with sensor controlled vision
US10057240B2 (en) * 2014-08-25 2018-08-21 Sap Se Single sign-on to web applications from mobile devices
US10949507B2 (en) 2014-10-17 2021-03-16 Vulpecula, Llc. Methods, systems, and computer program products for web browsing
CN104517493A (en) * 2014-12-29 2015-04-15 东莞市新雷神仿真控制有限公司 High-speed rail driving simulation system
EP3053801B2 (en) 2015-02-06 2023-11-01 ALSTOM Holdings Public transport vehicle with panoramic view
GB2542434A (en) * 2015-09-21 2017-03-22 Figment Productions Ltd A System for Providing a Virtual Reality Experience
CN106608580A (en) * 2015-10-27 2017-05-03 昆山通博电梯有限公司 Virtual landscape system used in elevator car
CN105344116A (en) * 2015-12-10 2016-02-24 沈阳体验科技股份有限公司 Display experience device
CN105413187B (en) * 2015-12-29 2017-08-29 华强方特(芜湖)文化科技有限公司 A kind of scene change controlling organization in sightseeing corridor
US10366290B2 (en) * 2016-05-11 2019-07-30 Baidu Usa Llc System and method for providing augmented virtual reality content in autonomous vehicles
US10944747B2 (en) * 2016-05-25 2021-03-09 Canon Information And Imaging Solutions, Inc. Devices, systems, and methods for zero-trust single sign-on
US10186065B2 (en) 2016-10-01 2019-01-22 Intel Corporation Technologies for motion-compensated virtual reality
US10909228B2 (en) * 2017-07-19 2021-02-02 Box, Inc. Server-side authentication policy determination for mobile applications
DE102017217027A1 (en) 2017-09-26 2019-03-28 Audi Ag A method of operating a head-mounted electronic display device and display system for displaying a virtual content
CN108845775A (en) * 2018-05-30 2018-11-20 王玉龙 A kind of virtual landscape window
KR20200017293A (en) * 2018-08-08 2020-02-18 삼성전자주식회사 Electronic apparatus for processing user utterance and controlling method thereof
FR3086898B1 (en) * 2018-10-05 2020-12-04 Psa Automobiles Sa VEHICLE ON BOARD A LUMINOUS PROJECTION SYSTEM IN THE VEHICLE'S COCKPIT

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2829035A (en) * 1954-11-05 1958-04-01 Lea Mfg Company Buffing compositions
US5004225A (en) * 1989-11-01 1991-04-02 Yuri Krukovsky Simulated observation flight apparatus
US5485897A (en) * 1992-11-24 1996-01-23 Sanyo Electric Co., Ltd. Elevator display system using composite images to display car position
US6166744A (en) * 1997-11-26 2000-12-26 Pathfinder Systems, Inc. System for combining virtual images with real-world scenes
FR2829035A1 (en) * 2001-09-05 2003-03-07 France Telecom Transport unit/environment simulation mechanism having memory storing/restoring simulated environment movements/person perceptible sensations with selectable instantaneous size restored digital words.
US20040102676A1 (en) * 2002-11-26 2004-05-27 Brendley Keith W. Motion-coupled visual environment for prevention or reduction of motion sickness and simulator/virtual environment sickness
JP2004217976A (en) * 2003-01-14 2004-08-05 Chugai Ro Co Ltd Continuous gas carburization furnace
US7088310B2 (en) * 2003-04-30 2006-08-08 The Boeing Company Method and system for presenting an image of an external view in a moving vehicle
JP2008136839A (en) * 2006-11-08 2008-06-19 Lion Office Products Corp Binder for storing key
US20090027399A1 (en) * 2005-02-03 2009-01-29 Pioneer Corporation Image editing apparatus, image editing method, image editing program, and computer-readable recording medium
US20090085873A1 (en) * 2006-02-01 2009-04-02 Innovative Specialists, Llc Sensory enhancement systems and methods in personal electronic devices
US7774075B2 (en) * 2002-11-06 2010-08-10 Lin Julius J Y Audio-visual three-dimensional input/output
US7777718B2 (en) * 2006-12-06 2010-08-17 The Boeing Company Flight portal
US20100323657A1 (en) * 2007-07-24 2010-12-23 Russell Brett Barnard communication devices
US7949295B2 (en) * 2004-08-18 2011-05-24 Sri International Automated trainee monitoring and performance evaluation system

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2733002B2 (en) * 1992-11-24 1998-03-30 三洋電機株式会社 Elevator system
US5684950A (en) * 1996-09-23 1997-11-04 Lockheed Martin Corporation Method and system for authenticating users to multiple computer servers via a single sign-on
RU2120664C1 (en) * 1997-05-06 1998-10-20 Нурахмед Нурисламович Латыпов System for generation of virtual reality for user
JP2000350195A (en) * 1999-06-04 2000-12-15 Mitsubishi Heavy Ind Ltd Virtual window forming system for aircraft
US7221935B2 (en) 2002-02-28 2007-05-22 Telefonaktiebolaget Lm Ericsson (Publ) System, method and apparatus for federated single sign-on services
JP2004125866A (en) * 2002-09-30 2004-04-22 Mega Chips Corp Image display method
US7831693B2 (en) * 2003-08-18 2010-11-09 Oracle America, Inc. Structured methodology and design patterns for web services
US20050055555A1 (en) * 2003-09-05 2005-03-10 Rao Srinivasan N. Single sign-on authentication system
US7487537B2 (en) 2003-10-14 2009-02-03 International Business Machines Corporation Method and apparatus for pervasive authentication domains
WO2005062989A2 (en) 2003-12-23 2005-07-14 Wachovia Corporation Authentication system for networked computer applications
US7607008B2 (en) 2004-04-01 2009-10-20 Microsoft Corporation Authentication broker service
ES2311821T3 (en) 2004-05-12 2009-02-16 Telefonaktiebolaget Lm Ericsson (Publ) AUTHENTICATION SYSTEM
WO2006045402A1 (en) 2004-10-26 2006-05-04 Telecom Italia S.P.A. Method and system for transparently authenticating a mobile user to access web services
WO2006077654A1 (en) * 2005-01-20 2006-07-27 Mitsubishi Denki Kabushiki Kaisha Elevator
US20090125992A1 (en) 2007-11-09 2009-05-14 Bo Larsson System and method for establishing security credentials using sms
CN101234224A (en) * 2008-01-29 2008-08-06 河海大学 Method for using virtual reality technique to help user executing training rehabilitation
US9736153B2 (en) 2008-06-27 2017-08-15 Microsoft Technology Licensing, Llc Techniques to perform federated authentication
US8438382B2 (en) 2008-08-06 2013-05-07 Symantec Corporation Credential management system and method
WO2010037201A1 (en) 2008-09-30 2010-04-08 Wicksoft Corporation System and method for secure management of mobile user access to enterprise network resources
TW201042973A (en) * 2008-11-28 2010-12-01 Ibm Token-based client to server authentication of a secondary communication channel by way of primary authenticated communication channels
US8510810B2 (en) 2008-12-23 2013-08-13 Bladelogic, Inc. Secure credential store
JP5837597B2 (en) * 2010-08-30 2015-12-24 ヴイエムウェア インコーポレイテッドVMware,Inc. Integrated workspace for thin, remote, and SaaS applications
US9323915B2 (en) * 2010-12-08 2016-04-26 Verizon Patent And Licensing Inc. Extended security for wireless device handset authentication

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2829035A (en) * 1954-11-05 1958-04-01 Lea Mfg Company Buffing compositions
US5004225A (en) * 1989-11-01 1991-04-02 Yuri Krukovsky Simulated observation flight apparatus
US5485897A (en) * 1992-11-24 1996-01-23 Sanyo Electric Co., Ltd. Elevator display system using composite images to display car position
US6166744A (en) * 1997-11-26 2000-12-26 Pathfinder Systems, Inc. System for combining virtual images with real-world scenes
FR2829035A1 (en) * 2001-09-05 2003-03-07 France Telecom Transport unit/environment simulation mechanism having memory storing/restoring simulated environment movements/person perceptible sensations with selectable instantaneous size restored digital words.
US7774075B2 (en) * 2002-11-06 2010-08-10 Lin Julius J Y Audio-visual three-dimensional input/output
US7128705B2 (en) * 2002-11-26 2006-10-31 Artis Llc Motion-coupled visual environment for prevention or reduction of motion sickness and simulator/virtual environment sickness
US20040102676A1 (en) * 2002-11-26 2004-05-27 Brendley Keith W. Motion-coupled visual environment for prevention or reduction of motion sickness and simulator/virtual environment sickness
JP2004217976A (en) * 2003-01-14 2004-08-05 Chugai Ro Co Ltd Continuous gas carburization furnace
US7088310B2 (en) * 2003-04-30 2006-08-08 The Boeing Company Method and system for presenting an image of an external view in a moving vehicle
US7949295B2 (en) * 2004-08-18 2011-05-24 Sri International Automated trainee monitoring and performance evaluation system
US20090027399A1 (en) * 2005-02-03 2009-01-29 Pioneer Corporation Image editing apparatus, image editing method, image editing program, and computer-readable recording medium
US20090085873A1 (en) * 2006-02-01 2009-04-02 Innovative Specialists, Llc Sensory enhancement systems and methods in personal electronic devices
JP2008136839A (en) * 2006-11-08 2008-06-19 Lion Office Products Corp Binder for storing key
US7777718B2 (en) * 2006-12-06 2010-08-17 The Boeing Company Flight portal
US20100323657A1 (en) * 2007-07-24 2010-12-23 Russell Brett Barnard communication devices

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120212613A1 (en) * 2011-02-22 2012-08-23 Sekai Electronics, Inc. Vehicle virtual window system, components and method
US20140160285A1 (en) * 2012-12-06 2014-06-12 Airbus Operations (Sas) Aircraft fuselage including a window equipped with a system for displaying images of the outside environment
US9456184B2 (en) * 2012-12-06 2016-09-27 Airbus S.A.S. Aircraft fuselage including a window equipped with a system for displaying images of the outside environment
US9423620B2 (en) 2014-04-24 2016-08-23 Lg Electronics Inc. Head mounted display and method for controlling the same
US9669302B2 (en) 2014-07-04 2017-06-06 Lg Electronics Inc. Digital image processing apparatus and controlling method thereof
US10603577B2 (en) * 2014-11-21 2020-03-31 Vr Moving Floor, Llc Moving floor for interactions with virtual reality systems and uses thereof
US20170252642A1 (en) * 2014-11-21 2017-09-07 Vr Moving Floor, Llc Moving Floor for Interactions with Virtual Reality Systems and Uses Thereof
DE102015208993A1 (en) * 2015-05-15 2016-11-17 Lufthansa Technik Ag Partial segment of an aircraft and electric chimney
US20170113899A1 (en) * 2015-05-29 2017-04-27 Legends Attractions, Llc Transformative elevator display system
US10221039B2 (en) * 2015-05-29 2019-03-05 Legends Attractions, Llc Transformative elevator display system
US20190152743A1 (en) * 2015-05-29 2019-05-23 Legends Attractions, Llc Transformative elevator display system
US10822198B2 (en) * 2015-05-29 2020-11-03 Legends Attractions, Llc Transformative elevator display system
US20180186598A1 (en) * 2015-07-03 2018-07-05 Laurent Coldre Elevator car wall imaging system and method
US10968074B2 (en) * 2015-07-03 2021-04-06 Otis Elevator Company Elevator car wall imaging system and method
KR20170015213A (en) * 2015-07-30 2017-02-08 삼성전자주식회사 Autonomous Vehicle and Operation Method thereof
KR102637101B1 (en) * 2015-07-30 2024-02-19 삼성전자주식회사 Autonomous Vehicle and Operation Method thereof
WO2017018844A1 (en) * 2015-07-30 2017-02-02 삼성전자 주식회사 Autonomous vehicle and operation method of same
US10620435B2 (en) 2015-10-26 2020-04-14 Active Knowledge Ltd. Utilizing vehicle window shading to improve quality of augmented reality video
US10710608B2 (en) 2015-10-26 2020-07-14 Active Knowledge Ltd. Provide specific warnings to vehicle occupants before intense movements
US10717402B2 (en) 2015-10-26 2020-07-21 Active Knowledge Ltd. Shock-absorbing energy dissipation padding placed at eye level in an autonomous vehicle
US10718943B2 (en) 2015-10-26 2020-07-21 Active Knowledge Ltd. Large mirror inside an autonomous vehicle
US10717406B2 (en) 2015-10-26 2020-07-21 Active Knowledge Ltd. Autonomous vehicle having an external shock-absorbing energy dissipation padding
US9988008B2 (en) 2015-10-26 2018-06-05 Active Knowledge Ltd. Moveable internal shock-absorbing energy dissipation padding in an autonomous vehicle
US11332061B2 (en) 2015-10-26 2022-05-17 Atnomity Ltd. Unmanned carrier for carrying urban manned vehicles
US10059347B2 (en) 2015-10-26 2018-08-28 Active Knowledge Ltd. Warning a vehicle occupant before an intense movement
US10589625B1 (en) 2015-12-11 2020-03-17 Disney Enterprises, Inc. Systems and methods for augmenting an appearance of an actual vehicle component with a virtual vehicle component
US10969748B1 (en) * 2015-12-28 2021-04-06 Disney Enterprises, Inc. Systems and methods for using a vehicle as a motion base for a simulated experience
US11524242B2 (en) 2016-01-20 2022-12-13 Disney Enterprises, Inc. Systems and methods for providing customized instances of a game within a virtual space
US20220198942A1 (en) * 2016-12-01 2022-06-23 SZ DJI Technology Co., Ltd. Methods and associated systems for managing 3d flight paths
US11961407B2 (en) * 2016-12-01 2024-04-16 SZ DJI Technology Co., Ltd. Methods and associated systems for managing 3D flight paths
JP2022188084A (en) * 2017-03-06 2022-12-20 ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー Systems and methods for layered virtual features in amusement park environment
US10854002B2 (en) * 2017-09-08 2020-12-01 Verizon Patent And Licensing Inc. Interactive vehicle window system including augmented reality overlays
US10585471B2 (en) 2017-10-03 2020-03-10 Disney Enterprises, Inc. Systems and methods to provide an interactive space based on predicted events
US10970560B2 (en) 2018-01-12 2021-04-06 Disney Enterprises, Inc. Systems and methods to trigger presentation of in-vehicle content
US11150102B2 (en) 2018-07-19 2021-10-19 Alpha Code Inc. Virtual-space-image providing device and program for providing virtual space image
US10841632B2 (en) 2018-08-08 2020-11-17 Disney Enterprises, Inc. Sequential multiplayer storytelling in connected vehicles
US10785621B1 (en) 2019-07-30 2020-09-22 Disney Enterprises, Inc. Systems and methods to provide an interactive space based on vehicle-to-vehicle communications
US11076276B1 (en) 2020-03-13 2021-07-27 Disney Enterprises, Inc. Systems and methods to provide wireless communication between computing platforms and articles

Also Published As

Publication number Publication date
EP2443055A1 (en) 2012-04-25
IL216990A0 (en) 2012-02-29
WO2010004547A1 (en) 2010-01-14
IL215424A (en) 2020-10-29
WO2010004547A4 (en) 2010-03-04
US20140237248A1 (en) 2014-08-21
JP2012530317A (en) 2012-11-29
CN102548886A (en) 2012-07-04
US10284366B2 (en) 2019-05-07
RU2012101303A (en) 2013-07-27
IL215424B (en) 2021-01-29

Similar Documents

Publication Publication Date Title
US20120256945A1 (en) System for altering virtual views
CN113474825B (en) Method and apparatus for providing immersive augmented reality experience on a mobile platform
KR102209873B1 (en) Perception based predictive tracking for head mounted displays
US9459692B1 (en) Virtual reality headset with relative motion head tracker
CA2835120C (en) Massive simultaneous remote digital presence world
US7292240B2 (en) Virtual reality presentation device and information processing method
CA2781869C (en) A method of operating a synthetic vision system in an aircraft
EP2584403A2 (en) Multi-user interaction with handheld projectors
CN106462232A (en) Determining coordinate frames in a dynamic environment
CN103977539A (en) Cervical vertebra rehabilitation and health care training aiding system
TWI453462B (en) Telescopic observation for virtual reality system and method thereof using intelligent electronic device
KR101813018B1 (en) Appartus for providing 3d contents linked to vehicle and method thereof
KR101507014B1 (en) Vehicle simulation system and method to control thereof
TWM545478U (en) Helmet with augmented reality and virtual reality
CN110622110A (en) Method and apparatus for providing immersive reality content
US20220092860A1 (en) Extended reality for moving platforms
EP3869302A1 (en) Vehicle, apparatus and method to reduce the occurence of motion sickness
EP2825933B1 (en) Electronic device for displaying content of an obscured area of a view
KR20120030441A (en) System for altering virtual views
TW201823803A (en) A helmet with augmented reality and virtual reality
US20230215287A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience
Okazaki et al. A system for supporting performers in stuffed suits
KR102532448B1 (en) Method and device for motion sickness reduction in virtual reality of metaverse environment in moving space using virtual object
US20240053609A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience
CN113436495A (en) Many people coordinate equipment training system based on VR

Legal Events

Date Code Title Description
AS Assignment

Owner name: DIGIGAGE LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIDRON, BEN;NOTEA, AMIR;SAPIR, YUVAL;AND OTHERS;SIGNING DATES FROM 20120318 TO 20120410;REEL/FRAME:032242/0948

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION