WO2011045437A1 - An interactive 3d display, a method for obtaining a perceived 3d object in the display and use of the interactive 3d display - Google Patents

An interactive 3d display, a method for obtaining a perceived 3d object in the display and use of the interactive 3d display Download PDF

Info

Publication number
WO2011045437A1
WO2011045437A1 PCT/EP2010/065619 EP2010065619W WO2011045437A1 WO 2011045437 A1 WO2011045437 A1 WO 2011045437A1 EP 2010065619 W EP2010065619 W EP 2010065619W WO 2011045437 A1 WO2011045437 A1 WO 2011045437A1
Authority
WO
WIPO (PCT)
Prior art keywords
stage
display
interactive
perceived
rotation
Prior art date
Application number
PCT/EP2010/065619
Other languages
French (fr)
Inventor
Peter Allan Simonsen
Original Assignee
Realfiction Aps
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realfiction Aps filed Critical Realfiction Aps
Publication of WO2011045437A1 publication Critical patent/WO2011045437A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47FSPECIAL FURNITURE, FITTINGS, OR ACCESSORIES FOR SHOPS, STOREHOUSES, BARS, RESTAURANTS OR THE LIKE; PAYING COUNTERS
    • A47F11/00Arrangements in shop windows, shop floors or show cases
    • A47F11/06Means for bringing about special optical effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/54Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being generated by moving a 2D surface, e.g. by vibrating or rotating the 2D surface
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • H04N13/117Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/006Pseudo-stereoscopic systems, i.e. systems wherein a stereoscopic effect is obtained without sending different images to the viewer's eyes

Definitions

  • An interactive 3D display a method for obtaining a perceived 3D object in the display and use of the interactive 3D display.
  • Interactive 3D display comprising at least one first image source projecting at least one image onto at least one semi- transparent mirror thereby creating a perceived 3D object in a space defined by the at least one semi-transparent mirror and a base by use of Pepper's ghost technique.
  • the invention further concerns a method of obtaining and interacting with the perceived 3D object and a specific use of the interactive 3D display .
  • Pepper's ghost technique has been described many times. Recently and easy to understand by Julien Clinton Sprott in Physics Demonstrations page 255. Said technique is simple to use, and when used by a person with insight into geometry and optics, the technique can create impressive three-dimensional illusions of objects seeming to appear in free air like a ghost. Recently this technique has been used to create perceived 3D objects in small-scale displays as described in EP 1846798. This display allows a viewer to observe a perceived 3D object from several angles, but the viewer remains passive with respect to the perceived 3D object. This means e.g.
  • the Pepper's ghost technique is based on optical illusions, it is critical to maintain the 3D feeling of the perceived 3D object and to avoid that it suddenly becomes flat to the observer or otherwise looses its creditability .
  • This, the 3D realization is a critical point no matter what type of 3D display it is, as long as it is using the Pepper's ghost technique, but it becomes increasingly more important to take into account if the 3D display allows for the viewer to move the perceived 3D object within the interactive 3D display.
  • Such a new interactive 3D display must enable the viewer/user to obtain information about e.g. a new sports car on virtual display in a 3D display in a car dealer or about the advanced structure of proteins in a museum or in a laboratory, in a manner where the viewer can concentrate on obtaining the information and not on performing the interaction with the perceived 3D object displayed. I.e. it must provide simpler alternatives to the known controls such as joy pads and computer mouse for moving 3D objects on a screen or in a 3D display. There is also a need for an interactive 3D display, which is easy to use by someone "just passing by" in a busy environment .
  • an interactive 3D display with an improved 3D realization is provided.
  • an interactive 3D display which provides a combination of an actual object of choice and a perceived 3D object .
  • an interactive 3D display with an intuitive interaction between viewer and perceived 3D object.
  • an interactive 3D display which allows for a precise coordination of an actual object and a perceived 3D object when the viewer interacts with the 3D display.
  • the term "animate” is used about the change in an image over time e.g. a visualization of the rotation of a 3D object and/or the visualization of the movement of parts of an object as for example a car with doors opening and closing etc.
  • the animation can be real-time calculated based on input to the device executing the animation or performed by accessing a database with pre-rendered images.
  • compute is used in the general sense of using a computer. It can be for calculations, execution of specially designed source code, execution of commercial software and database query. It can be dependent on input from one or more devices or only rely on pre-registered data .
  • the term "interact” means to rotate, tilt, translate or otherwise move the perceived 3D object. It also covers e.g. changing colors on the perceived 3D object and scrolling through text related to the displayed perceived 3D object.
  • the base comprises a moveable stage and that the interactive 3D display further comprises means for manipulating the stage and means for correlating the motion of the stage and the perceived 3D object.
  • This offers a unique range of possibilities for presenting the perceived 3D objects, making it possible for the viewer to interact with the perceived 3D object e.g. to tilt or rotate the perceived object to be able to see it from a preferred angle.
  • the perceived 3D object will automatically follow the motion of the stage, whereby the optical illusion is enhanced, compared to an interactive 3D display without the movable stage, and becomes very effective.
  • stage helps define the room and the depth in the perceived 3D object.
  • the movement of the stage together with the perceived 3D object will thus enhance the 3D feeling of the perceived object by clearly defining a space with depth in which the perceived 3D object can exist and it is therefore possible using the 3D display according to the invention to create images of a definition and a scale never before possible resulting in a much more realistic look than anything previously achieved using a more conventional Pepper's ghost set up.
  • Such an enhanced 3D illusion broadens the possibilities of use, both as the 3D display can be used under a wider spectrum of conditions and due to the fact that it is possible to display more challenging perceived 3D object, which in a 3D display of the types known in the art will be in risk of not convincingly seem to be three dimensional but instead appearing flat.
  • the interactive 3D display according to the present invention can be used in many contexts, from an easy to access shop display to advanced learning tools.
  • the motion of the stage can be rotation, tilting, and/or movement along different axes depending on what perceived 3D object is to be displayed in the interactive 3D display.
  • a crystal structure of a complex salt is to be displayed, as the perceived 3D object, it is preferred to use a tiltable and rotatable stage in order to be able to see the object from as many angles as possible while still following the motion of the stage. If more simple objects are to be displayed, a stage with a simple rotational motion can be chosen.
  • the rotation axis can easily be chosen by the person skilled in the art based on the information given in relation to the present invention.
  • the image source can be a 50-200Hz LCD screen, as this can provide a screen of good quality as well as they are reasonable in price.
  • An advantageous other option is to use a OLED screen due to the improved contrast, but in principle any screen is applicable.
  • Frame rates can advantageously be between from 20 frames pr . second and up, e.g. 30 frames pr . second in order to achieve a smooth movement of the perceived 3D object.
  • the screen can be a 3D auto-stereoscopic screen, as this provides a way of displaying steroscopic images provides without the use of special headgear or glasses on the part of the viewer.
  • the reflecting side of the semi transparent mirror must face the image source. It also enhances the illusion that the reflection is higher than the transmission through the semi transparent mirror, as the reflected image in this case is clear and the perceived 3D object thus appears more vivid that if the reflection is sparse.
  • Exemplary data for a semi transparent mirror, chosen to ensure a luminous perceived 3D object, can be
  • Reflection (R%) RAvg 60 +/- 5% The term "semi-transparent mirror" within the present document are meant to encompass any kind of reflective and semi- transparent screen capable of creating a Pepper's ghost illusion. Thus the term also comprises reflective and semi- transparent membranes and foils made of e.g. a polymeric composite.
  • the reflection layer of the "semi- transparent mirror" is the outer layer facing the image source, as it is then possible using the 3D display according to the invention to create images of a definition and a scale never before possible resulting in a much more realistic look than anything previously achieved using a conventional Pepper's ghost set up.
  • the semi-transparent mirror can comprise a thin protective layer covering the reflection layer, as this will provide a larger resistance to the mirror.
  • Said protective layer is preferably less than 400 nm, more preferably less than 200 ⁇ , and even more preferably less than
  • the stage is rotative around a stage rotation axis perpendicular to the stage. If the rotation around the stage rotation axis is the only possible movement of the stage, the interactive 3D display according to the invention have a number of advantages. Obviously it limits the movement of the stage significantly, compared to a stage having the possibility of a multi axial movement, which ensures that the correlation between the stage and the perceived 3D object can be performed very accurately even on smaller computers, which are cheap and can be implemented in even quite light display structures and thus facilitates simple and inexpensive displays. At the same time the limited movement of the stage, and thus of the perceived object, around a rotation axis perpendicular to the stage, can be an advantage for the viewer.
  • this simple rotation of the stage enables the viewer to intuitively perform the movement of the stage, by use of the means for manipulating the stage, and at the same time obtain information from the interactive 3D display.
  • This way the limited movement creates a possibility for offering the interactivity to the viewer, i.e. offering the viewer the possibility to rotate the perceived 3D object, but keeping it at a level which is simple and easy to perform without prior training or instructions.
  • Such a preferred embodiment is therefore convenient to use e.g. in busy locations as super markets or museums where the viewer can not be expected to have a quiet moment to concentrate or to have the time to learn how to navigate and decode more advanced movements and information.
  • the means for manipulating the stage may be at least one wheel or wheel section rotative around an axis A 2 parallel to the stage rotation axis.
  • This setup makes the interaction intuitive as the rotational movement of the wheel corresponds to the rotational movement of the stage, and requires a minimum of instructions of the viewer thus offering an instant clear and understandable interaction.
  • the stage is remote controlled in which case the stage may be equipped with a servo motor rotating the stage upon input from the means for manipulating the stage. If the stage can tilt as well as or instead of rotate around the stage rotation axis, means for providing this movement can be provided.
  • Such means for manipulating the stage can be integrated with the wheel, which e.g. can be tilted upwards and/or downwards to control tilting of the stage or they can be one or more separate button/wheel etc depending on the motion.
  • the means for obtaining said correlation may at least comprise:
  • a rotation sensor for detecting the rotation of the stage - a transmitter for transmitting a signal representing the rotation of the stage from the rotation sensor to a computer
  • the rotation sensor can in a preferred embodiment be a simple mechanical device located under the center of the stage. From here the data concerning the rotation of the stage, i.e. direction and/or speed can be transmitted to the computer via cables or via wireless techniques known in the art. This solution is very simple, but also very effective as it is cheap, easy to implement and does not require detailed calibration every time the display is set up for a new use. Furthermore the equipment can be quite sturdy and thus fulfils the requirements for use with a 3D display, which is to be handled by many and untrained people in public space.
  • the rotation sensor can alternatively be located at the edge of the stage. It can e.g. be of the type, which optically detects a pattern on the edge of the stage and from this pattern derive speed and direction of the stage or it can be mechanically linked to the stage and measure the stage rotation by means of a simple gearing or similar. Data transfer can preferably occur as described above.
  • the rotation sensor for detecting the rotation of the stage can preferrably measure a rotation position of the stage, such as an angle between the base of the stage and a predetermined reference point.
  • the measured rotation position can be used to determine the orientation of the stage.
  • the stage can include touch sensor areas located where a user typically grasps the stage to change the stages orientation. In this case, detection of touch in these areas could be used in addition to orientation information from one or more other sensors, such as accelerometers , position sensors, etc., to aid in the detection of a change in rotation of the stage.
  • sensors such as accelerometers , position sensors, etc.
  • the stage and the rotation sensor is arranged as a jog or shuttle wheel, i.e. a device which allows the user to shuttle or jog through a frame stack of an audio and/or video media.
  • "Jog” refers to going at a slow speed
  • “shuttle” refers to a fast speed.
  • the rotation sensor is an angular encoder which may be optical, mechanical or magnetical.
  • the used jog wheel or other rotation device has no stops and can be spun the entire way around preferrably unlimitted.
  • the jog wheell depends on the tracking of the actual motion of the rotation: the faster the jog wheel spins forward or back, the faster it fast- forwards or rewinds in the frame stack of the desired media.
  • jog wheels are known in the art and can e.g. be found on handheld PDAs and as the scroll wheel on computer mice, the jog wheel according to the present invention is functioning with software which preferreably uses angle data, which is in contrast to the known jog wheels.
  • the means for correlating the motion of the stage and the perceived 3D object at least comprise:
  • a key object is in the follow document considered to be any object with a characteristic shape and/or pattern recognizable by the software via the camera. It can be flat and may be recognizable by the camera/computer e.g. by its 2D outline or pattern.
  • the user selects a key object and places it on the stage.
  • the camera records the key object and the software recognizes the key objects and displays the corresponding image on the at least one image source.
  • the camera records the movement of the key object and software computes this data from the camera and animates the image displayed on the image source so that the perceived 3D object on or over the stage moves in a motion correlated with the stage.
  • the key object can also for example be a model car, a plastic animal or anything else that preferably represents the perceived 3D object it triggers to be displayed in the 3D display.
  • actual object is used for a physical object placed on the stage, which is to be coupled with the perceived 3D object.
  • a physical object placed on the stage which is to be coupled with the perceived 3D object.
  • E.g. an actual physical running shoe which can be supplied with a perceived fur surface by over layering of a perceived 3D object.
  • a key object is used for a physical object placed on the stage and which is to be recognized by the camera/computer.
  • a key object can be a small model running shoe, or a picture of a shoe, placed on the stage. This small shoe (or picture) triggers a real size perceived 3D running shoe to be displayed in the interactive 3D display above the stage and key object.
  • An actual object can be placed on the stage.
  • the introduction of an actual object on the stage offers the possibility of mixing the perceived 3D object with an actual object and can be used in a multitude of ways.
  • a running shoe can be placed on the stage and the perceived 3D object can be used to change perceived pattern, color etc. of the shoe allowing the viewer to e.g. see his/hers own customized shoes before ordering them at a store.
  • An actual object is, in most cases, not a key object as, in most cases the creation and animation of the perceived 3D object is not performed based on camera recordings of the actual object.
  • the actual object can be used as a key object as described in the following.
  • the actual object on the stage is correlated with the perceived 3D object it is necessary to either in a first place a pre-selected actual object on a specified position on the stage or in a second case for the interactive 3D display to be mounted with a camera registering the stage and actual objects on the stage, i.e. the actual object is also a key object.
  • the first case is quite simple.
  • the computer in the 3D display contains data and software for showing one or more specified perceived 3D objects to be combined with one or more specified actual objects e.g. ten different types of over-layers matching a specific shoe. That specific shoe (the actual object) is then placed on the stage in a position predefined to ensure a perfect match between shoe and perceived 3D object (in this case the over-layer to the shoe) , and now the stage with the shoe and the correlated perceived 3D image can be moved together and displayed from different views by use of e.g. the wheel. In other words in this simple version the actual object is placed on the stage so that it matches/overlaps the perceived 3D object. If the second more complex case with the camera is chosen, i.e.
  • the requirements to the computer is higher as the computations needed are more demanding.
  • the camera can record the object.
  • the object e.g. by using a database on the computer
  • the computer creates the image to be displayed by the image source and also performs the animations of said image needed for the perceived 3D object to follow the actual object placed on the stage when the stage is moved by the remote means.
  • the perceived 3D object and the animation hereof is calculated to follow the actual object on the stage.
  • the viewer or a person administrating the 3D object e.g. a shop-keeper can interact with the perceived 3D object via other means than the wheel.
  • Such means can be placed directly at the computer, an external touch screen or e.g. be integrated with the wheel for rotating the stage, such that when the wheel is pressed the color of the over-layer of the shoe, car, soda can etc can e.g. be changed on the stage.
  • upload new data to the computer if e.g. a customer has designed the paintwork for a new car and want to be able to see it mapped onto a model of the desired car.
  • the 3D display can also e.g.
  • the wheel can be used to rotate the perceived 3D object and the wheel can be pressed to "make time pass” i.e. the wheel can be pressed to start and stop an animation of the growth.
  • the software running on the computer is advantageously able to execute images from 55 - 200 Hz in order to be able to follow the motion of the stage.
  • the invention also relates to an interactive 3D display system comprising at least one interactive 3D display according to the invention and at least one additional presentation means for presenting audio and/or visual information.
  • the perceived object may be accompanied by additional information such as text, image sequences and/or music.
  • the system can comprise means for executing one or more preprogrammed actions on the at least one interactive 3D display and the at least one additional presentation means when the stage during manipulation reaches one or more predefined positions, e.g. angular positions defined in relation to a reference point when the stage is rotative around a stage rotation axis (AJ perpendicular to the stage plane.
  • predefined positions e.g. angular positions defined in relation to a reference point when the stage is rotative around a stage rotation axis (AJ perpendicular to the stage plane.
  • Similar can manipulation of one stage of the 3D display according to the invention control the manipulation of one or more stages of different 3D display displaying either identical or different perceived objects.
  • the means for executing one or more preprogrammed actions is preferably software running on the computer. If the software running on the computer comprises a video editing tool and/or a video player the user (i.e. a shop owner or an exhibition designer in a museum) can create his own media presentation by linking positions of the moveable stage to the execution of different actions/media elements to be displayed on the image source. This way the user can prepare a completely custom made media presentation wherein the perceived object may be accompanied by additional information such as text or image sequences .
  • the software running on the computer comprises a video editing tool and/or a video player the user (i.e. a shop owner or an exhibition designer in a museum) can create his own media presentation by linking positions of the moveable stage to the execution of different actions/media elements to be displayed on the image source. This way the user can prepare a completely custom made media presentation wherein the perceived object may be accompanied by additional information such as text or image sequences .
  • Types of software for imaging and animation can be of the type of 3D game engines, Ogre, qurest3D, Flash, 3Dmax, Maya3D etc, all of which is well known for the person skilled in the art and the 3D game engines which can be used in the present invention will therefore not be described in further details.
  • the perceived object may be formed by using the above mentioned video editing tool/player and having photos stored in a frame stack on the computer, enabling a viewer to scroll back and forward in a series of photos by turning the wheel or wheel section and thereby the stage back and forth.
  • the photos may be high resolution photos and the frame stack contains for example 1200 pictures of an object to be presented as the perceived object according to the present invention. Each picture is assigned to a angular position of the stage.
  • the viewer can experience a unique photo realistic visual presentation of e.g. a 360° view of an perceived object or navigate through the pictures in the frame stack to experience an ultra slow motion 360° movie of a golf swing or other complex motion, and stop to see each unique high resolution picture if desired. If the wheel/stage is turned at a higher speed the experienced movie is no longer ultra slow motion .
  • the photos in the frame stack can be combined with other visual or audio presentations such as real time 3D visualizations (e.g. using 3D studio Max or Maya) triggered at one or more selected angular positions of the stage.
  • the user can advantageously also display e.g. text information at one predetermined angel, a video at the same and/or a second angle, audio information at a third angle etc.
  • the desired actions can be chosen depending on kind of desired information.
  • the wheel/stage preferably has no stops it can be spun the entire way around preferrably unlimitted.
  • Each full revolution of the wheel/stage i.e. a 360° rotation defines a rotation and the software may increase or decrease a rotation count depending on what number of rotation the wheel/stage is on. If e.g. the wheel /stage has been rotated 450 degrees the rotation count of the software is 2 as the wheel/stage is on its second rotation .
  • Each rotation is subdivided into a number of subsections, which defines the step size/resolution chosen for the specific turnable wheel/stage.
  • the number of subsections can be arbitrarily chosen depending on the perceived object and information to be presented. As an example, if the position of the turnable stage is given in whole degrees each subsection corresponds to 1°, if the position of the turnable stage is given in 1/10 degrees each subsection is 0.1°, thus depending on the number of e.g. photos to be presented in the frame stack the user can choose the desired size of each subsection.
  • the number of subsections can be the same for each rotation, i.e. for each full revolution, but the number of subsections can also be independently chosen for each rotation.
  • each subsection can preferably be set to 1°, so that the first subsection corresponds to the position 0° of the turnable stage and the last subsection in the presentation corresponds to 3x360° i.e. the position where the turnable stage has been revolted three times or a total of 1080°.
  • the user can e.g. chose to execute the number of photos in the frame stack only during the first and third revolution of the turnable stage, and can during the third revolution ensure that the photos can be partly accompanied by first and second text and audio output.
  • a 3D animation is accompanied by text.
  • the photos displayed during the first revolution will preferably be displayed on the image source according to the invention using a first screen configuration.
  • first and second text will be displayed on the image source in a second screen configuration.
  • animations and text will be displayed using a third screen configuration.
  • the user can add different screen configurations via the software, add or remove output, change the unit of the sub division in one or more of the rotations as well as it is possible to add or remove rotations, e.g. add a rotation corresponding to fourth, fifth etc. revolution of the turnable stage .
  • stage rotation axis and the center of rotation of the perceived 3D object coincides at least in one point, the inventors has shown that this will ensures that the perceived 3D object and the stage move together in a way, which fully utilizes the stage effect of defining the room in which the perceived 3D object is seen.
  • the stage rotation axis is best placed in a plane as here described.
  • the correlation of the stage and the perceived 3D object can be "intelligent".
  • the interactive 3D display can be used to not only change the look on the shoe but also to give information about the shoe to the viewer.
  • a pop-up textbox with information about cushioning materials can appear and disappear again when the stage is turned again and the heel is no longer facing the viewer.
  • means for tracking the eye position of an observer and communication data about said eye position to software at least partly generating the at least one first image into the 3D display.
  • Such tracking means enables the perceived 3D object to be compensated depending on the viewers eye/head position and angle relative to the stage.
  • the perceived 3D objects formed this way follow the actual object correctly when the stage and thereby the perceived and actual object is turned no matter how tall the viewer is and in what angle he/she holds his/her head relative to the stage.
  • the interactive 3D display is collapsible it becomes easy to transport and store. Furthermore the semi transparent mirror and other essential parts can be protected while in a collapsed state .
  • the invention further relates to a method of forming a perceived 3D object comprising the steps of
  • the Interactive 3D display can for example be used to real-time changing of a perceived over layer of an actual physical object. This is a smart and effective way of e.g. illustrating the effect of different choices for the outer accessories on a car or the changes over time of the surface of the earth or the earths atmosphere.
  • the invention also provides for computer software arranged for, in operation with the interactive 3D display, to carry out or control every function of the apparatus and/or methods
  • a program for a computer comprising the steps of: receiving from rotation sensor a signal indicative of angle of rotation of the stage; and on the basis of said signal animating the at least one image projected by the image source.
  • a user can create media presentations consisting of a variety of media elements such as video, sound, pictures in a frame stack, text, 3D animations etc. Fundamental to the media presentations created is that the execution of each media event in the final media presentation is executed at specific angular input preferably from a rotation sensor.
  • the user can define the number of revolutions (registered by the rotation sensor) over which a presentation is to be executed, how fine the resolution is to be e.g. how many frames of a frame stack is to be shown pr . degree, define in what angular intervals a given media element is active e.g.
  • a video is active and from 30 to 90 degrees a frame stack is active. It is for example possible to upload a frame stack containing e.g. 4000 frames and then set the number of rotations to a specific number e.g. 4 in which case 1000 frames will be shown during each of four revolutions registered by the rotation sensor, giving a resolution of 0.36 degrees.
  • the Player receives input from a rotation sensor and on the basis hereof executes the media presentations created by use of the Editor.
  • the Player receives input from the rotation sensor about the current angular position of the rotation device i.e. turntable, stage, jog wheel, etc being used to control the Player.
  • the Player executes the media elements set by the Editor to be executed at a given angle. If e.g. a video is set to be executed in the angular interval from 10 degrees to 420 degrees and a frame stack from 30 to 90 degrees, both video and frame stack will be active when the reading from the rotation sensor is 80 degrees, 82.75 degrees, ... , 89.99 degrees but at 90.01 degrees only the video is active. If the rotation device is rotated in the opposite direction back to 89,5 degrees both video and frame stack is active again.
  • a media presentation may be set up by the Editor so that e.g. during a video media event a video will continue playing in the forward direction even if the rotation device is moved back and forth as long as the input from the rotation sensor to the Player is in the angular interval in which the video media event is set to be active.
  • a film played from frames in a frame stack being executed simultaneously with the video may be on the other hand be played forwards and backwards depending on the direction of the movement of the rotation device. This is due to the fact that the Editor allows the user to configure each media element independently to achieve the exact media presentation ideal for a specific purpose.
  • the Player software contains mechanisms which eliminates different errors. E.g. in the case above where the rotation sensor first delivers input of 82.75 degrees, ... ,90.01 degrees and then 89.5 to the Player, the Player is "intelligent” and can evaluate that the rotation device now has change direction of rotation as it is unlikely that the rotation device has been turned almost full 360 degrees since the last reading. This type of "intelligence” makes sure that the presentation executed by the Player does not “skip” as it might do if the software registered the new reading as if a new revolution has been started.
  • the Player software also contains mechanisms to eliminate unintended vibrations from the users hand or from the surroundings e.g. created by passing people.
  • an angle signal from a rotation device such as a stage may be used to synchronize several units, such that e.g. a perceived object on different stages may be controlled by a single action.
  • a rotation device such as a stage
  • two computers with screens and a computer with a sound source may be activated simultaneously when the stage reaches a specific input angle.
  • Fig. 1 shows, in a perspective view, an interactive 3D display with one semi-transparent mirror and one image source
  • fig. 2 is a schematic side view of an interactive 3D display, illustrating the rotation axes of the interactive 3D display, and the geometry of image source, semi transparent mirror and preferred position of the stage rotation axis
  • fig. 3 shows a sectional view taken along line III - III of the interactive 3D display in fig. 1, and a viewer interacting with the interactive 3D display
  • fig. 4 shows the same with the addition of an actual object on the stage and where the perceived 3D object is replace in order to match the actual object
  • fig. 5 corresponds to fig.
  • fig. 6a shows the interactive 3D display without the perceived 3D image but where the 3D display further comprises stage camera and a camera for tracking the head movement of the viewer and a key object placed on the stage
  • fig. 6b and 6c show examples of key objects to be used with the interactive 3D display in fig. 6a
  • fig. 7 is a perspective view of a three-sided 3D display
  • fig. 8 is a flow diagram explaining the method relating to the features shown in figs. 1 - 4 and 7
  • fig. 9 is a flow diagram relating to Fig. 5 and 6, fig.
  • FIG. 10 is yet an alternative flow diagram relating to a interactive 3D display according to the present invention
  • fig. 11 is a first exemplary interface for software to be used with the interactive display according to the present invention
  • fig. 12 is a second exemplary interface for software to be used with the interactive display according to the present invention
  • fig. 13 is a flow diagram showing the options during setup of the interface of the software
  • fig. 14 is shows a flow diagram representing an exemplary chain of steps leading to a maximized processor power available to the interactive display according to the invention
  • fig. 15 shows a flow diagram representing how the interactive display reacts to input in form of angle data from an rotation sensor
  • fig. 16 shows a flow diagram indication the function of the rotation sensor in fig. 15, and
  • fig. 17 shows a system according to the invention, comprising several additional presentation means for presenting audio and/or visual information.
  • the interactive 3D display according to the present invention is in the following described by way of examples. These examples are not to be regarded as limiting to the invention. E.g. the dimensions of the interactive 3D display can be significantly different from the here show examples, as well as the number of sides, displayed objects etc. can be different from the here shown.
  • Fig. 1 is an interactive 3D display 1 according to the present invention comprising an image source 2 above a semi transparent mirror 3, and a base 4 opposing the image source 2.
  • the image source 2 is rectangular with a first long side 7 opposing a second long side 8, the first 7 and second 8 long side are connected by a first short side 9 and an opposing second short side 10.
  • the image source 2 has a top surface 11 opposite an image surface 12 (not visible) .
  • the image surface 12 faces the semi transparent mirror 3, and the image source 2 and the semi transparent mirror 3 are connected along the second long side 8 of the image source 2 and a first mirror long side 13.
  • the semi transparent mirror 3 is positioned in a 45° angle relative to the image source 2 so that the image (not seen) from the image source 2 is reflected on the semi transparent mirror 3. If other angles are used it changes the geometry of the perceived 3D object.
  • the semi transparent mirror 3 has a second mirror long side 14 opposing the first mirror long side 13.
  • the semi transparent mirror 3 is connected to the base 4 along the second mirror long side 14 and along a first base long side 15.
  • the base also has a second base long side 16 opposing the first base long side 15.
  • a first 17 and second 18 base short side connects the first 15 and second 16 base long side.
  • the base has a resting face 19 opposing a stage face 20 containing the stage 5. In the present embodiment the stage is circular and placed centered in the stage face 20 of the base 4. This way the 3D display 1 forms a Z shape when seen in a side view.
  • the wheel 6 is positioned in the first base long side 15, which in the present embodiment is the front of the base 4.
  • Fig. 2 illustrates the rotation axes of the stage 5 and the wheel 6.
  • the stage rotation axis A 1 extends perpendicular to the center C s of the stage 5.
  • a wheel rotation axis A 2 extends perpendicular from the center C w of the wheel 6.
  • the axis A 1 and A 2 are parallel.
  • the stage 5 will also be turned in a motion corresponding to the motion of the wheel 6. I.e. speed, angle of rotation etc. of the wheel 6 corresponds to speed, angle of rotation etc. of the stage 5.
  • the shortest distance D . is equal to the distance D. to a plane P. perpendicular to the stage and containing the stage rotation axis A 1 .
  • the plane P ⁇ is perpendicular to the paper plane of fig. 2.
  • the image source 2 is offset with a distance h with respect to the semi transparent mirror 3.
  • An offset h can be present in some displays where the specific display design requires it. Placing the stage right with respect to the semi transparent mirror 3 and the offset h ensures that the perceived 3D object maintains its creditability .
  • Fig. 3 shows a viewer 21 observing a perceived object 22 and interacting with it via the wheel 6 and thus the stage 5.
  • the image source 2 projects an image 23 on to the semi transparent mirror 3 which creates the illusion of a perceived 3D object 22 on or over the stage 5.
  • the stage 5 turns in the same motion as the wheel 6 and stage 5 are correlated.
  • Information about the rotation of the stage 5 is transmitted to a computer 24 in the image source 2.
  • Said computer 24 computes how the image 23 must change in order for the perceived object 22 to move together with the stage 5 and displays the changing image 23 in a continuous stream making the perceived 3D object turn smoothly with the stage 5.
  • the rotation of the stage 5 can for example be detected by means of a rotation detector 25 positioned under the center C s of the stage 5.
  • Fig. 4 shows a viewer 21 observing a perceived object 22' over- layered on an actual object 26 and interacting with them via the wheel 6 and thus the stage 5.
  • Fig. 5 is a side view of the 3D display 1.
  • the interactive 3D display further comprises a camera 27 positioned on the first long side 7 of the image source 2.
  • This camera 27 can be used for tracking the viewer's eye motions and/or the position and angle of the viewer's head.
  • This information is send to the computer 24 where it is processed and implemented in the calculation of the image 23.
  • This step helps ensure that the perceived 3D object 22 very accurately matches the actual object 26 no matter from what angle the viewer is looking at the perceived object 22 and actual object 26.
  • This feature is a special advantage where an actual object 26 is placed on the stage 5.
  • the interactive 3D display according to the present invention can be equipped with a stage light 28 preferably above the center of the scene.
  • This light 28 help to enhance the illusion further as it helps defines the stage 5.
  • Possible light sources are different types of LED lights due to their long lifetime to limit the required maintenance. In principle any light source, which can be focused on the stage area, is useable.
  • the rotation of the stage is detected by a rotation sensor 25 located at the side of the stage 5.
  • Fig. 6a shows a 3D display according to the present invention where the basic elements are all known from the previous figures, but in the present case the 3D display is also equipped with a center camera 29 monitoring the stage 5.
  • the camera can be used in stead or as a supplement to other means for detecting the motion of the stage 5.
  • a key object 30 is placed on the stage 5 the camera 29 and image recognition software in the computer 24 can be used to keep track of the direction and rotation speed of the stage 5.
  • Fig. 6b and 6c show two different types of possible key objects 30b, 30c to be placed on the stage 5.
  • the key objects 30b, 30c can be three-dimensional or they can be flat as illustrated here .
  • FIG. 7 illustrates a 3D display 1 with more semi transparent mirrors 3a, 3b, 3c.
  • the image face 12 of the image source 2 will be divided in three subdivisions 12a, 12b, 12c (not shown) each projecting an image 23a, 23b, 23c on to the semi transparent mirror 3a, 3b, 3c underneath said subdivisions 12a, 12b, 12c.
  • This way the display can be accessed from three different sides. This allows for the same object to be viewed by more people at the same time and e.g. also for the same actual object to be seen with three different over-layers in which case each subdivisions 12a, 12b, 12c projects a image 23a, 23b, 23c different from the others.
  • the same shoe can be seen having e.g. a floral print, dots and fur surface depending on what side of the 3D display 1 it is viewed from.
  • the different embodiments are exemplary and where appropriate the many features can be combined.
  • the 3D display with a center camera 29 shown in Fig. 6 can also include stage light 28 and/or the front camera 27 can be left out.
  • the embodiments in Figs. 1-4 and 6 can also be equipped with stage light, stage camera 29 and/or front camera 27.
  • the majority of the embodiments are shown with a rotation detector 25 under the center of the stage 5, but the chosen means for detecting the motion of the stage could as well be a mechanical or optic sensor 25 placed as illustrated in Fig. 5.
  • all of the features of the described embodiments can be combined as long as they are not contradicting.
  • Fig. 8 shows a flow charge diagram of the basic functionality of the interactive 3D display 1 described in Fig. 1 to Fig. 7.
  • a viewer can observe the perceived 3D object 22 on or over the stage 5.
  • the viewer rotates the wheel 6 in on the front of the interactive 3D display 1.
  • This makes the stage 5 rotate in a movement correlated with the movement of the wheel 6.
  • the rotation of the stage 5 is detected by a sensor 25 and the information hereof is transferred to a computer 24.
  • the computer 24 registers the rotation data and computes a new image/animation to be displayed on the image source 2 and reflected on the semi transparent mirror 3 to create the illusion of a perceived 3D object 22 moving together with the stage 5 in a motion induced by the users rotation of the wheel 6.
  • Fig. 9 shows a flow charge of the basic functionality of the interactive 3D display 1 described in Fig. 4 to Fig. 6.
  • the basic flow in the flow charge of Fig. 9 is similar to that of Fig. 8.
  • Fig. 9 deviates from Fig. 8 in that the interactive 3D display 1 further comprises a front camera 27 for tracking the viewers eye/head movement.
  • the front camera data is transferred to the computer 24 where it is computed and is used to ensure that the image/animation created by the computer 24 and displayed on the image source 2 is correct with respect to the viewer's angle of observation. This is especially useful if an actual object 26 is placed on the stage 5.
  • Fig. 10 shows a flow charge diagram in which the viewer turns the wheel 6 causing the stage 5 to turn and at the same time transmits data of the rotation of the wheel 6 (and thus also of the stage 5) to the computer 24.
  • the computer 24 computes the input from the wheel 6 and on the basis hereof displays a picture/animation on the image source 2 and thus creates a perceived 3D object on or over the stage 5, which moves in a motion correlated with the stage 5 and wheel 6.
  • the software according to the present invention consist basically of an Editor and a Player.
  • the Editor is arranged for creating and executing media presentations
  • the Player is arranged for showing media presentations on at least an image source as in the interactive 3D display or e.g. a screen with or without a sound source.
  • FIG 11 a simplified exemplary interface for the video/presentation editing tool, i.e. the Editor which may be used together with the present invention.
  • the user has chosen to have six different possible actions: photos (in a frame stack), text 1, text 2, video, 3D animations and audio, to be executed on the at least one image source and audio means.
  • the actions are here represented in the interface by six vertical slots 30, 31a, 31b, 32, 33, 34. Each slot is divided into a first 35, second 36 and third section 37, and each section 35, 36, 37 are subdivided into a number of subsections 38.
  • the slots 30, 31a, 31b, 32, 33, 34 represent the angular position of a turnable stage in an interactive 3D display according to the present invention (not shown) so that each section 35, 36, 37 corresponds to 360° (i.e. a full revolution of the stage) .
  • Each subsection 38 corresponds to the step size/resolution chosen for the turnable stage, so that if the position of the turnable stage is given in whole degrees each subsection 38 corresponds to 1°, if the position of the turnable stage is given in 1/10 degrees each subsection 38 is 0.1°.
  • the three sections 35, 36, 37 corresponds to a first, second and third revolution of the turnable stage.
  • the first subsection 38a corresponds to the position 0° of the turnable stage and to the far right the last sub section 38b corresponds to 3x360° i.e. the position where the turnable stage has been revolted three times or a total of 1080°.
  • the user has chosen to execute a number of photos 39 stored in a frame stack during the first and third revolution of the turnable stage (indicated by the hatching of the first section 35 and third section 37 of slot 30 corresponding to photos) .
  • the photos will be partly accompanied by first 40 and second 41 text (indicated by partial hatching of the third section of the third 32 and fourth slot 33) and audio output (indicated by partial hatching of the third section 37 of the fifth slot 34) .
  • a 3D animation is accompanied by text.
  • the photos 39 displayed during the first revolution will be displayed on the image source using a first screen configuration 40.
  • video, first 41 and second text 42 will be displayed on the image source in a second screen configuration 43.
  • video and text will be displayed using a third screen configuration (not shown) .
  • the user can add different screen configurations via the software, add or remove output, change the unit of the sub division 38 in one or more of the sections 35, 36, 37, as well as it is possible to add or remove sections, e.g. add a section corresponding to fourth, fifth etc. revolution of the turnable stage .
  • the presentation may be saved in an executable file stored on computing means e.g. a PC, Mac, Unix or even a flash storage or similar .
  • computing means e.g. a PC, Mac, Unix or even a flash storage or similar .
  • the perceived 3D object in this example could e.g. be a person performing a golf swing.
  • the viewer will see, as the perceived 3D object, a 360° view of a golfer performing an imperfect swing with a golf club.
  • the 360° view being created by the display on the image source of a series of photos e.g. one photo for each 0.5° turn of the stage.
  • a 3D animation displayed on the image source creates, as the perceived 3D object, a 3D model of the corrections needed to optimize the swing, along with explaining first and second text next to the perceived 3D object.
  • a 360° film (created of photos in a frame stack as above) of the golfer now performing the perfect swing along with first and second text, will be displayed on the image source to create a perceived 3D object i.e. a perceived golfer making a perfect swing along with text "floating" next to him in relevant positions with e.g. information on foot position and shoulder movement as well as audio of a cheering crowd may be heard.
  • Fig. 12 shows a circular viewmaster 44 i.e. an example of the user interface of the software according to the present invention.
  • the key content of the viewmaster is known from fig. 11 and the circular viewmaster is an alternative to the linear viewmaster of fig. 11.
  • the circular viewmaster 44 in fig. 12 contains three circular slots 45, 46, 47 representing the angular position of the rotation device, e.g. jog wheel or rotatable stage at a given time. If the software is set up to one rotation, each slot 45, 46, 47 represents 360 degrees and if e.g. the software is setup to a presentation being executed over five revolutions each slot 45, 46, 47 represents 1800 degrees.
  • the media elements i.e. the image/video/sound etc.
  • the information field 49 contains information about current angular position 50 e.g. as in this case 2505,06 degrees which means that the angular encoder has rotated 360 degrees almost seven times.
  • the rotation count 51 will change to eight, indicating that the angular encoder is now on the eighth revolution.
  • the one or more subsection, e.g. frame in a frame stack, currently displayed is indicated by a number 51.
  • the software according to the invention comprises a number of different options during setup of a presentation in order to meet the unique requirements for designing a presentation utilizing the user interface shown in fig. 11 or 12 together with a number of menus . Said options are included to meet the requirements of different companies, business entities, or organizations.
  • the software of the invention provides the user with a number of different and individually options during setup of a presentation, and simultaneously providing feedback information, which is simple and easy to understand.
  • step 52 the user first connects the rotation sensor (in the following specifically an angular encoder) to the system.
  • Said angular encoder is as described above preferably reading the position of a jog wheel or a similar device, and yields an angle in degrees as an output, e.g. via a USB port.
  • step 53 the desired photos stored in a frame stack are loaded to the computer. This can e.g. be achieved by loading image files with transparency, e.g. PNGs with an alpha channel, from a persistent storage into RAM.
  • rotation count is the number of revolutions of the input device, e.g. Z-screen or jog wheel, over which a presentation is stretched.
  • “Degrees” is here the angle used to determine which media to composite and present.
  • the frame stack is configured by setting the x,y,z and scale of frame stack images.
  • Frame stack is composited with other media elements in 3D-space. I.e. all media elements is rendered in 3D to produce an output in which frame stack images and video are rendered in front of or behind one another.
  • a video can be placed in front of a frame stack image, behind a frame stack image or can transit from front to back as the Degree input changes.
  • the desired video (s) are selected in step 56 from the persistent storage (with or without an audio track) .
  • the specific Degree range (s) the video is to be played within is then set. Additionally video scale, 3-space coordinate, opacity and volume are set.
  • Key Degrees ranges can be set that change any of these parameters when Degrees falls into the range. Additionally a “rotation lock” can be toggled. Rotation lock causes video 3-space placement to be controlled by the current Degree, independently of Degree ranges and Key Degrees, e.g. by moving about in a circle the X, Z plane, matching the encoder's rotation .
  • the desired audio track (s), if any, are selected in step 57 from the persistent storage, and which Degree ranges the audio is to be played within is set.
  • the input and/or output can then optionally be configured in step 58, e.g. by mirroring the output to the screen if the output is to the image source in the Z - screen in which case a mirrored image is needed due to the reflections made on the mirror, advantageously effected by toggling.
  • setting damping from the encoder can configure the input /output .
  • Damping can include filtering and prediction of angular encoder output.
  • step 59 it is configured what to do when no change in Degree is registered over a period of time. This includes setting angular encoder idle timeout and for example a video to be played .
  • the camera 3D-space position is specified and set in step 60, including direction scale and field of view (FOV) . This affects how the elements in the scene are presented (rendered) .
  • FOV field of view
  • the user can in step 61 optionally use the angular encoder (e.g. by turning the jog wheel) to test the current presentation setup.
  • the angular encoder e.g. by turning the jog wheel
  • different images from the frame stack are shown.
  • the Degrees falls within a Degree- or Key Degree range video and sound will play, placed in, or moving through, the configured location. If one or more of the selected options does not correspond to the desired output, the user can easily alter, amend or change settings, in order to achieve a desired presentation.
  • the user can save said presentation in step 62 to persistent storage, so that the file can be accessed at a later time for editing by the Editor or executed by the Player to present the media presentation to an end user.
  • the Player is as described above arranged for showing media presentations on at least an image source as in the interactive 3D display or e.g. a screen with or without a sound source, created by the Editor.
  • the Player runs on a device, e.g. a computer which receives input from an angular encoder.
  • a device e.g. a computer which receives input from an angular encoder.
  • the player may require significant processor time and thus it can by an advantage if the player is able to suppress other activity on the device, in order to be able to run the file giving a smooth and "tight" experience to the user .
  • Fig. 14 shows a flow diagram representing an exemplary chain of steps/events leading to a maximized processor power available to the Player.
  • the step are listed in a preferred order, but sequence of said steps can arbitrarily amended or modified according to the users demands and requests as well as including additional preferred steps.
  • the user starts the Player software in step 63.
  • step 64 the screen saver is disabled to ensure that a screen saver does not interrupt a presentation even if there is no new input for a prolonged time.
  • step 65 can for example include disabling of screen a PC power management.
  • step 66 some continually running programs notify users of actions they should take, to e.g. maintain the system they are working on. Some of these programs may steal focus from the Player, thereby interfering with the presentation and user experience. Known programs may be disabled or silenced here.
  • step 67 the Player is running and the media presentation is executed on the image source and e.g. sound source until the user shuts down the Player.
  • Steps 68, 69, and 70 restore all or some of the settings and programs, which has been changed or disabled in steps 64, 65, and 66 depending on preset values or user choice.
  • the Player is terminated in step 71.
  • Fig. 15 shows a flow diagram representing an exemplary chain of events on how the Player reacts to input in form of angle data from an rotation sensor such as an angular encoder and all media events i.e. execution of video, frames in a frame stack, sound is coupled to predefined input angles.
  • an rotation sensor such as an angular encoder
  • media events i.e. execution of video, frames in a frame stack
  • step 72 the Player receives input from an angular encoder.
  • Input can be analog, digital, degree, radian, normalized as [0:1 [, unprocessed sensor values, via any bus such as USB, I2C, firewire bluetooth, wifi, TPC/IP, UDP etc.
  • step 73 interaction with a user is a feedback loop.
  • a computer processes the data from here and presents an image to the user who then reacts to the image.
  • This loop contains a delay, resulting from the encoder, computer, screen an user. If this delay is too large the user experiences lag which disrupts the experience. Prediction of the angle the user is to be presented with is factored in here to minimize perceived lag.
  • Noise from the angular encoder, jitter in a users muscles etc. can advantageously be filtered out to give the user a smooth experience when using the rotation device attached to the angular encoder.
  • Said step is shown in step 74.
  • the amount of filtering known as dampening in this context, can be controlled from the Player. Filtering methods can be, averaging, Kalamn filtering, Alpha-Beta, FIR, IIR, Notch, etc.
  • the processed output of the angular encoder value can be assigned to a variable e.g.: "newMeasuredDegree" .
  • the steps 76, 77, 78, and 79 keeps track how many complete revolutions the angular encoder has performed since program start in a variable named "rotations".
  • the angular encoder may report 0.1 on one measurement and 359.9 on the next. This implies that only a 0.2 degrees change is present between two measurements, not 359.8 degrees, (assuming proper use and operation) , and that the cumulative rotation count should be decreased by 1.
  • step 80 the variable "newDegree” is the value used to check against the activation ranges for execution of the media elements. "newDegree” is the angle of all preformed rotations multiplied by 360 plus the current rotation angle “newMeasuredDegree” . The “newDegree” value cannot exceed the "rotationCount " , given in the Player, multiplied by 360.
  • steps 81, 82, 83, and 84 the "newDegree" is then used to activate or deactivate media elements as described in figure 11 in the current patent.
  • step 85 is shown that the media element continues playing until a new value is received from the angular encoder, or the user shuts down the application.
  • the rotation sensor/angular encoder is described in more details in the flow diagram in fig. 16.
  • Detection of rotation device angle can be done in many ways e.g.: by counting notches in the rotation device optically or mechanically, by measuring change in a magnetic field by a magnet connected to the turnstage, or any other means whereby an accurate rotation from a predefined orientation can be determined.
  • an angular encoder is used as example and is schematically indicated with reference number 85.
  • step 86 it is tested if the angle of rotation, of the rotation device, has changed. This test is subject to filtering of the input signal, noise, resolution and other factors that can effect a measured angle.
  • Step 88 shows output of the angle of the rotation device measured by the angular encoder.
  • Output of the value obtained from the angular encoder can be analog, digital, processed, unprocessed, converted to an angle, radian or unconverted raw sensor values.
  • the output signal can be passed on via USB, I2C, Firewire, optical, radio etc.
  • an angle signal 89 from a angular encoder 90 in a turn table or rotatable stage may be used to synchronize several units 91, 92, 93, 94, comprising the Player according to the present invention.
  • an interactive 3D display 91, two computers with screens 92, 93 and a computer with a sound source 94 may be activated simultaneously if they all contain a Player running a file wherein an event, e.g. onset of display of images in a frame stack and sound, is triggered at a specific input angle e.g. 32 degrees.
  • the example of a software description is only exemplary and is added to illustrate possibilities of the software according to the present invention.
  • the limitations of the description must not be construed as being limiting of the software in general e.g. a video editing/player tool using open source elements can well be imagined within the present invention as well as video editor/player developed for Mac, Linux or other non PC formats are within the scope of the present invention.
  • Said limitations are related to each project e.g. due to financial considerations or copyright and not due to the scope of the present invention.
  • the same is true for the use of programming language.
  • the video editor/video player may be programmed in MS.net as well as it can be programmed in any other high or low level language as long as the resulting video editor/video player complies with requirements on e.g. speed, file size etc.
  • the Editor/Player software described above may be used with the exemplary setups of fig. 1 to 11 as well as other setups according to the present invention.

Abstract

Interactive 3D display (1) comprising at least one first image source (2) projecting at least one image onto at least one semi-transparent mirror (3) thereby creating a perceived 3D object (22) in a space defined by the at least one semi- transparent mirror (3) and a base (4) by use of Pepper's Ghost technique. The base (4) comprises a moveable stage (5) and the interactive 3D display further comprises means (6) for manipulating the stage and means for correlating (25, 24) the motion of the stage and the perceived 3D object.

Description

An interactive 3D display, a method for obtaining a perceived 3D object in the display and use of the interactive 3D display.
Interactive 3D display comprising at least one first image source projecting at least one image onto at least one semi- transparent mirror thereby creating a perceived 3D object in a space defined by the at least one semi-transparent mirror and a base by use of Pepper's Ghost technique. The invention further concerns a method of obtaining and interacting with the perceived 3D object and a specific use of the interactive 3D display .
3D illusions have been widely used for decades, for example to create special effects in theaters and exhibitions. One of the best-known techniques is Pepper's ghost technique. The Pepper's Ghost technique has been described many times. Recently and easy to understand by Julien Clinton Sprott in Physics Demonstrations page 255. Said technique is simple to use, and when used by a person with insight into geometry and optics, the technique can create impressive three-dimensional illusions of objects seeming to appear in free air like a ghost. Recently this technique has been used to create perceived 3D objects in small-scale displays as described in EP 1846798. This display allows a viewer to observe a perceived 3D object from several angles, but the viewer remains passive with respect to the perceived 3D object. This means e.g. that in order to see the perceived 3D object from another angle the viewer must move around the 3D display. An interactive display using Pepper's ghost technique is disclosed in US 5,405,152. In this application a computer game is played using a number of positioning devices for interacting with a computer game visualized partly by use of Pepper's ghost technique on a display. In the game the players must steer a space vessel to avoid collisions with other players and objects representing planets and meteor showers. To do so, the player stands on the positioning device, which reacts on the players shifting weight. Thus the positioning devices and the method of interaction between the positioning devices and the games are quite complex and only suited for this very specific game. Due to the complexity of the interaction, and the method of interaction, it takes time getting familiar with the controls as well as it requires concentration of the user to perform the interaction. All in all this leaves the interactive display with the matching positioning devices unsuited for use with other games than the one described in US 5,405,152. Also it is not suited for allowing a user to interact with other types of information, for example to gain information about details of a displayed perceived 3D object as a "virtual" shoe on display in a shoe shop.
As the Pepper's Ghost technique is based on optical illusions, it is critical to maintain the 3D feeling of the perceived 3D object and to avoid that it suddenly becomes flat to the observer or otherwise looses its creditability . This, the 3D realization, is a critical point no matter what type of 3D display it is, as long as it is using the Pepper's ghost technique, but it becomes increasingly more important to take into account if the 3D display allows for the viewer to move the perceived 3D object within the interactive 3D display.
Commercials and shop displays are becoming increasingly more advanced and there is a continuous competition to achieve the attention of customers. Also museums are experiencing rising demands to offer alternative exhibitions and new ways for the visitors to obtain advanced information intuitively and visually. One of the consequences hereof is that it in some situations can be preferable if the viewer can interact with the perceived 3D object, for example be able to rotate or tilt the perceived 3D object to facilitate a more detailed inspection of said 3D object. Thus there is a need for a simple but yet effect-full 3D display with convincing 3D realization/illusion, which can be used without prior training even by persons without any computers skills and in which the method of interaction does not take up too large a portion of the viewer/users concentration. Such a new interactive 3D display must enable the viewer/user to obtain information about e.g. a new sports car on virtual display in a 3D display in a car dealer or about the advanced structure of proteins in a museum or in a laboratory, in a manner where the viewer can concentrate on obtaining the information and not on performing the interaction with the perceived 3D object displayed. I.e. it must provide simpler alternatives to the known controls such as joy pads and computer mouse for moving 3D objects on a screen or in a 3D display. There is also a need for an interactive 3D display, which is easy to use by someone "just passing by" in a busy environment .
In a first aspect according to the present invention is provided an interactive 3D display with an improved 3D realization .
In a second aspect according to the present invention is provided an interactive 3D display, which provides a combination of an actual object of choice and a perceived 3D object .
In a third aspect according to the present invention is provided an interactive 3D display with an intuitive interaction between viewer and perceived 3D object.
In a fourth aspect according to the present invention is provided an interactive 3D display, which allows for a precise coordination of an actual object and a perceived 3D object when the viewer interacts with the 3D display. In the present application the term "animate" is used about the change in an image over time e.g. a visualization of the rotation of a 3D object and/or the visualization of the movement of parts of an object as for example a car with doors opening and closing etc. The animation can be real-time calculated based on input to the device executing the animation or performed by accessing a database with pre-rendered images.
In the present document the term "compute" is used in the general sense of using a computer. It can be for calculations, execution of specially designed source code, execution of commercial software and database query. It can be dependent on input from one or more devices or only rely on pre-registered data .
In the present document the term "interact" means to rotate, tilt, translate or otherwise move the perceived 3D object. It also covers e.g. changing colors on the perceived 3D object and scrolling through text related to the displayed perceived 3D object.
The novel and unique way the above and further aspects are obtained is that the base comprises a moveable stage and that the interactive 3D display further comprises means for manipulating the stage and means for correlating the motion of the stage and the perceived 3D object. This offers a unique range of possibilities for presenting the perceived 3D objects, making it possible for the viewer to interact with the perceived 3D object e.g. to tilt or rotate the perceived object to be able to see it from a preferred angle. When the viewer moves the stage and the stage is correlated with the perceived 3D object, the perceived 3D object will automatically follow the motion of the stage, whereby the optical illusion is enhanced, compared to an interactive 3D display without the movable stage, and becomes very effective. One reason to this enhancement of the 3D illusion is that the stage helps define the room and the depth in the perceived 3D object. The movement of the stage together with the perceived 3D object will thus enhance the 3D feeling of the perceived object by clearly defining a space with depth in which the perceived 3D object can exist and it is therefore possible using the 3D display according to the invention to create images of a definition and a scale never before possible resulting in a much more realistic look than anything previously achieved using a more conventional Pepper's Ghost set up. Such an enhanced 3D illusion broadens the possibilities of use, both as the 3D display can be used under a wider spectrum of conditions and due to the fact that it is possible to display more challenging perceived 3D object, which in a 3D display of the types known in the art will be in risk of not convincingly seem to be three dimensional but instead appearing flat. As it will be described later the interactive 3D display according to the present invention can be used in many contexts, from an easy to access shop display to advanced learning tools. The motion of the stage can be rotation, tilting, and/or movement along different axes depending on what perceived 3D object is to be displayed in the interactive 3D display. If for instance a crystal structure of a complex salt is to be displayed, as the perceived 3D object, it is preferred to use a tiltable and rotatable stage in order to be able to see the object from as many angles as possible while still following the motion of the stage. If more simple objects are to be displayed, a stage with a simple rotational motion can be chosen. The rotation axis can easily be chosen by the person skilled in the art based on the information given in relation to the present invention.
In one embodiment the image source can be a 50-200Hz LCD screen, as this can provide a screen of good quality as well as they are reasonable in price. An advantageous other option is to use a OLED screen due to the improved contrast, but in principle any screen is applicable. Frame rates can advantageously be between from 20 frames pr . second and up, e.g. 30 frames pr . second in order to achieve a smooth movement of the perceived 3D object. Alternatively the screen can be a 3D auto-stereoscopic screen, as this provides a way of displaying steroscopic images provides without the use of special headgear or glasses on the part of the viewer.
In order for the perceived object to appear as clear and sharp as possible the reflecting side of the semi transparent mirror must face the image source. It also enhances the illusion that the reflection is higher than the transmission through the semi transparent mirror, as the reflected image in this case is clear and the perceived 3D object thus appears more vivid that if the reflection is sparse. Exemplary data for a semi transparent mirror, chosen to ensure a luminous perceived 3D object, can be
Spectrum range 420 -680 nm
Transmissiom (T%) TAvg 15 +/- 5%
Reflection (R%) RAvg 60 +/- 5% The term "semi-transparent mirror" within the present document are meant to encompass any kind of reflective and semi- transparent screen capable of creating a Pepper's Ghost illusion. Thus the term also comprises reflective and semi- transparent membranes and foils made of e.g. a polymeric composite.
It is preferred that the reflection layer of the "semi- transparent mirror" is the outer layer facing the image source, as it is then possible using the 3D display according to the invention to create images of a definition and a scale never before possible resulting in a much more realistic look than anything previously achieved using a conventional Pepper's Ghost set up. Alternatively the semi-transparent mirror can comprise a thin protective layer covering the reflection layer, as this will provide a larger resistance to the mirror. Said protective layer is preferably less than 400 nm, more preferably less than 200 μηι, and even more preferably less than
In a preferred embodiment the stage is rotative around a stage rotation axis perpendicular to the stage. If the rotation around the stage rotation axis is the only possible movement of the stage, the interactive 3D display according to the invention have a number of advantages. Obviously it limits the movement of the stage significantly, compared to a stage having the possibility of a multi axial movement, which ensures that the correlation between the stage and the perceived 3D object can be performed very accurately even on smaller computers, which are cheap and can be implemented in even quite light display structures and thus facilitates simple and inexpensive displays. At the same time the limited movement of the stage, and thus of the perceived object, around a rotation axis perpendicular to the stage, can be an advantage for the viewer. As it requires a large degree of concentration to follow and obtain information from an object moved in several dimensions and/or along different axes, this simple rotation of the stage enables the viewer to intuitively perform the movement of the stage, by use of the means for manipulating the stage, and at the same time obtain information from the interactive 3D display. This way the limited movement creates a possibility for offering the interactivity to the viewer, i.e. offering the viewer the possibility to rotate the perceived 3D object, but keeping it at a level which is simple and easy to perform without prior training or instructions. Such a preferred embodiment is therefore convenient to use e.g. in busy locations as super markets or museums where the viewer can not be expected to have a quiet moment to concentrate or to have the time to learn how to navigate and decode more advanced movements and information. The means for manipulating the stage may be at least one wheel or wheel section rotative around an axis A2 parallel to the stage rotation axis. This setup makes the interaction intuitive as the rotational movement of the wheel corresponds to the rotational movement of the stage, and requires a minimum of instructions of the viewer thus offering an instant clear and understandable interaction. It is also possible that the stage is remote controlled in which case the stage may be equipped with a servo motor rotating the stage upon input from the means for manipulating the stage. If the stage can tilt as well as or instead of rotate around the stage rotation axis, means for providing this movement can be provided. Such means for manipulating the stage can be integrated with the wheel, which e.g. can be tilted upwards and/or downwards to control tilting of the stage or they can be one or more separate button/wheel etc depending on the motion.
When the movement of the stage and movement of the means for manipulating the stage are directly linked, e.g. by a gearing, preferably a 1:1 gearing, or driving belt between the means for manipulating the stage and stage, the interaction becomes even more intuitive as a certain motion of the wheel preferably corresponds to the same motion of the stage and thereby of the perceived 3D object. In modern digital displays, means like a computer mouse or a joy pad is used for interaction with moving 3D objects on e.g. a computer screen. But for small children and people not used to working with e.g. a computer mouse, the hand-eye coordination required to use even an ordinary computer mouse for manipulating an object on e.g. a screen or in a 3D display can be very hard to get a grasp of. When a wheel, or other rotation device, is used in the 3D display according to the invention the interaction becomes very intuitive, as the user simply manually can turn the wheel and hereby the perceived 3D object in the desired direction. The movement of the wheel will result in a similar movement of the stage and the perceived 3D object. Thereby can the perceived 3D object easily be viewed from different angels without the user physically having to move . To obtain a simple but yet fast and effective way of performing the correlation between the motion of the stage and the perceived 3D object, the means for obtaining said correlation may at least comprise:
a rotation sensor for detecting the rotation of the stage - a transmitter for transmitting a signal representing the rotation of the stage from the rotation sensor to a computer
software running on the computer receiving input in form of the signal from the rotation sensor and on the basis of said signal animating the at least one image projected by the image source.
Based on the signal from the rotation sensor the computer computes how the image displayed on the image surface of the image source, must change in order for the perceived object to move together with the stage and displays the changing image on the image surface in a continuous stream making the perceived 3D object turn smoothly with the stage. The rotation sensor can in a preferred embodiment be a simple mechanical device located under the center of the stage. From here the data concerning the rotation of the stage, i.e. direction and/or speed can be transmitted to the computer via cables or via wireless techniques known in the art. This solution is very simple, but also very effective as it is cheap, easy to implement and does not require detailed calibration every time the display is set up for a new use. Furthermore the equipment can be quite sturdy and thus fulfils the requirements for use with a 3D display, which is to be handled by many and untrained people in public space.
The rotation sensor can alternatively be located at the edge of the stage. It can e.g. be of the type, which optically detects a pattern on the edge of the stage and from this pattern derive speed and direction of the stage or it can be mechanically linked to the stage and measure the stage rotation by means of a simple gearing or similar. Data transfer can preferably occur as described above.
The rotation sensor for detecting the rotation of the stage, can preferrably measure a rotation position of the stage, such as an angle between the base of the stage and a predetermined reference point. The measured rotation position can be used to determine the orientation of the stage. In another example, the stage can include touch sensor areas located where a user typically grasps the stage to change the stages orientation. In this case, detection of touch in these areas could be used in addition to orientation information from one or more other sensors, such as accelerometers , position sensors, etc., to aid in the detection of a change in rotation of the stage. As one skilled in the art would understand in light of the present description, other types of sensors may be used, and different types of sensors may be used in combination, to sense the orientation of the stage. In one preferred embodiemnt of the invention the stage and the rotation sensor is arranged as a jog or shuttle wheel, i.e. a device which allows the user to shuttle or jog through a frame stack of an audio and/or video media. "Jog" refers to going at a slow speed, whereas "shuttle" refers to a fast speed. Preferably the rotation sensor is an angular encoder which may be optical, mechanical or magnetical.
It is preferred that the used jog wheel or other rotation device has no stops and can be spun the entire way around preferrably unlimitted. In said case the jog wheell depends on the tracking of the actual motion of the rotation: the faster the jog wheel spins forward or back, the faster it fast- forwards or rewinds in the frame stack of the desired media. Even though jog wheels are known in the art and can e.g. be found on handheld PDAs and as the scroll wheel on computer mice, the jog wheel according to the present invention is functioning with software which preferreably uses angle data, which is in contrast to the known jog wheels.
It is also possible to obtain the correlation between stage and perceived object if the means for correlating the motion of the stage and the perceived 3D object at least comprise:
a key object placed on the stage,
- a camera above the center of the stage monitoring the key object,
software running on the computer receiving input in form of data from the camera identifying the key object and creating a corresponding image visualized on the at least one image source, and
software running on the computer receiving input from the camera about movement of the key object and animating said image correlated with the movement of the stage. A key object is in the follow document considered to be any object with a characteristic shape and/or pattern recognizable by the software via the camera. It can be flat and may be recognizable by the camera/computer e.g. by its 2D outline or pattern. The user selects a key object and places it on the stage. The camera records the key object and the software recognizes the key objects and displays the corresponding image on the at least one image source. When the stage is moved, e.g. by turning the wheel on the front of the 3D display, the camera records the movement of the key object and software computes this data from the camera and animates the image displayed on the image source so that the perceived 3D object on or over the stage moves in a motion correlated with the stage. The key object can also for example be a model car, a plastic animal or anything else that preferably represents the perceived 3D object it triggers to be displayed in the 3D display.
In the present application the term "actual object" is used for a physical object placed on the stage, which is to be coupled with the perceived 3D object. E.g. an actual physical running shoe, which can be supplied with a perceived fur surface by over layering of a perceived 3D object.
In the present application the term "a key object" is used for a physical object placed on the stage and which is to be recognized by the camera/computer. Following the example above, a key object can be a small model running shoe, or a picture of a shoe, placed on the stage. This small shoe (or picture) triggers a real size perceived 3D running shoe to be displayed in the interactive 3D display above the stage and key object. An actual object can be placed on the stage. The introduction of an actual object on the stage offers the possibility of mixing the perceived 3D object with an actual object and can be used in a multitude of ways. For example a running shoe can be placed on the stage and the perceived 3D object can be used to change perceived pattern, color etc. of the shoe allowing the viewer to e.g. see his/hers own customized shoes before ordering them at a store.
An actual object is, in most cases, not a key object as, in most cases the creation and animation of the perceived 3D object is not performed based on camera recordings of the actual object. However in some cases the actual object can be used as a key object as described in the following.
To be able to correlate the actual object on the stage and the perceived 3D object it is necessary to either in a first place a pre-selected actual object on a specified position on the stage or in a second case for the interactive 3D display to be mounted with a camera registering the stage and actual objects on the stage, i.e. the actual object is also a key object.
The first case is quite simple. The computer in the 3D display contains data and software for showing one or more specified perceived 3D objects to be combined with one or more specified actual objects e.g. ten different types of over-layers matching a specific shoe. That specific shoe (the actual object) is then placed on the stage in a position predefined to ensure a perfect match between shoe and perceived 3D object (in this case the over-layer to the shoe) , and now the stage with the shoe and the correlated perceived 3D image can be moved together and displayed from different views by use of e.g. the wheel. In other words in this simple version the actual object is placed on the stage so that it matches/overlaps the perceived 3D object. If the second more complex case with the camera is chosen, i.e. where the actual object also is a key object, the requirements to the computer is higher as the computations needed are more demanding. In this case, when an actual object is placed on the stage the camera can record the object. Based on the data from these recordings software in the computer can recognize the object (e.g. by using a database on the computer) and also the position of the object on the stage. On the basis hereof the computer creates the image to be displayed by the image source and also performs the animations of said image needed for the perceived 3D object to follow the actual object placed on the stage when the stage is moved by the remote means. In this more complex version the perceived 3D object and the animation hereof is calculated to follow the actual object on the stage.
It is possible that the viewer or a person administrating the 3D object e.g. a shop-keeper can interact with the perceived 3D object via other means than the wheel. Such means can be placed directly at the computer, an external touch screen or e.g. be integrated with the wheel for rotating the stage, such that when the wheel is pressed the color of the over-layer of the shoe, car, soda can etc can e.g. be changed on the stage. It is also possible to upload new data to the computer if e.g. a customer has designed the paintwork for a new car and want to be able to see it mapped onto a model of the desired car. The 3D display can also e.g. be used to illustrate the growth of an animal, in which case the wheel can be used to rotate the perceived 3D object and the wheel can be pressed to "make time pass" i.e. the wheel can be pressed to start and stop an animation of the growth. The software running on the computer is advantageously able to execute images from 55 - 200 Hz in order to be able to follow the motion of the stage.
The invention also relates to an interactive 3D display system comprising at least one interactive 3D display according to the invention and at least one additional presentation means for presenting audio and/or visual information. In this way the perceived object may be accompanied by additional information such as text, image sequences and/or music.
In this respect the system can comprise means for executing one or more preprogrammed actions on the at least one interactive 3D display and the at least one additional presentation means when the stage during manipulation reaches one or more predefined positions, e.g. angular positions defined in relation to a reference point when the stage is rotative around a stage rotation axis (AJ perpendicular to the stage plane.
This enables the user with the possibility of presenting multiple information at the same time, and/or providing additional information in relation to the perceived object. Similar can manipulation of one stage of the 3D display according to the invention control the manipulation of one or more stages of different 3D display displaying either identical or different perceived objects.
The means for executing one or more preprogrammed actions is preferably software running on the computer. If the software running on the computer comprises a video editing tool and/or a video player the user (i.e. a shop owner or an exhibition designer in a museum) can create his own media presentation by linking positions of the moveable stage to the execution of different actions/media elements to be displayed on the image source. This way the user can prepare a completely custom made media presentation wherein the perceived object may be accompanied by additional information such as text or image sequences . Types of software for imaging and animation can be of the type of 3D game engines, Ogre, qurest3D, Flash, 3Dmax, Maya3D etc, all of which is well known for the person skilled in the art and the 3D game engines which can be used in the present invention will therefore not be described in further details.
The perceived object may be formed by using the above mentioned video editing tool/player and having photos stored in a frame stack on the computer, enabling a viewer to scroll back and forward in a series of photos by turning the wheel or wheel section and thereby the stage back and forth. For example the photos may be high resolution photos and the frame stack contains for example 1200 pictures of an object to be presented as the perceived object according to the present invention. Each picture is assigned to a angular position of the stage. By turning the wheel/stage, the viewer can experience a unique photo realistic visual presentation of e.g. a 360° view of an perceived object or navigate through the pictures in the frame stack to experience an ultra slow motion 360° movie of a golf swing or other complex motion, and stop to see each unique high resolution picture if desired. If the wheel/stage is turned at a higher speed the experienced movie is no longer ultra slow motion .
The photos in the frame stack can be combined with other visual or audio presentations such as real time 3D visualizations (e.g. using 3D studio Max or Maya) triggered at one or more selected angular positions of the stage. In this respect the user can advantageously also display e.g. text information at one predetermined angel, a video at the same and/or a second angle, audio information at a third angle etc. The desired actions can be chosen depending on kind of desired information.
As the wheel/stage preferably has no stops it can be spun the entire way around preferrably unlimitted. Each full revolution of the wheel/stage, i.e. a 360° rotation defines a rotation and the software may increase or decrease a rotation count depending on what number of rotation the wheel/stage is on. If e.g. the wheel /stage has been rotated 450 degrees the rotation count of the software is 2 as the wheel/stage is on its second rotation .
Each rotation is subdivided into a number of subsections, which defines the step size/resolution chosen for the specific turnable wheel/stage. The number of subsections can be arbitrarily chosen depending on the perceived object and information to be presented. As an example, if the position of the turnable stage is given in whole degrees each subsection corresponds to 1°, if the position of the turnable stage is given in 1/10 degrees each subsection is 0.1°, thus depending on the number of e.g. photos to be presented in the frame stack the user can choose the desired size of each subsection. The number of subsections can be the same for each rotation, i.e. for each full revolution, but the number of subsections can also be independently chosen for each rotation.
Thus, if e.g. a user has chosen to execute a number of photos stored in a frame stack over three rotations i.e. three full revolutions, each subsection can preferably be set to 1°, so that the first subsection corresponds to the position 0° of the turnable stage and the last subsection in the presentation corresponds to 3x360° i.e. the position where the turnable stage has been revolted three times or a total of 1080°. The user can e.g. chose to execute the number of photos in the frame stack only during the first and third revolution of the turnable stage, and can during the third revolution ensure that the photos can be partly accompanied by first and second text and audio output. During the second revolution of the turnable stage a 3D animation is accompanied by text.
The photos displayed during the first revolution will preferably be displayed on the image source according to the invention using a first screen configuration. During the third revolution of the turnable stage photos, first and second text will be displayed on the image source in a second screen configuration. During the second revolution animations and text will be displayed using a third screen configuration.
It is preferable possible to change screen configuration in the middle of a revolution e.g. if the first 80° of a revolution of the turnable stage triggers only video output and that over the following 280° a combination of e.g. texts, photos, and 3D animations are triggered at different positions of the turnable stage .
The user can add different screen configurations via the software, add or remove output, change the unit of the sub division in one or more of the rotations as well as it is possible to add or remove rotations, e.g. add a rotation corresponding to fourth, fifth etc. revolution of the turnable stage .
If the stage rotation axis and the center of rotation of the perceived 3D object coincides at least in one point, the inventors has shown that this will ensures that the perceived 3D object and the stage move together in a way, which fully utilizes the stage effect of defining the room in which the perceived 3D object is seen.
When at any point of the semi transparent mirror, the shortest distance D . between the image source and the semi transparent mirror equals the distance D± between the semi transparent mirror and a plane P± perpendicular to the stage containing the stage rotation axis, the best possible illusion can be obtained. This is due to the fact that the perceived 3D object will seem to appear behind the semi transparent mirror in a distance from the semi transparent mirror equaling the distance between the semi transparent mirror and the image source. Thus in order to get the maximum effect out of the stage, and its enhancing effect on the conception of 3D space for the perceived object, the stage rotation axis is best placed in a plane as here described.
The correlation of the stage and the perceived 3D object can be "intelligent". In the example with the shoe on the stage the interactive 3D display can be used to not only change the look on the shoe but also to give information about the shoe to the viewer. E.g. when the heel of the shoe is facing the viewer, a pop-up textbox with information about cushioning materials can appear and disappear again when the stage is turned again and the heel is no longer facing the viewer. It is also possible to integrate means for tracking the eye position of an observer and communication data about said eye position to software at least partly generating the at least one first image into the 3D display. Such tracking means enables the perceived 3D object to be compensated depending on the viewers eye/head position and angle relative to the stage. The perceived 3D objects formed this way follow the actual object correctly when the stage and thereby the perceived and actual object is turned no matter how tall the viewer is and in what angle he/she holds his/her head relative to the stage.
If the interactive 3D display is collapsible it becomes easy to transport and store. Furthermore the semi transparent mirror and other essential parts can be protected while in a collapsed state .
The invention further relates to a method of forming a perceived 3D object comprising the steps of
obtaining an interactive 3D display as described above, by means of a computer displaying an image on an image source, said image being reflected in a semi transparent mirror and creating the illusion of a perceived object located above the stage of the interactive 3D display, moving the stage
detecting and recording the movement of the stage by means of a movement sensor,
transmitting a signal from the movement sensor to the computer, logging and processing the signal from the movement sensor and animating the image on the basis hereof, so that the perceived object moves in a coordinated motion together with the stage. This method gives a straightforward opportunity of effectively and intuitively offering a viewer the ability of interacting with a perceived 3D object. The Interactive 3D display can for example be used to real-time changing of a perceived over layer of an actual physical object. This is a smart and effective way of e.g. illustrating the effect of different choices for the outer accessories on a car or the changes over time of the surface of the earth or the earths atmosphere.
The invention also provides for computer software arranged for, in operation with the interactive 3D display, to carry out or control every function of the apparatus and/or methods
In particular, according to a further aspect of the present invention there is provided a program for a computer, the program comprising the steps of: receiving from rotation sensor a signal indicative of angle of rotation of the stage; and on the basis of said signal animating the at least one image projected by the image source.
On of the key element of the Pepper's Ghost technique used in the interactive 3D display according to the invention is the actual content creation. This can as described above be done by many different conventional ways, one of which is filming real objects (people, products, etc) in HD on green screen, in order to create a video or by creating computer-animated objects. Animated files such as phones, cars, etc are then created as videos. The result is very real looking and breathtaking 3D- animation of a perceived object, adding the possibility of further creating the illusion that one object seem to "morph" into another. Thus according to the present invention is provided software consisting of an Editor and a Player, for respectively creating and executing media presentations, said media presentations being shown on at least an image source as in the interactive 3D display or e.g. a screen with or without a sound source. By use of the Editor a user can create media presentations consisting of a variety of media elements such as video, sound, pictures in a frame stack, text, 3D animations etc. Fundamental to the media presentations created is that the execution of each media event in the final media presentation is executed at specific angular input preferably from a rotation sensor. In the Editor the user can define the number of revolutions (registered by the rotation sensor) over which a presentation is to be executed, how fine the resolution is to be e.g. how many frames of a frame stack is to be shown pr . degree, define in what angular intervals a given media element is active e.g. from 10 degrees to 420 degrees a video is active and from 30 to 90 degrees a frame stack is active. It is for example possible to upload a frame stack containing e.g. 4000 frames and then set the number of rotations to a specific number e.g. 4 in which case 1000 frames will be shown during each of four revolutions registered by the rotation sensor, giving a resolution of 0.36 degrees.
The Player receives input from a rotation sensor and on the basis hereof executes the media presentations created by use of the Editor. The Player receives input from the rotation sensor about the current angular position of the rotation device i.e. turntable, stage, jog wheel, etc being used to control the Player. When the input from the rotation sensor is received, the Player executes the media elements set by the Editor to be executed at a given angle. If e.g. a video is set to be executed in the angular interval from 10 degrees to 420 degrees and a frame stack from 30 to 90 degrees, both video and frame stack will be active when the reading from the rotation sensor is 80 degrees, 82.75 degrees, ... , 89.99 degrees but at 90.01 degrees only the video is active. If the rotation device is rotated in the opposite direction back to 89,5 degrees both video and frame stack is active again.
A media presentation may be set up by the Editor so that e.g. during a video media event a video will continue playing in the forward direction even if the rotation device is moved back and forth as long as the input from the rotation sensor to the Player is in the angular interval in which the video media event is set to be active. A film played from frames in a frame stack being executed simultaneously with the video may be on the other hand be played forwards and backwards depending on the direction of the movement of the rotation device. This is due to the fact that the Editor allows the user to configure each media element independently to achieve the exact media presentation ideal for a specific purpose.
The Player software contains mechanisms which eliminates different errors. E.g. in the case above where the rotation sensor first delivers input of 82.75 degrees, ... ,90.01 degrees and then 89.5 to the Player, the Player is "intelligent" and can evaluate that the rotation device now has change direction of rotation as it is unlikely that the rotation device has been turned almost full 360 degrees since the last reading. This type of "intelligence" makes sure that the presentation executed by the Player does not "skip" as it might do if the software registered the new reading as if a new revolution has been started. The Player software also contains mechanisms to eliminate unintended vibrations from the users hand or from the surroundings e.g. created by passing people.
All of the above many advantages is based on the fact that input is given in angels and that the different media elements in the media presentation is controlled and defined based on angles opposed to traditional media systems all depending on clock sequences and clock readings. Giving input in angles has many advantages. For example, an angle signal from a rotation device such as a stage may be used to synchronize several units, such that e.g. a perceived object on different stages may be controlled by a single action. For example an interactive 3D display, two computers with screens and a computer with a sound source may be activated simultaneously when the stage reaches a specific input angle.
The invention will be described by way of example below with reference to the drawings illustrating various embodiments of the interactive 3D display.
Within the scope of the present invention the examples should not be seen as limiting, as many more can be foreseen within the scope of the present invention.
Fig. 1 shows, in a perspective view, an interactive 3D display with one semi-transparent mirror and one image source, fig. 2 is a schematic side view of an interactive 3D display, illustrating the rotation axes of the interactive 3D display, and the geometry of image source, semi transparent mirror and preferred position of the stage rotation axis, fig. 3 shows a sectional view taken along line III - III of the interactive 3D display in fig. 1, and a viewer interacting with the interactive 3D display, fig. 4 shows the same with the addition of an actual object on the stage and where the perceived 3D object is replace in order to match the actual object, fig. 5 corresponds to fig. 4, but where the 3D display further comprises stage light and a camera for tracking the head movement of the viewer, fig. 6a shows the interactive 3D display without the perceived 3D image but where the 3D display further comprises stage camera and a camera for tracking the head movement of the viewer and a key object placed on the stage, fig. 6b and 6c show examples of key objects to be used with the interactive 3D display in fig. 6a, fig. 7 is a perspective view of a three-sided 3D display, fig. 8 is a flow diagram explaining the method relating to the features shown in figs. 1 - 4 and 7, fig. 9 is a flow diagram relating to Fig. 5 and 6, fig. 10 is yet an alternative flow diagram relating to a interactive 3D display according to the present invention, fig. 11 is a first exemplary interface for software to be used with the interactive display according to the present invention, fig. 12 is a second exemplary interface for software to be used with the interactive display according to the present invention, fig. 13 is a flow diagram showing the options during setup of the interface of the software; fig. 14 is shows a flow diagram representing an exemplary chain of steps leading to a maximized processor power available to the interactive display according to the invention, fig. 15 shows a flow diagram representing how the interactive display reacts to input in form of angle data from an rotation sensor, fig. 16 shows a flow diagram indication the function of the rotation sensor in fig. 15, and fig. 17 shows a system according to the invention, comprising several additional presentation means for presenting audio and/or visual information.
The interactive 3D display according to the present invention is in the following described by way of examples. These examples are not to be regarded as limiting to the invention. E.g. the dimensions of the interactive 3D display can be significantly different from the here show examples, as well as the number of sides, displayed objects etc. can be different from the here shown.
Fig. 1 is an interactive 3D display 1 according to the present invention comprising an image source 2 above a semi transparent mirror 3, and a base 4 opposing the image source 2. In the base 4 is a stage 5, and on the front of the base is a wheel 6. The image source 2 is rectangular with a first long side 7 opposing a second long side 8, the first 7 and second 8 long side are connected by a first short side 9 and an opposing second short side 10. Furthermore the image source 2 has a top surface 11 opposite an image surface 12 (not visible) . The image surface 12 faces the semi transparent mirror 3, and the image source 2 and the semi transparent mirror 3 are connected along the second long side 8 of the image source 2 and a first mirror long side 13. The semi transparent mirror 3 is positioned in a 45° angle relative to the image source 2 so that the image (not seen) from the image source 2 is reflected on the semi transparent mirror 3. If other angles are used it changes the geometry of the perceived 3D object. The semi transparent mirror 3 has a second mirror long side 14 opposing the first mirror long side 13. The semi transparent mirror 3 is connected to the base 4 along the second mirror long side 14 and along a first base long side 15. The base also has a second base long side 16 opposing the first base long side 15. A first 17 and second 18 base short side connects the first 15 and second 16 base long side. The base has a resting face 19 opposing a stage face 20 containing the stage 5. In the present embodiment the stage is circular and placed centered in the stage face 20 of the base 4. This way the 3D display 1 forms a Z shape when seen in a side view. The wheel 6 is positioned in the first base long side 15, which in the present embodiment is the front of the base 4.
Fig. 2 illustrates the rotation axes of the stage 5 and the wheel 6. The stage rotation axis A1 extends perpendicular to the center Cs of the stage 5. A wheel rotation axis A2 extends perpendicular from the center Cw of the wheel 6. The axis A1 and A2 are parallel. When the wheel 6 is turned the stage 5 will also be turned in a motion corresponding to the motion of the wheel 6. I.e. speed, angle of rotation etc. of the wheel 6 corresponds to speed, angle of rotation etc. of the stage 5. At any point Pm on the semi transparent mirror 3 the shortest distance D . is equal to the distance D. to a plane P. perpendicular to the stage and containing the stage rotation axis A1. In the present view the plane P± is perpendicular to the paper plane of fig. 2. In the shown embodiment the image source 2 is offset with a distance h with respect to the semi transparent mirror 3. An offset h can be present in some displays where the specific display design requires it. Placing the stage right with respect to the semi transparent mirror 3 and the offset h ensures that the perceived 3D object maintains its creditability .
Fig. 3 shows a viewer 21 observing a perceived object 22 and interacting with it via the wheel 6 and thus the stage 5. The image source 2 projects an image 23 on to the semi transparent mirror 3 which creates the illusion of a perceived 3D object 22 on or over the stage 5. When the user 21 turns the wheel 6 the stage 5 turns in the same motion as the wheel 6 and stage 5 are correlated. Information about the rotation of the stage 5 is transmitted to a computer 24 in the image source 2. Said computer 24 computes how the image 23 must change in order for the perceived object 22 to move together with the stage 5 and displays the changing image 23 in a continuous stream making the perceived 3D object turn smoothly with the stage 5. The rotation of the stage 5 can for example be detected by means of a rotation detector 25 positioned under the center Cs of the stage 5.
Fig. 4 shows a viewer 21 observing a perceived object 22' over- layered on an actual object 26 and interacting with them via the wheel 6 and thus the stage 5.
Fig. 5 is a side view of the 3D display 1. In this embodiment the interactive 3D display further comprises a camera 27 positioned on the first long side 7 of the image source 2. This camera 27 can be used for tracking the viewer's eye motions and/or the position and angle of the viewer's head. This information is send to the computer 24 where it is processed and implemented in the calculation of the image 23. This step helps ensure that the perceived 3D object 22 very accurately matches the actual object 26 no matter from what angle the viewer is looking at the perceived object 22 and actual object 26. This feature is a special advantage where an actual object 26 is placed on the stage 5. In this figure is also illustrated how the interactive 3D display according to the present invention can be equipped with a stage light 28 preferably above the center of the scene. This light 28 help to enhance the illusion further as it helps defines the stage 5. Possible light sources are different types of LED lights due to their long lifetime to limit the required maintenance. In principle any light source, which can be focused on the stage area, is useable. In the present embodiment the rotation of the stage is detected by a rotation sensor 25 located at the side of the stage 5.
Fig. 6a shows a 3D display according to the present invention where the basic elements are all known from the previous figures, but in the present case the 3D display is also equipped with a center camera 29 monitoring the stage 5. The camera can be used in stead or as a supplement to other means for detecting the motion of the stage 5. If a key object 30 is placed on the stage 5 the camera 29 and image recognition software in the computer 24 can be used to keep track of the direction and rotation speed of the stage 5. This principle is also illustrated in the flow charge diagram in fig. 9. Fig. 6b and 6c show two different types of possible key objects 30b, 30c to be placed on the stage 5. The key objects 30b, 30c can be three-dimensional or they can be flat as illustrated here . Fig. 7 illustrates a 3D display 1 with more semi transparent mirrors 3a, 3b, 3c. In this case the image face 12 of the image source 2 will be divided in three subdivisions 12a, 12b, 12c (not shown) each projecting an image 23a, 23b, 23c on to the semi transparent mirror 3a, 3b, 3c underneath said subdivisions 12a, 12b, 12c. This way the display can be accessed from three different sides. This allows for the same object to be viewed by more people at the same time and e.g. also for the same actual object to be seen with three different over-layers in which case each subdivisions 12a, 12b, 12c projects a image 23a, 23b, 23c different from the others. This means that the same shoe can be seen having e.g. a floral print, dots and fur surface depending on what side of the 3D display 1 it is viewed from. The different embodiments are exemplary and where appropriate the many features can be combined. E.g. the 3D display with a center camera 29 shown in Fig. 6 can also include stage light 28 and/or the front camera 27 can be left out. The embodiments in Figs. 1-4 and 6 can also be equipped with stage light, stage camera 29 and/or front camera 27. The majority of the embodiments are shown with a rotation detector 25 under the center of the stage 5, but the chosen means for detecting the motion of the stage could as well be a mechanical or optic sensor 25 placed as illustrated in Fig. 5. Thus all of the features of the described embodiments can be combined as long as they are not contradicting.
It is also possible to mount hinges in the second long side 8 of the image source and in the first base long side 15 in order for the 3D display to be collapsible. Obviously the 3D display takes up less space when collapsed, as well as it can be a way of protecting the semi transparent mirror 3 for example during transport and storage.
Fig. 8 shows a flow charge diagram of the basic functionality of the interactive 3D display 1 described in Fig. 1 to Fig. 7. A viewer can observe the perceived 3D object 22 on or over the stage 5. To interact with the perceived 3D object 22 the viewer rotates the wheel 6 in on the front of the interactive 3D display 1. This makes the stage 5 rotate in a movement correlated with the movement of the wheel 6. The rotation of the stage 5 is detected by a sensor 25 and the information hereof is transferred to a computer 24. The computer 24 registers the rotation data and computes a new image/animation to be displayed on the image source 2 and reflected on the semi transparent mirror 3 to create the illusion of a perceived 3D object 22 moving together with the stage 5 in a motion induced by the users rotation of the wheel 6.
Fig. 9 shows a flow charge of the basic functionality of the interactive 3D display 1 described in Fig. 4 to Fig. 6. The basic flow in the flow charge of Fig. 9 is similar to that of Fig. 8. Fig. 9 deviates from Fig. 8 in that the interactive 3D display 1 further comprises a front camera 27 for tracking the viewers eye/head movement. The front camera data is transferred to the computer 24 where it is computed and is used to ensure that the image/animation created by the computer 24 and displayed on the image source 2 is correct with respect to the viewer's angle of observation. This is especially useful if an actual object 26 is placed on the stage 5.
Fig. 10 shows a flow charge diagram in which the viewer turns the wheel 6 causing the stage 5 to turn and at the same time transmits data of the rotation of the wheel 6 (and thus also of the stage 5) to the computer 24. The computer 24 computes the input from the wheel 6 and on the basis hereof displays a picture/animation on the image source 2 and thus creates a perceived 3D object on or over the stage 5, which moves in a motion correlated with the stage 5 and wheel 6.
The software according to the present invention consist basically of an Editor and a Player. The Editor is arranged for creating and executing media presentations, and the Player is arranged for showing media presentations on at least an image source as in the interactive 3D display or e.g. a screen with or without a sound source.
By use of the Editor the user can create media presentations consisting of a variety of media elements such as video, sound, pictures in a frame stack, text, 3D animations etc. In figure 11 is illustrated a simplified exemplary interface for the video/presentation editing tool, i.e. the Editor which may be used together with the present invention. In the present example the user has chosen to have six different possible actions: photos (in a frame stack), text 1, text 2, video, 3D animations and audio, to be executed on the at least one image source and audio means. The actions are here represented in the interface by six vertical slots 30, 31a, 31b, 32, 33, 34. Each slot is divided into a first 35, second 36 and third section 37, and each section 35, 36, 37 are subdivided into a number of subsections 38. The slots 30, 31a, 31b, 32, 33, 34 represent the angular position of a turnable stage in an interactive 3D display according to the present invention (not shown) so that each section 35, 36, 37 corresponds to 360° (i.e. a full revolution of the stage) . Each subsection 38 corresponds to the step size/resolution chosen for the turnable stage, so that if the position of the turnable stage is given in whole degrees each subsection 38 corresponds to 1°, if the position of the turnable stage is given in 1/10 degrees each subsection 38 is 0.1°. The three sections 35, 36, 37 corresponds to a first, second and third revolution of the turnable stage. Seen from the left, using the orientation of the figure, the first subsection 38a corresponds to the position 0° of the turnable stage and to the far right the last sub section 38b corresponds to 3x360° i.e. the position where the turnable stage has been revolted three times or a total of 1080°.
In the present example the user has chosen to execute a number of photos 39 stored in a frame stack during the first and third revolution of the turnable stage (indicated by the hatching of the first section 35 and third section 37 of slot 30 corresponding to photos) . In the third revolution the photos will be partly accompanied by first 40 and second 41 text (indicated by partial hatching of the third section of the third 32 and fourth slot 33) and audio output (indicated by partial hatching of the third section 37 of the fifth slot 34) . During the second revolution of the turnable stage a 3D animation is accompanied by text.
The photos 39 displayed during the first revolution will be displayed on the image source using a first screen configuration 40. During the third revolution of the turnable stage photos 39, video, first 41 and second text 42 will be displayed on the image source in a second screen configuration 43. During the second revolution animations, video and text will be displayed using a third screen configuration (not shown) .
It is possible to change screen configuration in the middle of a revolution e.g. if the first 80° of a revolution of the turnable stage triggers only video output and that over the following 280° a combination of e.g. texts, photos, and 3D animations are triggered at different positions of the turnable stage .
The user can add different screen configurations via the software, add or remove output, change the unit of the sub division 38 in one or more of the sections 35, 36, 37, as well as it is possible to add or remove sections, e.g. add a section corresponding to fourth, fifth etc. revolution of the turnable stage .
When the preparation of the presentation is finished the presentation may be saved in an executable file stored on computing means e.g. a PC, Mac, Unix or even a flash storage or similar .
Realized on the interactive 3D display according to the present invention, the perceived 3D object in this example could e.g. be a person performing a golf swing. During the first revolution of the stage the viewer will see, as the perceived 3D object, a 360° view of a golfer performing an imperfect swing with a golf club. The 360° view being created by the display on the image source of a series of photos e.g. one photo for each 0.5° turn of the stage. During the second revolution of the stage a 3D animation displayed on the image source creates, as the perceived 3D object, a 3D model of the corrections needed to optimize the swing, along with explaining first and second text next to the perceived 3D object. During the third revolution a 360° film (created of photos in a frame stack as above) of the golfer now performing the perfect swing along with first and second text, will be displayed on the image source to create a perceived 3D object i.e. a perceived golfer making a perfect swing along with text "floating" next to him in relevant positions with e.g. information on foot position and shoulder movement as well as audio of a cheering crowd may be heard.
Fig. 12 shows a circular viewmaster 44 i.e. an example of the user interface of the software according to the present invention. The key content of the viewmaster is known from fig. 11 and the circular viewmaster is an alternative to the linear viewmaster of fig. 11. The circular viewmaster 44 in fig. 12 contains three circular slots 45, 46, 47 representing the angular position of the rotation device, e.g. jog wheel or rotatable stage at a given time. If the software is set up to one rotation, each slot 45, 46, 47 represents 360 degrees and if e.g. the software is setup to a presentation being executed over five revolutions each slot 45, 46, 47 represents 1800 degrees. The media elements i.e. the image/video/sound etc. displayed on the image source at a given time is the one or more subsections of each slot (not shown) passing the angle point line 48 at said given time. In the slots 45, 46, 47 a number of media elements e.g. image/video/sound is indicated by hatched areas of each slot, as described in fig. 11. The information field 49 contains information about current angular position 50 e.g. as in this case 2505,06 degrees which means that the angular encoder has rotated 360 degrees almost seven times. When the angular encoder reaches 2880 degrees the rotation count 51 will change to eight, indicating that the angular encoder is now on the eighth revolution. The one or more subsection, e.g. frame in a frame stack, currently displayed is indicated by a number 51.
The software according to the invention comprises a number of different options during setup of a presentation in order to meet the unique requirements for designing a presentation utilizing the user interface shown in fig. 11 or 12 together with a number of menus . Said options are included to meet the requirements of different companies, business entities, or organizations. In this respect the software of the invention provides the user with a number of different and individually options during setup of a presentation, and simultaneously providing feedback information, which is simple and easy to understand.
The presentation setup and options during setup of the system according to the invention is described in general terms via the steps shown in the flow chart in fig. 13. Even though the steps are listed as sequential the sequence of said steps can arbitrarily amended or modified according to the users demands and requests.
In step 52 the user first connects the rotation sensor (in the following specifically an angular encoder) to the system. Said angular encoder is as described above preferably reading the position of a jog wheel or a similar device, and yields an angle in degrees as an output, e.g. via a USB port. In step 53 the desired photos stored in a frame stack are loaded to the computer. This can e.g. be achieved by loading image files with transparency, e.g. PNGs with an alpha channel, from a persistent storage into RAM. In step 54 the "rotation count" is defined and set. In this respect MAX degrees = rotation count * 360. Thus rotation count is the number of revolutions of the input device, e.g. Z-screen or jog wheel, over which a presentation is stretched. "Degrees" is here the angle used to determine which media to composite and present.
In step 55 the frame stack is configured by setting the x,y,z and scale of frame stack images. Frame stack is composited with other media elements in 3D-space. I.e. all media elements is rendered in 3D to produce an output in which frame stack images and video are rendered in front of or behind one another. I.e. a video can be placed in front of a frame stack image, behind a frame stack image or can transit from front to back as the Degree input changes. In order to configure the video (s) as seen by an end user, the desired video (s) are selected in step 56 from the persistent storage (with or without an audio track) . The specific Degree range (s) the video is to be played within is then set. Additionally video scale, 3-space coordinate, opacity and volume are set. "Key Degrees" ranges can be set that change any of these parameters when Degrees falls into the range. Additionally a "rotation lock" can be toggled. Rotation lock causes video 3-space placement to be controlled by the current Degree, independently of Degree ranges and Key Degrees, e.g. by moving about in a circle the X, Z plane, matching the encoder's rotation .
The desired audio track (s), if any, are selected in step 57 from the persistent storage, and which Degree ranges the audio is to be played within is set.
The input and/or output can then optionally be configured in step 58, e.g. by mirroring the output to the screen if the output is to the image source in the Z - screen in which case a mirrored image is needed due to the reflections made on the mirror, advantageously effected by toggling. Alternatively setting damping from the encoder can configure the input /output . Damping can include filtering and prediction of angular encoder output. In step 59 it is configured what to do when no change in Degree is registered over a period of time. This includes setting angular encoder idle timeout and for example a video to be played . The camera 3D-space position is specified and set in step 60, including direction scale and field of view (FOV) . This affects how the elements in the scene are presented (rendered) .
In order to evaluate if the actual presentation corresponds to the desired presentation, the user can in step 61 optionally use the angular encoder (e.g. by turning the jog wheel) to test the current presentation setup. By changing the output of the angular encoder over time, different images from the frame stack are shown. As the Degrees falls within a Degree- or Key Degree range video and sound will play, placed in, or moving through, the configured location. If one or more of the selected options does not correspond to the desired output, the user can easily alter, amend or change settings, in order to achieve a desired presentation.
When the actual presentation corresponds to the desired presentation the user can save said presentation in step 62 to persistent storage, so that the file can be accessed at a later time for editing by the Editor or executed by the Player to present the media presentation to an end user.
The Player is as described above arranged for showing media presentations on at least an image source as in the interactive 3D display or e.g. a screen with or without a sound source, created by the Editor. In this respect the Player runs on a device, e.g. a computer which receives input from an angular encoder. Depending on the content of the file it may require significant processor time and thus it can by an advantage if the player is able to suppress other activity on the device, in order to be able to run the file giving a smooth and "tight" experience to the user .
Fig. 14 shows a flow diagram representing an exemplary chain of steps/events leading to a maximized processor power available to the Player. The step are listed in a preferred order, but sequence of said steps can arbitrarily amended or modified according to the users demands and requests as well as including additional preferred steps.
The user starts the Player software in step 63.
In step 64 the screen saver is disabled to ensure that a screen saver does not interrupt a presentation even if there is no new input for a prolonged time.
Other activities in risk of terminating the presentation are also interrupted in step 65. This step can for example include disabling of screen a PC power management.
In step 66 some continually running programs notify users of actions they should take, to e.g. maintain the system they are working on. Some of these programs may steal focus from the Player, thereby interfering with the presentation and user experience. Known programs may be disabled or silenced here.
In step 67 the Player is running and the media presentation is executed on the image source and e.g. sound source until the user shuts down the Player. Steps 68, 69, and 70 restore all or some of the settings and programs, which has been changed or disabled in steps 64, 65, and 66 depending on preset values or user choice. The Player is terminated in step 71.
As previously mentioned the Player executes files created by the Editor. Fig. 15 shows a flow diagram representing an exemplary chain of events on how the Player reacts to input in form of angle data from an rotation sensor such as an angular encoder and all media events i.e. execution of video, frames in a frame stack, sound is coupled to predefined input angles.
In step 72 the Player receives input from an angular encoder. Input can be analog, digital, degree, radian, normalized as [0:1 [, unprocessed sensor values, via any bus such as USB, I2C, firewire bluetooth, wifi, TPC/IP, UDP etc.
In step 73 is shown that interaction with a user is a feedback loop. I.e. when a user effects the angular encoder, a computer processes the data from here and presents an image to the user who then reacts to the image. This loop contains a delay, resulting from the encoder, computer, screen an user. If this delay is too large the user experiences lag which disrupts the experience. Prediction of the angle the user is to be presented with is factored in here to minimize perceived lag.
Noise from the angular encoder, jitter in a users muscles etc. can advantageously be filtered out to give the user a smooth experience when using the rotation device attached to the angular encoder. Said step is shown in step 74. The amount of filtering, known as dampening in this context, can be controlled from the Player. Filtering methods can be, averaging, Kalamn filtering, Alpha-Beta, FIR, IIR, Notch, etc. In step 75 the processed output of the angular encoder value can be assigned to a variable e.g.: "newMeasuredDegree" . The steps 76, 77, 78, and 79 keeps track how many complete revolutions the angular encoder has performed since program start in a variable named "rotations". E.g.: the angular encoder may report 0.1 on one measurement and 359.9 on the next. This implies that only a 0.2 degrees change is present between two measurements, not 359.8 degrees, (assuming proper use and operation) , and that the cumulative rotation count should be decreased by 1.
In step 80 the variable "newDegree" is the value used to check against the activation ranges for execution of the media elements. "newDegree" is the angle of all preformed rotations multiplied by 360 plus the current rotation angle "newMeasuredDegree" . The "newDegree" value cannot exceed the "rotationCount " , given in the Player, multiplied by 360.
In steps 81, 82, 83, and 84 the "newDegree" is then used to activate or deactivate media elements as described in figure 11 in the current patent.
In step 85 is shown that the media element continues playing until a new value is received from the angular encoder, or the user shuts down the application.
The rotation sensor/angular encoder is described in more details in the flow diagram in fig. 16.
Detection of rotation device angle can be done in many ways e.g.: by counting notches in the rotation device optically or mechanically, by measuring change in a magnetic field by a magnet connected to the turnstage, or any other means whereby an accurate rotation from a predefined orientation can be determined. Here an angular encoder is used as example and is schematically indicated with reference number 85. In step 86 it is tested if the angle of rotation, of the rotation device, has changed. This test is subject to filtering of the input signal, noise, resolution and other factors that can effect a measured angle.
Step 88 shows output of the angle of the rotation device measured by the angular encoder. Output of the value obtained from the angular encoder can be analog, digital, processed, unprocessed, converted to an angle, radian or unconverted raw sensor values. The output signal can be passed on via USB, I2C, Firewire, optical, radio etc.
The fact that files are created via the Editor based on angle data and that the Player is controlled by angle input has many advantages. For example, as illustrated in fig. 17, an angle signal 89 from a angular encoder 90 in a turn table or rotatable stage may be used to synchronize several units 91, 92, 93, 94, comprising the Player according to the present invention. For example an interactive 3D display 91, two computers with screens 92, 93 and a computer with a sound source 94 may be activated simultaneously if they all contain a Player running a file wherein an event, e.g. onset of display of images in a frame stack and sound, is triggered at a specific input angle e.g. 32 degrees.
It is also possible that the Player on the units 91, 92, 93, 94 are running each their file and thus different events will be triggered at different angles at the different units. The software described in fig. 11 to fig. 16 may be used with the exemplary embodiments of fig. 1 to 10.
The example of a software description is only exemplary and is added to illustrate possibilities of the software according to the present invention. However the limitations of the description must not be construed as being limiting of the software in general e.g. a video editing/player tool using open source elements can well be imagined within the present invention as well as video editor/player developed for Mac, Linux or other non PC formats are within the scope of the present invention. Said limitations are related to each project e.g. due to financial considerations or copyright and not due to the scope of the present invention. The same is true for the use of programming language. The video editor/video player may be programmed in MS.net as well as it can be programmed in any other high or low level language as long as the resulting video editor/video player complies with requirements on e.g. speed, file size etc.
The Editor/Player software described above may be used with the exemplary setups of fig. 1 to 11 as well as other setups according to the present invention.

Claims

Claims
Interactive 3D display (1) comprising at least one first image source (2) projecting at least one image (23) onto at least one semi-transparent mirror (3) thereby creating a perceived 3D object (22,22') in a space defined by the at least one semi-transparent mirror (3) and a base (4) by use of Pepper's Ghost technique, characterized in that the base (4) comprises a moveable stage (5) and that the interactive 3D display (1) further comprises means (6) for manipulating the stage and means (24,25) for correlating the motion of the stage (5) and the perceived 3D object (22,22') .
Interactive 3D display (1) according to claim 1 characterized in that the stage (5) is rotative around a stage rotation axis (A perpendicular to the stage plane.
Interactive 3D display (1) according to claim 1 or 2 characterized in that the means for manipulating the stage (5) is at least one wheel (6) or wheel section rotative around an axis (A2) parallel to the stage rotation axis (A .
Interactive 3D display (1) according to claim 1,2 or 3 characterized in that the movement of the stage (5) and movement of the means (6) for remotely manipulating the stage (5) are directly linked.
Interactive 3D display (1) according to any of the preceding claims 1 - 4 characterized in that the means for correlating the motion of the stage (5) and the perceived 3D object (22,22') at least comprises
a rotation sensor (25) for detecting the rotation of the stage ( 5 ) , a transmitter for transmitting a rotation signal from the rotation sensor (25) to a computer (24),
software running on the computer (24) receiving input in form of the signal from the rotation sensor (25) and on the basis of said signal animating the at least one image projected by the image source (2) .
Interactive 3D display (1) according to claim 5 characterized in that the rotation sensor (25) for detecting the rotation of the stage (5), is arranged for determining an angle between the stage and a predetermined reference point, and thereby determing one or more positions of the stage.
Interactive 3D display (1) according to claim 5 or 6 characterized in that the rotation signal from the rotation sensor (25) is signals in form of angle data.
Interactive 3D display (1) according to claim 5, 6 or 7 characterized in software is arranged for receiving input in the form of angle data.
Interactive 3D display (1) according to any of the preceding claims 1 - 8 characterized in that the means for correlating the motion of the stage (5) and the perceived 3D object (22,22') comprises
a key object (30b, 30c) placed on the stage (5),
a camera (29) above the center of the stage monitoring the key object (30c, 30b) ,
software running on the computer (24) receiving input from the camera (29), said software identifying the key object (30b, 30c) and creating a corresponding image visualized on the at least one image source (2), and
software running on the computer (24) receiving input from the camera (29) about movement of the key object (30b, 30c) and animating said image correlated with the movement of the stage ( 5 ) .
Interactive 3D display (1) according to any of the preceding claims 1 - 9 in characterized that the software is at least a video player and/or video editing tool.
Interactive 3D display (1) according to any of the preceding claims 1 - 10 characterized in that the stage rotation axis (A and the center of rotation of the perceived 3D object (22,22') coincides in at least one point .
Interactive 3D display (1) according to any of the preceding claims 1 - 11 characterized in that at any point Pm of the semi transparent mirror (3), the shortest distance Dmin between the image source (2) and the semi transparent mirror (3) equals the distance D± between the semi transparent mirror (3) and a plane P± perpendicular to the stage containing the stage rotation axis (A .
Interactive 3D display (1) according to any of the preceding claims 1 - 12 characterized in that an actual object (26) is placed on the stage (5) .
Interactive 3D display (1) according to any of the preceding claims 1 - 13 characterized in that the interactive 3D display (1) further comprises means (24,27) for tracking the eye position of an observer (21) and communicating data about said eye position to software at least partly generating the at least one first image.
Interactive 3D display (1) according to any of the preceding claims 1 - 14 characterized in that the interactive 3D display (1) is collapsible.
16. An interactive 3D display system comprising at least one interactive 3D display (1) as defined in any of the claims 1 - 15, characterized in that the system comprises at least one additional presentation means (92,93,94) for presenting audio and/or visual information.
17. The system according to claim 16, characterized in that system further comprises means for executing one or more preprogrammed actions on the at least one interactive 3D display and the at least one additional presentation means when the stage during manipulation reaches one or more predefined positions.
18. The system according to claim 17, characterized in that stage is rotative around a stage rotation axis (A perpendicular to the stage plane, and the one or more predefined positions are angular positions defined in relation to a reference point.
19. A method of forming and animating a perceived 3D object (22,22'), characterized in that the method comprises the steps of :
providing an interactive 3D display (1) according to any of the claims 1 - 15,
- displaying an image on an image source (2) using a computer (24), said image being reflected in a semi transparent mirror (3) and creates the illusion of a perceived 3D object (22,22') located above the stage (5) of the interactive 3D display (1),
- moving the stage (5),
detecting and recording the movement of the stage (5) by means of a movement sensor (25),
transmitting a signal representing the movement of the stage (5) from the movement sensor (25) to the computer (24), logging and processing the signal from the movement sensor (25) and animating the image on the basis hereof, so that the perceived 3D object (22,22') moves in a coordinated motion together with the stage (5) .
A method of forming and animating a perceived 3D object (22,22') according to claim 19, characterized in that the motion of the stage (5) is a rotation.
A method of forming and animating a perceived 3D object (22,22') according to claim 19 or 20, characterized in that the motion sensor (25) is a rotation sensor.
A method of forming and animating a perceived 3D object (22,22') according to claim 20 or 21, characterized in that said method further comprises executing one or more preprogrammed actions on the interactive 3D display when the stage reaches one or more predefined angular positions during rotation.
Use of the interactive 3D display (1) according to any of the preceding claims 1 - 15 for real-time changing a perceived over layer (22,22') of an actual object (26) .
Use of the interactive 3D display system according to any of the claims 16 - 18, for accompanying the perceived object presented on the interactive 3D display with additional visual or audio effects.
PCT/EP2010/065619 2009-10-16 2010-10-18 An interactive 3d display, a method for obtaining a perceived 3d object in the display and use of the interactive 3d display WO2011045437A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
DKPA200901128 2009-10-16
DKPA200901128 2009-10-16
US31442710P 2010-03-16 2010-03-16
US61/314,427 2010-03-16
US32508010P 2010-04-16 2010-04-16
US61/325,080 2010-04-16

Publications (1)

Publication Number Publication Date
WO2011045437A1 true WO2011045437A1 (en) 2011-04-21

Family

ID=43242313

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/065619 WO2011045437A1 (en) 2009-10-16 2010-10-18 An interactive 3d display, a method for obtaining a perceived 3d object in the display and use of the interactive 3d display

Country Status (1)

Country Link
WO (1) WO2011045437A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2508933A1 (en) * 2011-04-04 2012-10-10 Realfiction ApS Collapsible 3D display and a method of assembling said 3D display
EP2610672A1 (en) * 2011-12-28 2013-07-03 OS Outstanding Solutions GmbH Display device
GB2508404A (en) * 2012-11-30 2014-06-04 Barry Patrick Skinner Display Projecting Image onto Inclined Transparent Panel with Semi-Reflective Coating
DE202014002440U1 (en) 2014-03-19 2014-07-09 Rainer Eissing Device for presenting virtual images
WO2014125365A2 (en) * 2013-02-18 2014-08-21 Verma Yogesh An improved virtual imaging technology
CN104007557A (en) * 2014-06-11 2014-08-27 深圳市丽新致维显示技术有限责任公司 Display equipment and system
WO2014195781A1 (en) 2013-06-07 2014-12-11 Two P Sas Modular holographic display case
CN104345464A (en) * 2014-06-17 2015-02-11 深圳创锐思科技有限公司 Auxiliary display device and display system
WO2015070881A1 (en) * 2013-11-14 2015-05-21 Realfiction Aps Display arranged for combining a physical object with one or more digital images
US20150271477A1 (en) * 2014-03-20 2015-09-24 Shenzhen Lexyz Technology Co., Ltd. Expanded display apparatus and system
CN105096794A (en) * 2014-05-22 2015-11-25 深圳创锐思科技有限公司 Display device, control method of display device and display system
FR3022040A1 (en) * 2014-06-10 2015-12-11 Digital Art Internat SYSTEM FOR 3D REPRESENTING A 2D IMAGE DISPLAYED ON A DISPLAY SCREEN
GB2543368A (en) * 2015-05-11 2017-04-19 Faraway Dominic Display apparatus
WO2017116426A1 (en) * 2015-12-30 2017-07-06 Hewlett-Packard Development Company, L.P. Detachable electronic device
WO2018097067A1 (en) * 2016-11-24 2018-05-31 コニカミノルタ株式会社 Aerial picture display device
JP2018513701A (en) * 2015-01-22 2018-05-31 奇鋭科技股▲分▼有限公司 Holographic projection game and learning system
WO2019182523A1 (en) * 2018-03-22 2019-09-26 Intometer J.S.A. A multi-functional 3d imaging device for ct and mr images
WO2020016488A1 (en) 2018-07-18 2020-01-23 Holomake System for motor-driven mechanical control of a holographic plane for manual precision guidance
WO2020139719A1 (en) * 2018-12-28 2020-07-02 Universal City Studios Llc Augmented reality system for an amusement ride
WO2022271755A1 (en) * 2021-06-21 2022-12-29 IKIN, Inc. Adpative holographic projection system with user tracking
USD988277S1 (en) 2021-06-17 2023-06-06 IKIN, Inc. Portable holographic projection device
USD994011S1 (en) 2021-06-16 2023-08-01 IKIN, Inc. Holographic projection device
NL1044264B1 (en) 2022-02-17 2023-09-05 Mrunal Gawade Dr Portable tabletop 3d media projection holographic display device
US11792311B2 (en) 2018-07-30 2023-10-17 IKIN, Inc. Portable terminal accessory device for holographic projection and user interface
USD1009969S1 (en) 2021-06-17 2024-01-02 IKIN, Inc. Holographic device housing

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5190286A (en) * 1990-09-27 1993-03-02 Namco, Ltd. Image synthesizing system and shooting game machine using the same
US5405152A (en) 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback
WO1996035975A1 (en) * 1995-05-12 1996-11-14 Peter Mcduffie White Device and method for superimposing images in a three-dimensional setting without using lenses
GB2329037A (en) * 1997-09-05 1999-03-10 Peter Mcduffie White Image displaying arrangement with two way mirror and coacting shutters
US5886818A (en) * 1992-12-03 1999-03-23 Dimensional Media Associates Multi-image compositing
US5940167A (en) * 1997-03-06 1999-08-17 Gans; Richard Process and apparatus for displaying an animated image
US6283598B1 (en) * 1998-06-19 2001-09-04 Minolta Co., Ltd. Method of and apparatus for displaying an image
US6607275B1 (en) * 2002-03-20 2003-08-19 The Neiman Marcus Group, Inc. Merchandise display case and system
EP1846798A1 (en) 2005-01-26 2007-10-24 Vizoo Invest APS Display device for producing quasi-three-dimensional images
JP2007275684A (en) * 2004-12-08 2007-10-25 Daito Giken:Kk Game machine and image displaying unit
WO2008030080A1 (en) * 2006-09-08 2008-03-13 Medialusion B.V. Device for displaying virtual three-dimensional objects

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5190286A (en) * 1990-09-27 1993-03-02 Namco, Ltd. Image synthesizing system and shooting game machine using the same
US5886818A (en) * 1992-12-03 1999-03-23 Dimensional Media Associates Multi-image compositing
US5405152A (en) 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback
WO1996035975A1 (en) * 1995-05-12 1996-11-14 Peter Mcduffie White Device and method for superimposing images in a three-dimensional setting without using lenses
US5940167A (en) * 1997-03-06 1999-08-17 Gans; Richard Process and apparatus for displaying an animated image
GB2329037A (en) * 1997-09-05 1999-03-10 Peter Mcduffie White Image displaying arrangement with two way mirror and coacting shutters
US6283598B1 (en) * 1998-06-19 2001-09-04 Minolta Co., Ltd. Method of and apparatus for displaying an image
US6607275B1 (en) * 2002-03-20 2003-08-19 The Neiman Marcus Group, Inc. Merchandise display case and system
JP2007275684A (en) * 2004-12-08 2007-10-25 Daito Giken:Kk Game machine and image displaying unit
EP1846798A1 (en) 2005-01-26 2007-10-24 Vizoo Invest APS Display device for producing quasi-three-dimensional images
WO2008030080A1 (en) * 2006-09-08 2008-03-13 Medialusion B.V. Device for displaying virtual three-dimensional objects

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2508933A1 (en) * 2011-04-04 2012-10-10 Realfiction ApS Collapsible 3D display and a method of assembling said 3D display
EP2610672A1 (en) * 2011-12-28 2013-07-03 OS Outstanding Solutions GmbH Display device
WO2013098100A1 (en) * 2011-12-28 2013-07-04 Os Outstanding Solutions Gmbh Display device
GB2508404A (en) * 2012-11-30 2014-06-04 Barry Patrick Skinner Display Projecting Image onto Inclined Transparent Panel with Semi-Reflective Coating
WO2014125365A3 (en) * 2013-02-18 2014-12-24 Verma Yogesh An improved virtual imaging technology
WO2014125365A2 (en) * 2013-02-18 2014-08-21 Verma Yogesh An improved virtual imaging technology
WO2014195781A1 (en) 2013-06-07 2014-12-11 Two P Sas Modular holographic display case
FR3006777A1 (en) * 2013-06-07 2014-12-12 Two P MODULAR HOLOGRAPHIC DISPLAY
WO2015070881A1 (en) * 2013-11-14 2015-05-21 Realfiction Aps Display arranged for combining a physical object with one or more digital images
DK201370683A1 (en) * 2013-11-14 2015-05-26 Realfiction Aps Display arranged for combining a physical object with one or more digital images
DE202014002440U1 (en) 2014-03-19 2014-07-09 Rainer Eissing Device for presenting virtual images
TWI585464B (en) * 2014-03-20 2017-06-01 深圳創銳思科技有限公司 Expansion display device and expansion display system
US20150271477A1 (en) * 2014-03-20 2015-09-24 Shenzhen Lexyz Technology Co., Ltd. Expanded display apparatus and system
JP2015184671A (en) * 2014-03-20 2015-10-22 トロンクシズ テクノロジー カンパニー リミテッド amplification display device and amplification display system
CN105096794A (en) * 2014-05-22 2015-11-25 深圳创锐思科技有限公司 Display device, control method of display device and display system
FR3022040A1 (en) * 2014-06-10 2015-12-11 Digital Art Internat SYSTEM FOR 3D REPRESENTING A 2D IMAGE DISPLAYED ON A DISPLAY SCREEN
CN104007557A (en) * 2014-06-11 2014-08-27 深圳市丽新致维显示技术有限责任公司 Display equipment and system
CN104345464A (en) * 2014-06-17 2015-02-11 深圳创锐思科技有限公司 Auxiliary display device and display system
JP2018513701A (en) * 2015-01-22 2018-05-31 奇鋭科技股▲分▼有限公司 Holographic projection game and learning system
GB2543368A (en) * 2015-05-11 2017-04-19 Faraway Dominic Display apparatus
GB2543368B (en) * 2015-05-11 2018-06-27 Faraway Dominic Display apparatus
WO2017116426A1 (en) * 2015-12-30 2017-07-06 Hewlett-Packard Development Company, L.P. Detachable electronic device
WO2018097067A1 (en) * 2016-11-24 2018-05-31 コニカミノルタ株式会社 Aerial picture display device
WO2019182523A1 (en) * 2018-03-22 2019-09-26 Intometer J.S.A. A multi-functional 3d imaging device for ct and mr images
WO2020016488A1 (en) 2018-07-18 2020-01-23 Holomake System for motor-driven mechanical control of a holographic plane for manual precision guidance
FR3084173A1 (en) 2018-07-18 2020-01-24 Holomake MOTORIZED MECHANICAL SERVO SYSTEM OF A HOLOGRAPHIC PLAN FOR MANUAL PRECISION GUIDANCE
US11792311B2 (en) 2018-07-30 2023-10-17 IKIN, Inc. Portable terminal accessory device for holographic projection and user interface
WO2020139719A1 (en) * 2018-12-28 2020-07-02 Universal City Studios Llc Augmented reality system for an amusement ride
US10818090B2 (en) 2018-12-28 2020-10-27 Universal City Studios Llc Augmented reality system for an amusement ride
USD994011S1 (en) 2021-06-16 2023-08-01 IKIN, Inc. Holographic projection device
USD988277S1 (en) 2021-06-17 2023-06-06 IKIN, Inc. Portable holographic projection device
USD1009969S1 (en) 2021-06-17 2024-01-02 IKIN, Inc. Holographic device housing
WO2022271755A1 (en) * 2021-06-21 2022-12-29 IKIN, Inc. Adpative holographic projection system with user tracking
NL1044264B1 (en) 2022-02-17 2023-09-05 Mrunal Gawade Dr Portable tabletop 3d media projection holographic display device

Similar Documents

Publication Publication Date Title
WO2011045437A1 (en) An interactive 3d display, a method for obtaining a perceived 3d object in the display and use of the interactive 3d display
US11605203B2 (en) Creation and use of virtual places
US10657727B2 (en) Production and packaging of entertainment data for virtual reality
RU2754991C2 (en) System of device for viewing mixed reality and method for it
US20160292850A1 (en) Personal audio/visual system
US10908682B2 (en) Editing cuts in virtual reality
KR20180102171A (en) Pass-through camera user interface elements for virtual reality
US8717360B2 (en) Presenting a view within a three dimensional scene
CN104935905B (en) Automated 3D Photo Booth
CN109069932A (en) Watch VR environment associated with virtual reality (VR) user interaction
CN109069934A (en) Spectators' view tracking to the VR user in reality environment (VR)
CN109069933A (en) Spectators visual angle in VR environment
JP2018508805A (en) Method and system for user interaction in a virtual or augmented reality scene using a head mounted display
TW201633104A (en) Tracking system for head mounted display
CN103649874A (en) Interface using eye tracking contact lenses
JP2008500624A (en) System and method for operating in a virtual three-dimensional space and system for selecting operations via a visualization system
CN103761763B (en) For the method using precalculated illumination to build augmented reality environment
CN107810634A (en) Display for three-dimensional augmented reality
US20180307352A1 (en) Systems and methods for generating custom views of videos
EP3600578B1 (en) Zoom apparatus and associated methods
US20230317115A1 (en) Video framing based on device orientation
KR102335209B1 (en) Virtual Reality Display Mobile Device
CN106814532A (en) A kind of 270 degree of line holographic projections showcases
JP2011143102A (en) Game device
JP6894322B2 (en) Information provision system and information provision program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10770527

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC. EPO FORM 1205A DATED 12.07.2012

122 Ep: pct application non-entry in european phase

Ref document number: 10770527

Country of ref document: EP

Kind code of ref document: A1