US20150234569A1 - Vehicle user interface unit for a vehicle electronic device - Google Patents
Vehicle user interface unit for a vehicle electronic device Download PDFInfo
- Publication number
- US20150234569A1 US20150234569A1 US14/698,692 US201514698692A US2015234569A1 US 20150234569 A1 US20150234569 A1 US 20150234569A1 US 201514698692 A US201514698692 A US 201514698692A US 2015234569 A1 US2015234569 A1 US 2015234569A1
- Authority
- US
- United States
- Prior art keywords
- user
- image
- display
- virtual
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000003993 interaction Effects 0.000 claims abstract description 64
- 230000008447 perception Effects 0.000 claims abstract description 18
- 238000001514 detection method Methods 0.000 claims abstract description 8
- 230000006870 function Effects 0.000 claims description 31
- 238000000034 method Methods 0.000 claims description 23
- 230000033001 locomotion Effects 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 6
- 238000012544 monitoring process Methods 0.000 claims description 4
- 238000011156 evaluation Methods 0.000 description 17
- 210000003128 head Anatomy 0.000 description 13
- 210000000887 face Anatomy 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 6
- 230000004913 activation Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000002542 deteriorative effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B60K35/10—
-
- B60K35/211—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
- G01S5/163—Determination of attitude
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- H04N13/0445—
-
- H04N13/0468—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04802—3D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/349—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
Abstract
A vehicle user interface unit for a vehicle electronic device. The vehicle user interface unit includes a three-dimensional (“3D”) display unit having a display, and is configured to display an image for perception by a user as a virtual 3D image. The virtual 3D image is at least partially located in front of the display when the user observes the display. A display control unit is configured to control the generation of the image by the 3D display unit. The virtual 3D image includes a 3D object having at least two regions located in different spatial planes. Each region includes a plurality of interaction elements. An input unit is configured to detect the location of a user-controlled object and to interpret the detection of a predefined variation of the user-controlled object as a selection of one of the interaction elements in the virtual 3D image.
Description
- The present application is a continuation of U.S. patent application Ser. No. 13/076,243, entitled “VEHICLE USER INTERFACE UNIT FOR A VEHICLE ELECTRONIC DEVICE,” filed Mar. 30, 2011, which claims priority to European Patent Application Serial No. 10 003 477.6, entitled “VEHICLE USER INTERFACE UNIT FOR A VEHICLE ELECTRONIC DEVICE,” filed on Mar. 30, 2010, the entire contents of each of which are hereby incorporated by reference for all purposes.
- 1. Field of the Invention
- The invention relates to a vehicle user interface unit, and more particularly, to a vehicle interface unit for a vehicle electronic device and a vehicle infotainment system, and methods for operating the vehicle interface unit.
- 2. Related Art
- Vehicles are typically equipped with a user interface to allow the user (driver or passenger) to control functions relating to the vehicle itself or to an electronic device provided in the vehicle, such as an infotainment system. The user may be provided with control over functions and information or with a display of information that may relate to driver assistance systems, a multimedia system such as a car radio or mobile communication systems that communicate for example, via GSM or UMTS. Information from outside the vehicle may also be made available to the driver or passenger. For example, information may be received from communications systems that permit information retrieval and transmission from the car to the world, including for example, communication from car to car, or from car to infrastructure.
- The user typically interacts with a head unit having a user interface with a display and control elements that allow the user to control the desired functions. The head unit typically has a face plate on the dashboard of the vehicle. Vehicles have limited space on the dashboard and the face plate, therefore, the mechanical control elements and the display may have to share the limited amount of space available in the dashboard with each other and with other components. The space limitation may limit the amount of information that may be simultaneously displayed to the user. In addition, only a few control elements may be available to operate and control a larger number of functions that may be offered to the user.
- User access to the larger number of functions via a few control elements is generally achieved using a menu tree structure with main menus and multiple submenus, through which a user browses to reach a particular function. Menu structures may be cumbersome for the user. Browsing through the menus and submenus may take a considerable amount of time before the user reaches a particular menu item. During this time, if the user is also driving, the effort to find the desired menu item may distract the user sufficiently to create a dangerous situation.
- One solution uses speech recognition for voice-activated commands to access the functions. However, such solutions have not yielded any considerable improvement due to the numerous enquiries by the system and the browsing through the menu structure still required by voice-activated commands.
- Some improvement may be achieved using a touch screen, which replaces many of the mechanical control elements with graphical control elements. By removing the mechanical control elements, space becomes available on the face plate for a larger size display without needing a larger face plate. Nevertheless, the available physical space typically remains rather limited permitting a limited amount of information or menu items to be displayed resulting in a generally confusing presentation, particularly when accessing complex menu structures. The control elements are also graphically relatively small and fail to provide any haptic feedback. User interfaces having a touchscreen are not considerably easier to operate inside a vehicle, particularly when attempted by the driver. Touchscreens are also susceptible to becoming soiled, such as by fingerprints, deteriorating the quality of the displayed image.
- There is a need for an improved presentation of menu structures and other information, and for facilitating the selection of menu items for controlling the functions of a vehicle electronic device, the vehicle itself or for adjusting parameters.
- A vehicle user interface unit for a vehicle electronic device. The vehicle user interface unit includes a three-dimensional (“3D”) display unit having a display, and is configured to display an image for perception by a user as a virtual 3D image. The virtual 3D image is at least partially located in front of the display when the user observes the display. A display control unit is configured to control the generation of the image by the 3D display unit. The virtual 3D image includes a 3D object having at least two regions located in different spatial planes. Each region includes a plurality of interaction elements. An input unit is configured to detect the location of a user-controlled object and to interpret the detection of a predefined variation of the user-controlled object as a selection of one of the interaction elements in the virtual 3D image.
- It is to be understood that the features mentioned above and those yet to be explained below can be used not only in the respective combinations indicated, but also in other combinations or in isolation, without leaving the scope of the present invention.
- Other devices, apparatus, systems, methods, features and advantages of the invention will be or will become apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims.
- The description below may be better understood by referring to the following figures. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. In the figures, like reference numerals designate corresponding parts throughout the different views.
-
FIG. 1 is a schematic diagram of an example of a vehicle user interface unit. -
FIGS. 2A to 2C illustrate the generation of a virtual 3D image having a 3D object using a 3D display unit. -
FIG. 3 is a flow chart illustrating operation of an example of a method for operating a vehicle user interface unit. -
FIG. 1 is a schematic diagram of an example of a vehicleuser interface unit 100. The vehicleuser interface unit 100 may be part of a head unit, a vehicle infotainment system, or any other vehicle electronic device. The vehicleuser interface unit 100 includes adisplay control unit 120 to control operation of the vehicleuser interface unit 100, a3D display unit 101, and aninput unit 130. The3D display unit 101 andinput unit 130 are both in electronic communication with thedisplay control unit 120. As such, the3D display unit 101 and theinput unit 130 communicate information to thedisplay control unit 120. - In the example illustrated in
FIG. 1 , the3D display unit 101 includes adisplay 102 connected to agraphics processor 103. Thegraphics processor 103 receives image data that includes data for a 3D image to be displayed using thedisplay control unit 120. Thegraphics processor 103 generates a corresponding output signal for thedisplay 102. The3D display unit 101 inFIG. 1 may be an autostereoscopic display unit, which is a display that it is capable of generating an image by for perception by auser 140 as avirtual 3D image 110. Thevirtual 3D image 110 is a 3D image generated for perception by the user without the having to wear spectacles. Thedisplay 102 generates thevirtual 3D image 110 by projecting each view required to generate depth perception into one eye of the observinguser 140 as shown schematically by dashed lines inFIG. 1 . - An autostereoscopic display may be realized using techniques known to those of ordinary skill in the art, and are therefore not described in greater detail here. One example of such a technique includes continuously monitoring the position of the user's head. An image of the user's head may be captured using a
stereoscopic camera 131 or by anon-stereoscopic camera 132. The adjustment of the projection optics and view content may be monitored accordingly using the capture image. The content of each view may be matched to the position of the eye, which may be identified using a face tracking technique. Thedisplay 102 may include a liquid crystal display with a continuously scanning spot source of light in the focal plane of a lens. The liquid crystal display may be used to control the intensity of light emitted by thedisplay 102 as a function of ray direction. The lens and light source produce rays all travelling in one general direction at anyone instant. The direction of the rays may be synchronized with the display of appropriate views of the 3D image on the liquid crystal display. In examples of this technique, the frame rate of thedisplay 102 may be doubled to allow the eyes of the observinguser 140 to integrate a 3D image over time. The faces of other users, such as for example, passengers inside the vehicle, maybe tracked and the frame rate of thedisplay 102 may be increased accordingly to enable the perception of the virtual 3D image by the other users. - Other techniques that may be employed using the
3D display unit 101 include multiple view autostereoscopy in which thedisplay 102 projects views to every position where a viewer might be. Examples of implementations may include a lenslet array, which covers a pixel for each view. The lenslets combine to make the pixels for each view visible exclusively in one direction. Diffraction gratings or an array of slits may also be used instead of an array of lenslets. Using a diffraction grating makes it possible to extend the field of view. The lenslet array and diffraction grating techniques may be implemented using an underlying display with a resolution that is the product of the view resolution and the number of views, which may require a high resolution display. In an example implementation, the high resolution display may be replaced by a technique that involves generating several video projections lined up behind a lens. The lens may then make each view corresponding to a projection visible in a different direction. - As described above with reference to
FIG. 1 , the3D display unit 101 may generate thevirtual 3D image 110 perceived by theuser 140 using a variety of techniques. It is to be understood that other implementations may include examples in which the3D display unit 101 operates in conjunction with shutter glasses to be worn by theuser 140. Different views are alternately projected to the eyes of theuser 140 enabling the use of aconventional display 102 with doubled frame rate. In another technique, each view is displayed using light of a different polarization allowing a user wearing corresponding polarizing spectacles to receive the intended view. - The
display control unit 120 may provide general information relating to a3D object 111 to be included in thevirtual 3D image 110 to agraphics processor 103. Thegraphics processor 103 may then calculate the different views to be displayed to theuser 140 to generate the binocular perception of depth (stereopsis). When these different views are displayed to theuser 140 by thedisplay 102, theuser 140 perceives thevirtual 3D image 110. In the example illustrated inFIG. 1 , the3D display unit 101 is configured to form thevirtual 3D image 110 in front of thedisplay 102. For example, thevirtual 3D image 110 may be positioned between thedisplay 102 and the observinguser 140. In other example implementations, thevirtual 3D image 110 may be partially located in front of thedisplay 102. - It is noted that the
display 102 may also be used to display two-dimensional (2D) images, and that the3D display unit 101 may be the display unit of a vehicle electronic device, such as an infotainment system. Thus, menu structures, maps for navigation, multimedia information or media streams may be displayed on thedisplay 102. - The
display control unit 120 may be implemented using a microprocessor, which may be the microprocessor used for control by the vehicle electronic device or of any other system in which the vehicleuser interface unit 100 is implemented, and may as such perform other functions unrelated to the user interface. Other implementations of thedisplay control unit 120 may use multiple microprocessors, a special purpose microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC) or a field-programmable gate array. The microprocessor may operate according to programs stored in a storage device (not shown) having an interface to the microprocessor. - In other example implementations, the
graphics processor 103 may not be used, and functions attributed to thegraphics processor 103 may be performed by thedisplay control unit 120. The3D display unit 101 may also include software code portions running on a microprocessor operating in thedisplay control unit 120. It is also possible that thegraphics processor 103 and the microprocessor are provided within a single chip, or component. - The vehicle
user interface unit 100 inFIG. 1 also includes theinput unit 130, which includes thestereoscopic camera 131 connected to anevaluation unit 135. Thestereoscopic camera 131 monitors an area in which thevirtual 3D image 110 is to be formed. In general, thestereoscopic camera 131 includes functions for monitoring the space in front of thedisplay 102. Thestereoscopic camera 131 includes two optical systems for acquiring two different views of the observed region. The example shown inFIG. 1 illustrates the two optical systems as being adjacent to one another. In other example implementations, the two optical systems may be arranged separately in other embodiments. For example, an optical system may be arranged on each side of thedisplay 102. - Each optical system of the
stereoscopic camera 131 includes a charge coupled device (“CCD”) array for acquiring an image of a view of the region to be monitored. The acquired image data is received by theevaluation unit 135, which constructs a 3D representation of the observed region from the two different views provided by thestereoscopic camera 131. The position of a user-controlled object, such as thehand 150 of theuser 140 may then be determined in three dimensions. Theevaluation unit 135 may also include functions for identifying the object within the monitored region or area, and for tracking the detected object. - The
evaluation unit 135 may also detect and interpret a predefined variation of the user-controlled object in the images received fromstereoscopic camera 131. In the example illustrated inFIG. 1 , the user-controlled object is the index finger of the user'shand 150. The position of thefinger 150, a change in the position of thefinger 150, and a variation in the shape of theindex finger 150 may be determined by theevaluation unit 135 from the supplied images. Theevaluation unit 135 interprets a predefined variation that it detects as a command. The predefined variations of the user-controlled object may include, for example, the movement of the user-controlled object to a particular location, such as when the tip of the index finger of the user'shand 150 is moved into the proximity of an element of thevirtual 3D image 110, or when the index finger ofhand 150 performs a gesture. The gesture may be identified by detecting a change in the shape of the user-controlled object. - The examples described above are only a few examples of how the user-controlled object may be implemented and how variations in the user-controlled object may be interpreted as commands. In example implementations, objects such as, a pen, a ring, or another marker positioned for control by the user may be monitored. Changes in the position of the objects or the performance of a gesture using the objects may be detected as commands. The illustrated example using a user's finger precludes the need for the user to hold or wear additional objects.
- During operation of a vehicle electronic device using the vehicle
user interface unit 100, the3D object 111 projects thevirtual image 110 using thedisplay control unit 120 and the3D display unit 101. The3D object 111 includes afirst region 112 with a plurality ofinteraction elements 115 and asecond region 113 with a plurality ofinteraction elements 115. Thefirst region 112 andsecond region 113 are positioned in two different spatial planes of thevirtual 3D image 110. The positions of the different spatial planes in which the regions of the3D object 111 are located may vary with the angle at which the display is observed by the user, which makes them “virtual spatial planes.” In the example ofFIG. 1 , the3D object 111 that may be perceived by theuser 140 is a cube or cuboid in which theregions interaction elements 115 may be menu items of a main menu or a submenu, or information elements containing information for display to theuser 140. The position and shape of the3D object 111 and theinteraction elements 115 displayed on the faces of the object may be controlled by thedisplay control unit 120. Thedisplay control unit 120 may generate a virtual image of different types of 3D objects, such as other types of polyhedrons; for example, an octagonal prism or other similar shapes. The3D object 111 is formed with several faces, each face displaying a different menu or submenu, or a certain class of information and control elements. The amount of information that can be simultaneously displayed may be multiplied or substantially increased using the three-dimensional representation. - The
display control unit 120 may also project theobject 111 in thevirtual 3D image 110 with partially transparent faces. The partially transparent faces make the faces oriented on the backside of theobject 111 visible by theuser 140. The faces on the backside of theobject 111 may include particular pieces of information or control elements that would otherwise be accessible to the user in different menus requiring the user to leave one menu to open another to locate the particular information or control element. The at least partially transparent faces provide the user with quick access to the particular information or control element. The vehicleuser interface unit 100 may include a control element, such as for example, a button or an interaction element, for activating or deactivating the transparency, or for setting a transparency value. For example, the user may set a transparency value within a range of 0% to about 50%, or about 10% to about 20%, where 0% corresponds to an opaque region or face (regions covered by the opaque region are not visible) and 100% corresponding to a completely transparent (or invisible) region or face. -
FIGS. 2A to 2C illustrate the generation of avirtual 3D image 200 having a3D object 202, which is acube 202 inFIG. 2A , using a 3D display unit 102 (as shown inFIG. 1 ) and obtaining access to functions by rotating the3D object 202.FIG. 2A shows thecube 202 having afirst region 212 on one face of thecube 202 and asecond region 214 on another face of thecube 202. Thecube 202 is oriented showing thefirst region 212 from the viewpoint of the user as the user observes thedisplay 102 so that the face containing thefirst region 212 faces the user. Thesecond region 214 is visible as a side face of thecube 202 due to the partial transparency of the face having thefirst region 212. An interaction element located on thesecond region 214 may be accessed by rotating thecube 202 so that the face having thesecond region 214 faces the user. After the rotation, the face having thesecond region 214 is displayed as facing the user from the viewpoint of the user, as illustrated inFIG. 2C . - As described above with reference to
FIG. 1 , thedisplay control unit 120 generates thevirtual 3D image 200 shown inFIG. 2A . Thedisplay control unit 120 includes data such as the position information of thevirtual image 200. Thedisplay control unit 120 therefore includes data indicating the location in space of the3D object 202. The position and space information is provided to the evaluation unit 135 (inFIG. 1 ) to enable the detection of a user input. Theevaluation unit 135 receives the position information of the3D object 202 as observed by the user 140 (inFIG. 1 ) and the position of the user's hand 150 (inFIG. 1 ). Theevaluation unit 135 may then use the information to determine when the user-controlled object approaches or virtually touches an element of thevirtual image 200. The input unit 130 (inFIG. 1 ) recognizes touches or virtual touches to elements on the3D object 202 and certain gestures as user commands. In an example implementation, a virtual touch to one of the interaction elements 115 (shown inFIG. 1 ) on thefirst region 212 of the3D object 202 is recognized as a command to select and execute the function associated with the virtually touched interaction element. The virtual touch is detected when the tip of the index finger of the user'shand 150 comes to within a predetermined distance from the respective interaction element in thevirtual image 200. Other example implementations may pre-select the interaction element by a first virtual touch and execute the associated function by a second virtual touch. Example implementations may also execute the function after a virtual touch that lasts a predefined minimum duration. When a pre-selection is implemented, the corresponding interaction element may be highlighted in thevirtual image 200 to provide optical feedback to the user. - In addition to selecting and activating or executing functions associated with interaction elements, the user may issue a command to rotate the
3D object 202 in order to access interaction elements on a different region on a different face of the object. InFIG. 2A , thefirst region 212 of the3D object 202 faces the user. To rotate thecube 202, the user touches a corner of thecube 202 in thevirtual 3D image 200 using, for example, a finger. The position of the user's finger is detected using thestereoscopic camera 131 and theevaluation unit 135 of the input unit 130 (FIG. 1 ). -
FIG. 2B shows an example of how the user may rotate thecube 202 inFIG. 2A . With the user'sfinger 150 on afirst corner 220 of thecube 202, the user may perform a gesture by moving thefinger 150 in the direction in which the3D object 202 is to be rotated. This movement is indicated by an arrow A inFIG. 2B . The gesture is recognized by theinput unit 130 by tracking the location and/or shape of thefinger 150, and is interpreted as a command to rotate the3D object 202 in the corresponding direction. The command is then communicated to the display control unit 120 (inFIG. 1 ), which issues corresponding commands to the 3D display unit 101 (inFIG. 1 ). The3D display unit 101 controls thedisplay device 102 to generate the corresponding virtual image showing the3D object 202 rotated. The virtual image is generated by displaying the different views for binocular perception by theuser 140 as thevirtual image 200. -
FIG. 2C illustrates the result of the rotation of thecube 202. The3D object 202 is shown oriented with the face having thesecond region 214 now facing the user. In example implementations, the interaction elements located in the region facing the user are selectable while the interaction elements in other regions are disabled in order to prevent an accidental activation. In other example implementation, the interaction elements of all regions may be active. - The vehicle
user interface unit 100 may generally be operated by the driver of the vehicle or a passenger. In a vehicle, the general locations of driver and passengers are usually known. Thevirtual image 200 may be generated at different spatial positions to allow users observing thevirtual image 200 along different angles from the different driver and passenger positions in the vehicle. The vehicleuser interface unit 100 may be provided with a way of determining which user is trying to input a command at any given time in order to correctly determine the position of thevirtual image 200 seen by the respective user. Sensors may be provided in the vehicle for determining the vehicle occupancy. Information obtained from these sensors may be used to identify the user that is trying to input a command. In a vehicle, the positions of the driver or the other passengers are generally predefined by the corresponding seat positions. The position of thevirtual image 200 may be suitably determined based on the predefined positions without information from additional sources. - The determination of the position of the
virtual image 200 may be enhanced by determining the position of the head of theuser 140. The position of the user's head may be determined from the images acquired by the stereoscopic camera 131 (inFIG. 1 ), or by providing one or moreadditional cameras 132, which may be non-stereoscopic cameras. Thecamera 132 may be arranged inside the vehicle cabin at a location that allows thecamera 132 to monitor the passenger's head. A camera for monitoring the passenger's head position may already be provided inside the vehicle, as part of a safety system for example. Information obtained from such a camera may be used by theevaluation unit 135. - The
evaluation unit 135 may perform a head tracking or a face tracking of the user's head in order to determine its position. Based on the head position, theevaluation unit 135 may determine the angle along which the user observes thedisplay 102. This information and the information on the3D image 200 generated by thedisplay control unit 120 may be used by theevaluation unit 135 to more precisely determine the spatial position at which theuser 140 observes thevirtual 3D image 200. The spatial location of the interaction elements on the3D object 202 may then be determined to make user activation of the interaction elements using the user-controlled object (in this example, the user's finger) more robust and accurate. - Referring to
FIG. 1 , the vehicleuser interface unit 100 may be configured to provide acoustic feedback to theuser 140 when the user-controlledobject 150 is within a predetermined distance of an element of the3D object 111 in thevirtual 3D image 110. The acoustic feedback may reduce the attention required of theuser 140 on thedisplay 102 while operating thevehicle user interface 100. Example implementations may use a variety of techniques for providing user feedback including using different sounds being output for different events. For example, a first audio signal may be provided when the finger of the user approaches the3D object 111, which may be generated using a frequency that changes according to the distance from the3D object 111. A second sound may be provided when the finger of the user virtually touches an interaction element. A third sound may be provided when the finger of the user reaches a corner or an edge of a face of the3D object 111. Additional acoustic signals may be provided for preselection, activation or execution of an interaction element, or for rotating the3D object 111. The added acoustic feedback and variations in generating sounds with meanings understood to the user may substantially reduce the attention to the3D object 111 required of the user to operate the vehicleuser interface unit 100. The acoustic feedback signals may be generated by theevaluation unit 135 and played out using an amplifier and a loudspeaker, neither of which is illustrated in the figures. - As described above with reference to
FIGS. 1 , and 2A-2C, the vehicleuser interface unit 100 provides a virtual, three-dimensional graphical user interface on which a plurality of interaction elements such as menu items or information elements may be clearly arranged and easily accessed. Each face of the 3D object may also display a part of a menu structure, such as for example, a menu or a submenu, to allow the user to access a menu item of a submenu without the need to browse through hierarchically higher menus. - It is to be understood by those of ordinary skill in the art that the vehicle
user interface unit 100 may include other components, such as mechanical control elements for user interaction, further display components and similar components. The functional units shown inFIG. 1 may be implemented in a variety of ways. Theevaluation unit 135 may be implemented by using a microprocessor, which may include the same microprocessor that performs functions for thedisplay control unit 120 as described above. Or, in other implementations, a separate microprocessor may be used. Thedisplay control unit 120 and theevaluation unit 135 may be implemented as software functions running on a microprocessor. - The microprocessor may be the microprocessor of the vehicle electronic device that uses the
user interface unit 100 for user interaction. The vehicle electronic device may be a head unit that controls vehicular functions and other electronic devices, which may include a multimedia or a navigation system. The vehicle electronic device may also be a less complex system, such as a car stereo. The vehicleuser interface unit 100 may also be provided as a component that is separate from the vehicle electronic device. For example, the vehicleuser interface unit 100 may be implemented inside a headrest and communicate with the vehicle electronic device using wired or wireless communication interface. By providing the vehicleuser interface unit 100 in the headrest, a passenger in the rear passenger compartment of a vehicle may make use of the vehicleuser interface unit 100. Multiple vehicleuser interface units 100 may also be provided in the vehicle compartment. -
FIG. 3 is a flow chart illustrating operation of an example of amethod 300 for operating a vehicle user interface unit. The vehicleuser interface unit 100 ofFIG. 1 may be configured to perform the method described with reference toFIG. 3 . In themethod 300 shown inFIG. 3 , an image is generated for perception by the user as a virtual 3D image instep 302. The virtual 3D image may be generated using the3D display unit 101 inFIG. 1 , for example. Instep 304, the image is generated as a virtual 3D image having at least two regions (regions FIG. 1 , for example) each having a plurality of interaction elements. Each region is arranged in a different spatial plane, which multiplies the amount of that can be presented to the user. Instep 306, the location of the user's hand and index finger is detected using, for example, theinput unit 130 inFIG. 1 . - In
step 310, the position in space at which the user observes the virtual 3D image may be determined by making use of an additional camera to track the user's face. The position at which the user observes the virtual 3D image provides a more precise determination of the viewing angle along which the user observes the display. In addition, the relative positions of the tip of the index finger of the user's hand and the elements provided in the virtual image may be determined with more precision. - In
step 312, the motion of the tip of the user's finger to within a predetermined distance from an interaction element arranged on the 3D object may be detected, and interpreted as a selection of the corresponding interaction element. The function associated with the interaction element may then be executed. Examples of such functions include the adjustment of a parameter such as a volume setting or a temperature setting, the selection of a destination in a navigation application, the selection and playback of a media file, the initiation of a communication via a mobile telephony network or a car-to-car communication system, or any other desired function. - The user may access an interaction element located on a different face of the 3D object using his finger to initiate a command to change the spatial arrangement of the at least two regions having the interaction elements. In
step 314, a command to change the spatial arrangement of the object may be performed in response to the movement of the index finger of the user's hand to within a predetermined distance from a corner or an edge of one of the regions followed by another movement of the index finger in a predetermined direction. As described above with reference toFIGS. 2A-C , the user may for example place a finger on the corner of the cube and drag it in one direction resulting in the rotation of the cube. The user may perform the rotation of the cube so that the desired interaction element is facing the user. The interaction elements of particular submenus on other faces may be in view of the user via the partial transparency of the faces and easily accessed by the simple gesture used to rotate the cube. The gesture described with reference to step 314 precludes the need for the user to browse through a plurality of menu levels in the hierarchy of menus to find the desired function. - The examples of implementations described above may be modified in a variety of ways without departing from the scope of the invention. For example, the
display control unit 120 inFIG. 1 may be configured as another type of polyhedron or a sphere, with spherical caps forming the regions in which the interaction elements are placed. Some regions may include interaction elements in the form of information elements, which may for example display the current status of vehicle electronic systems or other vehicle systems, navigation information or other information. Other regions may include interaction elements in the form of menu items for executing functions, entering further submenus, adjusting parameters, and performing other functions. Both types of interaction elements may also be combined on a region. As noted above, the functional units of the vehicle user interface unit may be implemented in a variety of ways, such as for example, as common or separate integrated circuits, as software code running on a microprocessor or a combination of hardware and software components. - It is to be understood that in the above description of example implementations, the partitioning of the system into functional blocks or units as shown in the drawings is not to be construed as indicating that these units necessarily are implemented as physically separate units. Rather, functional blocks or units shown or described may be implemented as separate units, circuits, chips or circuit elements, and one or more functional blocks or units may also be implemented in a common circuit, chip, circuit element or unit.
- It will be understood, and is appreciated by persons skilled in the art, that one or more processes, sub-processes, or process steps described in connection with
FIGS. 1-3 may be performed by hardware and/or software. If the process is performed by software, the software may reside in software memory (not shown) in a suitable electronic processing component or system such as, one or more of the functional components or modules schematically depicted inFIG. 1 . The software in software memory may include an ordered listing of executable instructions for implementing logical functions (that is, “logic” that may be implemented either in digital form such as digital circuitry or source code or in analog form such as analog circuitry or an analog source such an analog electrical, sound or video signal), and may selectively be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that may selectively fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this disclosure, a “computer-readable medium” is any means that may contain, store or communicate the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium may selectively be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device. More specific examples, but nonetheless a non-exhaustive list, of computer-readable media would include the following: a portable computer diskette (magnetic), a RAM (electronic), a read-only memory “ROM” (electronic), an erasable programmable read-only memory (EPROM or Flash memory) (electronic) and a portable compact disc read-only memory “CDROM” (optical). Note that the computer-readable medium may even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory. - The foregoing description of implementations has been presented for purposes of illustration and description. It is not exhaustive and does not limit the claimed inventions to the precise form disclosed. Modifications and variations are possible in light of the above description or may be acquired from practicing the invention. The claims and their equivalents define the scope of the invention.
Claims (24)
1. A vehicle user interface unit for a vehicle electronic device, comprising:
a three-dimensional (“3D”) display unit having a display, the 3D display unit being configured to display an image perceivable by a user as a virtual 3D image at least partially located in front of the display when the user observes the display;
a display control unit configured to control generation of the image by the 3D display unit where the virtual 3D image includes a 3D object, the 3D object having at least two regions located in different spatial planes, each region of the at least two regions having interaction elements;
a detector configured to determine a position of a user's head and determine a location of the virtual 3D image based on the determined position of the user's head; and
an input unit configured to:
determine an angle along which the user perceives the display based on the determined position of the user's head and based on the determined location of the virtual 3D image;
detect, from the determined perception of the user, a location of a finger of the user within a predetermined distance from at least one of the interaction elements; and
interpret as a selection of the at least one of the interaction elements, the detection of the location of the finger within the predetermined distance.
2. The vehicle user interface unit of claim 1 , where the display control unit is configured to generate the image indicating an active region of the at least two regions in which the interaction elements are selectable and interaction elements of the at least one other region are not selectable, where the active region is a region of the at least two regions that is located closer to an observing user in the virtual 3D image than the at least one other region.
3. The vehicle user interface unit of claim 1 , where the display control unit is configured to generate the image where the 3D object is a cube or a cuboid.
4. The vehicle user interface unit of claim 1 , where the display control unit is configured to generate the image where the 3D object is oriented with a face facing the user, the face facing the user being an active region in which the interaction elements are selectable.
5. The vehicle user interface unit of claim 1 , where the input unit is configured to detect a second predefined variation of the finger and to interpret the second predefined variation as a command to change a spatial arrangement of the at least two regions.
6. The vehicle user interface unit of claim 1 , where the display control unit is configured to change a spatial arrangement of the at least two regions by rotating the 3D object to orient a different face of the 3D object having different interaction elements to face the user.
7. The vehicle user interface unit of claim 5 , where the input unit is configured to detect positioning of the finger of the user at a boundary or a corner of one of the at least two regions and movement of the finger over a predetermined distance as the second predefined variation of the finger.
8. The vehicle user interface unit of claim 1 , where the vehicle user interface unit is configured to provide an acoustical feedback to the user when the finger comes to within a predetermined distance of an element of the 3D object or when a selection of one of the interaction elements is detected.
9. The vehicle user interface unit of claim 1 , where the input unit includes a stereoscopic camera configured to monitor an area adjacent to a position at which the virtual 3D image is created for detecting the location and a variation of the finger.
10. The vehicle user interface unit of claim 1 , further comprising:
a camera for monitoring a viewpoint from which the user observes the display, where the user interface unit is configured to determine a position at which the virtual 3D image is seen by the user observing the display based on the viewpoint.
11. The vehicle user interface unit of claim 1 , where the 3D display unit is an autostereoscopic 3D display unit.
12. The vehicle user interface unit of claim 1 , where the 3D display unit is housed in a face plate of the vehicle electronic device.
13. A method of operating a vehicle user interface unit of a vehicle electronic device, the method comprising:
generating, by a processor, a virtual three-dimensional (“3D”) image at least partially located in front of a display of a 3D display unit used to generate the virtual 3D image;
controlling, by the processor, the generation of the virtual 3D image to include a 3D object, the 3D object having at least two regions located in different spatial planes, each region of the at least two regions having interaction elements;
determining, by the processor, a position of a user's head;
determining, by the processor, a location of the virtual 3D image from a perception of the user based on the determined position of the user's head;
determining, by the processor, an angle along which the user perceives the display based on the determined position of the user's head and based on the determined location of the virtual 3D image; and
detecting, by the processor, from the determined perception of the user, a location of a finger of the user and interpreting a detection of a predefined variation of the finger as a selection of one of the interaction elements in the virtual 3D image, where the predefined variation of the finger includes movement of the finger to within a predetermined distance of an interaction element in the virtual 3D image.
14. A vehicle infotainment system comprising:
an infotainment control system for performing infotainment functions; and
a vehicle user interface unit for providing user access to control of the infotainment functions, the vehicle user interface unit having:
a three-dimensional (“3D”) display unit having a display, the 3D display unit being configured to display an image configured for perception by a user as a virtual 3D image at least partially located in front of the display when the user observes the display;
a display control unit configured to control generation of the image by the 3D display unit where the virtual 3D image includes a 3D object, the 3D object having at least two regions located in different spatial planes, each region of the at least two regions having interaction elements;
a detector configured to determine a position of a user's head and determine a location of the virtual 3D image based on the determined position of the user's head; and
an input unit configured to determine an angle along which the user perceives the display based on the determined position of the user's head and based on the determined location of the virtual 3D image, to detect, from the determined perception of the user, a location of a finger of the user within a predetermined distance from at least one of the interaction elements, and to interpret a detection of a predefined variation of the finger from the determined perception of the user as a selection of one of the interaction elements in the virtual 3D image.
15. The vehicle infotainment system of claim 14 , where the display control unit is configured to generate the image indicating an active region of the at least two regions in which the interaction elements are selectable and interaction elements of the at least one other region are not selectable, where the active region is a region of the at least two regions that is located closer to an observing user in the virtual 3D image than the at least one other region.
16. The vehicle infotainment system of claim 14 , where the display control unit is configured to generate the image where the 3D object is a cube or a cuboid.
17. The vehicle infotainment system of claim 14 , where the display control unit is configured to generate the image where the 3D object is oriented with a face facing the user, the face facing the user being an active region in which the interaction elements are selectable.
18. The vehicle infotainment system of claim 14 , where the input unit includes a stereoscopic camera configured to monitor an area adjacent to a position at which the virtual 3D image is created for detecting the location and a variation of the finger.
19. A vehicle cabin comprising:
an electronic device having a vehicle user interface unit for providing user access to control of functions of the electronic device, the vehicle user interface unit having:
a three-dimensional (“3D”) display unit having a display, the 3D display unit being configured to display an image configured for perception by a user as a virtual 3D image at least partially located in front of the display when the user observes the display;
a display control unit configured to control generation of the image by the 3D display unit where the virtual 3D image includes a 3D object, the 3D object having at least two regions located in different spatial planes, each region of the at least two regions having interaction elements;
a detector configured to determine a position of a user's head and determine a location of the virtual 3D image based on the determined position of the user's head; and
an input unit configured to determine an angle along which the user perceives the display based on the determined position of the user's head and based on the determined location of the virtual 3D image, to detect, from the determined perception of the user, a location of a finger of the user within a predetermined distance from at least one of the interaction elements, and to interpret a detection of a predefined variation of the finger from the determined perception of the user as a selection of one of the interaction elements in the virtual 3D image.
20. The vehicle cabin of claim 19 , where the predefined variation of the finger includes a movement of the finger to within a predetermined distance from at least one of the interaction elements, the input unit being further configured to interpret a detection of the movement as a selection of a corresponding interaction element.
21. The vehicle cabin of claim 19 , where the display control unit is configured to generate the image indicating an active region of the at least two regions in which the interaction elements are selectable and interaction elements of the at least one other region are not selectable, where the active region is a region of the at least two regions located closer to an observing user in the virtual 3D image than the at least one other region.
22. The vehicle cabin of claim 19 , where the display control unit is configured to generate the image where the 3D object is a cube or a cuboid.
23. The vehicle cabin of claim 19 , where the display control unit is configured to generate the image where the 3D object is oriented with a face facing the user, the face facing the user being an active region in which the interaction elements are selectable.
24. The vehicle cabin of claim 19 , where the input unit includes a stereoscopic camera configured to monitor an area adjacent to a position at which the virtual 3D image is created for detecting the location and a variation of the finger.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/698,692 US20150234569A1 (en) | 2010-03-30 | 2015-04-28 | Vehicle user interface unit for a vehicle electronic device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10003477A EP2372512A1 (en) | 2010-03-30 | 2010-03-30 | Vehicle user interface unit for a vehicle electronic device |
EP10003477.6 | 2010-03-30 | ||
US13/076,243 US9030465B2 (en) | 2010-03-30 | 2011-03-30 | Vehicle user interface unit for a vehicle electronic device |
US14/698,692 US20150234569A1 (en) | 2010-03-30 | 2015-04-28 | Vehicle user interface unit for a vehicle electronic device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/076,243 Continuation US9030465B2 (en) | 2010-03-30 | 2011-03-30 | Vehicle user interface unit for a vehicle electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150234569A1 true US20150234569A1 (en) | 2015-08-20 |
Family
ID=42320535
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/076,243 Active 2031-11-13 US9030465B2 (en) | 2010-03-30 | 2011-03-30 | Vehicle user interface unit for a vehicle electronic device |
US14/698,692 Abandoned US20150234569A1 (en) | 2010-03-30 | 2015-04-28 | Vehicle user interface unit for a vehicle electronic device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/076,243 Active 2031-11-13 US9030465B2 (en) | 2010-03-30 | 2011-03-30 | Vehicle user interface unit for a vehicle electronic device |
Country Status (6)
Country | Link |
---|---|
US (2) | US9030465B2 (en) |
EP (1) | EP2372512A1 (en) |
JP (1) | JP2011210239A (en) |
KR (1) | KR20110109974A (en) |
CN (1) | CN102207770A (en) |
CA (1) | CA2730379C (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140168415A1 (en) * | 2012-12-07 | 2014-06-19 | Magna Electronics Inc. | Vehicle vision system with micro lens array |
US20160239080A1 (en) * | 2015-02-13 | 2016-08-18 | Leap Motion, Inc. | Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments |
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US9697643B2 (en) | 2012-01-17 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US9823764B2 (en) * | 2014-12-03 | 2017-11-21 | Microsoft Technology Licensing, Llc | Pointer projection for natural user input |
US9934580B2 (en) | 2012-01-17 | 2018-04-03 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9996638B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Predictive information for free space gesture control and communication |
US10353532B1 (en) | 2014-12-18 | 2019-07-16 | Leap Motion, Inc. | User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments |
US10429923B1 (en) | 2015-02-13 | 2019-10-01 | Ultrahaptics IP Two Limited | Interaction engine for creating a realistic experience in virtual reality/augmented reality environments |
US10585193B2 (en) | 2013-03-15 | 2020-03-10 | Ultrahaptics IP Two Limited | Determining positional information of an object in space |
US10627913B2 (en) * | 2016-05-13 | 2020-04-21 | Visteon Global Technologies, Inc. | Method for the contactless shifting of visual information |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US10846942B1 (en) | 2013-08-29 | 2020-11-24 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11099653B2 (en) | 2013-04-26 | 2021-08-24 | Ultrahaptics IP Two Limited | Machine responsiveness to dynamic user movements and gestures |
US11354787B2 (en) | 2018-11-05 | 2022-06-07 | Ultrahaptics IP Two Limited | Method and apparatus for correcting geometric and optical aberrations in augmented reality |
US11353962B2 (en) | 2013-01-15 | 2022-06-07 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US11567578B2 (en) | 2013-08-09 | 2023-01-31 | Ultrahaptics IP Two Limited | Systems and methods of free-space gestural interaction |
US11720180B2 (en) | 2012-01-17 | 2023-08-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US11740705B2 (en) | 2013-01-15 | 2023-08-29 | Ultrahaptics IP Two Limited | Method and system for controlling a machine according to a characteristic of a control object |
US11778159B2 (en) | 2014-08-08 | 2023-10-03 | Ultrahaptics IP Two Limited | Augmented reality with motion sensing |
US11775033B2 (en) | 2013-10-03 | 2023-10-03 | Ultrahaptics IP Two Limited | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
Families Citing this family (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8933876B2 (en) | 2010-12-13 | 2015-01-13 | Apple Inc. | Three dimensional user interface session control |
US9035876B2 (en) | 2008-01-14 | 2015-05-19 | Apple Inc. | Three-dimensional user interface session control |
US8872762B2 (en) | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
WO2012144666A1 (en) * | 2011-04-19 | 2012-10-26 | Lg Electronics Inc. | Display device and control method therof |
US8881051B2 (en) | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
US9459758B2 (en) | 2011-07-05 | 2016-10-04 | Apple Inc. | Gesture-based interface with enhanced features |
US9377865B2 (en) | 2011-07-05 | 2016-06-28 | Apple Inc. | Zoom-based gesture user interface |
US9030498B2 (en) | 2011-08-15 | 2015-05-12 | Apple Inc. | Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface |
US9218063B2 (en) * | 2011-08-24 | 2015-12-22 | Apple Inc. | Sessionless pointing user interface |
CN103999018B (en) * | 2011-12-06 | 2016-12-28 | 汤姆逊许可公司 | The user of response three-dimensional display object selects the method and system of posture |
DE102012000201A1 (en) | 2012-01-09 | 2013-07-11 | Daimler Ag | Method and device for operating functions displayed on a display unit of a vehicle using gestures executed in three-dimensional space as well as related computer program product |
EP2802476B1 (en) * | 2012-01-09 | 2017-01-11 | Audi AG | Method and device for generating a 3d representation of a user interface in a vehicle |
DE102012000274A1 (en) * | 2012-01-10 | 2013-07-11 | Daimler Ag | A method and apparatus for operating functions in a vehicle using gestures executed in three-dimensional space and related computer program product |
US9229534B2 (en) | 2012-02-28 | 2016-01-05 | Apple Inc. | Asymmetric mapping for tactile and non-tactile user interfaces |
KR101318244B1 (en) | 2012-02-29 | 2013-10-15 | 한국과학기술연구원 | System and Method for Implemeting 3-Dimensional User Interface |
CN103365483B (en) * | 2012-04-09 | 2016-09-21 | 深圳泰山体育科技股份有限公司 | Realize the system and method for virtual screen |
DE102012007761A1 (en) * | 2012-04-20 | 2013-10-24 | Marquardt Gmbh | operating device |
US9069455B2 (en) * | 2012-06-22 | 2015-06-30 | Microsoft Technology Licensing, Llc | 3D user interface for application entities |
DE102012216193B4 (en) * | 2012-09-12 | 2020-07-30 | Continental Automotive Gmbh | Method and device for operating a motor vehicle component using gestures |
EP2752730B1 (en) * | 2013-01-08 | 2019-04-03 | Volvo Car Corporation | Vehicle display arrangement and vehicle comprising a vehicle display arrangement |
US20150363072A1 (en) * | 2013-01-10 | 2015-12-17 | Fox Sports Productions, Inc. | System, method and interface for viewer interaction relative to a 3d representation of a vehicle |
CN107264283B (en) * | 2013-03-29 | 2019-07-19 | 株式会社斯巴鲁 | Transporting equipment display device |
US20140368425A1 (en) * | 2013-06-12 | 2014-12-18 | Wes A. Nagara | Adjusting a transparent display with an image capturing device |
EP2821884B1 (en) * | 2013-07-01 | 2018-09-05 | Airbus Operations GmbH | Cabin management system having a three-dimensional operating panel |
CN103529947A (en) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | Display device and control method thereof and gesture recognition method |
JP2017521970A (en) * | 2014-04-30 | 2017-08-03 | ビステオン グローバル テクノロジーズ インコーポレイテッド | System and method for calibrating stereoscopic display alignment in a vehicle |
US20160165197A1 (en) * | 2014-05-27 | 2016-06-09 | Mediatek Inc. | Projection processor and associated method |
CN104199556B (en) * | 2014-09-22 | 2018-01-16 | 联想(北京)有限公司 | A kind of information processing method and device |
EP3007050A1 (en) * | 2014-10-08 | 2016-04-13 | Volkswagen Aktiengesellschaft | User interface and method for adapting a menu bar on a user interface |
GB2533777A (en) * | 2014-12-24 | 2016-07-06 | Univ Of Hertfordshire Higher Education Corp | Coherent touchless interaction with steroscopic 3D images |
US10247941B2 (en) * | 2015-01-19 | 2019-04-02 | Magna Electronics Inc. | Vehicle vision system with light field monitor |
EP3088991B1 (en) * | 2015-04-30 | 2019-12-25 | TP Vision Holding B.V. | Wearable device and method for enabling user interaction |
ES2819239T3 (en) | 2015-05-30 | 2021-04-15 | Leia Inc | Vehicle display system |
KR101910383B1 (en) * | 2015-08-05 | 2018-10-22 | 엘지전자 주식회사 | Driver assistance apparatus and vehicle including the same |
US20170161950A1 (en) * | 2015-12-08 | 2017-06-08 | GM Global Technology Operations LLC | Augmented reality system and image processing of obscured objects |
US20170161949A1 (en) * | 2015-12-08 | 2017-06-08 | GM Global Technology Operations LLC | Holographic waveguide hud side view display |
US10078884B2 (en) * | 2015-12-21 | 2018-09-18 | Siemens Aktiengesellschaft | System and method for processing geographical information with a central window and frame |
EP3249497A1 (en) * | 2016-05-24 | 2017-11-29 | Harman Becker Automotive Systems GmbH | Eye tracking |
DE102016216577A1 (en) * | 2016-09-01 | 2018-03-01 | Volkswagen Aktiengesellschaft | A method of interacting with image content displayed on a display device in a vehicle |
CN106980362A (en) * | 2016-10-09 | 2017-07-25 | 阿里巴巴集团控股有限公司 | Input method and device based on virtual reality scenario |
DE102016220075A1 (en) * | 2016-10-14 | 2018-04-19 | Audi Ag | Motor vehicle and method for 360 ° field detection |
WO2018105552A1 (en) * | 2016-12-09 | 2018-06-14 | 株式会社ソニー・インタラクティブエンタテインメント | Sound control device, sound control method, and program |
JP6673288B2 (en) * | 2017-04-27 | 2020-03-25 | 株式会社デンソー | Display device for vehicles |
CN107202426A (en) * | 2017-05-18 | 2017-09-26 | 珠海格力电器股份有限公司 | control device and method, water heater |
IT201700091628A1 (en) * | 2017-08-08 | 2019-02-08 | Automotive Lighting Italia Spa | Virtual man-machine interface system and corresponding virtual man-machine interface procedure for a vehicle. |
EP3466761B1 (en) * | 2017-10-05 | 2020-09-09 | Ningbo Geely Automobile Research & Development Co. Ltd. | A display system and method for a vehicle |
JP2019086911A (en) * | 2017-11-02 | 2019-06-06 | 三菱自動車工業株式会社 | In-vehicle user interface device |
US10267960B1 (en) | 2018-02-05 | 2019-04-23 | GM Global Technology Operations LLC | Cloaking device and apparatus |
JP6730552B2 (en) * | 2018-05-14 | 2020-07-29 | 株式会社ユピテル | Electronic information system and its program |
JP7076331B2 (en) * | 2018-08-10 | 2022-05-27 | 本田技研工業株式会社 | Vehicle system |
WO2020032307A1 (en) * | 2018-08-10 | 2020-02-13 | 엘지전자 주식회사 | Vehicular display system |
US10696239B2 (en) | 2018-11-28 | 2020-06-30 | Toyota Motor Engineering & Manufacturing North America, Inc. | Use of a lenticular lens array to apply a display inside of a vehicle |
EP3670228B1 (en) * | 2018-12-17 | 2022-04-13 | Audi Ag | A display device and a vehicle comprising the display device |
DE112019006103B4 (en) * | 2019-01-10 | 2022-10-20 | Mitsubishi Electric Corporation | Information display control device, method, program and recording medium |
DE102019105764B3 (en) | 2019-03-07 | 2020-08-06 | Gestigon Gmbh | Method for calibrating a user interface and user interface |
DE102019127183A1 (en) * | 2019-10-09 | 2021-04-15 | Audi Ag | Display system for displaying an operating instruction for an operating element in a motor vehicle |
CN112061137B (en) * | 2020-08-19 | 2022-01-14 | 一汽奔腾轿车有限公司 | Man-vehicle interaction control method outside vehicle |
CN113895229B (en) * | 2021-10-11 | 2022-05-13 | 黑龙江天有为电子有限责任公司 | Display method of automobile instrument information |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6704114B1 (en) * | 1998-11-16 | 2004-03-09 | Robert Bosch Gmbh | Device for detecting whether a vehicle seat is occupied by means of a stereoscopic image recording sensor |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07114451A (en) * | 1993-10-19 | 1995-05-02 | Canon Inc | Method and device for selecting three-dimension menu |
JPH0935584A (en) | 1995-07-21 | 1997-02-07 | Yazaki Corp | Display device for vehicle |
US5678015A (en) * | 1995-09-01 | 1997-10-14 | Silicon Graphics, Inc. | Four-dimensional graphical user interface |
JPH09134269A (en) * | 1995-11-10 | 1997-05-20 | Matsushita Electric Ind Co Ltd | Display controller |
DE69728108T2 (en) * | 1996-04-19 | 2004-09-30 | Koninklijke Philips Electronics N.V. | DATA PROCESSING SYSTEM WITH SOFT KEYBOARD SWITCHES BETWEEN DIRECT AND INDIRECT CHARACTERS |
JPH09298759A (en) * | 1996-05-08 | 1997-11-18 | Sanyo Electric Co Ltd | Stereoscopic video display device |
US6215898B1 (en) * | 1997-04-15 | 2001-04-10 | Interval Research Corporation | Data processing system and method |
US6064354A (en) | 1998-07-01 | 2000-05-16 | Deluca; Michael Joseph | Stereoscopic user interface method and apparatus |
JP2000075991A (en) * | 1998-08-28 | 2000-03-14 | Aqueous Research:Kk | Information input device |
JP3390677B2 (en) * | 1998-10-26 | 2003-03-24 | 三菱電機株式会社 | Menu display method and menu display device |
JP2001092579A (en) * | 1999-09-24 | 2001-04-06 | Toshiba Corp | Information display device |
JP2002259989A (en) * | 2001-03-02 | 2002-09-13 | Gifu Prefecture | Pointing gesture detecting method and its device |
JP2004318325A (en) * | 2003-04-14 | 2004-11-11 | Dainippon Printing Co Ltd | Information input system, its program, and electronic form |
JP2004334590A (en) | 2003-05-08 | 2004-11-25 | Denso Corp | Operation input device |
JP2005138755A (en) * | 2003-11-07 | 2005-06-02 | Denso Corp | Device and program for displaying virtual images |
JP2005196530A (en) * | 2004-01-08 | 2005-07-21 | Alpine Electronics Inc | Space input device and space input method |
DE102005017313A1 (en) * | 2005-04-14 | 2006-10-19 | Volkswagen Ag | Method for displaying information in a means of transport and instrument cluster for a motor vehicle |
JP4318047B2 (en) * | 2005-06-06 | 2009-08-19 | ソニー株式会社 | 3D object display device, 3D object switching display method, and 3D object display program |
US8279168B2 (en) * | 2005-12-09 | 2012-10-02 | Edge 3 Technologies Llc | Three-dimensional virtual-touch human-machine interface system and method therefor |
US9075441B2 (en) * | 2006-02-08 | 2015-07-07 | Oblong Industries, Inc. | Gesture based control using three-dimensional information extracted over an extended depth of field |
DE102006032117A1 (en) | 2006-07-12 | 2008-01-24 | Volkswagen Ag | Information system for transport medium, particularly motor vehicles, has input unit and indicator with display, where input unit has device to record position of object before display with in transport medium |
WO2008132724A1 (en) * | 2007-04-26 | 2008-11-06 | Mantisvision Ltd. | A method and apparatus for three dimensional interaction with autosteroscopic displays |
DE102007039442A1 (en) * | 2007-08-21 | 2009-02-26 | Volkswagen Ag | Method for displaying information in a vehicle and display device for a vehicle |
JP4645678B2 (en) * | 2008-05-08 | 2011-03-09 | ソニー株式会社 | Information input / output device, information input / output method, and computer program |
US20100050129A1 (en) | 2008-08-19 | 2010-02-25 | Augusta Technology, Inc. | 3D Graphical User Interface For Simultaneous Management Of Applications |
US20100128112A1 (en) * | 2008-11-26 | 2010-05-27 | Samsung Electronics Co., Ltd | Immersive display system for interacting with three-dimensional content |
-
2010
- 2010-03-30 EP EP10003477A patent/EP2372512A1/en not_active Ceased
-
2011
- 2011-01-27 CA CA2730379A patent/CA2730379C/en active Active
- 2011-02-21 JP JP2011035242A patent/JP2011210239A/en active Pending
- 2011-03-29 KR KR1020110028079A patent/KR20110109974A/en not_active Application Discontinuation
- 2011-03-30 US US13/076,243 patent/US9030465B2/en active Active
- 2011-03-30 CN CN2011100777543A patent/CN102207770A/en active Pending
-
2015
- 2015-04-28 US US14/698,692 patent/US20150234569A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6704114B1 (en) * | 1998-11-16 | 2004-03-09 | Robert Bosch Gmbh | Device for detecting whether a vehicle seat is occupied by means of a stereoscopic image recording sensor |
Non-Patent Citations (1)
Title |
---|
Surman et al., âHead Tracked Single and Multi-user Autostereoscopic Displays,â CVMP 2006, Nov. 2006, IEEE. * |
Cited By (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9934580B2 (en) | 2012-01-17 | 2018-04-03 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9697643B2 (en) | 2012-01-17 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US10699155B2 (en) | 2012-01-17 | 2020-06-30 | Ultrahaptics IP Two Limited | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US10565784B2 (en) | 2012-01-17 | 2020-02-18 | Ultrahaptics IP Two Limited | Systems and methods for authenticating a user according to a hand of the user moving in a three-dimensional (3D) space |
US9741136B2 (en) | 2012-01-17 | 2017-08-22 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US9778752B2 (en) | 2012-01-17 | 2017-10-03 | Leap Motion, Inc. | Systems and methods for machine control |
US11720180B2 (en) | 2012-01-17 | 2023-08-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US10410411B2 (en) | 2012-01-17 | 2019-09-10 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US11308711B2 (en) | 2012-01-17 | 2022-04-19 | Ultrahaptics IP Two Limited | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US10366308B2 (en) | 2012-01-17 | 2019-07-30 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US20140168415A1 (en) * | 2012-12-07 | 2014-06-19 | Magna Electronics Inc. | Vehicle vision system with micro lens array |
US20210291751A1 (en) * | 2012-12-07 | 2021-09-23 | Magna Electronics Inc. | Vehicular driver monitoring system with camera having micro lens array |
US11874970B2 (en) | 2013-01-15 | 2024-01-16 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US11740705B2 (en) | 2013-01-15 | 2023-08-29 | Ultrahaptics IP Two Limited | Method and system for controlling a machine according to a characteristic of a control object |
US11353962B2 (en) | 2013-01-15 | 2022-06-07 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US10585193B2 (en) | 2013-03-15 | 2020-03-10 | Ultrahaptics IP Two Limited | Determining positional information of an object in space |
US11693115B2 (en) | 2013-03-15 | 2023-07-04 | Ultrahaptics IP Two Limited | Determining positional information of an object in space |
US11099653B2 (en) | 2013-04-26 | 2021-08-24 | Ultrahaptics IP Two Limited | Machine responsiveness to dynamic user movements and gestures |
US11567578B2 (en) | 2013-08-09 | 2023-01-31 | Ultrahaptics IP Two Limited | Systems and methods of free-space gestural interaction |
US11282273B2 (en) | 2013-08-29 | 2022-03-22 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US10846942B1 (en) | 2013-08-29 | 2020-11-24 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11776208B2 (en) | 2013-08-29 | 2023-10-03 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11461966B1 (en) | 2013-08-29 | 2022-10-04 | Ultrahaptics IP Two Limited | Determining spans and span lengths of a control object in a free space gesture control environment |
US11775033B2 (en) | 2013-10-03 | 2023-10-03 | Ultrahaptics IP Two Limited | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
US11010512B2 (en) | 2013-10-31 | 2021-05-18 | Ultrahaptics IP Two Limited | Improving predictive information for free space gesture control and communication |
US9996638B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Predictive information for free space gesture control and communication |
US11868687B2 (en) | 2013-10-31 | 2024-01-09 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11568105B2 (en) | 2013-10-31 | 2023-01-31 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11778159B2 (en) | 2014-08-08 | 2023-10-03 | Ultrahaptics IP Two Limited | Augmented reality with motion sensing |
US9823764B2 (en) * | 2014-12-03 | 2017-11-21 | Microsoft Technology Licensing, Llc | Pointer projection for natural user input |
US10353532B1 (en) | 2014-12-18 | 2019-07-16 | Leap Motion, Inc. | User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments |
US11599237B2 (en) | 2014-12-18 | 2023-03-07 | Ultrahaptics IP Two Limited | User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments |
US10921949B2 (en) | 2014-12-18 | 2021-02-16 | Ultrahaptics IP Two Limited | User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments |
US11392212B2 (en) | 2015-02-13 | 2022-07-19 | Ultrahaptics IP Two Limited | Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments |
US11237625B2 (en) | 2015-02-13 | 2022-02-01 | Ultrahaptics IP Two Limited | Interaction engine for creating a realistic experience in virtual reality/augmented reality environments |
US10936080B2 (en) | 2015-02-13 | 2021-03-02 | Ultrahaptics IP Two Limited | Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments |
US10429923B1 (en) | 2015-02-13 | 2019-10-01 | Ultrahaptics IP Two Limited | Interaction engine for creating a realistic experience in virtual reality/augmented reality environments |
US10261594B2 (en) | 2015-02-13 | 2019-04-16 | Leap Motion, Inc. | Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments |
US9696795B2 (en) * | 2015-02-13 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments |
US20160239080A1 (en) * | 2015-02-13 | 2016-08-18 | Leap Motion, Inc. | Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments |
US10627913B2 (en) * | 2016-05-13 | 2020-04-21 | Visteon Global Technologies, Inc. | Method for the contactless shifting of visual information |
US11354787B2 (en) | 2018-11-05 | 2022-06-07 | Ultrahaptics IP Two Limited | Method and apparatus for correcting geometric and optical aberrations in augmented reality |
US11798141B2 (en) | 2018-11-05 | 2023-10-24 | Ultrahaptics IP Two Limited | Method and apparatus for calibrating augmented reality headsets |
Also Published As
Publication number | Publication date |
---|---|
CN102207770A (en) | 2011-10-05 |
US9030465B2 (en) | 2015-05-12 |
KR20110109974A (en) | 2011-10-06 |
US20110242102A1 (en) | 2011-10-06 |
CA2730379C (en) | 2013-05-14 |
EP2372512A1 (en) | 2011-10-05 |
CA2730379A1 (en) | 2011-09-30 |
JP2011210239A (en) | 2011-10-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9030465B2 (en) | Vehicle user interface unit for a vehicle electronic device | |
US11714592B2 (en) | Gaze-based user interactions | |
US11643047B2 (en) | Image display device, image display system, image display method and program | |
JP5781080B2 (en) | 3D stereoscopic display device and 3D stereoscopic display processing device | |
US9697347B2 (en) | Mobile terminal and control method thereof | |
US10821831B2 (en) | Method for interacting with image contents displayed on a display device in a transportation vehicle | |
US10133407B2 (en) | Display apparatus, display system, method for controlling display apparatus, and program | |
US9756319B2 (en) | Virtual see-through instrument cluster with live video | |
CN104765445B (en) | Eye vergence detection on a display | |
US9256288B2 (en) | Apparatus and method for selecting item using movement of object | |
US8923686B2 (en) | Dynamically configurable 3D display | |
EP2752730B1 (en) | Vehicle display arrangement and vehicle comprising a vehicle display arrangement | |
JP5889408B2 (en) | Information processing apparatus, method, and program | |
JP5860144B2 (en) | Information processing apparatus, method, and program | |
US20180157324A1 (en) | Method and Device for Interacting with a Graphical User Interface | |
CN110968187A (en) | Remote touch detection enabled by a peripheral device | |
WO2019217081A1 (en) | Selecting a text input field using eye gaze | |
US9304670B2 (en) | Display device and method of controlling the same | |
US20130187845A1 (en) | Adaptive interface system | |
JP6638392B2 (en) | Display device, display system, display device control method, and program | |
WO2019002673A1 (en) | Methods, apparatus, systems, computer programs for enabling consumption of virtual content for mediated reality | |
JP5465334B2 (en) | 3D stereoscopic display device | |
US11068054B2 (en) | Vehicle and control method thereof | |
EP3088991B1 (en) | Wearable device and method for enabling user interaction | |
JP6740613B2 (en) | Display device, display device control method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HARMAN BECKER AUTOMOTIVE SYSTEMS GMBH, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HESS, WOLFGANG;REEL/FRAME:035555/0361 Effective date: 20100209 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |