US20150153936A1 - Integrated multimedia device for vehicle - Google Patents

Integrated multimedia device for vehicle Download PDF

Info

Publication number
US20150153936A1
US20150153936A1 US14/522,242 US201414522242A US2015153936A1 US 20150153936 A1 US20150153936 A1 US 20150153936A1 US 201414522242 A US201414522242 A US 201414522242A US 2015153936 A1 US2015153936 A1 US 2015153936A1
Authority
US
United States
Prior art keywords
display device
input
touch
user input
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/522,242
Inventor
Mi Jung Lim
Dong A Oh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Mobis Co Ltd
Original Assignee
Hyundai Mobis Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyundai Mobis Co Ltd filed Critical Hyundai Mobis Co Ltd
Assigned to HYUNDAI MOBIS CO., LTD. reassignment HYUNDAI MOBIS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIM, MI JUNG, OH, DONG A
Publication of US20150153936A1 publication Critical patent/US20150153936A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • B60K35/213
    • B60K35/29
    • B60K35/60
    • B60K35/81
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • B60K2360/11
    • B60K2360/113
    • B60K2360/115
    • B60K2360/1438
    • B60K2360/184
    • B60K2360/782

Definitions

  • Exemplary embodiments relate to an integrated multimedia device for a vehicle, and, more particularly, to an integrated multimedia device for a vehicle that may be operated in conjunction with an instrument cluster and a method of controlling the same.
  • a vehicle may include a multimedia device configured to provide a digital multimedia broadcasting (DMB) function and a navigation function.
  • a display unit of the multimedia device may be incorporated as part of a center fascia of the vehicle. With the display unit being incorporated as part of the center fascia, various functions may be provided, such as, for example, the DMB function, the navigation function, a temperature adjustment function, a multimedia playback function, etc.
  • various functions may be provided, such as, for example, the DMB function, the navigation function, a temperature adjustment function, a multimedia playback function, etc.
  • the attention of a driver may be distracted during operation or information retrieval, which may cause an accident. For instance, the driver may need to perform a touch input while watching the screen and may need to bend their body downward to manipulate the screen. In this manner, the risk of an accident may increase, especially when the driver attempts to drive the vehicle and manipulate the display unit at the same time.
  • Exemplary embodiments provide an integrated multimedia device for a vehicle operated in conjunction with an instrument cluster.
  • an integrated multimedia device for a vehicle includes a display device, an instrument cluster, and a user input device.
  • the display device is configured to provide at least one of a digital multimedia broadcasting (DMB) function and a navigation function.
  • the display device is configured to detect touch inputs.
  • the instrument cluster is configured to display content from the display device in response to a user input.
  • the user input device is configured to manipulate at least one of the display device and the instrument cluster in response to detection of a touch input.
  • a method includes: causing, at least in part, a user input to a system of a vehicle to be detected, the system being configured to provide at least one of a digital multimedia broadcasting (DMB) function and a navigation function; determining, in response to detection of the user input, an operation corresponding to the user input; and generating, in accordance with the operation, a control signal configured to affect the display of content via at least one of a display device and an instrument cluster of the system.
  • DMB digital multimedia broadcasting
  • user convenience of a multimedia device may be improve via enablement of various types of user inputs, such as touch inputs to a screen of the multimedia device, touch inputs to a touch pad mounted on a steering wheel of the vehicle, and gesture inputs detected via one or more sensors.
  • a driver of the vehicle may perform an input action without directly interacting with the screen of the multimedia device, which may increase user convenience and safety when the driver is driving the vehicle, as well as reduce the potential for driver distraction.
  • the instrument cluster may output content received from the multimedia device, the driver may drive the vehicle and acquire information from the multimedia device without directing attention to the screen of the multimedia device, and, thereby, away from a direction of travel.
  • FIG. 1 is a view of a cockpit of a vehicle including an integrated multimedia device, according to exemplary embodiments.
  • FIG. 2 is a block diagram of the integrated multimedia device of FIG. 1 , according to exemplary embodiments.
  • FIGS. 3A and 3B are views of an analog instrument cluster, according to exemplary embodiments.
  • FIGS. 3C and 3D are views of a digital instrument cluster, according to exemplary embodiments.
  • FIGS. 4A , 4 B, and 4 C are views of a display device, according to exemplary embodiments.
  • FIG. 5A is a view of a first user input unit, according to exemplary embodiments.
  • FIGS. 5B , 5 C, 5 D, and 5 E are views of a second user input unit, according to exemplary embodiments.
  • FIGS. 6A , 6 B, and 6 C are views of the display device to illustrate a method of controlling the integrated multimedia device of FIG. 1 , according to exemplary embodiments.
  • FIGS. 7A and 7B are views of the display device to illustrate another method of controlling the integrated multimedia device of FIG. 1 , according to exemplary embodiments.
  • a component, device, element, etc. When a component, device, element, etc., is referred to as being “on,” “connected to,” or “coupled to” another component, device, element, etc., it may be directly on, connected to, or coupled to the other component, device element, etc., or intervening components, devices, elements, etc., may be present. When, however, a component, device, element, etc., is referred to as being “directly on,” “directly connected to,” or “directly coupled to” another component, device, element, etc., there are no intervening components, devices, elements, etc., present.
  • “at least one of X, Y, and Z” and “at least one selected from the group consisting of X, Y, and Z” may be construed as X only, Y only, Z only, or any combination of two or more of X, Y, and Z, such as, for instance, XYZ, XYY, YZ, and ZZ.
  • the term “and/or” includes any and all combinations of one or more of the associated listed items.
  • first, second, etc. may be used herein to describe various components, devices, elements, regions, etc., these components, devices, elements, regions, etc., are not to be limited by these terms. These terms are used to distinguish one component, device, element, region, etc., from another component, device, element, region, etc. In this manner, a first component, device, element, region, etc., discussed below may be termed a second component, device, element, region, etc., without departing from the teachings of the present disclosure.
  • Spatially relative terms such as “beneath,” “below,” “lower,” “above,” “upper,” and the like, may be used herein for descriptive purposes, and, thereby, to describe one component, device, element, or feature's relationship to another component(s), device(s), element(s), or feature(s) as illustrated in the drawings.
  • Spatially relative terms are intended to encompass different orientations of an apparatus in use, operation, and/or manufacture in addition to the orientation depicted in the drawings. For example, if an apparatus in the drawings is turned over, components described as “below” or “beneath” other components would then be oriented “above” the other components. In this manner, the exemplary term “below” can encompass both an orientation of above and below.
  • an apparatus may be otherwise oriented (e.g., rotated 90 degrees or at other orientations), and, as such, the spatially relative descriptors used herein are to be interpreted accordingly.
  • FIG. 1 is a view of a cockpit of a vehicle including an integrated multimedia device, according to exemplary embodiments.
  • FIG. 2 is a block diagram of the integrated multimedia device of FIG. 1 .
  • the integrated multimedia device for a vehicle may include an instrument cluster 110 , a user input unit 120 , a display device 130 , a vision (or optical) sensor 140 , and a control unit 150 .
  • the integrated multimedia device may embody many forms and include multiple and/or alternative components.
  • the components of the integrated multimedia device may be combined, located in separate structures, and/or separate locations.
  • the instrument (or gauge) cluster 110 is a module that displays various types of information regarding a vehicle.
  • instrument cluster 110 will be referred to as simply cluster 110 .
  • the cluster 110 may display information regarding (or otherwise associated with) a speed of the vehicle and a revolution per minute (RPM) of an engine of the vehicle.
  • RPM revolution per minute
  • the cluster 110 may display trip information regarding a traveling distance, fuel consumption, and/or the like.
  • the cluster 110 may display information regarding a travelable distance and fuel consumption, such as instantaneous fuel consumption and average fuel consumption.
  • the cluster 110 may display information regarding various types of states and/or operational components of the vehicle, such as opened states of a door and/or a trunk, an operating state of a lamp, an operating state of a side (or emergency) brake, an operating state of a wiper, a warning signal associated with a component (e.g., airbag, door, engine, gate, ignition, light, sunroof, tire, window, etc.) of the vehicle, an operating state of a feature (e.g., a cruise control feature, a blind spot monitoring feature, a transmission position feature, a traction control feature, a fluid monitoring feature, etc.), and/or the like.
  • the cluster 110 may provide information concerning vehicle maintenance, date and/or time information, and/or the like.
  • the cluster 110 may output and display contents provided (or otherwise received) from the display device 130 .
  • the cluster 110 may display information regarding navigation, digital multimedia broadcasting (DMB), multimedia file playing, radio, etc., information, that are provided from the display device 130 .
  • DMB digital multimedia broadcasting
  • a user of the vehicle may manipulate an operation of the cluster 110 using, for instance, the user input unit 120 .
  • the user may be a driver or other type of occupant of the vehicle.
  • the user input unit 120 may be a touch pad mounted on a steering wheel. It is also contemplated that the user input unit 120 may be disposed (or otherwise integrated) as part of a console unit, an integrated control system, a fascia, etc.
  • a user of the vehicle may interact with the user input unit 120 to control aspects of the instrument cluster 110 and/or the display device 130 .
  • a user may perform a multi-touch input to the user input unit 120 , and, thereafter, perform a pinch-in input or a pinch-out input in a state in which navigational information is displayed via the cluster 110 .
  • a map displayed via the cluster 110 may zoom in or zoom out.
  • a user may perform a multi-touch input to the user input unit 120 , and thereafter, perform a drag input in a predetermined direction in a state in which information regarding a playing multimedia file is displayed via the cluster 110 .
  • a screen of the cluster 110 may display information regarding a next file to be played or a previous file that was already played.
  • a user may perform a touch input to the user input unit 120 , and, thereafter, perform a drag input in a predetermined direction in a state in which navigational information is displayed via the cluster 110 .
  • the screen of the cluster 110 may be converted to a DMB screen or a radio screen.
  • a user may perform a multi-touch input to the user input unit 120 using three or more (e.g., five) fingers, and, thereafter, perform a drag input collecting the respective touch inputs of the fingers to one position, in a state in which predetermined contents are displayed via the cluster 110 .
  • a main menu may be displayed via the cluster 110 . It is noted, however, that various other and/or alternative inputs may be performed to control the display of information via cluster 110 .
  • the display device 130 may include a display device input unit 131 , a display device control unit 132 , and a display unit 133 .
  • the display unit 133 and the display device input unit 131 may mutually have a layered structure, and may be configured as a touch screen. In this manner, the display unit 133 may serve as the display device input unit 131 .
  • additional and/or alternative forms of display device input units 131 may be utilized in association with display device 130 , such as, for example, buttons, dials, levers, touch surfaces, wheels, etc.
  • the display unit 133 (and associated functions thereof) will be, hereinafter, considered in association with a touch screen implementation.
  • the display device control unit 132 receives input from a user to the display device input unit 131 or the display unit 133 , and controls the display device input unit 131 or the display unit 133 in order to perform a predetermined operation or function.
  • the display device 130 may be mounted in a region of a center fascia (or dashboard) and may be configured to display various types of information.
  • the display device 130 may display operating states of various types of buttons of (or positioned on or in) the center fascia. For instance, when a user operates an air conditioner or a heater by manipulating a button of the center fascia, the display device 130 may display an operating state of the air conditioner or the heater.
  • the display device 130 may include a navigation function, a DMB function, a radio function, a multimedia file playing function, and/or the like.
  • the navigation, DMB, radio, multimedia file playing, etc. functions may be performed.
  • the interaction or manipulation of an input device may be a touch-based input to the display unit 133 .
  • a user may perform a touch input to a map region displayed via display unit 133 , and, thereafter, may perform a drag input in a downward direction in a state in which a map of the navigation feature is displayed in an upper region of the display unit 133 and the main menu is displayed in a middle region of the display unit 133 .
  • the display device control unit 132 may control the display unit 133 to display the navigation map on an overall screen of the display unit 133 .
  • the display device control unit 132 may control the menu screen to be changed.
  • the menu screen may be changed while having three-dimensional spatial characteristics.
  • a user may perform a pinch-in input or a pinch-out input in a state in which the navigation map is displayed via display unit 133 .
  • the display device control unit 132 may control the map so that the map zooms in or zooms out.
  • a user may perform a multi-touch input at three or more (e.g., five) points of the display unit 133 , and, thereafter, perform a drag input in a state in which a predetermined operation screen is displayed.
  • the control unit 150 may control the cluster 110 so that content provided from the display device 130 may be displayed via the cluster 110 .
  • the vision sensor 140 may include one or more cameras, and may be configured to sense a gesture of the user. It is also contemplated that the vision sensor 140 may include a motion sensor, charge-coupled device, etc.
  • the vision sensor 140 transmits the sensed gesture of the user to the control unit 150 .
  • the control unit 150 transmits a control signal to the display device 130 so that the display device 130 may perform an operation that corresponds (or otherwise mapped) to the gesture of the user that is received from the vision sensor 140 .
  • the display device 130 which has received the control signal from the control unit 150 , may perform an operation that corresponds to the gesture of the user.
  • the vision sensor 140 senses an operation of the user when the user waves a hand (e.g., a right hand or a left hand) in an upward, downward, leftward, or rightward direction.
  • the hand may be an opened hand, as will become more apparent below.
  • a gesture of the user may be sensed by the vision sensor 140 when the user lowers their hand in a downward direction in a state in which their hand is opened.
  • the control unit 150 may transmit a control signal, which corresponds to the gesture, to the display device 130 .
  • the display device 130 which has received the control signal, may perform an operation of displaying the map of the navigational feature on the display unit 133 while corresponding to the gesture. It is contemplated, however, that any other suitable operation/function may be performed.
  • a gesture of the user may be sensed by the vision sensor 140 when the user raises their hand in an upward direction in a state in which the hand of the user is opened.
  • the control unit 150 may transmit a control signal, which corresponds to the gesture, to the display device 130 .
  • the display device 130 which has received the control signal, may perform an operation of displaying a quick menu screen on the display unit 133 while corresponding to the gesture.
  • a gesture of the user may be sensed by the vision sensor 140 when the user moves their hand in a leftward or rightward direction in a state in which the hand of the user is opened. In this manner, the control unit 150 may transmit a control signal, which corresponds to the gesture, to the display device 130 .
  • the display device 130 may perform an operation of changing the menu, which is currently activated, to another menu or other function.
  • the vision sensor 140 may sense a gesture of the user when the user stops moving their hand for a predetermined (or set) period of time in a state in which the hand of the user is opened.
  • the control unit 150 may control the cluster 110 so that content provided from the display device 130 to the cluster 110 is displayed via the cluster 110 .
  • control unit 150 controls operations of the cluster 110 , the user input unit 120 , the display device 130 , and the vision sensor 140 .
  • control unit 150 may transmit a control signal, which corresponds to the input signal, to the display device 130 , the cluster 110 , and/or any other suitable component of the vehicle.
  • the cluster 110 , the user input unit 120 , the display device 130 , the vision sensor 140 , the control unit 150 , and/or one or more components thereof may be implemented via one or more general purpose and/or special purpose components, such as one or more discrete circuits, digital signal processing chips, integrated circuits, application specific integrated circuits, microprocessors, processors, programmable arrays, field programmable arrays, instruction set processors, and/or the like.
  • the features, functions, processes, etc., described herein may be implemented via software, hardware (e.g., general processor, digital signal processing (DSP) chip, an application specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), etc.), firmware, or a combination thereof.
  • DSP digital signal processing
  • ASIC application specific integrated circuit
  • FPGAs field programmable gate arrays
  • the cluster 110 , the user input unit 120 , the display device 130 , the vision sensor 140 , the control unit 150 , and/or one or more components thereof may include or otherwise be associated with one or more memories (not shown) including code (e.g., instructions) configured to cause the cluster 110 , the user input unit 120 , the display device 130 , the vision sensor 140 , the control unit 150 , and/or one or more components thereof to perform one or more of the features, functions, processes, etc., described herein.
  • code e.g., instructions
  • the memories may be any medium that participates in providing code to the one or more software, hardware, and/or firmware components for execution. Such memories may be implemented in any suitable form, including, but not limited to, non-volatile media, volatile media, and transmission media.
  • Non-volatile media include, for example, optical or magnetic disks.
  • Volatile media include dynamic memory.
  • Transmission media include coaxial cables, copper wire and fiber optics. Transmission media can also take the form of acoustic, optical, or electromagnetic waves.
  • Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a compact disk-read only memory (CD-ROM), a rewriteable compact disk (CDRW), a digital video disk (DVD), a rewriteable DVD (DVD-RW), any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a random-access memory (RAM), a programmable read only memory (PROM), and erasable programmable read only memory (EPROM), a FLASH-EPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which information may be read by, for example, a controller/processor.
  • CD-ROM compact disk-read only memory
  • CDRW rewriteable compact disk
  • DVD digital video disk
  • DVD-RW rewriteable DVD
  • EPROM erasable programmable read only memory
  • FLASH-EPROM any
  • FIGS. 3A and 3B are views of an analog instrument cluster, according to exemplary embodiments.
  • the cluster 110 may display a speed, an RPM, a traveling distance, an amount of fuel (or fuel level) of the vehicle, whether a lamp is operated, an opened state of a door, a temperature of the engine, a position of the transmission, and/or the like.
  • the speed and the RPM of the vehicle may be displayed in an analog manner, e.g., via an analog gauge.
  • the fuel level and the temperature of the engine may also be displayed in an analog manner.
  • the cluster 110 may include one or more screens to output other information, e.g., the traveling distance, the position of the transmission, etc.
  • the cluster 110 may be operated in conjunction with the display device 130 , and may display a radio frequency that is being received or available for tuning.
  • the cluster 110 may be operated in conjunction with the navigation feature of the display device 130 , and may display a current position and direction of the vehicle.
  • the cluster 110 may display a map of the navigation feature of the display device 130 , as well as one or more of the informational elements previously described in association with FIG. 3A .
  • FIGS. 3C and 3D are views of a digital instrument cluster, according to exemplary embodiments.
  • the cluster 110 may display a speed, an RPM, a traveling distance, an amount of fuel of the vehicle, whether a lamp is operated, an opened state of a door, a temperature of the engine, a position of the transmission, and/or the like. It is noted, however, that the various informational elements (e.g., the speed of the vehicle, the RPM of the vehicle, etc.) may be displayed in a digital manner, such as, via one or more screens.
  • the cluster 110 may be operated in conjunction with the display device 130 , and may display a radio frequency that is being received or available for tuning.
  • the cluster 110 may be operated in conjunction with the navigation feature of the display device 130 , and may display a current position and direction of the vehicle.
  • the cluster 110 may display a map of the navigation feature of the display device 130 , as well as one or more of the informational elements previously described in association with FIG. 3C .
  • FIGS. 4A , 4 B, and 4 C are views of a display device, according to exemplary embodiments.
  • a map of the navigation feature may be displayed in an upper region of the display unit 133 of the display device 130 .
  • a radio frequency, which is being received, may be displayed in a middle region of the display unit 133 of the display device 130 .
  • a plurality of radio frequencies, which is stored in advance, may be displayed in the middle region of the display unit 133 of the display device 130 .
  • a region for adjusting volume, a region for displaying content that is currently being activated (or played), and a region for setting gesture modes may be displayed in a lower region of the display unit 133 of the display device 130 .
  • the navigation map when a drag input in a downward direction is received after a touch input to the region where the navigation map is displayed, the navigation map may be enlarged and displayed, such as illustrated in FIG. 4B .
  • a quick setting mode screen of the display device may be displayed, such as illustrated in FIG. 4C .
  • the quick setting mode screen may include screens for setting media files, adjusting volume, adjusting sound, adjusting a screen, adjusting a watch, setting Bluetooth, setting wireless fidelity (Wi-Fi), setting an air conditioner, etc., may be displayed.
  • the aforementioned settings may be performed via the quick setting mode screen through one or more touch inputs of the user.
  • FIG. 5A is a view of a first user input unit, according to exemplary embodiments.
  • the user input unit 510 may be mounted as part of a steering wheel of the vehicle, e.g., at a left side of the steering wheel, so that a driver may easily manipulate the user input unit 510 using the thumb of their right hand.
  • the user input unit 510 may include first to fourth buttons 511 , 512 , 513 , and 514 , cluster type conversion buttons 515 , and 516 , a home button 517 , and a back key 518 .
  • the screen displayed on the cluster 110 may be converted from the main menu screen to the map screen of the navigation feature.
  • the screen displayed on the cluster 110 may be converted from the map screen of the navigation feature to the main menu screen of the user interface.
  • the screen displayed on the cluster 110 may be converted into another menu.
  • the screen displayed on the cluster 110 may be converted and displayed from the analog cluster to the digital cluster or converted from the digital cluster to the analog cluster.
  • the screen displayed on the cluster 110 may be converted into the main menu screen.
  • the screen displayed on the cluster 110 may be converted into a previous screen.
  • FIGS. 5B , 5 C, 5 D, and 5 E are views of a second user input unit, according to exemplary embodiments.
  • the user input unit may be a touch pad 530 mounted in a region of the steering wheel.
  • the user may manipulate an operation of the cluster 110 using the touch pad 530 .
  • the user is an occupant of the vehicle.
  • a map displayed via the cluster 110 may zoom in or zoom out.
  • the cluster screen may display information regarding a next file playing or previously played file.
  • the cluster screen may be converted into a navigation screen, such as illustrated in FIG. 5C .
  • the cluster screen may be converted into a DMB screen or a radio screen, such as illustrated in FIG. 5D .
  • a main menu may be displayed via the cluster screen, such as illustrated in FIG. 5E .
  • FIGS. 6A , 6 B, and 6 C are views of the display device to illustrate a method of controlling the integrated multimedia device of FIG. 1 , according to exemplary embodiments.
  • the display device 130 may display a radio frequency, which is being received, in a middle region of the display unit 133 .
  • the occupant may move their left hand in a leftward direction in a state in which their left hand is opened.
  • the vision sensor 140 may sense and recognize the gesture as a valid user input, and the display device 130 may change the menu from a radio menu to a media menu.
  • the vision sensor 140 may receive and recognize the gesture as a valid user input, and the display device 130 may change the menu from the radio menu to a DMB menu.
  • the display device 130 may display a radio frequency, which is being received, in the middle region of the display unit 133 .
  • the occupant may move their left hand downward in a state in which their left hand is opened.
  • the vision sensor 140 may sense and recognize the gesture as a valid user input, and the display device 130 may enlarge and display the navigation map.
  • the vision sensor 140 may receive and recognize the gesture as a valid user input. In this manner, the display device 130 may display a quick menu setting screen.
  • the display device 130 may display a radio frequency, which is being received, in the middle region of the display unit 133 .
  • the vision sensor 140 may sense and recognize the gesture as a valid user input. In this manner, content provided from the display device 130 to the cluster 110 may be displayed via the cluster 110 .
  • FIGS. 7A and 7B are views of the display device to illustrate another method of controlling the integrated multimedia device of FIG. 1 , according to exemplary embodiments.
  • the display device 130 may change the menu from the radio menu to the media menu.
  • the display device 130 may change the menu from the radio menu to the DMB menu.
  • the display device 130 may display a radio frequency, which is being received, in the middle region of the display unit 133 .
  • a touch screen of the display device 130 may sense the touch gesture. In this manner, content provided from the display device 130 to the cluster 110 may be displayed via the cluster 110 .

Abstract

An integrated multimedia device for a vehicle includes a display device, an instrument cluster, and a user input device. The display device is configured to provide at least one of a digital multimedia broadcasting (DMB) function and a navigation function. The display device is further configured to detect touch inputs. The instrument cluster is configured to display content from the display device in response to a user input. The user input device is configured to manipulate at least one of the display device and the instrument cluster in response to detection of a touch input.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority from and the benefit of Korean Patent Application No. 10-2013-0146839, filed on Nov. 29, 2013, which is incorporated by reference for all purposes as if set forth herein.
  • BACKGROUND
  • 1. Field
  • Exemplary embodiments relate to an integrated multimedia device for a vehicle, and, more particularly, to an integrated multimedia device for a vehicle that may be operated in conjunction with an instrument cluster and a method of controlling the same.
  • 2. Discussion of the Background
  • A vehicle may include a multimedia device configured to provide a digital multimedia broadcasting (DMB) function and a navigation function. A display unit of the multimedia device may be incorporated as part of a center fascia of the vehicle. With the display unit being incorporated as part of the center fascia, various functions may be provided, such as, for example, the DMB function, the navigation function, a temperature adjustment function, a multimedia playback function, etc. It is noted, however, that because a screen of the display unit may be relatively size, and because the display unit and an instrument cluster of the vehicle do not communicate information with one another, the attention of a driver may be distracted during operation or information retrieval, which may cause an accident. For instance, the driver may need to perform a touch input while watching the screen and may need to bend their body downward to manipulate the screen. In this manner, the risk of an accident may increase, especially when the driver attempts to drive the vehicle and manipulate the display unit at the same time.
  • It is also noted that in conventional vehicles only simple control keys, such as a key for adjusting volume and a mute key associated with a multimedia device are provided on a steering wheel. In this manner, it may be inconvenient for the driver to manipulate the multimedia device. As the frequency of use of the multimedia device for performing, for example, navigation features, phone calls, multimedia functions, etc., increases while the vehicle is in transit, it may be necessary to operate the multimedia device and the instrument cluster in conjunction with one another.
  • The above information disclosed in this Background section is only for enhancement of understanding of the background of the inventive concept, and, therefore, it may contain information that does not form the prior art that is already known in this country to a person of ordinary skill in the art.
  • SUMMARY
  • Exemplary embodiments provide an integrated multimedia device for a vehicle operated in conjunction with an instrument cluster.
  • Additional aspects will be set forth in the detailed description which follows, and, in part, will be apparent from the disclosure, or may be learned by practice of the inventive concept.
  • According to exemplary embodiments, an integrated multimedia device for a vehicle includes a display device, an instrument cluster, and a user input device. The display device is configured to provide at least one of a digital multimedia broadcasting (DMB) function and a navigation function. The display device is configured to detect touch inputs. The instrument cluster is configured to display content from the display device in response to a user input. The user input device is configured to manipulate at least one of the display device and the instrument cluster in response to detection of a touch input.
  • According to exemplary embodiments, a method includes: causing, at least in part, a user input to a system of a vehicle to be detected, the system being configured to provide at least one of a digital multimedia broadcasting (DMB) function and a navigation function; determining, in response to detection of the user input, an operation corresponding to the user input; and generating, in accordance with the operation, a control signal configured to affect the display of content via at least one of a display device and an instrument cluster of the system.
  • According to exemplary embodiments, user convenience of a multimedia device may be improve via enablement of various types of user inputs, such as touch inputs to a screen of the multimedia device, touch inputs to a touch pad mounted on a steering wheel of the vehicle, and gesture inputs detected via one or more sensors. In this manner, a driver of the vehicle may perform an input action without directly interacting with the screen of the multimedia device, which may increase user convenience and safety when the driver is driving the vehicle, as well as reduce the potential for driver distraction. Furthermore, because the instrument cluster may output content received from the multimedia device, the driver may drive the vehicle and acquire information from the multimedia device without directing attention to the screen of the multimedia device, and, thereby, away from a direction of travel.
  • The foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the claimed subject matter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are included to provide a further understanding of the inventive concept, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the inventive concept, and, together with the description, serve to explain principles of the inventive concept.
  • FIG. 1 is a view of a cockpit of a vehicle including an integrated multimedia device, according to exemplary embodiments.
  • FIG. 2 is a block diagram of the integrated multimedia device of FIG. 1, according to exemplary embodiments.
  • FIGS. 3A and 3B are views of an analog instrument cluster, according to exemplary embodiments.
  • FIGS. 3C and 3D are views of a digital instrument cluster, according to exemplary embodiments.
  • FIGS. 4A, 4B, and 4C are views of a display device, according to exemplary embodiments.
  • FIG. 5A is a view of a first user input unit, according to exemplary embodiments.
  • FIGS. 5B, 5C, 5D, and 5E are views of a second user input unit, according to exemplary embodiments.
  • FIGS. 6A, 6B, and 6C are views of the display device to illustrate a method of controlling the integrated multimedia device of FIG. 1, according to exemplary embodiments.
  • FIGS. 7A and 7B are views of the display device to illustrate another method of controlling the integrated multimedia device of FIG. 1, according to exemplary embodiments.
  • DETAILED DESCRIPTION OF THE ILLUSTRATED EMBODIMENTS
  • In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of various exemplary embodiments. It is apparent, however, that various exemplary embodiments may be practiced without these specific details or with one or more equivalent arrangements. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring various exemplary embodiments. In the accompanying figures, the size and relative sizes of various components, devices, elements, etc., may be exaggerated for clarity and descriptive purposes. Also, like reference numerals denote like elements.
  • When a component, device, element, etc., is referred to as being “on,” “connected to,” or “coupled to” another component, device, element, etc., it may be directly on, connected to, or coupled to the other component, device element, etc., or intervening components, devices, elements, etc., may be present. When, however, a component, device, element, etc., is referred to as being “directly on,” “directly connected to,” or “directly coupled to” another component, device, element, etc., there are no intervening components, devices, elements, etc., present. For the purposes of this disclosure, “at least one of X, Y, and Z” and “at least one selected from the group consisting of X, Y, and Z” may be construed as X only, Y only, Z only, or any combination of two or more of X, Y, and Z, such as, for instance, XYZ, XYY, YZ, and ZZ. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
  • Although the terms first, second, etc., may be used herein to describe various components, devices, elements, regions, etc., these components, devices, elements, regions, etc., are not to be limited by these terms. These terms are used to distinguish one component, device, element, region, etc., from another component, device, element, region, etc. In this manner, a first component, device, element, region, etc., discussed below may be termed a second component, device, element, region, etc., without departing from the teachings of the present disclosure.
  • Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper,” and the like, may be used herein for descriptive purposes, and, thereby, to describe one component, device, element, or feature's relationship to another component(s), device(s), element(s), or feature(s) as illustrated in the drawings. Spatially relative terms are intended to encompass different orientations of an apparatus in use, operation, and/or manufacture in addition to the orientation depicted in the drawings. For example, if an apparatus in the drawings is turned over, components described as “below” or “beneath” other components would then be oriented “above” the other components. In this manner, the exemplary term “below” can encompass both an orientation of above and below. Furthermore, an apparatus may be otherwise oriented (e.g., rotated 90 degrees or at other orientations), and, as such, the spatially relative descriptors used herein are to be interpreted accordingly.
  • The terminology used herein is for the purpose of describing exemplary embodiments and is not intended to be limiting. As used herein, the singular forms, “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Moreover, the terms “comprises,” comprising,” “includes,” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, components, devices, regions, and/or groups thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, devices, regions, and/or groups thereof.
  • Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure is a part. Terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense, unless expressly so defined herein.
  • FIG. 1 is a view of a cockpit of a vehicle including an integrated multimedia device, according to exemplary embodiments. FIG. 2 is a block diagram of the integrated multimedia device of FIG. 1.
  • Referring to FIGS. 1 and 2, the integrated multimedia device for a vehicle may include an instrument cluster 110, a user input unit 120, a display device 130, a vision (or optical) sensor 140, and a control unit 150. Although specific reference will be made to this particular implementation, it is also contemplated that the integrated multimedia device may embody many forms and include multiple and/or alternative components. For example, it is contemplated that the components of the integrated multimedia device may be combined, located in separate structures, and/or separate locations.
  • The instrument (or gauge) cluster 110 is a module that displays various types of information regarding a vehicle. Hereinafter, instrument cluster 110 will be referred to as simply cluster 110. For instance, the cluster 110 may display information regarding (or otherwise associated with) a speed of the vehicle and a revolution per minute (RPM) of an engine of the vehicle. In addition, the cluster 110 may display trip information regarding a traveling distance, fuel consumption, and/or the like. The cluster 110 may display information regarding a travelable distance and fuel consumption, such as instantaneous fuel consumption and average fuel consumption. It is also contemplated that the cluster 110 may display information regarding various types of states and/or operational components of the vehicle, such as opened states of a door and/or a trunk, an operating state of a lamp, an operating state of a side (or emergency) brake, an operating state of a wiper, a warning signal associated with a component (e.g., airbag, door, engine, gate, ignition, light, sunroof, tire, window, etc.) of the vehicle, an operating state of a feature (e.g., a cruise control feature, a blind spot monitoring feature, a transmission position feature, a traction control feature, a fluid monitoring feature, etc.), and/or the like. To this end, the cluster 110 may provide information concerning vehicle maintenance, date and/or time information, and/or the like.
  • According to exemplary embodiments, the cluster 110 may output and display contents provided (or otherwise received) from the display device 130. For example, the cluster 110 may display information regarding navigation, digital multimedia broadcasting (DMB), multimedia file playing, radio, etc., information, that are provided from the display device 130.
  • In exemplary embodiments, a user of the vehicle may manipulate an operation of the cluster 110 using, for instance, the user input unit 120. The user may be a driver or other type of occupant of the vehicle. The user input unit 120 may be a touch pad mounted on a steering wheel. It is also contemplated that the user input unit 120 may be disposed (or otherwise integrated) as part of a console unit, an integrated control system, a fascia, etc.
  • It is contemplated that a user of the vehicle may interact with the user input unit 120 to control aspects of the instrument cluster 110 and/or the display device 130. For example, a user may perform a multi-touch input to the user input unit 120, and, thereafter, perform a pinch-in input or a pinch-out input in a state in which navigational information is displayed via the cluster 110. In this manner, a map displayed via the cluster 110 may zoom in or zoom out. As another example, a user may perform a multi-touch input to the user input unit 120, and thereafter, perform a drag input in a predetermined direction in a state in which information regarding a playing multimedia file is displayed via the cluster 110. In this manner, a screen of the cluster 110 may display information regarding a next file to be played or a previous file that was already played. In yet another example, a user may perform a touch input to the user input unit 120, and, thereafter, perform a drag input in a predetermined direction in a state in which navigational information is displayed via the cluster 110. In this manner, the screen of the cluster 110 may be converted to a DMB screen or a radio screen. It is also contemplated that a user may perform a multi-touch input to the user input unit 120 using three or more (e.g., five) fingers, and, thereafter, perform a drag input collecting the respective touch inputs of the fingers to one position, in a state in which predetermined contents are displayed via the cluster 110. As such, a main menu may be displayed via the cluster 110. It is noted, however, that various other and/or alternative inputs may be performed to control the display of information via cluster 110.
  • According to exemplary embodiments, the display device 130 may include a display device input unit 131, a display device control unit 132, and a display unit 133. The display unit 133 and the display device input unit 131 may mutually have a layered structure, and may be configured as a touch screen. In this manner, the display unit 133 may serve as the display device input unit 131. It is contemplated, however, that additional and/or alternative forms of display device input units 131 may be utilized in association with display device 130, such as, for example, buttons, dials, levers, touch surfaces, wheels, etc. For descriptive purposes, however, the display unit 133 (and associated functions thereof) will be, hereinafter, considered in association with a touch screen implementation. The display device control unit 132 receives input from a user to the display device input unit 131 or the display unit 133, and controls the display device input unit 131 or the display unit 133 in order to perform a predetermined operation or function.
  • The display device 130 may be mounted in a region of a center fascia (or dashboard) and may be configured to display various types of information. For example, the display device 130 may display operating states of various types of buttons of (or positioned on or in) the center fascia. For instance, when a user operates an air conditioner or a heater by manipulating a button of the center fascia, the display device 130 may display an operating state of the air conditioner or the heater. The display device 130 may include a navigation function, a DMB function, a radio function, a multimedia file playing function, and/or the like. In response to detecting a user interaction with (or manipulation of) an input device associated with the display device (e.g., the display device input unit 131), the navigation, DMB, radio, multimedia file playing, etc., functions may be performed.
  • In exemplary embodiments, the interaction or manipulation of an input device may be a touch-based input to the display unit 133. For example, a user may perform a touch input to a map region displayed via display unit 133, and, thereafter, may perform a drag input in a downward direction in a state in which a map of the navigation feature is displayed in an upper region of the display unit 133 and the main menu is displayed in a middle region of the display unit 133. In response thereto, the display device control unit 132 may control the display unit 133 to display the navigation map on an overall screen of the display unit 133. As another example, when a user performs a drag input after a multi-touch input, the display device control unit 132 may control the menu screen to be changed. In this manner, the menu screen may be changed while having three-dimensional spatial characteristics. According to another example, a user may perform a pinch-in input or a pinch-out input in a state in which the navigation map is displayed via display unit 133. As such, the display device control unit 132 may control the map so that the map zooms in or zooms out. In a further example, a user may perform a multi-touch input at three or more (e.g., five) points of the display unit 133, and, thereafter, perform a drag input in a state in which a predetermined operation screen is displayed. As such, the control unit 150 may control the cluster 110 so that content provided from the display device 130 may be displayed via the cluster 110.
  • According to exemplary embodiments, the vision sensor 140 may include one or more cameras, and may be configured to sense a gesture of the user. It is also contemplated that the vision sensor 140 may include a motion sensor, charge-coupled device, etc. The vision sensor 140 transmits the sensed gesture of the user to the control unit 150. The control unit 150 transmits a control signal to the display device 130 so that the display device 130 may perform an operation that corresponds (or otherwise mapped) to the gesture of the user that is received from the vision sensor 140. The display device 130, which has received the control signal from the control unit 150, may perform an operation that corresponds to the gesture of the user.
  • In exemplary embodiments, the vision sensor 140 senses an operation of the user when the user waves a hand (e.g., a right hand or a left hand) in an upward, downward, leftward, or rightward direction. The hand may be an opened hand, as will become more apparent below. For example, a gesture of the user may be sensed by the vision sensor 140 when the user lowers their hand in a downward direction in a state in which their hand is opened. In this manner, the control unit 150 may transmit a control signal, which corresponds to the gesture, to the display device 130. The display device 130, which has received the control signal, may perform an operation of displaying the map of the navigational feature on the display unit 133 while corresponding to the gesture. It is contemplated, however, that any other suitable operation/function may be performed.
  • For example, a gesture of the user may be sensed by the vision sensor 140 when the user raises their hand in an upward direction in a state in which the hand of the user is opened. As such, the control unit 150 may transmit a control signal, which corresponds to the gesture, to the display device 130. The display device 130, which has received the control signal, may perform an operation of displaying a quick menu screen on the display unit 133 while corresponding to the gesture. As another example, a gesture of the user may be sensed by the vision sensor 140 when the user moves their hand in a leftward or rightward direction in a state in which the hand of the user is opened. In this manner, the control unit 150 may transmit a control signal, which corresponds to the gesture, to the display device 130. As such, the display device 130 may perform an operation of changing the menu, which is currently activated, to another menu or other function. In another example, the vision sensor 140 may sense a gesture of the user when the user stops moving their hand for a predetermined (or set) period of time in a state in which the hand of the user is opened. As such, the control unit 150 may control the cluster 110 so that content provided from the display device 130 to the cluster 110 is displayed via the cluster 110.
  • According to exemplary embodiments, the control unit (or controller) 150 controls operations of the cluster 110, the user input unit 120, the display device 130, and the vision sensor 140. When the control unit 150 receives an input signal from the user input unit 120, the vision sensor 140, or the display device input unit 131, the control unit 150 may transmit a control signal, which corresponds to the input signal, to the display device 130, the cluster 110, and/or any other suitable component of the vehicle.
  • In exemplary embodiments, the cluster 110, the user input unit 120, the display device 130, the vision sensor 140, the control unit 150, and/or one or more components thereof, may be implemented via one or more general purpose and/or special purpose components, such as one or more discrete circuits, digital signal processing chips, integrated circuits, application specific integrated circuits, microprocessors, processors, programmable arrays, field programmable arrays, instruction set processors, and/or the like. As such, the features, functions, processes, etc., described herein may be implemented via software, hardware (e.g., general processor, digital signal processing (DSP) chip, an application specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), etc.), firmware, or a combination thereof. In this manner, the cluster 110, the user input unit 120, the display device 130, the vision sensor 140, the control unit 150, and/or one or more components thereof may include or otherwise be associated with one or more memories (not shown) including code (e.g., instructions) configured to cause the cluster 110, the user input unit 120, the display device 130, the vision sensor 140, the control unit 150, and/or one or more components thereof to perform one or more of the features, functions, processes, etc., described herein.
  • Although not illustrated, the memories may be any medium that participates in providing code to the one or more software, hardware, and/or firmware components for execution. Such memories may be implemented in any suitable form, including, but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media include, for example, optical or magnetic disks. Volatile media include dynamic memory. Transmission media include coaxial cables, copper wire and fiber optics. Transmission media can also take the form of acoustic, optical, or electromagnetic waves. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a compact disk-read only memory (CD-ROM), a rewriteable compact disk (CDRW), a digital video disk (DVD), a rewriteable DVD (DVD-RW), any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a random-access memory (RAM), a programmable read only memory (PROM), and erasable programmable read only memory (EPROM), a FLASH-EPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which information may be read by, for example, a controller/processor.
  • FIGS. 3A and 3B are views of an analog instrument cluster, according to exemplary embodiments.
  • As illustrated in FIG. 3A, the cluster 110 may display a speed, an RPM, a traveling distance, an amount of fuel (or fuel level) of the vehicle, whether a lamp is operated, an opened state of a door, a temperature of the engine, a position of the transmission, and/or the like. The speed and the RPM of the vehicle may be displayed in an analog manner, e.g., via an analog gauge. The fuel level and the temperature of the engine may also be displayed in an analog manner. In this manner, it is contemplated that the cluster 110 may include one or more screens to output other information, e.g., the traveling distance, the position of the transmission, etc. The cluster 110 may be operated in conjunction with the display device 130, and may display a radio frequency that is being received or available for tuning. The cluster 110 may be operated in conjunction with the navigation feature of the display device 130, and may display a current position and direction of the vehicle.
  • As illustrated in FIG. 3B, the cluster 110 may display a map of the navigation feature of the display device 130, as well as one or more of the informational elements previously described in association with FIG. 3A.
  • FIGS. 3C and 3D are views of a digital instrument cluster, according to exemplary embodiments.
  • As illustrated in FIG. 3C, the cluster 110 may display a speed, an RPM, a traveling distance, an amount of fuel of the vehicle, whether a lamp is operated, an opened state of a door, a temperature of the engine, a position of the transmission, and/or the like. It is noted, however, that the various informational elements (e.g., the speed of the vehicle, the RPM of the vehicle, etc.) may be displayed in a digital manner, such as, via one or more screens. In addition, the cluster 110 may be operated in conjunction with the display device 130, and may display a radio frequency that is being received or available for tuning. The cluster 110 may be operated in conjunction with the navigation feature of the display device 130, and may display a current position and direction of the vehicle.
  • As illustrated in FIG. 3D, the cluster 110 may display a map of the navigation feature of the display device 130, as well as one or more of the informational elements previously described in association with FIG. 3C.
  • FIGS. 4A, 4B, and 4C are views of a display device, according to exemplary embodiments.
  • As illustrated in FIG. 4A, a map of the navigation feature may be displayed in an upper region of the display unit 133 of the display device 130. A radio frequency, which is being received, may be displayed in a middle region of the display unit 133 of the display device 130. A plurality of radio frequencies, which is stored in advance, may be displayed in the middle region of the display unit 133 of the display device 130. A region for adjusting volume, a region for displaying content that is currently being activated (or played), and a region for setting gesture modes may be displayed in a lower region of the display unit 133 of the display device 130.
  • In exemplary embodiments, when a drag input in a downward direction is received after a touch input to the region where the navigation map is displayed, the navigation map may be enlarged and displayed, such as illustrated in FIG. 4B. When, for example, a drag input in an upward direction is received in the screen illustrated in FIG. 4A after a touch input to a quick setting region 410 is provided, a quick setting mode screen of the display device may be displayed, such as illustrated in FIG. 4C. The quick setting mode screen may include screens for setting media files, adjusting volume, adjusting sound, adjusting a screen, adjusting a watch, setting Bluetooth, setting wireless fidelity (Wi-Fi), setting an air conditioner, etc., may be displayed. The aforementioned settings may be performed via the quick setting mode screen through one or more touch inputs of the user.
  • FIG. 5A is a view of a first user input unit, according to exemplary embodiments.
  • Referring to FIG. 5A, the user input unit 510 may be mounted as part of a steering wheel of the vehicle, e.g., at a left side of the steering wheel, so that a driver may easily manipulate the user input unit 510 using the thumb of their right hand. The user input unit 510 may include first to fourth buttons 511, 512, 513, and 514, cluster type conversion buttons 515, and 516, a home button 517, and a back key 518.
  • When an input of the first button 511 is received, the screen displayed on the cluster 110 may be converted from the main menu screen to the map screen of the navigation feature. When an input of the second button 512 is received, the screen displayed on the cluster 110 may be converted from the map screen of the navigation feature to the main menu screen of the user interface. When an input of the third button 513 and an input of the fourth button 514 are received, the screen displayed on the cluster 110 may be converted into another menu. In instances when an input of the cluster type conversion buttons 515 and 516 is received, the screen displayed on the cluster 110 may be converted and displayed from the analog cluster to the digital cluster or converted from the digital cluster to the analog cluster. When an input of the home button 517 is received, the screen displayed on the cluster 110 may be converted into the main menu screen. When an input of the back key 518 is received, the screen displayed on the cluster 110 may be converted into a previous screen.
  • FIGS. 5B, 5C, 5D, and 5E are views of a second user input unit, according to exemplary embodiments.
  • As illustrated in FIG. 5B, the user input unit may be a touch pad 530 mounted in a region of the steering wheel. The user may manipulate an operation of the cluster 110 using the touch pad 530. Again, the user is an occupant of the vehicle.
  • According to exemplary embodiments, when the occupant performs a multi-touch input to the touch pad 530, and, thereafter, performs a pinch-in input or a pinch-out input in a state in which the navigation feature is displayed via the cluster 110, a map displayed via the cluster 110 may zoom in or zoom out. As another example, when the occupant performs a multi-touch input to the touch pad 530, and, thereafter, performs a drag input in a predetermined direction in a state in which information regarding a playing multimedia file is displayed via the cluster 110, the cluster screen may display information regarding a next file playing or previously played file. In another example, when the occupant performs a touch input to the touch pad 530, and, thereafter, performs a drag input in an upward direction in a state in which a radio receiving frequency is displayed via the cluster, such as illustrated in FIG. 5D, the cluster screen may be converted into a navigation screen, such as illustrated in FIG. 5C.
  • According to exemplary embodiments, when the occupant performs a touch input to the touch pad 530, and, thereafter, performs a drag input in a downward direction in a state in which the navigation feature is displayed via the cluster 110, such as illustrated in FIG. 5C, the cluster screen may be converted into a DMB screen or a radio screen, such as illustrated in FIG. 5D. In another example, when the occupant performs a multi-touch input to the touch pad 530 using three or more (e.g., five) fingers, and, thereafter, performs a drag input to collect the respective touch inputs to one position in a state in which predetermined content is displayed via the cluster 110, a main menu may be displayed via the cluster screen, such as illustrated in FIG. 5E.
  • FIGS. 6A, 6B, and 6C are views of the display device to illustrate a method of controlling the integrated multimedia device of FIG. 1, according to exemplary embodiments.
  • Referring to FIG. 6A, the display device 130 may display a radio frequency, which is being received, in a middle region of the display unit 133. As such, the occupant may move their left hand in a leftward direction in a state in which their left hand is opened. In this manner, the vision sensor 140 may sense and recognize the gesture as a valid user input, and the display device 130 may change the menu from a radio menu to a media menu. When the occupant opens the left hand and moves the opened left hand in a rightward direction in a state in which a radio frequency is displayed, the vision sensor 140 may receive and recognize the gesture as a valid user input, and the display device 130 may change the menu from the radio menu to a DMB menu.
  • Referring to FIG. 6B, the display device 130 may display a radio frequency, which is being received, in the middle region of the display unit 133. As such, the occupant may move their left hand downward in a state in which their left hand is opened. As such, the vision sensor 140 may sense and recognize the gesture as a valid user input, and the display device 130 may enlarge and display the navigation map. When the occupant opens their left hand and moves their opened left hand in an upward direction in a state in which a radio frequency is displayed, the vision sensor 140 may receive and recognize the gesture as a valid user input. In this manner, the display device 130 may display a quick menu setting screen.
  • Referring to FIG. 6C, the display device 130 may display a radio frequency, which is being received, in the middle region of the display unit 133. When, for example, the occupant stops moving their left hand for a predetermined period of time in a state in which their left hand is opened, the vision sensor 140 may sense and recognize the gesture as a valid user input. In this manner, content provided from the display device 130 to the cluster 110 may be displayed via the cluster 110.
  • FIGS. 7A and 7B are views of the display device to illustrate another method of controlling the integrated multimedia device of FIG. 1, according to exemplary embodiments.
  • Referring to FIG. 7A, when drag input in a leftward direction is received after a touch input is received in a state in which the radio receiving frequency is displayed, the display device 130 may change the menu from the radio menu to the media menu. When a drag input in a rightward direction is received after a touch input is received in a state in which the radio receiving frequency function is displayed, the display device 130 may change the menu from the radio menu to the DMB menu.
  • Referring to FIG. 7B, the display device 130 may display a radio frequency, which is being received, in the middle region of the display unit 133. When the occupant inputs a multi-touch gesture for a predetermined period of time, a touch screen of the display device 130 may sense the touch gesture. In this manner, content provided from the display device 130 to the cluster 110 may be displayed via the cluster 110.
  • Although certain exemplary embodiments and implementations have been described herein, other embodiments and modifications will be apparent from this description. Accordingly, the inventive concept is not limited to such embodiments, but rather to the broader scope of the presented claims and various obvious modifications and equivalent arrangements.

Claims (20)

What is claimed is:
1. An integrated multimedia device for a vehicle, comprising:
a display device configured to provide at least one of a digital multimedia broadcasting (DMB) function and a navigation function, the display device being further configured to detect touch inputs;
an instrument cluster configured to display content from the display device in response to a user input; and
a user input device configured to manipulate at least one of the display device and the instrument cluster in response to detection of a touch input.
2. The integrated multimedia device of claim 1, further comprising:
a sensor configured to sense a gesture of an occupant of the vehicle,
wherein, in response to sensation of the gesture, the display device is configured to perform an operation corresponding to the gesture.
3. The integrated multimedia device of claim 2, wherein the gesture corresponds to an open-hand wave by the occupant in an upward, downward, leftward, or rightward direction.
4. The integrated multimedia device of claim 3, wherein, in response to detection of a termination of movement of the gesture for a set period of time, the instrument cluster is configured to output content received from the display device.
5. The integrated multimedia device of claim 1, wherein:
the user input device comprises a touch pad mounted on a steering wheel of the vehicle; and
the touch pad is configured to detect touch inputs.
6. The integrated multimedia device of claim 2, further comprising:
a controller configured to transmit a control signal to the display device or the instrument cluster in response to reception of an input signal via the user input device or the sensor, the control signal corresponding to the input signal.
7. The integrated multimedia device of claim 1, wherein the display device comprises a touch screen configured to detect touch inputs and display information.
8. The integrated multimedia device of claim 7, wherein, in response to detection of a multi-touch input for a set duration via the touch screen, the control unit is configured to control content displayed via the display device to be displayed via the instrument cluster.
9. The integrated multimedia device of claim 5, wherein the touch pad is configured to receive at least one of a touch input, a multi-touch input, a drag input, a pinch-in input, and a pinch-out input.
10. The integrated multimedia device of claim 1, wherein the instrument cluster is configured to display at least one of a navigation screen, a DMB screen, and a radio screen.
11. The integrated multimedia device of claim 1, wherein:
the vehicle comprises a steering wheel and a driver's seat; and
the steering wheel is disposed between the instrument cluster and the driver's seat.
12. A method, comprising:
causing, at least in part, a user input to a system of a vehicle to be detected, the system being configured to provide at least one of a digital multimedia broadcasting (DMB) function and a navigation function;
determining, in response to detection of the user input, an operation corresponding to the user input; and
generating, in accordance with the operation, a control signal configured to affect the display of content via at least one of a display device and an instrument cluster of the system.
13. The method of claim 12, further comprising:
causing, at least in part, the control signal to be transmitted to at least one of the display device and the instrument cluster.
14. The method of claim 12, wherein detection of the user input comprises:
causing, at least in part, a gesture to be detected via a sensor; and
determining that the gesture corresponds to a valid user input to the system.
15. The method of claim 14, wherein the gesture corresponds to an open-hand wave in an upward, downward, leftward, or rightward direction.
16. The method of claim 15, further comprising:
causing, at least in part, a termination of the open-hand wave to be detected via the sensor for a set period of time;
causing, at least in part, content to be transmitted to the instrument cluster via the display device; and
causing, at least in part, the content to be displayed via the instrument cluster.
17. The method of claim 12, wherein the user input is detected via a touch pad mounted on a steering wheel of the vehicle.
18. The method of claim 17, wherein the user input is at least one of a touch input, a multi-touch input, a drag input, a pinch-in input, and a pinch-out input.
19. The method of claim 12, wherein:
the user input is a multi-touch input to a touch screen of the display device for a set duration of time; and
the control signal is configured to cause content displayed via the display device to be displayed via the instrument cluster.
20. The method of claim 12, wherein the instrument cluster is configured to display at least one of a navigation screen, a DMB screen, and a radio screen.
US14/522,242 2013-11-29 2014-10-23 Integrated multimedia device for vehicle Abandoned US20150153936A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130146839A KR20150062317A (en) 2013-11-29 2013-11-29 Multimedia apparatus of an autombile
KR10-2013-0146839 2013-11-29

Publications (1)

Publication Number Publication Date
US20150153936A1 true US20150153936A1 (en) 2015-06-04

Family

ID=53058591

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/522,242 Abandoned US20150153936A1 (en) 2013-11-29 2014-10-23 Integrated multimedia device for vehicle

Country Status (4)

Country Link
US (1) US20150153936A1 (en)
KR (1) KR20150062317A (en)
CN (1) CN104679404A (en)
DE (1) DE102014115376A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160264070A1 (en) * 2015-03-13 2016-09-15 Yazaki Corporation Vehicle operation system
US20170212633A1 (en) * 2016-01-26 2017-07-27 Samsung Electronics Co., Ltd. Automotive control system and method for operating the same
US20180129467A1 (en) * 2016-11-09 2018-05-10 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicle multimedia display system
EP3378692A1 (en) * 2017-03-22 2018-09-26 Kwang Yang Motor Co., Ltd. Vehicle dashboard structure
US10086702B2 (en) * 2016-09-21 2018-10-02 Lg Electronics Inc. Dashboard display indicating speed and vehicle having the same
US20190359059A1 (en) * 2018-05-22 2019-11-28 Kubota Corporation Work vehicle having display unit for displaying fuel consumption amount
US10940760B2 (en) * 2018-10-16 2021-03-09 Hyundai Motor Company Device for controlling vehicle display device, system having the same, and method for controlling vehicle display device
JP2021102357A (en) * 2019-12-24 2021-07-15 トヨタ自動車株式会社 On-vehicle equipment operation device
US11065947B2 (en) * 2018-08-23 2021-07-20 Hyundai Motor Company Apparatus and method for tracking location of sunroof blind
USD941323S1 (en) * 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941338S1 (en) 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941322S1 (en) * 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941852S1 (en) * 2019-02-08 2022-01-25 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD942482S1 (en) * 2019-08-06 2022-02-01 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
US20220252219A1 (en) * 2021-02-09 2022-08-11 Hyundai Mobis Co., Ltd. Vehicle control apparatus and method using swivel operation of smart device
US11919463B1 (en) * 2022-10-21 2024-03-05 In Motion Mobility LLC Comprehensive user control system for vehicle
US11941317B2 (en) 2021-07-26 2024-03-26 Faurecia Clarion Electronics Co., Ltd. Display controlling method

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6543185B2 (en) * 2015-12-22 2019-07-10 クラリオン株式会社 In-vehicle device
KR20170109283A (en) * 2016-03-21 2017-09-29 현대자동차주식회사 Vehicle and method for controlling vehicle
US11400997B2 (en) 2016-05-23 2022-08-02 Indian Motorcycle International, LLC Display systems and methods for a recreational vehicle
KR101876739B1 (en) * 2016-12-16 2018-07-10 현대자동차주식회사 In-vehicle command input system and method of controlling thereof
CN107817002A (en) * 2017-10-24 2018-03-20 重庆长安汽车股份有限公司 Automobile middle control entertainment systems carry out navigate interactive display device and method with instrument cubicle
KR101977092B1 (en) 2017-12-11 2019-08-28 엘지전자 주식회사 Vehicle control device mounted on vehicle and method for controlling the vehicle
CN110525212B (en) * 2018-05-24 2021-04-13 广州小鹏汽车科技有限公司 Vehicle central control large screen control method, device and system
CN109186626A (en) * 2018-08-13 2019-01-11 上海擎感智能科技有限公司 Language and characters display methods, system, storage medium and vehicle device
WO2020111308A1 (en) * 2018-11-28 2020-06-04 전자부품연구원 Intuitive interaction method and system for augmented reality display for vehicle
CN109631936A (en) * 2019-01-15 2019-04-16 重庆德科电子仪表有限公司 A kind of automobile instrument display vehicle device navigation optimization method
US11383731B2 (en) 2019-06-04 2022-07-12 Lg Electronics Inc. Image output device
KR102246498B1 (en) * 2019-08-22 2021-05-03 주식회사 이에스피 Cockpit module for autonomous vehicle and method for controlling display module position
CN111469859A (en) * 2020-03-27 2020-07-31 一汽奔腾轿车有限公司 Automobile gesture interaction system
CN112158141A (en) * 2020-09-27 2021-01-01 深圳口袋码农科技有限公司 Control method and device for vehicle-mounted display equipment
CN113147598B (en) * 2021-05-06 2022-03-25 黑龙江天有为电子有限责任公司 Cabin system applied to vehicle, control method of cabin system and vehicle
KR102631094B1 (en) * 2022-03-25 2024-01-30 케이지모빌리티 주식회사 Vehicle information display device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080293469A1 (en) * 2007-05-23 2008-11-27 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US20090027337A1 (en) * 2007-07-27 2009-01-29 Gesturetek, Inc. Enhanced camera-based input
US7683771B1 (en) * 2007-03-26 2010-03-23 Barry Loeb Configurable control panel and/or dashboard display
US20120013548A1 (en) * 2010-07-19 2012-01-19 Honda Motor Co., Ltd. Human-Machine Interface System
US8406961B2 (en) * 2009-04-16 2013-03-26 Panasonic Corporation Reconfigurable vehicle user interface system
US8593398B2 (en) * 2010-06-25 2013-11-26 Nokia Corporation Apparatus and method for proximity based input
US20140062858A1 (en) * 2012-08-29 2014-03-06 Alpine Electronics, Inc. Information system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100575906B1 (en) * 2002-10-25 2006-05-02 미츠비시 후소 트럭 앤드 버스 코포레이션 Hand pattern switching apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7683771B1 (en) * 2007-03-26 2010-03-23 Barry Loeb Configurable control panel and/or dashboard display
US20080293469A1 (en) * 2007-05-23 2008-11-27 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US20090027337A1 (en) * 2007-07-27 2009-01-29 Gesturetek, Inc. Enhanced camera-based input
US8406961B2 (en) * 2009-04-16 2013-03-26 Panasonic Corporation Reconfigurable vehicle user interface system
US8593398B2 (en) * 2010-06-25 2013-11-26 Nokia Corporation Apparatus and method for proximity based input
US20120013548A1 (en) * 2010-07-19 2012-01-19 Honda Motor Co., Ltd. Human-Machine Interface System
US20140062858A1 (en) * 2012-08-29 2014-03-06 Alpine Electronics, Inc. Information system

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160264070A1 (en) * 2015-03-13 2016-09-15 Yazaki Corporation Vehicle operation system
US20170212633A1 (en) * 2016-01-26 2017-07-27 Samsung Electronics Co., Ltd. Automotive control system and method for operating the same
US10086702B2 (en) * 2016-09-21 2018-10-02 Lg Electronics Inc. Dashboard display indicating speed and vehicle having the same
US20180129467A1 (en) * 2016-11-09 2018-05-10 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicle multimedia display system
US10402147B2 (en) * 2016-11-09 2019-09-03 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicle multimedia display system
EP3378692A1 (en) * 2017-03-22 2018-09-26 Kwang Yang Motor Co., Ltd. Vehicle dashboard structure
US20190359059A1 (en) * 2018-05-22 2019-11-28 Kubota Corporation Work vehicle having display unit for displaying fuel consumption amount
US11065947B2 (en) * 2018-08-23 2021-07-20 Hyundai Motor Company Apparatus and method for tracking location of sunroof blind
US10940760B2 (en) * 2018-10-16 2021-03-09 Hyundai Motor Company Device for controlling vehicle display device, system having the same, and method for controlling vehicle display device
USD941337S1 (en) 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941852S1 (en) * 2019-02-08 2022-01-25 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941338S1 (en) 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941323S1 (en) * 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941340S1 (en) 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941321S1 (en) * 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941339S1 (en) 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941322S1 (en) * 2019-02-08 2022-01-18 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD941866S1 (en) 2019-02-08 2022-01-25 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD942482S1 (en) * 2019-08-06 2022-02-01 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD944276S1 (en) 2019-08-06 2022-02-22 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD944278S1 (en) 2019-08-06 2022-02-22 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
USD944277S1 (en) 2019-08-06 2022-02-22 Nissan Motor Co., Ltd. Display screen or portion thereof with graphical user interface
JP2021102357A (en) * 2019-12-24 2021-07-15 トヨタ自動車株式会社 On-vehicle equipment operation device
US20220252219A1 (en) * 2021-02-09 2022-08-11 Hyundai Mobis Co., Ltd. Vehicle control apparatus and method using swivel operation of smart device
US11941317B2 (en) 2021-07-26 2024-03-26 Faurecia Clarion Electronics Co., Ltd. Display controlling method
US11919463B1 (en) * 2022-10-21 2024-03-05 In Motion Mobility LLC Comprehensive user control system for vehicle

Also Published As

Publication number Publication date
CN104679404A (en) 2015-06-03
KR20150062317A (en) 2015-06-08
DE102014115376A1 (en) 2015-06-03

Similar Documents

Publication Publication Date Title
US20150153936A1 (en) Integrated multimedia device for vehicle
CN108349423B (en) User interface for in-vehicle system
EP3377358B1 (en) Dynamic reconfigurable display knobs
US9718360B2 (en) Vehicle, display device for vehicle, and method for controlling the vehicle display device
KR101575648B1 (en) User interface apparatus, Vehicle having the same and method for controlling the same
US20140062872A1 (en) Input device
JP2007106353A (en) Vehicular information display device, and vehicular information display system
JP2012006551A (en) In-vehicle image display apparatus
JP2008260519A (en) Operation unit and operating method
US9802484B2 (en) Method and display device for transitioning display information
WO2016084360A1 (en) Display control device for vehicle
WO2015141147A1 (en) Notification control device for vehicle and notification control system for vehicle
EP2703226A1 (en) Vehicle-mounted apparatus control device and program
JP2009090690A (en) Touch panel device
WO2017111075A1 (en) On-board device, display area dividing method, program, and information control device
WO2014129197A1 (en) Display control device and display control program
JP4760245B2 (en) Vehicle information display device
JP2015080994A (en) Vehicular information-processing device
US8626387B1 (en) Displaying information of interest based on occupant movement
JP5902936B2 (en) In-vehicle device operation system
JP2007076383A (en) Information display unit for vehicle
JP2007076382A (en) Information display device for vehicle
US10052955B2 (en) Method for providing an operating device in a vehicle and operating device
JP2012027538A (en) Electronic apparatus
US20180022217A1 (en) Method for driving an operating arrangement for a motor vehicle with provision of two operating modes for an operating element, operating arrangement and motor vehicle

Legal Events

Date Code Title Description
AS Assignment

Owner name: HYUNDAI MOBIS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIM, MI JUNG;OH, DONG A;REEL/FRAME:034031/0992

Effective date: 20141024

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION