US20150081167A1 - Interactive vehicle window display system with vehicle function control - Google Patents

Interactive vehicle window display system with vehicle function control Download PDF

Info

Publication number
US20150081167A1
US20150081167A1 US14/180,563 US201414180563A US2015081167A1 US 20150081167 A1 US20150081167 A1 US 20150081167A1 US 201414180563 A US201414180563 A US 201414180563A US 2015081167 A1 US2015081167 A1 US 2015081167A1
Authority
US
United States
Prior art keywords
user
vehicle
recited
subsystem
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/180,563
Inventor
James T. Pisz
Jason A. Schulz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Sales USA Inc
Original Assignee
Toyota Motor Sales USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US14/180,563 priority Critical patent/US20150081167A1/en
Application filed by Toyota Motor Sales USA Inc filed Critical Toyota Motor Sales USA Inc
Assigned to TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC. reassignment TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PISZ, JAMES T., SCHULZ, JASON A.
Assigned to TOYOTA MOTOR SALES, U.S.A., INC. reassignment TOYOTA MOTOR SALES, U.S.A., INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
Priority to US14/333,638 priority patent/US20150077561A1/en
Priority to US14/447,465 priority patent/US20150081133A1/en
Priority to US14/461,427 priority patent/US9902266B2/en
Priority to US14/461,422 priority patent/US9400564B2/en
Priority to US14/469,041 priority patent/US9760698B2/en
Priority to KR1020207006527A priority patent/KR102227424B1/en
Priority to KR1020167009954A priority patent/KR20160057458A/en
Priority to EP14780691.3A priority patent/EP3047236B1/en
Priority to JP2016542865A priority patent/JP6457535B2/en
Priority to CN201480050903.9A priority patent/CN105556246B/en
Priority to PCT/US2014/055752 priority patent/WO2015042005A1/en
Priority to US14/639,695 priority patent/US9807196B2/en
Publication of US20150081167A1 publication Critical patent/US20150081167A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60HARRANGEMENTS OF HEATING, COOLING, VENTILATING OR OTHER AIR-TREATING DEVICES SPECIALLY ADAPTED FOR PASSENGER OR GOODS SPACES OF VEHICLES
    • B60H1/00Heating, cooling or ventilating [HVAC] devices
    • B60H1/00642Control systems or circuits; Control members or indication devices for heating, cooling or ventilating devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • B60K35/22
    • B60K35/23
    • B60K35/60
    • B60K35/65
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/2661Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic mounted on parts having other functions
    • B60Q1/268Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic mounted on parts having other functions on windscreens or windows
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/503Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/503Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text
    • B60Q1/5035Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text electronic displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/543Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating other states or conditions of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/545Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating other traffic conditions, e.g. fog, heavy traffic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/25Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the sides of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3688Systems comprising multiple parts or multiple output devices (not client-server), e.g. detachable faceplates, key fobs or multiple output screens
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • B60K2360/334
    • B60K2360/741
    • B60K2360/785
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/043Identity of occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle for navigation systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present disclosure relates to a vehicle and more particularly to systems and methods therefor.
  • Vehicles often include various systems such as infotainment and navigation systems. These systems are generally provided with a display around which mechanical control elements are arranged to provide a user interface mounted in a dashboard of a vehicle cabin for driver and front passenger access. Alternatively, the display combines at least some of the control elements into a touch panel display.
  • a vehicle head unit is a hardware interface located in the vehicle dash board and enabling user control of vehicle systems including, but not limited to, the vehicle's entertainment media such as AM/FM radio, satellite radio, CDs, MP3s, video; navigations systems such as GPS navigation; climate controls; communication systems such a cellular phones, text, email; and vehicle control functions such as lights, door chimes, speed control and others.
  • vehicle head unit refers to such a hardware interface, or to any control module configured to control vehicular systems.
  • dashboard mount location may limit display size and restrict the functionality thereof.
  • a system for a vehicle includes an interactive display subsystem which can generate output for display on a vehicle window and a control subsystem which can enable a user to control at least one vehicle function through the interactive display subsystem.
  • a method of controlling a vehicle function in a non-limiting embodiment of the present disclosure includes receiving instructions for controlling the vehicle function through an interactive display subsystem operable to generate output for display on a vehicle window and executing the instructions using a control subsystem.
  • a system for a vehicle includes an interactive display subsystem.
  • the interactive display subsystem includes a projection display and is operable to enable a vehicle rear-seat passenger to control at least one vehicle function.
  • FIG. 1 is a pictorial representation of an example vehicle for use with an interactive vehicle window display system
  • FIG. 2 is a schematic block diagram of the interactive vehicle window display system according to one non-limiting embodiment
  • FIG. 3 is a partial interior view of the vehicle with the interactive vehicle window display system
  • FIG. 4 is a top view of the vehicle illustrating an exterior user identification subsystem of the interactive vehicle window display system
  • FIG. 5 is a pictorial representation of the vehicle illustrating user identification via a skeletal joint relationship, key fob and/or user gesture
  • FIG. 6 is a schematic block diagram of an algorithm for operation of the system according to one non-limiting embodiment
  • FIG. 7 is a pictorial representation of an example skeletal joint relationship recognizable by the system.
  • FIG. 8 is an illustration of an example user gesture recognizable by the system
  • FIG. 9 is an example landing page displayed by the interactive vehicle window display system
  • FIG. 10 is an example route page displayed by the interactive vehicle window display system
  • FIG. 11 is an example calendar page displayed by the interactive vehicle window display system
  • FIG. 12 is an example weather page displayed by the interactive vehicle window display system
  • FIG. 13 is an example vehicle status page displayed by the interactive vehicle window display system
  • FIG. 14 is an example to-do page displayed by the interactive vehicle window display system
  • FIG. 15 is a partial interior view of a vehicle cabin illustrating an interactive environment for the driver and/or passengers to utilize functionalities of a vehicle head unit;
  • FIG. 16 is a partial interior view of the vehicle cabin illustrating discrimination of a driver and/or passenger to selectively permit utilization of functionalities of a vehicle head unit during vehicle operation;
  • FIG. 17 is a pictorial representation of a vehicle passenger facial map for use with the system to track occupant location.
  • FIG. 18 is an overhead interior view of the vehicle illustrating a sensor arrangement to track occupant location within the vehicle cabin.
  • FIG. 1 schematically illustrates a vehicle 20 with a window 22 and an interactive vehicle window display system 30 .
  • the window 22 is here shown as a driver's side passenger window of a minivan type vehicle in the disclosed, non-limiting embodiment, it should be appreciated that various vehicle types and windows will also benefit herefrom.
  • the system 30 generally includes an interactive display subsystem 32 , a control subsystem 34 , a user input subsystem 36 , a user identification subsystem 38 , and a user location subsystem 39 . It should be appreciated that although particular subsystems are separately defined, each or any of the subsystems may be combined or segregated via hardware and/or software of the system 30 . Additionally, each or any of the subsystems can be implemented using one or more computing devices including conventional central processing units or other devices capable of manipulating or processing information.
  • the interactive display subsystem 32 can include any device or devices capable of displaying images on a vehicle window 22 under the control of system 30 , and can be adapted for viewing from outside the vehicle, inside the vehicle, or both.
  • the interactive display subsystem 32 can include a display device integral to the window 22 , such as an LCD.
  • Such a display can be illuminated by ambient light or by one or more light sources under the control of system 30 .
  • Such light sources can be mounted at any operable locations enabling light emission onto a window from inside or outside the vehicle, depending on whether the display is to be viewed by a user located outside or inside the vehicle. Examples of such mounting locations can include in the floor, in the vehicle headliner, within the vehicle door structure, or in the exterior door panel.
  • the interactive display subsystem 32 can include a coating 40 and a projector 42 .
  • the coating 40 may be a polymer dispersed liquid crystal (PDLC) film, applied to the window 22 to provide both transparency when inactive and partial or complete opacity when active.
  • the window 22 treated with the coating 40 is thereby operable to display content as a projection page visible from outside and/or inside the vehicle 20 ( FIG. 1 ).
  • the projector 42 can be mounted in the floor ( FIG. 3 ) or other locations within the vehicle 20 , such as the vehicle headliner or within the vehicle door structure as well as in locations on the vehicle exterior such as in an exterior door panel.
  • the illustrated shaded area extending from the projector 42 toward the window 22 schematically represents the projection of output in the form of content pages provided by the projector 42 .
  • the coating 40 changes from transparent to opaque so that the projector 42 may project the output onto the window 22 .
  • the displayed content can include personalized information or entertainment content such as videos, games, maps, navigation, vehicle diagnostics, calendar information, weather information, vehicle climate controls, vehicle entertainment controls, email, internet browsing, or any other interactive applications associated with the recognized user, whether the information originates onboard and/or off board the vehicle 20 .
  • personalized information or entertainment content such as videos, games, maps, navigation, vehicle diagnostics, calendar information, weather information, vehicle climate controls, vehicle entertainment controls, email, internet browsing, or any other interactive applications associated with the recognized user, whether the information originates onboard and/or off board the vehicle 20 .
  • the control subsystem 34 generally includes a control module 50 with a processor 52 , a memory 54 , and an interface 56 .
  • the processor 52 may be any type of microprocessor having desired performance characteristics.
  • the memory 54 may include any type of computer readable medium which stores the data and control algorithms described herein such as a user support system algorithm 58 .
  • the functions of the algorithm 58 are disclosed in terms of functional block diagrams ( FIG. 6 ) and representative pages ( FIGS. 9-14 ), and it should be understood by those skilled in the art with the benefit of this disclosure that these functions may be enacted in either dedicated hardware circuitry or programmed software routines capable of execution in a microprocessor based electronics control embodiment.
  • control module 50 may be a portion of a central vehicle control, a stand-alone unit, or other system such as a cloud-based system.
  • Other operational software for the processor 52 may also be stored in the memory 54 .
  • the interface 56 facilitates communication with other subsystems such as the interactive display subsystem 32 , the user input subsystem 36 , the user identification subsystem 38 , and the user location subsystem 39 . It should be understood that the interface 56 may also communicate with other onboard vehicle systems and offboard vehicle systems.
  • Onboard systems include but are not limited to, a vehicle head unit 300 which communicates with vehicle sensors that provide, for example, vehicle tire pressure, fuel level and other vehicle diagnostic information.
  • Offboard vehicle systems can provide information which includes but is not limited to, weather reports, traffic, and other information which may be provided via cloud 70 .
  • the user input subsystem 36 can include one or more input sensors including onboard input sensors 60 , offboard input devices, or both.
  • Onboard input sensors 60 can include one or more motion cameras or other light sensors configured to detect gesture commands, one or more touch sensors configured to detect touch commands, one or more microphones configured to detect voice commands, or other onboard devices configured to detect user input.
  • the user input subsystem can also include offboard input devices such as a key fob 62 and/or a personal electronic device 63 of the user, e.g. a tablet, smart phone, or other mobile device.
  • At least one onboard input sensor 60 or offboard input device can be integrated into, or operate in conjunction with, the interactive display subsystem 32 .
  • the interactive display subsystem 32 includes an LCD display integrated into a window 22 and can operate in conjunction with one or more touch sensors integrated into the window 22 , causing the window to function as a touchscreen.
  • the interactive display subsystem 32 includes a projector 42 and coating 40 on the window 22 and can operate in conjunction with one or more motion detectors configured to detect user gesture commands, causing the window to operate as a gesture-based interactive display.
  • Subsystem combinations involving the interactive display subsystem 32 and the user input subsystem and enabling user interaction with a display on a vehicle window 22 will be referred to herein as an interactive window display.
  • the user identification subsystem 38 includes one or more identification sensors 64 such as a closed-circuit television (CCTV) camera, infrared, thermal or other sensor mounted to the vehicle 20 to provide a desired field of view external to the vehicle 20 as shown in FIG. 4 , internal to the vehicle, or both.
  • Identification sensors 64 such as a closed-circuit television (CCTV) camera, infrared, thermal or other sensor mounted to the vehicle 20 to provide a desired field of view external to the vehicle 20 as shown in FIG. 4 , internal to the vehicle, or both.
  • One example user identification subsystem 38 can recognize the driver and/or passenger based on image data captured by identification sensors 64 , e.g. a skeletal joint relationship 66 and/or other user form data ( FIG. 5 ), separate from, or along with, wireless devices such as the key fob 62 associated with that particular driver and/or passenger.
  • the system 30 Based at least in part on this identification, the system 30 provides access to interactive interfaces on the interactive display subsystem 32 associated with the particular driver and/or
  • the system 30 can store user profiles of known users, the user profiles including identification information relevant to individual users.
  • a user profile can contain skeleton joint relationship data or facial recognition data useable by the user identification subsystem 38 to identify or authenticate a user.
  • a user profile can additionally contain personal interest information, such as personal calendar and event information, driving/destination history, web browsing history, entertainment preferences, climate preferences, etc.
  • any or all information contained in a user profile can be stored on or shared with a personal electronic device 63 , remote server, or other cloud 70 based system.
  • Such offboard storage or sharing of user profile data can facilitate utilization of user profile data in other vehicles such as any additional vehicles owned by the user, rental vehicles, etc.
  • Such user profile data can be secured by being accessible through a password protected application running on the cloud 70 based system, by biometric authentication, or by other effective means.
  • a user profile can additionally contain user access information; data pertaining to whether the user is allowed to control a given vehicle function.
  • the user profile associated with a user can indicate full user access, or function control rights for that user. This can be analogous to the control rights of the administrator of a personal computer.
  • a user profile can alternatively indicate restricted user access.
  • the user profile associated with a child can be set to block the user from accessing certain audio or video controls, the navigation system, altering user profiles, or the like.
  • Registration of various user profiles with the system 30 can be completed in any manner, for example, over the internet or with a direct vehicle interface.
  • User profiles can be based on the identities of individual users known to or registered with the system, or to user categories, such as “unknown user”, or “valet”.
  • a default user category such as “unknown user” or “valet” can be associated with limited, default access, or can be associated with no access, i.e. complete prohibition of access to the system 30 .
  • the user location subsystem 39 operable to determine the location of one or more users inside or outside the vehicle, includes one or more location sensors 66 such as a pressure sensor, temperature sensor, or camera deployed inside or outside the vehicle.
  • a device can serve as both an identification sensor 64 and a location sensor 66 .
  • a camera mounted within the vehicle can provide information on a user's specific identity, by means described above, and on the user's location within the vehicle, such as the driver's seat or the front-row passenger's seat.
  • elements of the interactive display subsystem 32 can also operate as location sensors 66 within the user location subsystem 39 .
  • pressure sensors within a smartscreen or motion detectors operating as part of an interactive display can be used to obtain user location information.
  • user access can be based on user location as determined by the user location subsystem 39 .
  • second or third row passengers can be allowed or disallowed access to various vehicle functions such as the navigation system.
  • a user with a user profile that is associated with unlimited access per the access information associated with the user profile can specify such settings.
  • user access can be based on a combination of the user profile as applied by the user identification subsystem 38 , and the user location as detected by the user location subsystem 39 . For example, a user with unlimited access as specified by the applied user profile can nonetheless be blocked from accessing certain vehicle functions when occupying the driver's seat of a moving vehicle.
  • operation of the system 30 generally includes a sleeper mode 100 , a watcher mode 102 and a user mode 104 . It should be appreciated that other modes may additionally or alternatively be provided.
  • the system 30 If the system 30 is active but has yet to detect a user, the system 30 will be in sleeper mode 100 until awakened by the user identification subsystem 38 . After detection but prior to identification by the system 30 , the watcher mode 102 may be utilized to interact with authenticated as well as un-authenticated persons. For example, when a person approaches the vehicle 20 , the system 30 recognizes the direction from which the person has approached then activates the interactive display subsystem 32 to display an avatar, eyes or other graphic. The graphic may be directed specifically toward the direction from which the person approaches, e.g., the graphical eyes “look” toward their approach.
  • an audio capability allows the system 30 to respond to commands and initiate interaction from a blind side of the vehicle 20 , i.e., a side without the interactive display subsystem 32 .
  • the watcher mode 102 utilizes the user identification subsystem 38 to discriminate between authenticated and un-authenticated persons.
  • the user mode 104 allows a user with a known operator and/or passenger user profile in the system 30 to make decisions on approach to the vehicle 20 so that so that certain vehicle interactions need not await entry into the vehicle 20 .
  • the user mode 104 reduces distractions through the reduction of travel-associated decisions from the driver's cognitive, visual and manual workload streams once within the vehicle 20 .
  • the user is presented with an overview of information to include, for example, weather, traffic, calendar events and vehicle health.
  • predictive functions of the system 30 identify likely actions, and offer optimal paths to completion, such as planning an efficient route.
  • a maximum range of content provision by the interactive display subsystem 32 may be associated with a maximum distance at which that content can be effectively interacted with by the user.
  • the maximum range of each content feature is prioritized with respect to legibility range of content displayed by the interactive display subsystem 32 . This range metric facilitates the determination of the order in which content appears in the walkup experience. Access to prioritized content with greater maximum range allows the walkup experience to begin further from the vehicle 20 to provide the user with more overall time to interact with the system 30 .
  • the system 30 utilizes a multi-factor authentication for security and authorization.
  • Example multi-factor authentication may include the key fob 62 , skeleton joint relationship recognition ( FIG. 5 ), and/or a gesture password ( FIG. 8 ).
  • the user may be provisionally identified with one of these factors, but may require a total of at least two factors to authenticate the user prior to display of certain content. That is, the user will not be granted access to all the features in user mode 104 until a multi-factor authentication is passed and the user is within a predetermine range of the vehicle 20 . This authentication process ensures the security of the vehicle and the personal information embedded in the system 30 .
  • the first authentication factor is the key fob 62 and the second is the skeleton joint relationship ( FIG. 7 ) of the user. If the user does not have their key fob 62 , the skeleton joint relationship may become the first authentication factor and a gesture password such as a wave or particular arm movement ( FIG. 8 ) becomes the second.
  • the key fob 62 in one disclosed non-limiting embodiment may be encrypted to uniquely identify each user to the system 30 . Additional security protocols such as a rolling time key to ensure that even the encrypted key cannot be intercepted and re-used by unauthorized devices may additionally be utilized.
  • the user will be welcomed and pre-authenticated to allow limited access to selected content in the user mode 104 . This will provide the user with enough time to cycle through multiple content features during the walkup experience, yet maintain security with respect to other content features e.g., a destination.
  • all content features e.g. destination made during the pre-authenticated state, are validated for display. If the authentication fails, the user will not be granted access to the vehicle 20 or any sensitive information.
  • the system 30 in this disclosed non-limiting embodiment allows pre-authenticated access at about 30-40 feet and full access at about 15-25 feet from the vehicle.
  • the system 30 is operable to recognize a user by his skeleton joint relationships. Skeleton joint relationships in this disclosed non-limiting embodiment facilitate pre-authentication but not full authentication that grants full access to the vehicle 20 . However, if the user has been pre-authenticated via the key fob 62 , a matching skeleton joint relationship will fully authenticate the user. That is, the user identification subsystem 38 may utilize skeleton joint relationships as the second point of identification.
  • the “landing” or “home” page 200 provides a summary of alerts and important information to the user.
  • the landing page 200 provides the user with a readily reviewable overview of the status of the vehicle and how it may affect his schedule and activities.
  • the content includes time information, vehicle diagnostic information, and personal calendar information.
  • a low fuel warning is provided in addition to a traffic-based route update for use by the vehicle navigation system and a calendar event reminder to “Pick up kids in 20 minutes.”
  • the system 30 will include a fuel station as a stop during route guidance if the destination is a distance greater than the available fuel range.
  • preferred fuel stations or other stops may be predefined in the user profile.
  • the landing page 200 further displays a plurality of icons to indicate additional content pages that can be viewed by the authorized user.
  • the landing page 200 itself may be accessed on each content page as an icon such as a vehicle manufacturer mark icon on each content page.
  • the landing page 200 allows the authorized user to understand what vehicle systems or personal user profile items may require further attention and provides access to additional content feature details with regard to these items in the form of navigable icons that lead to additional content pages.
  • the landing page 200 can additionally or alternatively integrate an interactive display, for example, a smart page or video game. Other interactive vehicle display page configurations are also possible.
  • Selection of content is accomplished with, for example, the key fob 62 , user gestures, voice commands, touch inputs, etc.
  • the user utilizes the key fob 62 to cycle through various pages displayed by the interactive display subsystem 32 .
  • the key fob 62 may include a four button directional pad and two auxiliary buttons.
  • hand gestures may be used to “swipe” between pages. It should be appreciated that although particular pages are illustrated in the disclosed non-limiting embodiment, various alternative or additional pages may be provided.
  • a route page 202 defaults to the predicted best route for the user with respect to an explicit or inferred next destination. Any alternate destinations or routes that can be explicit or inferred with confidence from, for example, a user personal electronic device, are presented to permit user selection by scrolling through the options.
  • the suggested route screen is here shown accessed using the folded-map icon, however, other icons may be utilized.
  • a calendar page 204 displays the user's calendar.
  • the view is near-term, and shows only the next 2-3 upcoming appointments. If the event includes location information the user is also given the option to use the event for destination selection.
  • the calendar page 204 provides content with respect to the next appointment highlighted for the user and provides a reminder to “Pick Up kids.”
  • the calendar screen is here shown accessed using a flip calendar icon, however, other icons may be utilized.
  • a weather page 206 leverages information about the route to provide relevant weather information—this may be especially effective when the user is travelling away from home. For example, the system 30 determines whether it is more valuable to present the user with local weather information, destination weather information, or both, depending on the settings selected by the user or the type of weather information available.
  • the weather forecast is chronological.
  • the weather page 206 can be accessed with a sun icon, however, other icons may be utilized.
  • weather conditions can be utilized to generate a reminder for display on the landing screen 200 that, for example, suggests an umbrella be placed in the vehicle if rain is forecasted.
  • a vehicle status page 208 provides the user with a view of impending vehicle maintenance needs that requires attention. Notifications can include source details of the notification, severity, and options to resolve the potential issue. For example, given the notification of “Low Fuel,” the system 30 can suggest a route to a nearby fuel station within the range of the vehicle.
  • the vehicle status page 208 is here shown accessed with a vehicle icon, however, other icons may be utilized.
  • a to-do list page 210 presents the authorized user with information from any associated to-do list available on, for example, that user's personal electronic device 63 , remote device, or web service.
  • the recognized user is tasked to “Send Package,” “Submit Taxes,” and “Renew Car Registration,” among other items.
  • the to-do list page 210 can alternatively be integrated into the route selection page if location information is included in a given list item in the personal electronic device to-do list.
  • An example of this integration includes the provision of route details to a dry cleaner if the dry cleaning pickup is on the to-do list and the current route is proximate to the location of the dry cleaner location.
  • the to-do list page is here shown accessed using a check-mark icon, however, other icons may be utilized.
  • information of this nature can in some variations be stored on or shared with a personal electronic device 63 , remote server, or other cloud 70 based system, facilitating utilization in more than one vehicle. Any such information can be secured by being accessible through a password protected application running on the cloud 70 based system, by biometric authentication, or by other effective means.
  • a first user can be granted partial or complete access to a second user's profile by password sharing, for example.
  • Such sharing of access could enable a first user to write reminders or tasks from a remote location to the user profile of a second user, such as a family member, such that the reminders or tasks written by the first user will be displayed on a window when the second user approaches or enters the vehicle, or any vehicle equipped with system 30 enabled to access the user profile of the second user.
  • user access to various vehicle functions can include direct or remote access to utilize functionalities of a vehicle head unit 300 .
  • a front-seat passenger can be offered more menu selections than the driver, while 2nd and 3rd row passengers can be offered even greater menu selections than the front-seat passenger. In these embodiments, the passengers can take over portions of the driver workload.
  • the vehicle passengers may, for example, interact with the system 30 and thereby the vehicle head unit 300 via an interactive window display or through a personal electronic device such as a smart phone or tablet which communicates therewith, through Bluetooth, RFID or other wireless technology standards to exchange data.
  • the system 30 may permit the formation of personal area networks (PANs) for vehicle passengers to share information.
  • PANs personal area networks
  • a passenger's personal electronic device may include a mapping app operable to communicate with the vehicle navigation system on the vehicle head unit 300 with no features locked out such that the passenger can search destinations and selectively send to the vehicle navigation system via the vehicle head unit 300 .
  • Interaction of the system 30 with the vehicle head unit 300 also allows the driver and/or passengers to select content for other vehicle passengers and/or the driver.
  • one of the passengers can select a destination to display on the navigation system for the driver while the vehicle is in motion.
  • the driver can select entertainment content for display to child passengers.
  • the passenger can control infotainment or climate control features controlled by the vehicle head unit 300 .
  • the system 30 by utilizing user location subsystem 39 , is operable to track the location or position of the vehicle occupants within the vehicle cabin 400 ( FIG. 18 ) through skeletal position ( FIG. 16 ), facial map data ( FIG. 17 ), pressure sensors, interactive window display input sensors, or others.
  • skeletal position FIG. 16
  • facial map data FIG. 17
  • pressure sensors For a three row vehicle, for example, three distinct areas are tracked—front row, middle row and rear row.
  • at least two sensors 402 per row are required to track a state of each occupant within the vehicle 20 . In some instances, each individual seat in the vehicle 20 can be tracked.
  • the data from all sensors 402 may alternatively or additionally be combined to create one central map ( 2 D or 3 D) for use by the system 30 . It should be appreciated that the sensors 402 may communicate with, or be a portion of, the user identification subsystem 38 , the user location subsystem 39 , or both.
  • the multi-point skeletal joint relationship and facial recognition map data provides a relatively accurate position of each occupant captured on an XYZ axis map that can track, to a desired level of precision, the state of each occupant at a specific snapshot in time.
  • the state of each occupant facilitates further tailored operations for various vehicle functions.
  • the user location subsystem 39 detects and discriminates between a driver's hand from that of a vehicle front row passenger hand to selectively unlock various head unit functionality such as navigation route selection ( FIG. 16 ).
  • head unit functionality such as navigation route selection ( FIG. 16 ).
  • Dependent for example, on which user (driver or passenger) is attempting to access the system 30 and whether the vehicle is in motion, content menu items of the vehicle head unit 300 are selectively displayed. For example, certain content such as route selection may be color coded for only passenger access, while other content such as zooming and scrolling may always be available regardless of user.
  • the system 30 Upon approach to the vehicle, the system 30 beneficially recognizes a user with a first and second point of identification to display information for that particular, authorized user. This authentication process ensures the security of the vehicle and the personal information embedded in the system 30 yet permits vehicle interaction prior to user entry into the vehicle cabin. The system 30 also beneficially discriminates passengers from the driver to selectively permit access to personalized content or specific vehicle system interfaces.

Abstract

A system for a vehicle includes an interactive display subsystem operable to generate output for display on a vehicle window, the display visible from inside the vehicle, outside the vehicle, or both. The system also includes a control subsystem operable to enable a user to control at least one vehicle function through the interactive display subsystem. A method of controlling a vehicle function includes receiving instructions for controlling the vehicle function through an interactive display subsystem operable to generate output for display on a vehicle window and executing the instructions using a control subsystem.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present disclosure claims priority to U.S. Provisional Patent Disclosure Ser. No. 61/878,898, filed 17 Sep. 2013.
  • BACKGROUND
  • The present disclosure relates to a vehicle and more particularly to systems and methods therefor.
  • Vehicles often include various systems such as infotainment and navigation systems. These systems are generally provided with a display around which mechanical control elements are arranged to provide a user interface mounted in a dashboard of a vehicle cabin for driver and front passenger access. Alternatively, the display combines at least some of the control elements into a touch panel display.
  • Conventionally, a vehicle head unit is a hardware interface located in the vehicle dash board and enabling user control of vehicle systems including, but not limited to, the vehicle's entertainment media such as AM/FM radio, satellite radio, CDs, MP3s, video; navigations systems such as GPS navigation; climate controls; communication systems such a cellular phones, text, email; and vehicle control functions such as lights, door chimes, speed control and others. As used herein, the term vehicle head unit refers to such a hardware interface, or to any control module configured to control vehicular systems.
  • Due to the numerous functions typically available, operation may require a relatively substantial amount of time, e.g. to find the desired mechanical control element or to browse through menus and submenus to access a desired function. Further, the dashboard mount location may limit display size and restrict the functionality thereof.
  • Although effective, such display and control elements necessarily require the user to be within the vehicle to operate the system and thereby increase total travel time in the vehicle. In particular, the multitude of control options presented to a driver can result in significant distraction from the driver's primary task of safely operating the vehicle. Such driver distraction can lengthen travel times and decrease the safety of the driver and any passengers. Various regulatory agencies may also require lock out of the system to prevent driver interaction when the vehicle is in motion.
  • SUMMARY
  • A system for a vehicle according to one disclosed non-limiting embodiment of the present disclosure includes an interactive display subsystem which can generate output for display on a vehicle window and a control subsystem which can enable a user to control at least one vehicle function through the interactive display subsystem.
  • A method of controlling a vehicle function in a non-limiting embodiment of the present disclosure includes receiving instructions for controlling the vehicle function through an interactive display subsystem operable to generate output for display on a vehicle window and executing the instructions using a control subsystem.
  • A system for a vehicle according to another disclosed non-limiting embodiment of the present disclosure includes an interactive display subsystem. The interactive display subsystem includes a projection display and is operable to enable a vehicle rear-seat passenger to control at least one vehicle function.
  • The foregoing features and elements may be combined in various combinations without exclusivity, unless expressly indicated otherwise. These features and elements as well as the operation thereof will become more apparent in light of the following description and the accompanying drawings. It should be understood, however, the following description and drawings are intended to be exemplary in nature and non-limiting.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Various features will become apparent to those skilled in the art from the following detailed description of the disclosed non-limiting embodiment. The drawings that accompany the detailed description can be briefly described as follows:
  • FIG. 1 is a pictorial representation of an example vehicle for use with an interactive vehicle window display system;
  • FIG. 2 is a schematic block diagram of the interactive vehicle window display system according to one non-limiting embodiment;
  • FIG. 3 is a partial interior view of the vehicle with the interactive vehicle window display system;
  • FIG. 4 is a top view of the vehicle illustrating an exterior user identification subsystem of the interactive vehicle window display system;
  • FIG. 5 is a pictorial representation of the vehicle illustrating user identification via a skeletal joint relationship, key fob and/or user gesture;
  • FIG. 6 is a schematic block diagram of an algorithm for operation of the system according to one non-limiting embodiment;
  • FIG. 7 is a pictorial representation of an example skeletal joint relationship recognizable by the system;
  • FIG. 8 is an illustration of an example user gesture recognizable by the system;
  • FIG. 9 is an example landing page displayed by the interactive vehicle window display system;
  • FIG. 10 is an example route page displayed by the interactive vehicle window display system;
  • FIG. 11 is an example calendar page displayed by the interactive vehicle window display system;
  • FIG. 12 is an example weather page displayed by the interactive vehicle window display system;
  • FIG. 13 is an example vehicle status page displayed by the interactive vehicle window display system;
  • FIG. 14 is an example to-do page displayed by the interactive vehicle window display system;
  • FIG. 15 is a partial interior view of a vehicle cabin illustrating an interactive environment for the driver and/or passengers to utilize functionalities of a vehicle head unit;
  • FIG. 16 is a partial interior view of the vehicle cabin illustrating discrimination of a driver and/or passenger to selectively permit utilization of functionalities of a vehicle head unit during vehicle operation;
  • FIG. 17 is a pictorial representation of a vehicle passenger facial map for use with the system to track occupant location; and
  • FIG. 18 is an overhead interior view of the vehicle illustrating a sensor arrangement to track occupant location within the vehicle cabin.
  • DETAILED DESCRIPTION
  • FIG. 1 schematically illustrates a vehicle 20 with a window 22 and an interactive vehicle window display system 30. Although the window 22 is here shown as a driver's side passenger window of a minivan type vehicle in the disclosed, non-limiting embodiment, it should be appreciated that various vehicle types and windows will also benefit herefrom.
  • With reference to FIG. 2, selected portions of the system 30 are schematically illustrated. The system 30 generally includes an interactive display subsystem 32, a control subsystem 34, a user input subsystem 36, a user identification subsystem 38, and a user location subsystem 39. It should be appreciated that although particular subsystems are separately defined, each or any of the subsystems may be combined or segregated via hardware and/or software of the system 30. Additionally, each or any of the subsystems can be implemented using one or more computing devices including conventional central processing units or other devices capable of manipulating or processing information.
  • The interactive display subsystem 32 can include any device or devices capable of displaying images on a vehicle window 22 under the control of system 30, and can be adapted for viewing from outside the vehicle, inside the vehicle, or both. In one non-limiting example the interactive display subsystem 32 can include a display device integral to the window 22, such as an LCD. Such a display can be illuminated by ambient light or by one or more light sources under the control of system 30. Such light sources can be mounted at any operable locations enabling light emission onto a window from inside or outside the vehicle, depending on whether the display is to be viewed by a user located outside or inside the vehicle. Examples of such mounting locations can include in the floor, in the vehicle headliner, within the vehicle door structure, or in the exterior door panel.
  • In another non-limiting example, the interactive display subsystem 32 can include a coating 40 and a projector 42. The coating 40, for example, may be a polymer dispersed liquid crystal (PDLC) film, applied to the window 22 to provide both transparency when inactive and partial or complete opacity when active. The window 22 treated with the coating 40 is thereby operable to display content as a projection page visible from outside and/or inside the vehicle 20 (FIG. 1). The projector 42 can be mounted in the floor (FIG. 3) or other locations within the vehicle 20, such as the vehicle headliner or within the vehicle door structure as well as in locations on the vehicle exterior such as in an exterior door panel. The illustrated shaded area extending from the projector 42 toward the window 22 schematically represents the projection of output in the form of content pages provided by the projector 42. In response to the approach of a recognized user, the coating 40 changes from transparent to opaque so that the projector 42 may project the output onto the window 22.
  • As will be further described, the displayed content can include personalized information or entertainment content such as videos, games, maps, navigation, vehicle diagnostics, calendar information, weather information, vehicle climate controls, vehicle entertainment controls, email, internet browsing, or any other interactive applications associated with the recognized user, whether the information originates onboard and/or off board the vehicle 20.
  • The control subsystem 34 generally includes a control module 50 with a processor 52, a memory 54, and an interface 56. The processor 52 may be any type of microprocessor having desired performance characteristics. The memory 54 may include any type of computer readable medium which stores the data and control algorithms described herein such as a user support system algorithm 58. The functions of the algorithm 58 are disclosed in terms of functional block diagrams (FIG. 6) and representative pages (FIGS. 9-14), and it should be understood by those skilled in the art with the benefit of this disclosure that these functions may be enacted in either dedicated hardware circuitry or programmed software routines capable of execution in a microprocessor based electronics control embodiment.
  • With continued reference to FIG. 2, the control module 50 may be a portion of a central vehicle control, a stand-alone unit, or other system such as a cloud-based system. Other operational software for the processor 52 may also be stored in the memory 54. The interface 56 facilitates communication with other subsystems such as the interactive display subsystem 32, the user input subsystem 36, the user identification subsystem 38, and the user location subsystem 39. It should be understood that the interface 56 may also communicate with other onboard vehicle systems and offboard vehicle systems. Onboard systems include but are not limited to, a vehicle head unit 300 which communicates with vehicle sensors that provide, for example, vehicle tire pressure, fuel level and other vehicle diagnostic information. Offboard vehicle systems can provide information which includes but is not limited to, weather reports, traffic, and other information which may be provided via cloud 70.
  • The user input subsystem 36 can include one or more input sensors including onboard input sensors 60, offboard input devices, or both. Onboard input sensors 60 can include one or more motion cameras or other light sensors configured to detect gesture commands, one or more touch sensors configured to detect touch commands, one or more microphones configured to detect voice commands, or other onboard devices configured to detect user input. The user input subsystem can also include offboard input devices such as a key fob 62 and/or a personal electronic device 63 of the user, e.g. a tablet, smart phone, or other mobile device.
  • In some instances, at least one onboard input sensor 60 or offboard input device can be integrated into, or operate in conjunction with, the interactive display subsystem 32. In one non-limiting example, the interactive display subsystem 32 includes an LCD display integrated into a window 22 and can operate in conjunction with one or more touch sensors integrated into the window 22, causing the window to function as a touchscreen. In another non-limiting example, the interactive display subsystem 32 includes a projector 42 and coating 40 on the window 22 and can operate in conjunction with one or more motion detectors configured to detect user gesture commands, causing the window to operate as a gesture-based interactive display. Subsystem combinations involving the interactive display subsystem 32 and the user input subsystem and enabling user interaction with a display on a vehicle window 22 will be referred to herein as an interactive window display.
  • The user identification subsystem 38 includes one or more identification sensors 64 such as a closed-circuit television (CCTV) camera, infrared, thermal or other sensor mounted to the vehicle 20 to provide a desired field of view external to the vehicle 20 as shown in FIG. 4, internal to the vehicle, or both. One example user identification subsystem 38 can recognize the driver and/or passenger based on image data captured by identification sensors 64, e.g. a skeletal joint relationship 66 and/or other user form data (FIG. 5), separate from, or along with, wireless devices such as the key fob 62 associated with that particular driver and/or passenger. Based at least in part on this identification, the system 30 provides access to interactive interfaces on the interactive display subsystem 32 associated with the particular driver and/or passenger.
  • The system 30 can store user profiles of known users, the user profiles including identification information relevant to individual users. For example, a user profile can contain skeleton joint relationship data or facial recognition data useable by the user identification subsystem 38 to identify or authenticate a user. A user profile can additionally contain personal interest information, such as personal calendar and event information, driving/destination history, web browsing history, entertainment preferences, climate preferences, etc. In some variations, any or all information contained in a user profile can be stored on or shared with a personal electronic device 63, remote server, or other cloud 70 based system. Such offboard storage or sharing of user profile data can facilitate utilization of user profile data in other vehicles such as any additional vehicles owned by the user, rental vehicles, etc. Such user profile data can be secured by being accessible through a password protected application running on the cloud 70 based system, by biometric authentication, or by other effective means.
  • In some instances, a user profile can additionally contain user access information; data pertaining to whether the user is allowed to control a given vehicle function. For example, the user profile associated with a user can indicate full user access, or function control rights for that user. This can be analogous to the control rights of the administrator of a personal computer. A user profile can alternatively indicate restricted user access. For example, the user profile associated with a child can be set to block the user from accessing certain audio or video controls, the navigation system, altering user profiles, or the like.
  • Registration of various user profiles with the system 30 can be completed in any manner, for example, over the internet or with a direct vehicle interface. User profiles can be based on the identities of individual users known to or registered with the system, or to user categories, such as “unknown user”, or “valet”. In different variations, a default user category such as “unknown user” or “valet” can be associated with limited, default access, or can be associated with no access, i.e. complete prohibition of access to the system 30.
  • The user location subsystem 39, operable to determine the location of one or more users inside or outside the vehicle, includes one or more location sensors 66 such as a pressure sensor, temperature sensor, or camera deployed inside or outside the vehicle. In some cases, a device can serve as both an identification sensor 64 and a location sensor 66. For example, a camera mounted within the vehicle can provide information on a user's specific identity, by means described above, and on the user's location within the vehicle, such as the driver's seat or the front-row passenger's seat. In some cases, elements of the interactive display subsystem 32 can also operate as location sensors 66 within the user location subsystem 39. For example, pressure sensors within a smartscreen or motion detectors operating as part of an interactive display can be used to obtain user location information.
  • In some instances, user access can be based on user location as determined by the user location subsystem 39. For example, second or third row passengers can be allowed or disallowed access to various vehicle functions such as the navigation system. Optionally, a user with a user profile that is associated with unlimited access per the access information associated with the user profile can specify such settings. In some instances, user access can be based on a combination of the user profile as applied by the user identification subsystem 38, and the user location as detected by the user location subsystem 39. For example, a user with unlimited access as specified by the applied user profile can nonetheless be blocked from accessing certain vehicle functions when occupying the driver's seat of a moving vehicle.
  • With reference to FIG. 6, operation of the system 30 according to one disclosed non-limiting embodiment generally includes a sleeper mode 100, a watcher mode 102 and a user mode 104. It should be appreciated that other modes may additionally or alternatively be provided.
  • If the system 30 is active but has yet to detect a user, the system 30 will be in sleeper mode 100 until awakened by the user identification subsystem 38. After detection but prior to identification by the system 30, the watcher mode 102 may be utilized to interact with authenticated as well as un-authenticated persons. For example, when a person approaches the vehicle 20, the system 30 recognizes the direction from which the person has approached then activates the interactive display subsystem 32 to display an avatar, eyes or other graphic. The graphic may be directed specifically toward the direction from which the person approaches, e.g., the graphical eyes “look” toward their approach. Alternatively, an audio capability allows the system 30 to respond to commands and initiate interaction from a blind side of the vehicle 20, i.e., a side without the interactive display subsystem 32. The watcher mode 102 utilizes the user identification subsystem 38 to discriminate between authenticated and un-authenticated persons.
  • The user mode 104 allows a user with a known operator and/or passenger user profile in the system 30 to make decisions on approach to the vehicle 20 so that so that certain vehicle interactions need not await entry into the vehicle 20. The user mode 104 reduces distractions through the reduction of travel-associated decisions from the driver's cognitive, visual and manual workload streams once within the vehicle 20. In furtherance of this, the user is presented with an overview of information to include, for example, weather, traffic, calendar events and vehicle health. As will be further described, predictive functions of the system 30 identify likely actions, and offer optimal paths to completion, such as planning an efficient route.
  • A maximum range of content provision by the interactive display subsystem 32 may be associated with a maximum distance at which that content can be effectively interacted with by the user. In one disclosed non-limiting embodiment, the maximum range of each content feature is prioritized with respect to legibility range of content displayed by the interactive display subsystem 32. This range metric facilitates the determination of the order in which content appears in the walkup experience. Access to prioritized content with greater maximum range allows the walkup experience to begin further from the vehicle 20 to provide the user with more overall time to interact with the system 30.
  • In one disclosed non-limiting embodiment, the system 30 utilizes a multi-factor authentication for security and authorization. Example multi-factor authentication may include the key fob 62, skeleton joint relationship recognition (FIG. 5), and/or a gesture password (FIG. 8). The user may be provisionally identified with one of these factors, but may require a total of at least two factors to authenticate the user prior to display of certain content. That is, the user will not be granted access to all the features in user mode 104 until a multi-factor authentication is passed and the user is within a predetermine range of the vehicle 20. This authentication process ensures the security of the vehicle and the personal information embedded in the system 30. In one disclosed non-limiting embodiment, the first authentication factor is the key fob 62 and the second is the skeleton joint relationship (FIG. 7) of the user. If the user does not have their key fob 62, the skeleton joint relationship may become the first authentication factor and a gesture password such as a wave or particular arm movement (FIG. 8) becomes the second.
  • The key fob 62 in one disclosed non-limiting embodiment may be encrypted to uniquely identify each user to the system 30. Additional security protocols such as a rolling time key to ensure that even the encrypted key cannot be intercepted and re-used by unauthorized devices may additionally be utilized.
  • Once the key fob 62 is recognized, the user will be welcomed and pre-authenticated to allow limited access to selected content in the user mode 104. This will provide the user with enough time to cycle through multiple content features during the walkup experience, yet maintain security with respect to other content features e.g., a destination. Once the user has been fully authenticated, all content features, e.g. destination made during the pre-authenticated state, are validated for display. If the authentication fails, the user will not be granted access to the vehicle 20 or any sensitive information. The system 30 in this disclosed non-limiting embodiment allows pre-authenticated access at about 30-40 feet and full access at about 15-25 feet from the vehicle.
  • With respect to FIG. 7, to provide further authentication, the system 30 is operable to recognize a user by his skeleton joint relationships. Skeleton joint relationships in this disclosed non-limiting embodiment facilitate pre-authentication but not full authentication that grants full access to the vehicle 20. However, if the user has been pre-authenticated via the key fob 62, a matching skeleton joint relationship will fully authenticate the user. That is, the user identification subsystem 38 may utilize skeleton joint relationships as the second point of identification.
  • With reference to FIG. 9, once authenticated, the “landing” or “home” page 200 provides a summary of alerts and important information to the user. The landing page 200 provides the user with a readily reviewable overview of the status of the vehicle and how it may affect his schedule and activities. In this example, the content includes time information, vehicle diagnostic information, and personal calendar information. Here shown, a low fuel warning is provided in addition to a traffic-based route update for use by the vehicle navigation system and a calendar event reminder to “Pick up Kids in 20 minutes.” In another example, the system 30 will include a fuel station as a stop during route guidance if the destination is a distance greater than the available fuel range. Notably, preferred fuel stations or other stops may be predefined in the user profile.
  • The landing page 200 further displays a plurality of icons to indicate additional content pages that can be viewed by the authorized user. The landing page 200 itself may be accessed on each content page as an icon such as a vehicle manufacturer mark icon on each content page. The landing page 200 allows the authorized user to understand what vehicle systems or personal user profile items may require further attention and provides access to additional content feature details with regard to these items in the form of navigable icons that lead to additional content pages. The landing page 200 can additionally or alternatively integrate an interactive display, for example, a smart page or video game. Other interactive vehicle display page configurations are also possible.
  • Selection of content is accomplished with, for example, the key fob 62, user gestures, voice commands, touch inputs, etc. In one example, the user utilizes the key fob 62 to cycle through various pages displayed by the interactive display subsystem 32. In one example, the key fob 62 may include a four button directional pad and two auxiliary buttons. Alternatively, hand gestures may be used to “swipe” between pages. It should be appreciated that although particular pages are illustrated in the disclosed non-limiting embodiment, various alternative or additional pages may be provided.
  • With reference to FIG. 10, a route page 202 defaults to the predicted best route for the user with respect to an explicit or inferred next destination. Any alternate destinations or routes that can be explicit or inferred with confidence from, for example, a user personal electronic device, are presented to permit user selection by scrolling through the options. The suggested route screen is here shown accessed using the folded-map icon, however, other icons may be utilized.
  • With reference to FIG. 11, a calendar page 204 displays the user's calendar. In this example, the view is near-term, and shows only the next 2-3 upcoming appointments. If the event includes location information the user is also given the option to use the event for destination selection. Here shown, the calendar page 204 provides content with respect to the next appointment highlighted for the user and provides a reminder to “Pick Up Kids.” The calendar screen is here shown accessed using a flip calendar icon, however, other icons may be utilized.
  • With reference to FIG. 12, a weather page 206 leverages information about the route to provide relevant weather information—this may be especially effective when the user is travelling away from home. For example, the system 30 determines whether it is more valuable to present the user with local weather information, destination weather information, or both, depending on the settings selected by the user or the type of weather information available. Here shown, the weather forecast is chronological. The weather page 206 can be accessed with a sun icon, however, other icons may be utilized. In addition, weather conditions can be utilized to generate a reminder for display on the landing screen 200 that, for example, suggests an umbrella be placed in the vehicle if rain is forecasted.
  • With reference to FIG. 13, a vehicle status page 208 provides the user with a view of impending vehicle maintenance needs that requires attention. Notifications can include source details of the notification, severity, and options to resolve the potential issue. For example, given the notification of “Low Fuel,” the system 30 can suggest a route to a nearby fuel station within the range of the vehicle. The vehicle status page 208 is here shown accessed with a vehicle icon, however, other icons may be utilized.
  • With reference to FIG. 14, a to-do list page 210 presents the authorized user with information from any associated to-do list available on, for example, that user's personal electronic device 63, remote device, or web service. Here shown, the recognized user is tasked to “Send Package,” “Submit Taxes,” and “Renew Car Registration,” among other items. The to-do list page 210 can alternatively be integrated into the route selection page if location information is included in a given list item in the personal electronic device to-do list. An example of this integration includes the provision of route details to a dry cleaner if the dry cleaning pickup is on the to-do list and the current route is proximate to the location of the dry cleaner location. The to-do list page is here shown accessed using a check-mark icon, however, other icons may be utilized.
  • As noted above, information of this nature, which can be included in a user profile, can in some variations be stored on or shared with a personal electronic device 63, remote server, or other cloud 70 based system, facilitating utilization in more than one vehicle. Any such information can be secured by being accessible through a password protected application running on the cloud 70 based system, by biometric authentication, or by other effective means. In some such variations, a first user can be granted partial or complete access to a second user's profile by password sharing, for example. Such sharing of access could enable a first user to write reminders or tasks from a remote location to the user profile of a second user, such as a family member, such that the reminders or tasks written by the first user will be displayed on a window when the second user approaches or enters the vehicle, or any vehicle equipped with system 30 enabled to access the user profile of the second user.
  • With reference to FIG. 15, user access to various vehicle functions can include direct or remote access to utilize functionalities of a vehicle head unit 300.
  • With the interactivity between the vehicle head unit 300 and the system 30, and in particular between the vehicle head unit 300 and various interactive window displays, passengers can make selections with regard to vehicle systems typically performed by driver and in some cases only when the vehicle is stationary. Allowing only passengers to interact with certain vehicle systems while the vehicle is in motion increases safety by minimization of driver distraction. Passenger interaction can also enable greater functionality for the system 30. For example, a front-seat passenger can be offered more menu selections than the driver, while 2nd and 3rd row passengers can be offered even greater menu selections than the front-seat passenger. In these embodiments, the passengers can take over portions of the driver workload.
  • The vehicle passengers may, for example, interact with the system 30 and thereby the vehicle head unit 300 via an interactive window display or through a personal electronic device such as a smart phone or tablet which communicates therewith, through Bluetooth, RFID or other wireless technology standards to exchange data. Further, the system 30 may permit the formation of personal area networks (PANs) for vehicle passengers to share information. For example, a passenger's personal electronic device may include a mapping app operable to communicate with the vehicle navigation system on the vehicle head unit 300 with no features locked out such that the passenger can search destinations and selectively send to the vehicle navigation system via the vehicle head unit 300.
  • Interaction of the system 30 with the vehicle head unit 300 also allows the driver and/or passengers to select content for other vehicle passengers and/or the driver. For example, one of the passengers can select a destination to display on the navigation system for the driver while the vehicle is in motion. In another example, the driver can select entertainment content for display to child passengers. In yet another example, the passenger can control infotainment or climate control features controlled by the vehicle head unit 300.
  • With reference to FIG. 16, and in one non-limiting example of the operation of the user location subsystem 39, to still further increase safety through driver distraction minimization, the system 30, by utilizing user location subsystem 39, is operable to track the location or position of the vehicle occupants within the vehicle cabin 400 (FIG. 18) through skeletal position (FIG. 16), facial map data (FIG. 17), pressure sensors, interactive window display input sensors, or others. For a three row vehicle, for example, three distinct areas are tracked—front row, middle row and rear row. Typically, at least two sensors 402 per row are required to track a state of each occupant within the vehicle 20. In some instances, each individual seat in the vehicle 20 can be tracked. The data from all sensors 402 may alternatively or additionally be combined to create one central map (2D or 3D) for use by the system 30. It should be appreciated that the sensors 402 may communicate with, or be a portion of, the user identification subsystem 38, the user location subsystem 39, or both.
  • Given that the vehicle occupants are typically seated and belted, the multi-point skeletal joint relationship and facial recognition map data provides a relatively accurate position of each occupant captured on an XYZ axis map that can track, to a desired level of precision, the state of each occupant at a specific snapshot in time. The state of each occupant facilitates further tailored operations for various vehicle functions. For example, the user location subsystem 39 detects and discriminates between a driver's hand from that of a vehicle front row passenger hand to selectively unlock various head unit functionality such as navigation route selection (FIG. 16). Dependent, for example, on which user (driver or passenger) is attempting to access the system 30 and whether the vehicle is in motion, content menu items of the vehicle head unit 300 are selectively displayed. For example, certain content such as route selection may be color coded for only passenger access, while other content such as zooming and scrolling may always be available regardless of user.
  • Upon approach to the vehicle, the system 30 beneficially recognizes a user with a first and second point of identification to display information for that particular, authorized user. This authentication process ensures the security of the vehicle and the personal information embedded in the system 30 yet permits vehicle interaction prior to user entry into the vehicle cabin. The system 30 also beneficially discriminates passengers from the driver to selectively permit access to personalized content or specific vehicle system interfaces.
  • The use of the terms “a” and “an” and “the” and similar references in the context of description (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or specifically contradicted by context. The modifier “about” used in connection with a quantity is inclusive of the stated value and has the meaning dictated by the context (e.g., it includes the degree of error associated with measurement of the particular quantity). All ranges disclosed herein are inclusive of the endpoints. It should be appreciated that relative positional terms such as “forward,” “aft,” “upper,” “lower,” “above,” “below,” and the like are with reference to the normal operational attitude of the vehicle and should not be considered otherwise limiting.
  • Although the different non-limiting embodiments have specific illustrated components, the embodiments of this invention are not limited to those particular combinations. It is possible to use some of the components or features from any of the non-limiting embodiments in combination with features or components from any of the other non-limiting embodiments.
  • It should be appreciated that like reference numerals identify corresponding or similar elements throughout the several drawings. It should also be appreciated that although a particular component arrangement is disclosed in the illustrated embodiment, other arrangements will benefit herefrom.
  • Although particular step sequences are shown, described, and claimed, it should be understood that steps may be performed in any order, separated or combined unless otherwise indicated and will still benefit from the present disclosure.
  • The foregoing description is exemplary rather than defined by the limitations within. Various non-limiting embodiments are disclosed herein, however, one of ordinary skill in the art would recognize that various modifications and variations in light of the above teachings will fall within the scope of the appended claims. It is therefore to be appreciated that within the scope of the appended claims, the disclosure may be practiced other than as specifically described. For that reason the appended claims should be studied to determine true scope and content.

Claims (24)

What is claimed is:
1. A system for a vehicle, comprising:
an interactive display subsystem operable to generate output for display on a vehicle window; and
a control subsystem operable to enable a user to control at least one vehicle function through the interactive display subsystem.
2. The system as recited in claim 1, wherein the vehicle function is a navigation function.
3. The system as recited in claim 1, wherein the vehicle function is an audio/video function.
4. The system as recited in claim 1, wherein the vehicle function is a climate control function.
5. The system as recited in claim 1, wherein the vehicle function is an internet access function.
6. The system as recited in claim 1, wherein the interactive display subsystem comprises motion sensors.
7. The system as recited in claim 1, further comprising a vehicle head unit in communication with the interactive display subsystem.
8. The system as recited in claim 1, further comprising:
a user location subsystem operable to detect a location of the user and in communication with the control subsystem, the control subsystem further operable to control user access to a vehicle head unit based on the location of the user.
9. The system as recited in claim 8, further comprising:
a user identification subsystem operable to identify the user and in communication with the control subsystem, the control subsystem further operable to control user access to a vehicle head unit based on the identity of the user.
10. The system as recited in claim 9, wherein the user identification subsystem includes one or more user profiles, each user profile associated with at least one user.
11. The system as recited in claim 10, wherein each user profile includes user access information.
12. The system as recited in claim 11, wherein the user access information indicates full access or restricted access.
13. The system as recited in claim 10, wherein the user access information is based on both the user location and the identity of the user.
14. A method of controlling a vehicle function, comprising:
receiving instructions for controlling the vehicle function through an interactive display subsystem operable to generate output for display on a vehicle window; and
executing the instructions using a control subsystem.
15. The method as recited in claim 14, further comprising:
determining a user location; and
regulating user access to controlling the vehicle function based on the user location.
16. The method as recited in claim 15, further comprising:
determining a user identity; and
regulating user access to controlling the vehicle function based on the user identity.
17. The method as recited in claim 14, wherein the vehicle function is a navigation function.
18. The method as recited in claim 14, wherein the vehicle function is an audio/video function.
19. The method as recited in claim 14, wherein the vehicle function is a climate control function.
20. The method as recited in claim 14, wherein the vehicle function is an internet access function.
21. The method as recited in claim 14, wherein the interactive display subsystem comprises an interactive touch screen with touch sensors.
22. The method as recited in claim 14, wherein the interactive display subsystem comprises a projection display.
23. The method as recited in claim 14, wherein the interactive display subsystem comprises motion sensors.
24. A system for a vehicle, comprising:
an interactive display subsystem including a projection display, the interactive display subsystem operable to enable a vehicle rear-seat passenger to control at least one vehicle function.
US14/180,563 2013-09-17 2014-02-14 Interactive vehicle window display system with vehicle function control Abandoned US20150081167A1 (en)

Priority Applications (13)

Application Number Priority Date Filing Date Title
US14/180,563 US20150081167A1 (en) 2013-09-17 2014-02-14 Interactive vehicle window display system with vehicle function control
US14/333,638 US20150077561A1 (en) 2013-09-17 2014-07-17 Monitoring a back row seat area of a vehicle
US14/447,465 US20150081133A1 (en) 2013-09-17 2014-07-30 Gesture-based system enabling children to control some vehicle functions in a vehicle
US14/461,427 US9902266B2 (en) 2013-09-17 2014-08-17 Interactive vehicle window display system with personal convenience reminders
US14/461,422 US9400564B2 (en) 2013-09-17 2014-08-17 Interactive vehicle window display system with a safe driving reminder system
US14/469,041 US9760698B2 (en) 2013-09-17 2014-08-26 Integrated wearable article for interactive vehicle control system
PCT/US2014/055752 WO2015042005A1 (en) 2013-09-17 2014-09-16 Interactive vehicle window display system with user identification and vehicle function control
KR1020207006527A KR102227424B1 (en) 2013-09-17 2014-09-16 Interactive vehicle window display system with user identification and vehicle function control
CN201480050903.9A CN105556246B (en) 2013-09-17 2014-09-16 Interactive vehicle window display system with user's identification and vehicle functions control
KR1020167009954A KR20160057458A (en) 2013-09-17 2014-09-16 Interactive vehicle window display system with user identification and vehicle function control
EP14780691.3A EP3047236B1 (en) 2013-09-17 2014-09-16 Interactive vehicle window display system with user identification and vehicle function control
JP2016542865A JP6457535B2 (en) 2013-09-17 2014-09-16 Interactive vehicle window display system with user identification and vehicle function control
US14/639,695 US9807196B2 (en) 2013-09-17 2015-03-05 Automated social network interaction system for a vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361878898P 2013-09-17 2013-09-17
US14/180,563 US20150081167A1 (en) 2013-09-17 2014-02-14 Interactive vehicle window display system with vehicle function control

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/175,862 Continuation-In-Part US9340155B2 (en) 2013-09-17 2014-02-07 Interactive vehicle window display system with user identification

Related Child Applications (4)

Application Number Title Priority Date Filing Date
US14/175,862 Continuation-In-Part US9340155B2 (en) 2013-09-17 2014-02-07 Interactive vehicle window display system with user identification
US14/175,862 Continuation US9340155B2 (en) 2013-09-17 2014-02-07 Interactive vehicle window display system with user identification
US14/447,465 Continuation-In-Part US20150081133A1 (en) 2013-09-17 2014-07-30 Gesture-based system enabling children to control some vehicle functions in a vehicle
US14/469,041 Continuation-In-Part US9760698B2 (en) 2013-09-17 2014-08-26 Integrated wearable article for interactive vehicle control system

Publications (1)

Publication Number Publication Date
US20150081167A1 true US20150081167A1 (en) 2015-03-19

Family

ID=52667494

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/175,862 Active US9340155B2 (en) 2013-09-17 2014-02-07 Interactive vehicle window display system with user identification
US14/180,563 Abandoned US20150081167A1 (en) 2013-09-17 2014-02-14 Interactive vehicle window display system with vehicle function control

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/175,862 Active US9340155B2 (en) 2013-09-17 2014-02-07 Interactive vehicle window display system with user identification

Country Status (6)

Country Link
US (2) US9340155B2 (en)
EP (1) EP3047236B1 (en)
JP (1) JP6457535B2 (en)
KR (2) KR20160057458A (en)
CN (1) CN105556246B (en)
WO (1) WO2015042005A1 (en)

Cited By (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140310594A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc Configuration of haptic feedback and visual preferences in vehicle user interfaces
US20140309868A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc User interface and virtual personality presentation based on user profile
US20150070155A1 (en) * 2012-05-21 2015-03-12 Lawo Informationssysteme Gmbh Passenger information system and method
US20150168721A1 (en) * 2013-12-18 2015-06-18 Fuji Jukogyo Kabushiki Kaisha Onboard image display device for vehicle
US20150268840A1 (en) * 2014-03-20 2015-09-24 Nokia Corporation Determination of a program interaction profile based at least in part on a display region
US20160012654A1 (en) * 2014-07-09 2016-01-14 Toyota Motor Engineering & Manufacturing North America, Inc. Hands Free Access System for a Vehicle Closure
US9340155B2 (en) 2013-09-17 2016-05-17 Toyota Motor Sales, U.S.A., Inc. Interactive vehicle window display system with user identification
US9387824B2 (en) 2013-09-17 2016-07-12 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with user identification and image recording
US9400564B2 (en) 2013-09-17 2016-07-26 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with a safe driving reminder system
US20170160915A1 (en) * 2015-12-03 2017-06-08 Hyundai Motor Company System and method for setting a three-dimensional effect
US9760698B2 (en) 2013-09-17 2017-09-12 Toyota Motor Sales, U.S.A., Inc. Integrated wearable article for interactive vehicle control system
US9807196B2 (en) 2013-09-17 2017-10-31 Toyota Motor Sales, U.S.A. Automated social network interaction system for a vehicle
US20180024733A1 (en) * 2015-01-02 2018-01-25 Volkswagen Ag User interface and method for the hybrid use of a display unit of a transportation means
US9902266B2 (en) 2013-09-17 2018-02-27 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with personal convenience reminders
US9928734B2 (en) 2016-08-02 2018-03-27 Nio Usa, Inc. Vehicle-to-pedestrian communication systems
US9946906B2 (en) 2016-07-07 2018-04-17 Nio Usa, Inc. Vehicle with a soft-touch antenna for communicating sensitive information
US9963106B1 (en) 2016-11-07 2018-05-08 Nio Usa, Inc. Method and system for authentication in autonomous vehicles
US9984572B1 (en) 2017-01-16 2018-05-29 Nio Usa, Inc. Method and system for sharing parking space availability among autonomous vehicles
US10031521B1 (en) 2017-01-16 2018-07-24 Nio Usa, Inc. Method and system for using weather information in operation of autonomous vehicles
CN108340852A (en) * 2016-01-26 2018-07-31 通用汽车环球科技运作有限责任公司 The system and method for guarantee for vehicle ride safety and people and property
US10074223B2 (en) 2017-01-13 2018-09-11 Nio Usa, Inc. Secured vehicle for user use only
US10166995B2 (en) 2016-01-08 2019-01-01 Ford Global Technologies, Llc System and method for feature activation via gesture recognition and voice command
US10234302B2 (en) 2017-06-27 2019-03-19 Nio Usa, Inc. Adaptive route and motion planning based on learned external and internal vehicle environment
US20190095711A1 (en) * 2017-09-26 2019-03-28 Toyota Research Institute, Inc. Systems and methods for generating three dimensional skeleton representations
US10249104B2 (en) 2016-12-06 2019-04-02 Nio Usa, Inc. Lease observation and event recording
US10286915B2 (en) 2017-01-17 2019-05-14 Nio Usa, Inc. Machine learning for personalized driving
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10369966B1 (en) 2018-05-23 2019-08-06 Nio Usa, Inc. Controlling access to a vehicle using wireless access devices
US10410250B2 (en) 2016-11-21 2019-09-10 Nio Usa, Inc. Vehicle autonomy level selection based on user context
US10410064B2 (en) 2016-11-11 2019-09-10 Nio Usa, Inc. System for tracking and identifying vehicles and pedestrians
US10464530B2 (en) 2017-01-17 2019-11-05 Nio Usa, Inc. Voice biometric pre-purchase enrollment for autonomous vehicles
US10471829B2 (en) 2017-01-16 2019-11-12 Nio Usa, Inc. Self-destruct zone and autonomous vehicle navigation
US20190366916A1 (en) * 2018-05-29 2019-12-05 Valeo North America, Inc. Vehicle light with dual projection film
US10606274B2 (en) 2017-10-30 2020-03-31 Nio Usa, Inc. Visual place recognition based self-localization for autonomous vehicles
US10635109B2 (en) 2017-10-17 2020-04-28 Nio Usa, Inc. Vehicle path-planner monitor and controller
US10694357B2 (en) 2016-11-11 2020-06-23 Nio Usa, Inc. Using vehicle sensor data to monitor pedestrian health
US10692126B2 (en) 2015-11-17 2020-06-23 Nio Usa, Inc. Network-based system for selling and servicing cars
US10708547B2 (en) 2016-11-11 2020-07-07 Nio Usa, Inc. Using vehicle sensor data to monitor environmental and geologic conditions
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
US10717412B2 (en) 2017-11-13 2020-07-21 Nio Usa, Inc. System and method for controlling a vehicle using secondary access methods
WO2020158969A1 (en) * 2019-01-30 2020-08-06 엘지전자 주식회사 Device provided in vehicle and method for controlling same
CN111497736A (en) * 2018-12-28 2020-08-07 株式会社小糸制作所 Marker light system
US10837790B2 (en) 2017-08-01 2020-11-17 Nio Usa, Inc. Productive and accident-free driving modes for a vehicle
US10897469B2 (en) 2017-02-02 2021-01-19 Nio Usa, Inc. System and method for firewalls between vehicle networks
US10935978B2 (en) 2017-10-30 2021-03-02 Nio Usa, Inc. Vehicle self-localization using particle filters and visual odometry
CN113766505A (en) * 2017-10-03 2021-12-07 谷歌有限责任公司 System and method for multi-factor authentication and access control in a vehicle environment
CN114179610A (en) * 2021-12-23 2022-03-15 奇瑞汽车股份有限公司 Interface control method and device for heating, ventilating and adjusting automobile seat
USD948570S1 (en) * 2020-05-15 2022-04-12 Barel Ip, Inc Computing device display screen or portion thereof with an icon
USD961619S1 (en) * 2020-05-15 2022-08-23 Barel Ip, Inc. Computing device display screen or portion thereof with an icon
CN115503783A (en) * 2022-09-23 2022-12-23 中车青岛四方机车车辆股份有限公司 Information interaction system based on transparent display vehicle window
CN116279552A (en) * 2023-05-10 2023-06-23 浙大宁波理工学院 Semi-active interaction method and device for vehicle cabin and vehicle
USD1012944S1 (en) 2021-08-03 2024-01-30 Icon Vehicle Dynamics Llc Display screen with graphical user interface
US11955522B2 (en) 2020-02-13 2024-04-09 Vanguard International Semiconductor Corporation Semiconductor structure and method of forming the same
USD1024092S1 (en) * 2021-08-03 2024-04-23 Icon Vehicle Dynamics Llc Display screen with graphical user interface

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9972150B2 (en) * 2014-07-15 2018-05-15 Huf North America Automotive Parts Mfg. Corp. Method of verifying user intent in activation of a device in a vehicle
KR101673305B1 (en) * 2014-12-11 2016-11-22 현대자동차주식회사 Head unit for providing streaming service between different device and streaming control method the same, and computer-readable medium storing program for executing the same
US10013620B1 (en) * 2015-01-13 2018-07-03 State Farm Mutual Automobile Insurance Company Apparatuses, systems and methods for compressing image data that is representative of a series of digital images
KR101641834B1 (en) * 2015-06-12 2016-07-22 한양대학교 산학협력단 Method and System for Providing Real Time Information using Interactive Window in Vehicle
KR20170015112A (en) * 2015-07-30 2017-02-08 삼성전자주식회사 Autonomous Vehicle and Operation Method thereof
JP6441763B2 (en) * 2015-07-31 2018-12-19 Necプラットフォームズ株式会社 Display device, display control method, and program therefor
WO2017030255A1 (en) 2015-08-18 2017-02-23 Samsung Electronics Co., Ltd. Large format display apparatus and control method thereof
US10055111B2 (en) * 2015-08-28 2018-08-21 Here Global B.V. Method and apparatus for providing notifications on reconfiguration of a user environment
KR101716218B1 (en) * 2015-11-10 2017-03-15 현대자동차주식회사 Vehicle And Control Method Thereof
US9701315B2 (en) 2015-11-13 2017-07-11 At&T Intellectual Property I, L.P. Customized in-vehicle display information
DE102016100064B4 (en) * 2016-01-04 2017-11-23 Volkswagen Aktiengesellschaft Adative display in B-pillar
US20170210285A1 (en) * 2016-01-26 2017-07-27 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Flexible led display for adas application
US9830755B2 (en) 2016-02-17 2017-11-28 Jvis-Usa, Llc System including a hand-held communication device having low and high power settings for remotely controlling the position of a door of a land vehicle and key fob for use in the system
US10284822B2 (en) * 2016-02-17 2019-05-07 Jvis-Usa, Llc System for enhancing the visibility of a ground surface adjacent to a land vehicle
US9707912B1 (en) 2016-03-22 2017-07-18 Ford Global Technologies, Llc Human scanning for interior preferences setup
US9898142B2 (en) * 2016-04-01 2018-02-20 Ford Global Technologies, Llc Touch detection on a curved surface
JP6680604B2 (en) * 2016-04-21 2020-04-15 林テレンプ株式会社 Shade device for vehicle
US9873396B2 (en) 2016-06-16 2018-01-23 Ford Global Technologies, Llc Method and apparatus for vehicle occupant location detection
US10049512B2 (en) * 2016-06-20 2018-08-14 Ford Global Technologies, Llc Vehicle puddle lights for onboard diagnostics projection
EP3546259B1 (en) * 2016-07-06 2020-06-03 Audi AG Method for operating an interactive screening means, a window assembly, and a motor vehicle
US20180018179A1 (en) * 2016-07-12 2018-01-18 Ford Global Technologies, Llc Intelligent pre-boot and setup of vehicle systems
DE102016214273B4 (en) 2016-08-02 2019-10-10 Audi Ag A method of controlling a display device for a vehicle and vehicle with a display device
DE102016215434A1 (en) * 2016-08-18 2018-02-22 Continental Automotive Gmbh Display arrangement for a vehicle and vehicle with such a display arrangement
US10346119B2 (en) * 2016-09-27 2019-07-09 Intel Corporation Trusted vehicle messaging via transparent display
KR20180037828A (en) 2016-10-05 2018-04-13 현대자동차주식회사 Method and apparatus for controlling vehicular user interface under driving circumstance
KR101901799B1 (en) 2016-10-05 2018-11-07 현대자동차주식회사 Method and apparatus for controlling vehicular user interface
KR101795416B1 (en) * 2016-10-07 2017-11-09 현대자동차 주식회사 Door glass welcome system
KR20180051842A (en) * 2016-11-09 2018-05-17 엘지전자 주식회사 Display apparatus and method for controlling thereof
KR102529901B1 (en) * 2016-12-07 2023-05-08 현대자동차주식회사 Apparatus for management personalized payment user in vehicle and method thereof
JP2018096718A (en) * 2016-12-08 2018-06-21 アイシン精機株式会社 Detection sensor
US11494950B2 (en) 2017-06-16 2022-11-08 Honda Motor Co., Ltd. Experience providing system, experience providing method, and experience providing program
US20190143905A1 (en) * 2017-11-15 2019-05-16 Toyota Research Institute, Inc. Image capture with a vehicle object sensor device based on user input
CN108189788B (en) * 2018-01-11 2021-07-06 蔚来(安徽)控股有限公司 Automatic vehicle adjustment system and automatic vehicle adjustment method
JP7005112B2 (en) * 2018-03-05 2022-01-21 矢崎総業株式会社 Display system and in-vehicle system
JP2019164477A (en) * 2018-03-19 2019-09-26 本田技研工業株式会社 Information provision system, information provision method and program
CN109177924A (en) * 2018-09-20 2019-01-11 安徽信息工程学院 Vehicle recognition of face system for unlocking
JP2020050199A (en) * 2018-09-27 2020-04-02 豊田合成株式会社 Vehicular user interface system
KR102634349B1 (en) * 2018-10-11 2024-02-07 현대자동차주식회사 Apparatus and method for controlling display of vehicle
DE102018128904A1 (en) 2018-11-16 2020-05-20 Huf Hülsbeck & Fürst Gmbh & Co. Kg Vehicle with display device
WO2020163801A1 (en) * 2019-02-08 2020-08-13 Warner Bros. Entertainment Inc. Intra-vehicle games
WO2020204213A1 (en) * 2019-03-29 2020-10-08 엘지전자 주식회사 Method for voice interaction and vehicle using same
CN111818110A (en) * 2019-04-10 2020-10-23 上海博泰悦臻电子设备制造有限公司 Shared data recording method, shared terminal, and shared data recording system
US11501495B2 (en) 2019-05-10 2022-11-15 Qualcomm Incorporated Virtual models for communications between autonomous vehicles and external observers
CN110341648A (en) * 2019-07-16 2019-10-18 奇瑞汽车股份有限公司 Car door unlocking method, device and storage medium
JP7415394B2 (en) * 2019-09-24 2024-01-17 スズキ株式会社 vehicle display device
TWI709069B (en) * 2019-10-09 2020-11-01 晨豐光電股份有限公司 Touch sensing window
CN110774864B (en) * 2019-11-26 2021-09-28 盐城吉研智能科技有限公司 In-vehicle temperature adjusting device and method based on big data
US20220410709A1 (en) * 2019-12-02 2022-12-29 Hutchinson Window frame element for a motor vehicle
DE102020100044A1 (en) * 2020-01-03 2021-07-08 Bayerische Motoren Werke Aktiengesellschaft Method and vehicle for the display of information by a vehicle
JP7247901B2 (en) * 2020-01-07 2023-03-29 トヨタ自動車株式会社 MOBILE BODY CONTROL DEVICE, MOBILE BODY CONTROL METHOD, AND PROGRAM
JP2021124597A (en) * 2020-02-05 2021-08-30 株式会社東海理化電機製作所 Display device and display system
CN111703302B (en) * 2020-06-18 2021-07-02 北京航迹科技有限公司 Vehicle window content display method and device, electronic equipment and readable storage medium
CN111703301B (en) * 2020-06-18 2022-03-04 北京航迹科技有限公司 Vehicle window content display method and device, electronic equipment and readable storage medium
CN112550309B (en) * 2020-12-21 2022-05-10 阿波罗智联(北京)科技有限公司 Autonomous vehicle, information display method, information display apparatus, device, and storage medium
CN113190199B (en) * 2021-04-29 2023-07-21 Oppo广东移动通信有限公司 Content display method, device, terminal, vehicle and readable storage medium
ES2948649B2 (en) * 2022-02-17 2024-04-03 Seat Sa Optical signaling system for vehicles
KR102530411B1 (en) * 2022-03-07 2023-05-10 금문산업(주) Lighting Apparatus For Vehicle
DE102022109323A1 (en) * 2022-04-14 2023-10-19 Bayerische Motoren Werke Aktiengesellschaft Method and device for providing a showroom mode
WO2023218799A1 (en) * 2022-05-13 2023-11-16 株式会社Jvcケンウッド Video display device and video display method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4994204A (en) * 1988-11-04 1991-02-19 Kent State University Light modulating materials comprising a liquid crystal phase dispersed in a birefringent polymeric phase
US5240636A (en) * 1988-04-11 1993-08-31 Kent State University Light modulating materials comprising a liquid crystal microdroplets dispersed in a birefringent polymeric matri method of making light modulating materials
US5589958A (en) * 1995-02-09 1996-12-31 Lieb; Joseph A. Kitchen ensemble having windows with controllable opacity
US5867802A (en) * 1995-08-16 1999-02-02 Dew Engineering And Development Limited Biometrically secured control system for preventing the unauthorized use of a vehicle
US6249720B1 (en) * 1997-07-22 2001-06-19 Kabushikikaisha Equos Research Device mounted in vehicle
US20020091473A1 (en) * 2000-10-14 2002-07-11 Gardner Judith Lee Method and apparatus for improving vehicle operator performance
US20060012679A1 (en) * 2004-07-14 2006-01-19 Ressler Galen E Multifunction vehicle interior imaging system
US20060145825A1 (en) * 2005-01-05 2006-07-06 Mccall Clark E Virtual keypad for vehicle entry control
US8527146B1 (en) * 2012-01-30 2013-09-03 Google Inc. Systems and methods for updating vehicle behavior and settings based on the locations of vehicle passengers
US20140282931A1 (en) * 2013-03-18 2014-09-18 Ford Global Technologies, Llc System for vehicular biometric access and personalization

Family Cites Families (131)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3129862A (en) 1959-11-02 1964-04-21 Melvin R Cone Key holding apparel belt
US3713090A (en) 1970-09-03 1973-01-23 C Dickinson System for use in conducting aircraft check lists
US6507779B2 (en) 1995-06-07 2003-01-14 Automotive Technologies International, Inc. Vehicle rear seat monitor
DE8417687U1 (en) 1984-06-09 1985-10-10 Schulte-Schlagbaum Ag, 5620 Velbert key
DK594787A (en) 1986-11-27 1988-05-28 Hoffmann La Roche Lactone degradation product
US4818048A (en) 1987-01-06 1989-04-04 Hughes Aircraft Company Holographic head-up control panel
US4942841A (en) 1989-06-15 1990-07-24 Drucker Jr Jack L Vehicle service reminder display
US5454074A (en) 1991-09-18 1995-09-26 The Boeing Company Electronic checklist system
DE4204821A1 (en) 1992-02-18 1993-08-19 Burkhard Katz METHOD AND DEVICE FOR PRESENTING PRESENTATIONS BEFORE PASSENGERS OF MOVING VEHICLES
JPH06197888A (en) 1993-01-06 1994-07-19 Mitsubishi Motors Corp Doze warning device for vehicle
US5638202A (en) 1994-06-10 1997-06-10 Rofe; Michael E. Liquid crystal windshield display
JP3624465B2 (en) 1995-05-26 2005-03-02 株式会社デンソー Head-up display device
US5705977A (en) 1995-07-20 1998-01-06 Jones; James L. Maintenance reminder
US5652564A (en) 1995-07-26 1997-07-29 Winbush; Solomon Lanair Bold thief security system
US5774591A (en) 1995-12-15 1998-06-30 Xerox Corporation Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images
US6148251A (en) 1999-01-12 2000-11-14 Trw Inc. Touchtone electronic steering wheel
US6227862B1 (en) 1999-02-12 2001-05-08 Advanced Drivers Education Products And Training, Inc. Driver training system
KR100335946B1 (en) 1999-07-31 2002-05-09 이계안 Navigation system having a function which stops voice guidance
US7050606B2 (en) 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
AU2747001A (en) 1999-11-03 2001-06-12 Ericsson Inc. System and device for assisting flight scheduling by a traveller
FR2800782B1 (en) 1999-11-10 2001-12-21 Valeo Securite Habitacle MOTOR VEHICLE EQUIPPED WITH A HANDS-FREE ACCESS AND / OR STARTING SYSTEM
US8818647B2 (en) 1999-12-15 2014-08-26 American Vehicular Sciences Llc Vehicular heads-up display system
JP2001304896A (en) * 2000-04-25 2001-10-31 Mitsubishi Motors Corp Vehicular navigation device
WO2002001508A1 (en) 2000-06-23 2002-01-03 Automated Car Rental, L.L.C. System and method for the automated release of vehicles from a moter pool
US6393348B1 (en) 2000-07-14 2002-05-21 Douglas K. Ziegler Passenger monitoring vehicle safety seat and monitoring device
DE10037573B4 (en) 2000-08-02 2005-05-19 Robert Bosch Gmbh Navigation method in a motor vehicle
AUPQ968200A0 (en) 2000-08-25 2000-09-21 Robert Bosch Gmbh A security system
US7135961B1 (en) 2000-09-29 2006-11-14 International Business Machines Corporation Method and system for providing directions for driving
US6603405B2 (en) 2000-12-05 2003-08-05 User-Centric Enterprises, Inc. Vehicle-centric weather prediction system and method
US6362734B1 (en) 2001-01-31 2002-03-26 Ford Global Technologies, Inc. Method and apparatus for monitoring seat belt use of rear seat passengers
US6654070B1 (en) 2001-03-23 2003-11-25 Michael Edward Rofe Interactive heads up display (IHUD)
DE60133052T2 (en) 2001-07-30 2009-04-30 Nippon Seiki Co. Ltd., Nagaoka VEHICLE DISPLAY DEVICE
US20030076968A1 (en) 2001-10-23 2003-04-24 Rast Rodger H. Method and system of controlling automotive equipment remotely
US20040052418A1 (en) 2002-04-05 2004-03-18 Bruno Delean Method and apparatus for probabilistic image analysis
US7369685B2 (en) 2002-04-05 2008-05-06 Identix Corporation Vision-based operating method and system
US20030204526A1 (en) 2002-04-24 2003-10-30 Saeid Salehi-Had Interlocking smart fob enabling secure access and tracking for electronic devices
US6791462B2 (en) 2002-09-18 2004-09-14 Sang J. Choi Sleepy alarm system activated by heart pulse meter
US6696943B1 (en) 2003-03-18 2004-02-24 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Video monitoring system for car seat
GB2400667B (en) 2003-04-15 2006-05-31 Hewlett Packard Development Co Attention detection
US7398140B2 (en) 2003-05-14 2008-07-08 Wabtec Holding Corporation Operator warning system and method for improving locomotive operator vigilance
US7126853B2 (en) 2003-08-14 2006-10-24 Mosel Vitelic, Inc. Electronic memory having impedance-matched sensing
US7757076B2 (en) 2003-12-08 2010-07-13 Palo Alto Research Center Incorporated Method and apparatus for using a secure credential infrastructure to access vehicle components
US7561966B2 (en) 2003-12-17 2009-07-14 Denso Corporation Vehicle information display system
US7680574B2 (en) 2004-03-04 2010-03-16 Gm Global Technology Operations, Inc. Vehicle information system with steering wheel controller
US7507438B2 (en) 2004-09-03 2009-03-24 Donnelly Corporation Display substrate with diffuser coating
US7053866B1 (en) 2004-12-18 2006-05-30 Emile Mimran Portable adaptor and software for use with a heads-up display unit
US20100311399A1 (en) 2005-03-31 2010-12-09 United Video Properties, Inc. Systems and methods for generating audible reminders on mobile user equipment
JP4353162B2 (en) 2005-09-26 2009-10-28 トヨタ自動車株式会社 Vehicle surrounding information display device
KR20070049338A (en) * 2005-11-08 2007-05-11 김종호 Picture image display system installed in a car
US20090278915A1 (en) 2006-02-08 2009-11-12 Oblong Industries, Inc. Gesture-Based Control System For Vehicle Interfaces
JP2007210457A (en) * 2006-02-09 2007-08-23 Fujitsu Ten Ltd Automatic vehicle setting device and setting method
US7764247B2 (en) 2006-02-17 2010-07-27 Microsoft Corporation Adaptive heads-up user interface for automobiles
US7897888B2 (en) 2006-03-30 2011-03-01 Strattec Security Corporation Key fob device and method
US7976386B2 (en) 2006-06-12 2011-07-12 Tran Bao Q Mesh network game controller with voice transmission, search capability, motion detection, and/or position detection
US8096069B2 (en) 2006-09-06 2012-01-17 The Invention Science Fund I, Llc Repeatably displaceable emanating element display
JP4309909B2 (en) 2006-12-01 2009-08-05 矢崎総業株式会社 Vehicle display device and display position adjustment support method thereof
JP2008143220A (en) * 2006-12-06 2008-06-26 Tokai Rika Co Ltd Individual authentication system
US20080167892A1 (en) 2007-01-10 2008-07-10 Neil Clark System for ride sharing and method therefor
US7692552B2 (en) 2007-01-23 2010-04-06 International Business Machines Corporation Method and system for improving driver safety and situational awareness
KR101331827B1 (en) * 2007-01-31 2013-11-22 최윤정 Display device for car and display method using the same
CN101652789A (en) 2007-02-12 2010-02-17 肖恩·奥沙利文 Share transportation system and service network
JP2008225889A (en) * 2007-03-13 2008-09-25 Pioneer Electronic Corp Information providing device and information providing method
GB2447484B (en) 2007-03-15 2012-01-18 Jaguar Cars Security system for a motor vehicle
GB0705120D0 (en) 2007-03-16 2007-04-25 Pilkington Group Ltd Vehicle glazing
US20080238667A1 (en) 2007-03-30 2008-10-02 Proxwear, Llc Clothing and Accessories that Operate Radio Frequency Identification Enabled Security Devices
JP2008261749A (en) 2007-04-12 2008-10-30 Takata Corp Occupant detection device, actuator control system, seat belt system, and vehicle
US7982620B2 (en) 2007-05-23 2011-07-19 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for reducing boredom while driving
US7983206B2 (en) 2007-09-10 2011-07-19 Robert Bosch Gmbh Integrated system and method for interactive communication and multimedia support in vehicles
US7966109B2 (en) 2007-09-14 2011-06-21 Les Innovations Cd Invenio Inc. Reminder device for eliciting behavioral response in a vehicle
WO2009055573A1 (en) * 2007-10-23 2009-04-30 Viaclix, Inc. Multimedia administration, advertising, content & services system
US8120651B2 (en) 2007-10-31 2012-02-21 Motocam 360, L.L.C. Video capture assembly
DE102007053422A1 (en) 2007-11-09 2009-05-14 Robert Bosch Gmbh computing device
US20090140878A1 (en) 2007-11-30 2009-06-04 Ryan Ii Thomas E Sound customization for operating actions of automobiles
US20090146947A1 (en) 2007-12-07 2009-06-11 James Ng Universal wearable input and authentication device
US8232897B2 (en) 2008-04-16 2012-07-31 Delphi Technologies, Inc. Vehicle locator key fob with range and bearing measurement
US20090290021A1 (en) 2008-05-23 2009-11-26 Rudesill Della J Rear seat passenger surveillance system
US20100045451A1 (en) 2008-08-25 2010-02-25 Neeraj Periwal Speed reduction, alerting, and logging system
JP2010070180A (en) * 2008-09-22 2010-04-02 Asmo Co Ltd Vehicular system equipment driver
US8126450B2 (en) 2008-09-24 2012-02-28 Embarq Holdings Company Llc System and method for key free access to a vehicle
US8516561B2 (en) 2008-09-29 2013-08-20 At&T Intellectual Property I, L.P. Methods and apparatus for determining user authorization from motion of a gesture-based control unit
US8344870B2 (en) 2008-10-07 2013-01-01 Cisco Technology, Inc. Virtual dashboard
US8395529B2 (en) 2009-04-02 2013-03-12 GM Global Technology Operations LLC Traffic infrastructure indicator on head-up display
US8384531B2 (en) 2009-04-02 2013-02-26 GM Global Technology Operations LLC Recommended following distance on full-windshield head-up display
US8317329B2 (en) 2009-04-02 2012-11-27 GM Global Technology Operations LLC Infotainment display on full-windshield head-up display
JP5198346B2 (en) 2009-04-23 2013-05-15 本田技研工業株式会社 Vehicle periphery monitoring device
JP2010257249A (en) * 2009-04-24 2010-11-11 Autonetworks Technologies Ltd On-vehicle security device
GB0908444D0 (en) 2009-05-16 2009-06-24 Quintal Line Lost object locator system
US8577543B2 (en) 2009-05-28 2013-11-05 Intelligent Mechatronic Systems Inc. Communication system with personal information management and remote vehicle monitoring and control features
US8942888B2 (en) 2009-10-15 2015-01-27 Airbiquity Inc. Extensible scheme for operating vehicle head unit as extended interface for mobile device
US9002574B2 (en) 2009-10-15 2015-04-07 Airbiquity Inc. Mobile integration platform (MIP) integrated handset application proxy (HAP)
US8659658B2 (en) * 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
US8930439B2 (en) 2010-04-30 2015-01-06 Nokia Corporation Method and apparatus for providing cooperative user interface layer management with respect to inter-device communications
US8463488B1 (en) 2010-06-24 2013-06-11 Paul Hart Vehicle profile control and monitoring
US8862299B2 (en) 2011-11-16 2014-10-14 Flextronics Ap, Llc Branding of electrically propelled vehicles via the generation of specific operating output
EP2441635B1 (en) 2010-10-06 2015-01-21 Harman Becker Automotive Systems GmbH Vehicle User Interface System
US8606430B2 (en) * 2010-10-08 2013-12-10 GM Global Technology Operations LLC External presentation of information on full glass display
US8560013B2 (en) 2010-12-14 2013-10-15 Toyota Motor Engineering & Manufacturing North America, Inc. Automatic status update for social networking
JP2012126251A (en) 2010-12-15 2012-07-05 Toyota Motor Corp Light source device for vehicle
US8633979B2 (en) 2010-12-29 2014-01-21 GM Global Technology Operations LLC Augmented road scene illustrator system on full windshield head-up display
DE102011003976B3 (en) 2011-02-11 2012-04-26 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Sound input device for use in e.g. music instrument input interface in electric guitar, has classifier interrupting output of sound signal over sound signal output during presence of condition for period of sound signal passages
US9171273B2 (en) 2011-02-16 2015-10-27 The Boeing Company Integrated electronic checklist display system
KR20130115368A (en) * 2011-02-17 2013-10-21 폭스바겐 악티엔 게젤샤프트 Operating device in a vehicle
US8947203B2 (en) 2011-03-07 2015-02-03 John Clinton Kolar Aftermarket sound activated wireless vehicle door unlocker
US20120249291A1 (en) * 2011-03-29 2012-10-04 Denso Corporation Systems and methods for vehicle passive entry
US20120265814A1 (en) 2011-04-14 2012-10-18 Stilianos George Roussis Software Application for Managing Personal Matters and Personal Interactions through a Personal Network
EP2511750A1 (en) * 2011-04-15 2012-10-17 Volvo Car Corporation Vehicular information display system
DE102011018555A1 (en) 2011-04-26 2012-10-31 Continental Automotive Gmbh Interface for data transmission in a motor vehicle and computer program product
WO2012159083A2 (en) 2011-05-18 2012-11-22 Triangle Software Llc System for providing traffic data and driving efficiency data
US20130030645A1 (en) 2011-07-28 2013-01-31 Panasonic Corporation Auto-control of vehicle infotainment system based on extracted characteristics of car occupants
CN102914317A (en) * 2011-08-05 2013-02-06 上海博泰悦臻电子设备制造有限公司 Navigation system and method employing front window projection of car
GB2494398B (en) 2011-09-05 2015-03-11 Jaguar Land Rover Ltd Security system and device therefor
US20130063336A1 (en) 2011-09-08 2013-03-14 Honda Motor Co., Ltd. Vehicle user interface system
US8941690B2 (en) * 2011-12-09 2015-01-27 GM Global Technology Operations LLC Projected rear passenger entertainment system
JP5832667B2 (en) 2011-12-29 2015-12-16 インテル・コーポレーション Reconfigurable personalized vehicle display
DE102012203535A1 (en) 2012-03-06 2013-09-12 Bayerische Motoren Werke Aktiengesellschaft Keyless car key with gesture recognition
US8942881B2 (en) 2012-04-02 2015-01-27 Google Inc. Gesture-based automotive controls
US8552847B1 (en) 2012-05-01 2013-10-08 Racing Incident Pty Ltd. Tactile based performance enhancement system
US20140007618A1 (en) 2012-07-05 2014-01-09 Charles H. Brown, III Integrated Handcuff Key Bracelet
US20140068713A1 (en) 2012-08-31 2014-03-06 Tweddle Group, Inc. Systems, methods and articles for providing communications and services involving automobile head units and user preferences
US9091715B2 (en) 2013-02-25 2015-07-28 Google Technology Holdings LLC Wearable device with capacitive sensor and method of operation therefor
US9639508B2 (en) 2013-06-14 2017-05-02 Worldmate, Ltd. Systems and methods for providing a contextual user interface element
CN103273885A (en) 2013-06-14 2013-09-04 苏州旭宇升电子有限公司 Automotive touch display system
US20150081133A1 (en) 2013-09-17 2015-03-19 Toyota Motor Sales, U.S.A., Inc. Gesture-based system enabling children to control some vehicle functions in a vehicle
US9902266B2 (en) 2013-09-17 2018-02-27 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with personal convenience reminders
US20150077561A1 (en) 2013-09-17 2015-03-19 Toyota Motor Sales, U.S.A., Inc. Monitoring a back row seat area of a vehicle
US9387824B2 (en) 2013-09-17 2016-07-12 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with user identification and image recording
US9807196B2 (en) 2013-09-17 2017-10-31 Toyota Motor Sales, U.S.A. Automated social network interaction system for a vehicle
US9400564B2 (en) 2013-09-17 2016-07-26 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with a safe driving reminder system
US9760698B2 (en) 2013-09-17 2017-09-12 Toyota Motor Sales, U.S.A., Inc. Integrated wearable article for interactive vehicle control system
US9340155B2 (en) 2013-09-17 2016-05-17 Toyota Motor Sales, U.S.A., Inc. Interactive vehicle window display system with user identification
US20150220991A1 (en) 2014-02-05 2015-08-06 Harman International Industries, Incorporated External messaging in the automotive environment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5240636A (en) * 1988-04-11 1993-08-31 Kent State University Light modulating materials comprising a liquid crystal microdroplets dispersed in a birefringent polymeric matri method of making light modulating materials
US4994204A (en) * 1988-11-04 1991-02-19 Kent State University Light modulating materials comprising a liquid crystal phase dispersed in a birefringent polymeric phase
US5589958A (en) * 1995-02-09 1996-12-31 Lieb; Joseph A. Kitchen ensemble having windows with controllable opacity
US5867802A (en) * 1995-08-16 1999-02-02 Dew Engineering And Development Limited Biometrically secured control system for preventing the unauthorized use of a vehicle
US6249720B1 (en) * 1997-07-22 2001-06-19 Kabushikikaisha Equos Research Device mounted in vehicle
US20020091473A1 (en) * 2000-10-14 2002-07-11 Gardner Judith Lee Method and apparatus for improving vehicle operator performance
US20060012679A1 (en) * 2004-07-14 2006-01-19 Ressler Galen E Multifunction vehicle interior imaging system
US20060145825A1 (en) * 2005-01-05 2006-07-06 Mccall Clark E Virtual keypad for vehicle entry control
US8527146B1 (en) * 2012-01-30 2013-09-03 Google Inc. Systems and methods for updating vehicle behavior and settings based on the locations of vehicle passengers
US20140282931A1 (en) * 2013-03-18 2014-09-18 Ford Global Technologies, Llc System for vehicular biometric access and personalization

Cited By (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160244011A1 (en) * 2012-03-14 2016-08-25 Autoconnect Holdings Llc User interface and virtual personality presentation based on user profile
US20150070155A1 (en) * 2012-05-21 2015-03-12 Lawo Informationssysteme Gmbh Passenger information system and method
US20140309868A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc User interface and virtual personality presentation based on user profile
US20140310594A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc Configuration of haptic feedback and visual preferences in vehicle user interfaces
US9760698B2 (en) 2013-09-17 2017-09-12 Toyota Motor Sales, U.S.A., Inc. Integrated wearable article for interactive vehicle control system
US9902266B2 (en) 2013-09-17 2018-02-27 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with personal convenience reminders
US9340155B2 (en) 2013-09-17 2016-05-17 Toyota Motor Sales, U.S.A., Inc. Interactive vehicle window display system with user identification
US9387824B2 (en) 2013-09-17 2016-07-12 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with user identification and image recording
US9400564B2 (en) 2013-09-17 2016-07-26 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with a safe driving reminder system
US9807196B2 (en) 2013-09-17 2017-10-31 Toyota Motor Sales, U.S.A. Automated social network interaction system for a vehicle
US10042163B2 (en) * 2013-12-18 2018-08-07 Subaru Corporation Onboard image display device for vehicle
US20150168721A1 (en) * 2013-12-18 2015-06-18 Fuji Jukogyo Kabushiki Kaisha Onboard image display device for vehicle
US20150268840A1 (en) * 2014-03-20 2015-09-24 Nokia Corporation Determination of a program interaction profile based at least in part on a display region
US9598049B2 (en) * 2014-07-09 2017-03-21 Toyota Motor Engineering & Manufacturing North America, Inc. Hands free access system for a vehicle closure
US20160012654A1 (en) * 2014-07-09 2016-01-14 Toyota Motor Engineering & Manufacturing North America, Inc. Hands Free Access System for a Vehicle Closure
US20180024733A1 (en) * 2015-01-02 2018-01-25 Volkswagen Ag User interface and method for the hybrid use of a display unit of a transportation means
US10838604B2 (en) * 2015-01-02 2020-11-17 Volkswagen Ag User interface and method for the hybrid use of a display unit of a transportation means
US11715143B2 (en) 2015-11-17 2023-08-01 Nio Technology (Anhui) Co., Ltd. Network-based system for showing cars for sale by non-dealer vehicle owners
US10692126B2 (en) 2015-11-17 2020-06-23 Nio Usa, Inc. Network-based system for selling and servicing cars
US10067663B2 (en) * 2015-12-03 2018-09-04 Hyundai Motor Company System and method for setting a three-dimensional effect
US20170160915A1 (en) * 2015-12-03 2017-06-08 Hyundai Motor Company System and method for setting a three-dimensional effect
US10166995B2 (en) 2016-01-08 2019-01-01 Ford Global Technologies, Llc System and method for feature activation via gesture recognition and voice command
CN108340852A (en) * 2016-01-26 2018-07-31 通用汽车环球科技运作有限责任公司 The system and method for guarantee for vehicle ride safety and people and property
US10699326B2 (en) 2016-07-07 2020-06-30 Nio Usa, Inc. User-adjusted display devices and methods of operating the same
US9984522B2 (en) 2016-07-07 2018-05-29 Nio Usa, Inc. Vehicle identification or authentication
US10679276B2 (en) 2016-07-07 2020-06-09 Nio Usa, Inc. Methods and systems for communicating estimated time of arrival to a third party
US10672060B2 (en) 2016-07-07 2020-06-02 Nio Usa, Inc. Methods and systems for automatically sending rule-based communications from a vehicle
US10685503B2 (en) 2016-07-07 2020-06-16 Nio Usa, Inc. System and method for associating user and vehicle information for communication to a third party
US10354460B2 (en) 2016-07-07 2019-07-16 Nio Usa, Inc. Methods and systems for associating sensitive information of a passenger with a vehicle
US10032319B2 (en) 2016-07-07 2018-07-24 Nio Usa, Inc. Bifurcated communications to a third party through a vehicle
US9946906B2 (en) 2016-07-07 2018-04-17 Nio Usa, Inc. Vehicle with a soft-touch antenna for communicating sensitive information
US10388081B2 (en) 2016-07-07 2019-08-20 Nio Usa, Inc. Secure communications with sensitive user information through a vehicle
US10262469B2 (en) 2016-07-07 2019-04-16 Nio Usa, Inc. Conditional or temporary feature availability
US11005657B2 (en) 2016-07-07 2021-05-11 Nio Usa, Inc. System and method for automatically triggering the communication of sensitive information through a vehicle to a third party
US10304261B2 (en) 2016-07-07 2019-05-28 Nio Usa, Inc. Duplicated wireless transceivers associated with a vehicle to receive and send sensitive information
US9928734B2 (en) 2016-08-02 2018-03-27 Nio Usa, Inc. Vehicle-to-pedestrian communication systems
US10083604B2 (en) 2016-11-07 2018-09-25 Nio Usa, Inc. Method and system for collective autonomous operation database for autonomous vehicles
US10031523B2 (en) 2016-11-07 2018-07-24 Nio Usa, Inc. Method and system for behavioral sharing in autonomous vehicles
US9963106B1 (en) 2016-11-07 2018-05-08 Nio Usa, Inc. Method and system for authentication in autonomous vehicles
US11024160B2 (en) 2016-11-07 2021-06-01 Nio Usa, Inc. Feedback performance control and tracking
US10708547B2 (en) 2016-11-11 2020-07-07 Nio Usa, Inc. Using vehicle sensor data to monitor environmental and geologic conditions
US10694357B2 (en) 2016-11-11 2020-06-23 Nio Usa, Inc. Using vehicle sensor data to monitor pedestrian health
US10410064B2 (en) 2016-11-11 2019-09-10 Nio Usa, Inc. System for tracking and identifying vehicles and pedestrians
US11710153B2 (en) 2016-11-21 2023-07-25 Nio Technology (Anhui) Co., Ltd. Autonomy first route optimization for autonomous vehicles
US10970746B2 (en) 2016-11-21 2021-04-06 Nio Usa, Inc. Autonomy first route optimization for autonomous vehicles
US10515390B2 (en) 2016-11-21 2019-12-24 Nio Usa, Inc. Method and system for data optimization
US11922462B2 (en) 2016-11-21 2024-03-05 Nio Technology (Anhui) Co., Ltd. Vehicle autonomous collision prediction and escaping system (ACE)
US10699305B2 (en) 2016-11-21 2020-06-30 Nio Usa, Inc. Smart refill assistant for electric vehicles
US10410250B2 (en) 2016-11-21 2019-09-10 Nio Usa, Inc. Vehicle autonomy level selection based on user context
US10949885B2 (en) 2016-11-21 2021-03-16 Nio Usa, Inc. Vehicle autonomous collision prediction and escaping system (ACE)
US10249104B2 (en) 2016-12-06 2019-04-02 Nio Usa, Inc. Lease observation and event recording
US10074223B2 (en) 2017-01-13 2018-09-11 Nio Usa, Inc. Secured vehicle for user use only
US10471829B2 (en) 2017-01-16 2019-11-12 Nio Usa, Inc. Self-destruct zone and autonomous vehicle navigation
US9984572B1 (en) 2017-01-16 2018-05-29 Nio Usa, Inc. Method and system for sharing parking space availability among autonomous vehicles
US10031521B1 (en) 2017-01-16 2018-07-24 Nio Usa, Inc. Method and system for using weather information in operation of autonomous vehicles
US10464530B2 (en) 2017-01-17 2019-11-05 Nio Usa, Inc. Voice biometric pre-purchase enrollment for autonomous vehicles
US10286915B2 (en) 2017-01-17 2019-05-14 Nio Usa, Inc. Machine learning for personalized driving
US11811789B2 (en) 2017-02-02 2023-11-07 Nio Technology (Anhui) Co., Ltd. System and method for an in-vehicle firewall between in-vehicle networks
US10897469B2 (en) 2017-02-02 2021-01-19 Nio Usa, Inc. System and method for firewalls between vehicle networks
US10234302B2 (en) 2017-06-27 2019-03-19 Nio Usa, Inc. Adaptive route and motion planning based on learned external and internal vehicle environment
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10837790B2 (en) 2017-08-01 2020-11-17 Nio Usa, Inc. Productive and accident-free driving modes for a vehicle
US10607079B2 (en) * 2017-09-26 2020-03-31 Toyota Research Institute, Inc. Systems and methods for generating three dimensional skeleton representations
US20190095711A1 (en) * 2017-09-26 2019-03-28 Toyota Research Institute, Inc. Systems and methods for generating three dimensional skeleton representations
US11856399B2 (en) 2017-10-03 2023-12-26 Google Llc Multi-factor authentication and access control in a vehicular environment
CN113766505A (en) * 2017-10-03 2021-12-07 谷歌有限责任公司 System and method for multi-factor authentication and access control in a vehicle environment
US11726474B2 (en) 2017-10-17 2023-08-15 Nio Technology (Anhui) Co., Ltd. Vehicle path-planner monitor and controller
US10635109B2 (en) 2017-10-17 2020-04-28 Nio Usa, Inc. Vehicle path-planner monitor and controller
US10606274B2 (en) 2017-10-30 2020-03-31 Nio Usa, Inc. Visual place recognition based self-localization for autonomous vehicles
US10935978B2 (en) 2017-10-30 2021-03-02 Nio Usa, Inc. Vehicle self-localization using particle filters and visual odometry
US10717412B2 (en) 2017-11-13 2020-07-21 Nio Usa, Inc. System and method for controlling a vehicle using secondary access methods
US10369966B1 (en) 2018-05-23 2019-08-06 Nio Usa, Inc. Controlling access to a vehicle using wireless access devices
US20190366916A1 (en) * 2018-05-29 2019-12-05 Valeo North America, Inc. Vehicle light with dual projection film
US11420552B2 (en) 2018-05-29 2022-08-23 Valeo North America, Inc. Vehicle light with dual projection film
US10899272B2 (en) * 2018-05-29 2021-01-26 Valeo North America, Inc. Vehicle light with dual projection film
CN111497736A (en) * 2018-12-28 2020-08-07 株式会社小糸制作所 Marker light system
WO2020158969A1 (en) * 2019-01-30 2020-08-06 엘지전자 주식회사 Device provided in vehicle and method for controlling same
US11955522B2 (en) 2020-02-13 2024-04-09 Vanguard International Semiconductor Corporation Semiconductor structure and method of forming the same
USD961619S1 (en) * 2020-05-15 2022-08-23 Barel Ip, Inc. Computing device display screen or portion thereof with an icon
USD948570S1 (en) * 2020-05-15 2022-04-12 Barel Ip, Inc Computing device display screen or portion thereof with an icon
USD1012944S1 (en) 2021-08-03 2024-01-30 Icon Vehicle Dynamics Llc Display screen with graphical user interface
USD1024092S1 (en) * 2021-08-03 2024-04-23 Icon Vehicle Dynamics Llc Display screen with graphical user interface
CN114179610A (en) * 2021-12-23 2022-03-15 奇瑞汽车股份有限公司 Interface control method and device for heating, ventilating and adjusting automobile seat
CN115503783A (en) * 2022-09-23 2022-12-23 中车青岛四方机车车辆股份有限公司 Information interaction system based on transparent display vehicle window
CN116279552A (en) * 2023-05-10 2023-06-23 浙大宁波理工学院 Semi-active interaction method and device for vehicle cabin and vehicle

Also Published As

Publication number Publication date
KR102227424B1 (en) 2021-03-12
KR20160057458A (en) 2016-05-23
JP2016539346A (en) 2016-12-15
KR20200028497A (en) 2020-03-16
WO2015042005A1 (en) 2015-03-26
US9340155B2 (en) 2016-05-17
US20150077327A1 (en) 2015-03-19
JP6457535B2 (en) 2019-01-23
CN105556246B (en) 2018-04-27
EP3047236A1 (en) 2016-07-27
EP3047236B1 (en) 2020-01-15
CN105556246A (en) 2016-05-04

Similar Documents

Publication Publication Date Title
US9340155B2 (en) Interactive vehicle window display system with user identification
US9902266B2 (en) Interactive vehicle window display system with personal convenience reminders
US9400564B2 (en) Interactive vehicle window display system with a safe driving reminder system
EP2985571B1 (en) Method for remote communication with and through a vehicle
US9387824B2 (en) Interactive vehicle window display system with user identification and image recording
US9760698B2 (en) Integrated wearable article for interactive vehicle control system
WO2016032990A1 (en) Integrated wearable article for interactive vehicle control system
US9807196B2 (en) Automated social network interaction system for a vehicle
US9977593B2 (en) Gesture recognition for on-board display
US10171529B2 (en) Vehicle and occupant application integration
US8979159B2 (en) Configurable hardware unit for car systems
US20160269469A1 (en) Vehicle Supervising of Occupant Applications

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AME

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PISZ, JAMES T.;SCHULZ, JASON A.;REEL/FRAME:032283/0705

Effective date: 20140210

AS Assignment

Owner name: TOYOTA MOTOR SALES, U.S.A., INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.;REEL/FRAME:032655/0227

Effective date: 20140401

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION