WO2014172369A2 - Intelligent vehicle for assisting vehicle occupants and incorporating vehicle crate for blade processors - Google Patents

Intelligent vehicle for assisting vehicle occupants and incorporating vehicle crate for blade processors Download PDF

Info

Publication number
WO2014172369A2
WO2014172369A2 PCT/US2014/034194 US2014034194W WO2014172369A2 WO 2014172369 A2 WO2014172369 A2 WO 2014172369A2 US 2014034194 W US2014034194 W US 2014034194W WO 2014172369 A2 WO2014172369 A2 WO 2014172369A2
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
user
occupant
sensors
Prior art date
Application number
PCT/US2014/034194
Other languages
French (fr)
Other versions
WO2014172369A3 (en
Inventor
Christopher P. Ricci
Octavian Chincisan
Alisher Yusupov
Gary Fong
Original Assignee
Flextronics Ap, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flextronics Ap, Llc filed Critical Flextronics Ap, Llc
Priority to PCT/US2014/034194 priority Critical patent/WO2014172369A2/en
Priority to US14/253,506 priority patent/US9082239B2/en
Publication of WO2014172369A2 publication Critical patent/WO2014172369A2/en
Publication of WO2014172369A3 publication Critical patent/WO2014172369A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • B60R25/1004Alarm systems characterised by the type of sensor, e.g. current sensing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D23/00Control of temperature
    • G05D23/19Control of temperature characterised by the use of electric means
    • G05D23/1917Control of temperature characterised by the use of electric means using digital means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services
    • G06Q30/012Providing warranty services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0265Vehicular advertisement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0639Item locations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0645Rental transactions; Leasing transactions
    • G06Q50/40
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19647Systems specially adapted for intrusion detection in or around a vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/016Personal emergency signalling and security systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41422Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • H04W12/068Authentication using credential vaults, e.g. password manager applications or one time password [OTP] applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/08Access security
    • H04W12/084Access security using delegated authorisation, e.g. open authorisation [OAuth] protocol
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/08Access security
    • H04W12/088Access security using filters or firewalls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • H04W4/21Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/46Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/60Subscription-based services using application servers or record carriers, e.g. SIM application toolkits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W48/00Access restriction; Network selection; Access point selection
    • H04W48/02Access restriction performed under specific conditions
    • H04W48/04Access restriction performed under specific conditions based on user or terminal location or mobility data, e.g. moving direction, speed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/11Allocation or use of connection identifiers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/52Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating emergencies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/185Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
    • G08B29/188Data fusion; cooperative systems, e.g. voting among different detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/68Gesture-dependent or behaviour-dependent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W36/00Hand-off or reselection arrangements
    • H04W36/34Reselection control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/005Moving wireless networks

Definitions

  • One way to instill comfort in a vehicle is to create an environment within the vehicle similar to that of an individual's home. Integrating features in a vehicle that are associated with comfort found in an individual's home can ease a traveler's transition from home to vehicle.
  • Several manufacturers have added comfort features in vehicles such as the following: leather seats, adaptive and/or personal climate control systems, music and media players, ergonomic controls, and, in some cases, Internet connectivity.
  • these manufacturers have added features to a conveyance, they have built comfort around a vehicle and failed to build a vehicle around comfort.
  • a method can be provided by the present disclosure that includes the steps of:
  • first and second operating systems executing on a common microprocessor, wherein the first operating system comprises one or more applications performing a critical vehicle task, function, and/or operation and the second operating system comprises one or more applications performing an infotainment task, function, and/or operation;
  • first and second operating systems can execute on a common microprocessor
  • the first operating system can comprise one or more applications performing a critical vehicle task, function, and/or operation
  • the second operating system can comprise one or more applications performing an infotainment task, function, and/or operation
  • a tangible and non-transient computer readable medium in the vehicle can include microprocessor executable and readable instructions that, when executed by the microprocessor, can perform operations including:
  • a vehicle can include:
  • a first computer having a first operating system executing one or more applications performing a critical vehicle task, function, and/or operation;
  • a second computer having a different second operating system executing one or more applications performing at least one of an infotainment task, function, and/or operation and less critical task, function, and/or operation, wherein at least one of the following can be true:
  • a computer control module collects one or more metrics regarding an operation of the first and/or second computers and/or a network in communication with the first and/or second computer, determines whether the collected one or metrics regarding an operation of the first and/or second computers and/or network are normal and/or abnormal, and apply one or more of the following rules:
  • Each member of the set of critical vehicle tasks, functions, and/or operations can be different from each member of the set of infotainment tasks, functions, and/or operations, whereby the union of the two sets is null.
  • the critical task, function or operation can be one or more of monitoring, controlling, and/or operating the ECU, TCU, door settings, window settings, and/or blind spot monitor, monitoring, controlling, and/or operating the safety equipment, monitoring and/or controlling certain critical sensors such as the power source controller and energy output sensor, engine temperature, oil pressure sensing, hydraulic pressure sensors, sensors for headlight and other lights, vehicle control system sensors, and/or
  • steering/torque sensor controlling the operation of the engine, head light control unit, power steering, display panel, switch state control unit, power control unit, and/or brake control unit, and/or issuing alerts to a user and/or remote monitoring entity of potential problems with a vehicle operation.
  • the infotainment task, function or operation can be one or more of receiving, processing, and/or providing media and/or multimedia content.
  • the first operating system can be simulated by a first virtual machine and the second operating system by a second virtual machine and the computer control module can be in or configured as a hypervisor module.
  • the first and second operating systems can concurrently and independently run on a common kernel.
  • a first requirement applied to the first operating system to determine abnormal operation can be different from a second requirement applied to the second operating system to determine abnormal operation.
  • the present disclosure can include a method, vehicle, and/or tangible and non- transient computer readable medium comprising the steps, operations, and/or functions of:
  • a type of the computational device can determine a specific predetermined area and/or zone of the vehicle, from among a plurality of predetermined areas and/or zones, to be used in applying the rules.
  • the computational device can be one or more of a tablet computer, laptop, smart phone, and personal digital assistant.
  • the specific predetermined area and/or zone of the vehicle can be at least part of the passenger compartment.
  • the device discovery daemon can perform sub-steps of step (a) including:
  • the determining step/operation/function (b) can base the determination on whether the computational device is located within the predetermined area and/or zone of the vehicle on one or more of signal strength of a signal from the computational device as received by an access point of the vehicle, a received satellite -based position of the computational device, triangulation based on relative received signal strengths of a signal from the computational device as received by multiple access points of the vehicle, image processing of images of the predetermined area and/or zone, occupant presence and/or location information received by an on board vehicle sensor, whether the computational device is attempting to connect to the network and/or communication subsystem wirelessly or by hard wire connection, whether the computational device has moved relative to a selected access point during a defined time interval, whether the received signal strength of signaling from the computational device at a selected access point varies temporally, a type or service of the computational device, and input received from a user of the computational device.
  • the computational device When the computational device is determined to be located within the predetermined area and/or zone of the vehicle and is permitted to access or attempt to access the vehicle network and/or communication subsystem and wherein the device discovery daemon can determine a set of tasks, functions, and/or operations that can be performed and a set of tasks, functions, and/or operations that cannot be performed based on the determined location of the computational device.
  • the device discovery daemon can determine a level of confidence that the computational device is located within the predetermined area and/or zone and wherein the device discovery daemon determines that the computational device is located within the predetermined area and/or zone when the level of confidence has at least a threshold value.
  • a method, vehicle, and/or computer executable instructions can be provided that perform at least the following steps, operations, and functions:
  • the user can be determined to be driving the vehicle when the user is determined to be in a driver's seat of the vehicle.
  • the user can be determined to be driving the vehicle when the vehicle is at least one of in gear and in motion.
  • the method/vehicle/instructions can further perform the following steps, operations, and functions:
  • step (d) in response to step (c), the media controller subsystem performing one or more of the following steps:
  • the user can be determined not to be driving the vehicle when the vehicle is at least one of not in gear, parked, and not in motion.
  • the user can be determined to be in a driver's seat of the vehicle when the user is located within a predetermined area and/or zone.
  • the determining step (a) can base the determination on whether the user is located within the predetermined area and/or zone of the vehicle on one or more of signal strength of a signal from a computational device associated with the user as received by an access point of the vehicle, a received satellite-based position of the computational device, triangulation based on relative received signal strengths of a signal from the computational device as received by multiple access points of the vehicle, image processing of images of the predetermined area and/or zone, user presence and/or location information received by an on board vehicle sensor, and input received from a user of the computational device.
  • a media controller subsystem can include:
  • a microprocessor executable distributed network control server operable to access selected content on a public network external to a vehicle comprising the media controller subsystem, the distributed network control server having at least one of an assigned Internet Protocol address and global unicast address;
  • a microprocessor executable media server operable to receive requests for content from a vehicle occupant and provide requested content to a portable computational device associated with the vehicle occupant, the media server having a contactable electronic address on a local area network maintained by the vehicle;
  • a microprocessor readable memory to store content
  • a microprocessor executable virtual network console operable to provide the computational device with remote access to the media server.
  • the distributed network control server, media server, and virtual network console can be on a common media processing board mounted on the vehicle.
  • the portable computational device can be discrete from, remote from, and in wireless communication with the media processing board.
  • the virtual network console can operate in accordance with the Remote Frame Buffer protocol on top of the TCP/IP suite of protocols, thereby causing the remote computational device to appear to a computer mounted in the vehicle as if the remote computational device is part of the on board vehicle control system comprising the computer.
  • the vehicle occupant through the remote computational device, can control one or more of the following media presentation features of a screen and sound system mounted on the vehicle: volume, contrast, resolution, and channel selection.
  • a method, vehicle, and instructions can perform at least the following steps, operations, and functions:
  • the portable computational device can be discrete from, remote from, and in wireless communication with the media processing board.
  • the filter and/or restriction can be applied to a media request of the vehicle occupant before and/or after the requested media content is accessed.
  • the filter and/or restriction can be one or more of an age-related content filter and/or restriction, a vehicle occupant seating location filter and/or restriction, and a privacy filter and/or restriction.
  • a method, vehicle, and tangible and non-transient computer readable medium can be provided to perform steps, operations, and functions, including:
  • a microprocessor executable vehicle control system can apply the rules based on the account corresponding to the vehicle occupant and the area and/or zone occupied by the vehicle occupant. Then vehicle occupant can perform a first set of vehicle tasks, functions and operations when in a first area and/or zone and a different set of vehicle tasks, functions, and operations when in a different second area and/or zone.
  • the microprocessor executable vehicle control system can apply the rules based on the account corresponding to the vehicle occupant and the operating state of the vehicle.
  • the vehicle occupant can perform a first set of vehicle tasks, functions and operations when the vehicle is in a first operating state and a different set of vehicle tasks, functions, and operations when the vehicle is in a second operating state.
  • rule (i) applies, the vehicle occupant can control the requested vehicle task, function and/or operation using one or more of a cell phone, laptop, tablet computer, and personal digital assistant.
  • a different authentication procedure can be used for the vehicle occupant when requesting a first set of vehicle tasks, operations, and functions than when requesting a second set of vehicle tasks, operations, and functions.
  • the account can include rights and privileges for the vehicle occupant with respect to a vehicle task, function or operation, security and/or authentication requirements and/or credentials for the vehicle occupant, and personal settings of the vehicle occupant.
  • the personal settings of the vehicle occupant can include a plurality of a seat setting, climate control setting, lighting setting, configuration of an instrument cluster on a screen, rear view mirror setting, driving mode, media channel setting or preset, media delivery preference, music genre preference, scheduled program, playlist, synchronization with cloud-based data associated with the vehicle occupant, application-specific personalization and selections, and a display setting and configuration.
  • a method/vehicle/computer readable instructions can perform at least the following steps/operations/ functions :
  • [0136] enable activate a window in at least part of the displayed content by hovering over the window with a mouse cursor
  • [0145] enable blind typing on the graphical user interface; and [0146] remove one or more selectable objects from the displayed content.
  • a method/vehicle/computer readable instructions can perform at least the following steps/operations/ functions :
  • mapping by a microprocessor executable media controller subsystem, a graphical user interface on board and/or in communication with a computer network controlled by computer on board the vehicle with a segmented control surface, whereby each segment corresponds to an item of content displayed on the graphical user interface;
  • control surface can be electrically, magnetically, and electromagnetically nonresponsive to contact of the user's body part.
  • control surface can be optically nonresponsive to contact of the user's body part.
  • control surface can be part of an arm rest and/or dashboard and wherein the graphical user interface is virtual and not displayed to the user.
  • control surface can be part of an arm rest and/or dashboard and wherein the graphical user interface is displayed to the user on a screen on board the vehicle.
  • a method/vehicle/computer readable instructions can perform at least the following steps/operations/ functions :
  • the impairment can be one or more of a vision impairment, hearing impairment, dexterity impairment, mobility impairment, language impairment, and communication impairment.
  • the communication interface can be a screen on board the vehicle.
  • the location of the vehicle occupant and the screen corresponding to the determined occupant location can be determined.
  • the alteration can be one or more of:
  • [0181] enable blind typing on the graphical user interface; [0182] enable text and/or a visual alternative to an audio channel associated with the displayed content;
  • a method/vehicle/computer readable instructions can perform at least the following steps/operations/ functions :
  • the notification can be one or more of a visual notification, an audible notification, and a haptic notification.
  • the notification can be an audible notification.
  • the audible notification can be one or more of a portion of the frequency range of the received sound, a frequency shifted portion of the received sound, and a phase shifted portion of the sound.
  • a method/vehicle/computer readable instructions can perform at least the following step/operation/function:
  • the interface can be one or more of a steering wheel, pedal, a graphical user interface, and a setting and/or configuration of an automated vehicle response system.
  • the automated vehicle response system can be a collision avoidance system.
  • a method/vehicle/computer readable instructions can perform at least the following step/operation/function:
  • the vehicle-related information comprises at least one of a current and/or future vehicle location and path of vehicle travel.
  • the action performed can depend on a seating position of the vehicle occupant.
  • the persona can include one or more of bioinformatics, medical information, driving history, personal information, private information, travel information, and Internet browsing history and/or browsed content.
  • the vehicle related information can include one or more of vehicle context, state, external surroundings, location, past, current, and/or intended path of travel, waypoint, and destination.
  • the action can be one or more of making an appointment, making a reservation, purchasing an item on line, adding a waypoint or destination to path of travel on a navigation system of the vehicle, adding an entry into the occupant's electronic calendar, changing a destination or path of travel on the navigation system, warning the occupant, notifying the occupant, and sending a message to a person at a waypoint or destination of the vehicle regarding an arrival time.
  • the steps/ functions/operations can determine relevant information other than the persona of the vehicle occupant and the vehicle-related information.
  • the relevant information can include one or more of a persona of a selected person not currently in the vehicle, a message from a friend or family member of the vehicle occupant, a current activity of the friend and/or family member, a location, hours of operation, and/or descriptive information about a point and/or location of interest near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a vehicle service facility near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a hotel and/or motel near the vehicle and/or the vehicle's path of travel, a current location of the friend or family member near the vehicle and/or the vehicle's path of travel, and a road condition along a path of travel of the vehicle.
  • the vehicle can obtain the persona from one or more of a different vehicle driven by the occupant and a home computer.
  • the vehicle can synchronize with the different vehicle and/or home computer when the vehicle is parked in proximity thereto and an ignition of the vehicle is turned off.
  • the maintaining of the persona can include collecting information from one or more vehicle sensors and/or from a remote information source.
  • a type of information collected can depend on a seating position of the occupant.
  • the type of information collected can depend on one or more of an identity of the occupant, an age of the occupant, and an association of the occupant with the vehicle.
  • the persona can be defined in a format (e.g., grammar, syntax, and/or semantics) that can be processed by vehicles of different manufacturers.
  • a format e.g., grammar, syntax, and/or semantics
  • a vehicle backplane assembly can include at least the following components:
  • a plurality of blade processors in a vehicle that includes a first set of blade processors installed prior to vehicle sale and not modifiable and/or replaceable by the vehicle owner and a second set of blade processors installable after vehicle sale and is modifiable and/or replaceable by the vehicle owner.
  • Each blade processor includes a microprocessor, a memory, and a network interface and each of the blade processors performing a different set of functions.
  • the restricted access by the customer to the first set of blade processors can be done through restrictions on software rights and privileges (e.g., read only rights and privileges to machine code with no right or privilege to modify same) and physical access to the first set of blade processors (e.g., through a locked housing containing one or more components of the blade processors).
  • software rights and privileges e.g., read only rights and privileges to machine code with no right or privilege to modify same
  • physical access to the first set of blade processors e.g., through a locked housing containing one or more components of the blade processors.
  • the assembly can include a third set of blade processors that interfaces with the first and second set of blade processors to provide input to and/or receive output from a corresponding one of the first and second blade processors.
  • a satellite receiving system can be a member of the third set of blade processors and a navigation system a member of one of the first and second set of blade processors.
  • Each of the first and second sets of blade processors can include a corresponding Universal Serial Bus (“USB”) hub, the USB hub comprising a plurality of ports to permit devices to connect to the USB hub.
  • USB Universal Serial Bus
  • the first set of blades can include a master blade processor operable to inventory hardware and/or software in communication with the backplane assembly, assign blade processors to applications attempting to execute, and/or determine a health state of a selected the blade processor.
  • a vehicle can include a plurality of blade processors, each blade processor performing a function that is at least one of a vehicle task, function, or operation and an infotainment task, function, or operation.
  • the blade processors can have a backplane.
  • the backplane can have a first communication zone defining a trusted network within the vehicle to connect with trusted computational devices and/or module provided or certified by the vehicle manufacturer but not untrusted computational devices and/or modules provided by vehicle occupants. Whether or not a computational device and/or module is certified by a manufacturer can be determined using known license check procedures, such as a unique identifier, a unique credential (e.g., password or encrypted or unencrypted key), and the like.
  • the computational device and/or module is deemed to be certified by the manufacturer.
  • the computational device and/or module is not deemed to be certified by the manufacturer.
  • the backplane can have a second communication zone defining an untrusted network to connect with the untrusted computational devices.
  • the backplane can have a third communication zone providing power and data transmission to the plurality of blade processors.
  • the first and second communication zones can be connected logically on opposing sides of a firewall blade processor.
  • a master blade processor can enable a connection to a computational device connected to the first communication zone upon verification that the connected computational device is certified by a vehicle manufacturer.
  • the first and second communication zones can be configured as separate Ethernet switches.
  • the first and second communication zones are typically not in signal communication with one another.
  • a firewall can have a dedicated slot that bridges the first and second
  • An output of each of the plurality of blade processors can be an IP message framed into an Ethernet packet.
  • a sensor and/or controller of the vehicle can communicate by a Car Area Network ("CAN") protocol.
  • An Ethernet bus controller can terminate a CAN bus to the sensor and/or controller and a CAN Ethernet controller subsystem can translate an Ethernet message from a blade processor to a CAN protocol-based message.
  • the plurality of blade processors can be configured as a crate having the backplane connector and on-backplane Ethernet interfaces enabling the blade processors to communicate with one another using Ethernet messages.
  • a vehicle can include a plurality of blade processors, each blade processor performing a function that is at least one of a vehicle task, function, or operation and an infotainment task, function, or operation.
  • the plurality of blade processors can include a master blade processor operable to assign a component and/or module requiring a blade processor for execution to a selected blade processor.
  • the master blade processor can inventory hardware and software capabilities of components and/or modules in communication with the master blade processor to provide an application resource table comprising component and/or module identity, functional description, and/or computational resource requirements and/or capabilities needed for execution of the component and/or module.
  • a blade processor can be assigned to execute the component and/or module when the blade processor satisfies the corresponding computational resource requirements and/or capabilities.
  • the master blade processor can assign a network address to each blade processor and communication device in communication with the master blade processor.
  • the master blade processor can assign a component and/or module to a selected blade processor when the selected blade processor is available or, though unavailable, is executing a component and/or module having a lower priority than the component and/or module to be assigned to the selected blade processor.
  • the present disclosure can provide a number of advantages depending on the particular aspect, embodiment, and/or configuration. For example, operating multiple operating systems on a common on board computer platform for vehicle control and infotainment tasks, functions and operations can provide a high level of vehicle security and block successfully intrusive attacks.
  • the network security configurations can enable both infotainment and critical vehicle tasks, functions, and operations to performed automatically and concurrently in a vehicle without increasing driver danger from computer-targeted attacks by an outside source.
  • Using a common computer to run safely both the first and second operating systems can not only reduce vehicle manufacturing costs but also provide a simpler computational architecture.
  • the device discovery daemon by locating the portable device seeking access to the vehicle network, can not only provide higher levels of security from intrusive attacks via a computational device external to the vehicle but also protect the privacy and resources of the vehicle network.
  • the application of accessibility technologies to the driver's screen can successfully reduce driver distractions while enabling the driver to perform permissible computational tasks.
  • the availability of accessibility technologies in the vehicle can not only comply with prevailing legal requirements but also enhance passenger enjoyment.
  • the use of network control and media servers and virtual network console functionality on a common processing board can provide a simple yet effective way to stream media to devices within the vehicle network.
  • the use of media filters and restrictions based on one or more of the identity of the vehicle occupant requesting media content, the identity of a portable computational device associated with the vehicle occupant, and the spatial location of the vehicle occupant and/or remote computational device can enable compliance with prevailing laws regarding driver distractions while providing the vehicle owner with the ability to control media access by one or more selected vehicle occupants.
  • the use of user accounts to control passenger access to vehicle tasks, functions, and operations can prevent a non-driving passenger, for instance, from controlling a critical vehicle task, function or operation and confusing or frustrating the driver.
  • the inactive control surface can enable a vehicle occupant to perform tasks, functions and operations without viewing his or her screen.
  • the blade processor architecture can take advantage of the pervasiveness of TCP/IP and the large volume of components and/or modules made for IP-based distributed systems, such as the Ethernet.
  • the architecture can be a modular distributed system that can be upgraded and/or extended over time, without changing the basic processing architecture. Because the various subsystems can communicate with one another by protocol-based messages, there is not requirement that all software be written in any specific language or execute on any specific operating system. This can allow developers to use the most appropriate run time environment for the subsystem being developed.
  • RTOS real time operating system
  • a subsystem can be developed using RTOS, without constraining the runtime environment of less demanding systems (e.g., a console subsystem).
  • each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • the term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more,” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising,” “including,” and “having” can be used interchangeably.
  • Accessible technology refers to any computer technology that users can adjust and/or employ to accommodate his or her vision, dexterity, hearing, cognitive, language, learning, and/or speech needs. Accessibility technology can be in the form of accessibility features or settings built into software programs and specialty hardware devices or software programs.
  • assistive technology refers to any technology that users can adjust and/or employ to accommodate his or her vision, dexterity, hearing, cognitive, language, learning, and/or speech needs.
  • assistive technology include accessible technology, adjuncts, peripherals, plug-ins, and add-ins.
  • the term "disability” refers to the consequence of an impairment that may be physical, cognitive, intellectual, mental, sensory, emotional, developmental, or some combination of these.
  • Section 503 defines “disability” as an impairment that substantially limits a major life activity, even if it were not to limit any other major life activity, or an impairment that is episodically active or in remission and would substantially limit a major life activity when active.
  • the term "automotive navigation system” can refer to a satellite navigation system designed for use in vehicles. It typically uses a GPS navigation device to acquire position data to locate the user on a road in the unit's map database. Using the road database, the unit can give directions to other locations along roads also in its database. Dead reckoning using distance data from sensors attached to the drivetrain, a gyroscope and an accelerometer can be used for greater reliability, as GPS signal loss and/or multipath can occur due to urban canyons or tunnels.
  • the term "bus” and variations thereof, as used herein, can refer to a subsystem that transfers information and/or data between various components.
  • a bus generally refers to the collection communication hardware interface, interconnects, bus architecture, standard, and/or protocol defining the communication scheme for a communication system and/or communication network.
  • a bus may also refer to a part of a communication hardware that interfaces the communication hardware with the interconnects that connect to other components of the corresponding communication network.
  • the bus may be for a wired network, such as a physical bus, or wireless network, such as part of an antenna or hardware that couples the communication hardware with the antenna.
  • a bus architecture supports a defined format in which information and/or data is arranged when sent and received through a communication network.
  • a protocol may define the format and rules of communication of a bus architecture.
  • communication device can be used interchangeably and may include any type of device capable of communicating with one or more of another device and/or across a communications network, via a communications protocol, and the like.
  • exemplary communication devices may include but are not limited to smartphones, handheld computers, laptops, netbooks, notebook computers, subnotebooks, tablet computers, scanners, portable gaming devices, phones, pagers, GPS modules, portable music players, and other Internet-enabled and/or network-connected devices.
  • a "communication modality” can refer to any protocol- or standard defined or specific communication session or interaction, such as Voice-Over-Internet-Protocol ("VoIP), cellular communications (e.g., IS-95, 1G, 2G, 3G, 3.5G, 4G, 4G/IMT-Advanced standards, 3 GPP, WIMAXTM, GSM, CDMA, CDMA2000, EDGE, lxEVDO, iDEN, GPRS, HSPDA, TDMA, UMA, UMTS, ITU-R, and 5G), BluetoothTM, text or instant messaging (e.g., AIM, Blauk, eBuddy, Gadu-Gadu, IBM Lotus Sametime, ICQ, iMessage, IMVU, Lync, MXit, Paltalk, Skype, Tencent QQ, Windows Live MessengerTM or MSN MessengerTM, Wireclub, Xfire, and Yahoo! MessengerTM), email, Twitter (e.g., tweeting), Digital Service Protocol (DSP), and the like
  • VoIP
  • the term "communication system” or “communication network” and variations thereof, as used herein, can refer to a collection of communication components capable of one or more of transmission, relay, interconnect, control, or otherwise manipulate information or data from at least one transmitter to at least one receiver.
  • the communication may include a range of systems supporting point-to-point or broadcasting of the information or data.
  • a communication system may refer to the collection individual communication hardware as well as the interconnects associated with and connecting the individual communication hardware.
  • Communication hardware may refer to dedicated communication hardware or may refer a processor coupled with a communication means (i.e., an antenna) and running software capable of using the communication means to send and/or receive a signal within the communication system.
  • Interconnect refers some type of wired or wireless communication link that connects various components, such as communication hardware, within a communication system.
  • a communication network may refer to a specific setup of a communication system with the collection of individual communication hardware and interconnects having some definable network topography.
  • a communication network may include wired and/or wireless network having a pre-set to an ad hoc network structure.
  • Non-volatile media includes, for example, non-volatile random access memory (NVRAM), or magnetic or optical disks.
  • Volatile media includes dynamic memory, such as main memory.
  • Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, magneto-optical medium, a compact disc read only memory (CD-ROM), any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a random access memory (RAM), a programmable read only memory (PROM), and erasable programmable read only memory EPROM, a FLASH-EPROM, a solid state medium like a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
  • a floppy disk a flexible disk, hard disk, magnetic tape, or any other magnetic medium
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • EPROM erasable programmable read only memory
  • FLASH-EPROM a solid state medium like a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which
  • a digital file attachment to an e-mail or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium.
  • the computer-readable media is configured as a database
  • the database may be any type of database, such as relational, hierarchical, object-oriented, and/or the like. Accordingly, the disclosure is considered to include a tangible storage medium or distribution medium and prior art-recognized equivalents and successor media, in which the software
  • dashboards may be used interchangeably and can be any panel and/or area of a vehicle disposed adjacent to an operator, user, and/or passenger. Dashboards may include, but are not limited to, one or more control panel(s), instrument housing(s), head unit(s), indicator(s), gauge(s), meter(s), light(s), audio equipment, computer(s), screen(s), display(s), HUD unit(s), and graphical user interface(s).
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and software that is capable of performing the functionality associated with that element.
  • the term "desktop” refers to a metaphor used to portray systems.
  • a desktop is generally considered a "surface” that may include pictures, called icons, widgets, folders, etc. that can activate and/or show applications, windows, cabinets, files, folders, documents, and other graphical items.
  • the icons are generally selectable to initiate a task through user interface interaction to allow a user to execute applications and/or conduct other operations.
  • display refers to a portion of a physical screen used to display the output of a computer to a user.
  • displayed image refers to an image produced on the display.
  • a typical displayed image is a window or desktop.
  • the displayed image may occupy all or a portion of the display.
  • the term "display orientation” refers to the way in which a rectangular display is oriented for viewing.
  • the two most common types of display orientations are portrait and landscape.
  • the display In landscape mode, the display is oriented such that the width of the display is greater than the height of the display (such as a 4:3 ratio, which is 4 units wide and 3 units tall, or a 16:9 ratio, which is 16 units wide and 9 units tall).
  • the longer dimension of the display is oriented substantially horizontal in landscape mode while the shorter dimension of the display is oriented substantially vertical.
  • the display is oriented such that the width of the display is less than the height of the display. Stated differently, the shorter dimension of the display is oriented
  • a multi-screen display can have one composite display that encompasses all the screens.
  • the composite display can have different display
  • the term "electronic address” can refer to any contactable address, including a telephone number, instant message handle, e-mail address, Uniform Resource Locator ("URL”), Global Universal Identifier ("GUID”), Universal Resource Identifier (“URI”), Address of Record (“AOR”), electronic alias in a database, etc., combinations thereof.
  • URL Uniform Resource Locator
  • GUI Global Universal Identifier
  • URI Universal Resource Identifier
  • AOR Address of Record
  • the term "gesture” refers to a user action that expresses an intended idea, action, meaning, result, and/or outcome.
  • the user action can include manipulating a device (e.g., opening or closing a device, changing a device orientation, moving a trackball or wheel, etc.), movement of a body part in relation to the device, movement of an implement or tool in relation to the device, audio inputs, etc.
  • a gesture may be made on a device (such as on the screen) or with the device to interact with the device.
  • gesture capture refers to a sense or otherwise a detection of an instance and/or type of user gesture.
  • the gesture capture can be received by sensors in three-dimensional space. Further, the gesture capture can occur in one or more areas of a screen, for example, on a touch-sensitive display or a gesture capture region.
  • a gesture region can be on the display, where it may be referred to as a touch sensitive display, or off the display, where it may be referred to as a gesture capture area.
  • Infotainment may provide media and/or multimedia content.
  • An example is information-based media content or programming that also includes entertainment content.
  • a "multi-screen application” refers to an application that is capable of producing one or more windows that may simultaneously occupy one or more screens.
  • a multiscreen application commonly can operate in single-screen mode in which one or more windows of the application are displayed only on one screen or in multi-screen mode in which one or more windows are displayed simultaneously on multiple screens.
  • a "single-screen application” refers to an application that is capable of producing one or more windows that may occupy only a single screen at a time.
  • online community can mean a group of people that interact via a computer network, for social, professional, educational, and/or other purposes.
  • the interaction can use a variety of media formats, including wikis, blogs, chat rooms, Internet forums, instant messaging, email, and other forms of electronic media.
  • Many media formats may be used in social software separately and/or in combination, including text-based chat rooms and forums that use voice, video text or avatars.
  • the term "satellite positioning system receiver” can refer to a wireless receiver or transceiver to receive and/or send location signals from and/or to a satellite positioning system (SPS), such as the Global Positioning System (“GPS”) (US), GLONASS ( Russia), Galileo positioning system (EU), Compass navigation system (China), and Regional Navigational Satellite System (India).
  • SPS satellite positioning system
  • GPS Global Positioning System
  • US Global Positioning System
  • GLONASS Russian
  • EU Galileo positioning system
  • Compass navigation system China
  • Regional Navigational Satellite System India
  • social network service may include a service provider that builds online communities of people, who share interests and/or activities, or who are interested in exploring the interests and/or activities of others.
  • Social network services can be network-based and may provide a variety of ways for users to interact, such as e-mail and instant messaging services.
  • social network can refer to a network-based social network.
  • touch screen refers to a physical structure that enables the user to interact with the computer by touching areas on the screen and provides information to a user through a display.
  • the touch screen may sense user contact in a number of different ways, such as by a change in an electrical parameter (e.g., resistance or capacitance), acoustic wave variations, infrared radiation proximity detection, light variation detection, and the like.
  • an electrical parameter e.g., resistance or capacitance
  • acoustic wave variations e.g., infrared radiation proximity detection, light variation detection, and the like.
  • resistive touch screen normally separated conductive and resistive metallic layers in the screen pass an electrical current. When a user touches the screen, the two layers make contact in the contacted location, whereby a change in electrical field is noted and the coordinates of the contacted location calculated.
  • a capacitive layer stores electrical charge, which is discharged to the user upon contact with the touch screen, causing a decrease in the charge of the capacitive layer. The decrease is measured, and the contacted location coordinates determined.
  • a surface acoustic wave touch screen an acoustic wave is transmitted through the screen, and the acoustic wave is disturbed by user contact.
  • a receiving transducer detects the user contact instance and determines the contacted location coordinates.
  • window refers to a, typically rectangular, displayed image on at least part of a display that contains or provides content different from the rest of the screen.
  • the window may obscure the desktop.
  • the dimensions and orientation of the window may be configurable either by another module or by a user. When the window is expanded, the window can occupy substantially all of the display space on a screen or screens.
  • vehicle can refer to a device or structure for transporting animate and/or inanimate or tangible objects (e.g., persons and/or things), such as a self-propelled conveyance.
  • a vehicle as used herein can include any conveyance or model of a conveyance, where the conveyance was originally designed for the purpose of moving one or more tangible objects, such as people, animals, cargo, and the like.
  • vehicle does not require that a conveyance moves or is capable of movement.
  • Typical vehicles may include but are in no way limited to cars, trucks, motorcycles, busses, automobiles, trains, railed conveyances, boats, ships, marine conveyances, submarine conveyances, airplanes, space craft, flying machines, human-powered conveyances, and the like.
  • profile can refer to any data structure, data store, and/or database that includes one or more items of information associated with a vehicle, a vehicle system, a device (e.g., a mobile device, laptop, mobile phone, etc.), or a person.
  • a device e.g., a mobile device, laptop, mobile phone, etc.
  • the term "in communication with,” as used herein, refers to any coupling, connection, or interaction using electrical signals to exchange information or data, using any system, hardware, software, protocol, or format, regardless of whether the exchange occurs wirelessly or over a wired connection.
  • FIG. 1 depicts an embodiment of a vehicle operating environment
  • FIG. 2 is a block diagram of an embodiment of a vehicle system
  • FIG. 3 is a block diagram of an embodiment of a vehicle control system environment
  • Fig. 4 is a block diagram of an embodiment of a vehicle communications subsystem
  • Fig. 5A is a first block diagram of an embodiment of a vehicle interior environment separated into areas and/or zones;
  • Fig. 5B is a second block diagram of an embodiment of a vehicle interior environment separated into areas and/or zones;
  • Fig. 5C is a third block diagram of an embodiment of a vehicle interior environment separated into areas and/or zones;
  • Fig. 6A depicts an embodiment of a sensor configuration for a vehicle
  • Fig. 6B depicts an embodiment of a sensor configuration for a zone of a vehicle
  • FIG. 7A is a block diagram of an embodiment of interior sensors for a vehicle
  • Fig. 7B is a block diagram of an embodiment of exterior sensors for a vehicle
  • Fig. 8A is a block diagram of an embodiment of a media subsystem for a vehicle
  • Fig. 8B is a block diagram of an embodiment of a user and device interaction subsystem for a vehicle
  • Fig. 8C is a block diagram of an embodiment of a Navigation subsystem for a vehicle
  • FIG. 9 is a block diagram of an embodiment of a communications subsystem for a vehicle
  • Fig. 10 is a block diagram of an embodiment of a software architecture for the vehicle control system
  • Fig. 11 A is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 1 IB is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 11C is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 1 ID is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 1 IE is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 1 IF is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 11G is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 11H is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 1 II is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 11J is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 1 IK is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system
  • Fig. 12A is a diagram of an embodiment of a data structure for storing information about a user of a vehicle
  • FIG. 12B is a diagram of an embodiment of a data structure for storing information about a device associated with or in a vehicle;
  • Fig. 12C is a diagram of an embodiment of a data structure for storing information about a system of a vehicle
  • Fig. 12D is a diagram of an embodiment of a data structure for storing information about a vehicle
  • Fig. 13 is a flow or process diagram of a method for storing one or more settings associated with a user
  • Fig. 14 is a flow or process diagram of a method for establishing one or more settings associated with a user
  • Fig. 15 is a flow or process diagram of a method for storing one or more settings associated with a user
  • Fig. 16 is a flow or process diagram of a method for storing one or more gestures associated with a user
  • Fig. 17 is a flow or process diagram of a method for reacting to a gesture performed by a user
  • Fig. 18 is a flow or process diagram of a method for storing health data associated with a user
  • Fig. 19 is a flow or process diagram of a method for reacting to a gesture performed by a user
  • FIG. 20 depicts a flow diagram according to an embodiment
  • Fig. 21 depicts an on board media system according to an embodiment
  • Fig. 22 depicts a flow diagram according to an embodiment
  • FIG. 23 depicts a flow diagram according to an embodiment
  • Fig. 24 depicts a computational system according to an embodiment
  • Fig. 25 depicts a computational system according to an embodiment
  • Fig. 26 depicts a flow diagram according to an embodiment
  • Fig. 27 depicts a flow diagram according to an embodiment
  • Fig. 28 depicts a screen shot according to an embodiment
  • Fig. 29 depicts a screen shot according to an embodiment
  • Fig. 30 depicts a screen shot according to an embodiment
  • FIG. 31 depicts a flow diagram according to an embodiment
  • Fig. 32 depicts a flow diagram according to an embodiment
  • Fig. 33 depicts a segmented control surface according to an embodiment
  • Fig. 34 depicts a mapped display according to an embodiment
  • Fig. 35 depicts a flow diagram according to an embodiment
  • Fig. 36 depicts a flow diagram according to an embodiment
  • Figs. 37A-B depict a flow diagram according to an embodiment
  • Fig. 38 depicts a flow diagram according to an embodiment
  • Fig. 39 depicts a flow diagram according to an embodiment.
  • the embodiments may relate to an automobile and/or an automobile environment.
  • the automobile environment can include systems associated with the automobile and devices or other systems in communication with the automobile and/or automobile systems.
  • the systems can relate to communications systems and/or devices and may be capable of communicating with other devices and/or to an individual or group of individuals. Further, the systems can receive user input in unique ways.
  • the overall design and functionality of the systems provide for an enhanced user experience making the automobile more useful and more efficient.
  • the automobile systems may be electrical, mechanical, electro-mechanical, software- based, and/or combinations thereof.
  • a vehicle environment 100 that may contain a vehicle ecosystem is shown in Fig. 1.
  • the vehicle environment 100 can contain areas associated with a vehicle or conveyance 104.
  • the vehicle 104 is shown as a car but can be any type of conveyance.
  • the environment 100 can include at least three zones.
  • a first zone 108 may be inside a vehicle 104.
  • the zone 108 includes any interior space, trunk space, engine compartment, or other associated space within or associated with the vehicle 104.
  • the interior zone 108 can be defined by one or more techniques, for example, geo-fencing.
  • a second zone 112 may be delineated by line 120.
  • the zone 112 is created by a range of one or more sensors associated with the vehicle 104.
  • the area 112 is exemplary of the range of those sensors and what can be detected by those sensors associated with the vehicle 104.
  • sensor range is shown as a fixed and
  • the sensor range may be dynamic and/or discontinuous.
  • a ranging sensor e.g., radar, lidar, ladar, etc.
  • the rest of the environment includes all space beyond the range of the sensors and is represented by space 116.
  • the environment 100 may have an area 116 that includes all areas beyond the sensor range 112.
  • the area 116 may include locations of travel that the vehicle 104 may proceed to in the future.
  • the vehicle system 200 may comprise hardware and/or software that conduct various operations for or with the vehicle 104.
  • the operations can include, but are not limited to, providing information to the user 216, receiving input from the user 216, and controlling the functions or operation of the vehicle 104, etc.
  • the vehicle system 200 can include a vehicle control system 204.
  • the vehicle control system 204 can be any type of computing system operable to conduct the operations as described herein.
  • An example of a vehicle control system may be as described in conjunction with Fig. 3.
  • the vehicle control system 204 may interact with a memory or storage system 208 that stores system data.
  • System data 208 may be any type of data needed for the vehicle control system 204 to control effectively the vehicle 104.
  • the system data 208 can represent any type of database or other storage system.
  • the system data 208 can be a flat file data system, an object-oriented data system, or some other data system that may interface with the vehicle control system 204.
  • the vehicle control system 204 may communicate with a device or user interface 212, 248.
  • the user interface 212, 248 may be operable to receive user input either through touch input, on one or more user interface buttons, via voice command, via one or more image sensors, or through a graphical user interface that may include a gesture capture region, as described in conjunction with the other figures provided herein.
  • the symbol 212, 248 can represent a device that is located or associated with the vehicle 104.
  • the device 212, 248 can be a mobile device, including, but not limited to, a mobile telephone, a mobile computer, or other type of computing system or device that is either permanently located in or temporarily associated with, but not necessarily connected to, the vehicle 104.
  • the vehicle control system 204 can interface with the device 212, 248 and leverage the device's computing capability to provide one or more of the features or functions as described herein.
  • the device or user interface 212, 248 can receive input or provide information to a user 216.
  • the user 216 may thus interact with the vehicle control system 204 through the interface or device 212, 248.
  • the device 212, 248 may include or have access to device data 220 and/or profile data 252.
  • the device data 220 can be any type of data that is used in conjunction with the device 212, 248 including, but not limited to, multimedia data, preferences data, device identification information, or other types of data.
  • the profile data 252 can be any type of data associated with at least one user 216 including, but in no way limited to, bioinformatics, medical information, driving history, personal information (e.g., home physical address, business physical address, contact addresses, likes, dislikes, hobbies, size, weight, occupation, business contacts - including physical and/or electronic addresses, personal contacts - including physical and/or electronic addresses, family members, and personal information related thereto, etc.), other user characteristics, advertising information, user settings and feature preferences, travel information, associated vehicle preferences, communication preferences, historical information (e.g., including historical, current, and/or future travel destinations), Internet browsing history, or other types of data.
  • the data may be stored as device data 220 and/or profile data 252 in a storage system similar to that described in
  • the profile data 252 may include one or more user profiles.
  • User profiles may be generated based on data gathered from one or more of vehicle preferences (e.g., seat settings, HVAC settings, dash configurations, and the like), recorded settings, geographic location information (e.g., provided by a satellite positioning system (e.g., GPS), Wi-Fi hotspot, cell tower data, etc.), mobile device information (such as mobile device electronic addresses, Internet browsing history and content, application store selections, user settings and enabled and disabled features, and the like), private information (such as user information from a social network, user presence information, user business account, and the like), secure data, biometric information, audio information from on board microphones, video information from on board cameras, Internet browsing history and browsed content using an on board computer and/or the local area network enabled by the vehicle 104, geographic location information (e.g., a vendor storefront, roadway name, city name, etc.), and the like.
  • vehicle preferences e.g., seat settings, HVAC settings, dash configurations, and the like
  • the profile data 252 may include one or more user accounts.
  • User accounts may include access and permissions to one or more settings and/or feature preferences associated with the vehicle 104, communications, infotainment, content, etc.
  • a user account may allow access to certain settings for a particular user, while another user account may deny access to the settings for another user, and vice versa.
  • the access controlled by the user account may be based on at least one of a user account priority, role, permission, age, family status, a group priority (e.g., the user account priority of one or more users, etc.), a group age (e.g., the average age of users in the group, a minimum age of the users in the group, a maximum age of the users in the group, and/or combinations thereof, etc.).
  • a group priority e.g., the user account priority of one or more users, etc.
  • a group age e.g., the average age of users in the group, a minimum age of the users in the group, a maximum age of the users in the group, and/or combinations thereof, etc.
  • a user 216 may be allowed to purchase applications (e.g., software, etc.) for the vehicle 104 and/or a device associated with the vehicle 104 based on information associated with the user account.
  • This user account information may include a preferred payment method, permissions, and/or other account information.
  • the user account information may be part of the user profile and/or other data stored in the profile data 252.
  • an adult user e.g., a user with an age of 18 years old and/or over, etc.
  • an adult user may be located in an area of a vehicle 104, such as a rear passenger area.
  • a child user (e.g., a user with an age of 17 years old and/or less, etc.) may be located in the same, or close, area.
  • the user account information in the profile data 252 associated with both the adult user and the child user may be used by the vehicle 104 in determining whether content is appropriate for the area given the age of the child user.
  • a graphic movie containing violence e.g., a movie associated with a mature rating, such as a Motion Picture Association of America (MPAA) rating of "R,” "NC-17,” etc.
  • MPAA Motion Picture Association of America
  • the vehicle control system 204 may also communicate with or through a communication network 224.
  • the communication network 224 can represent any type of wireless and/or wired communication system that may be included within the vehicle 104 or operable to communicate outside the vehicle 104.
  • the communication network 224 can include a local area communication capability and a wide area communication capability.
  • the communication network 224 can include a Bluetooth® wireless system, an 802.1 lx (e.g., 802.11G/ 802.1 IN /802.11AC, or the like, wireless system), a CAN bus, an Ethernet network within the vehicle 104, or other types of communication networks that may function with or be associated with the vehicle 104.
  • the communication network 224 can also include wide area communication capabilities, including one or more of, but not limited to, a cellular communication capability, satellite telephone communication capability, a wireless wide area network communication capability, or other types of communication capabilities that allow for the vehicle control system 204 to communicate outside the vehicle 104.
  • wide area communication capabilities including one or more of, but not limited to, a cellular communication capability, satellite telephone communication capability, a wireless wide area network communication capability, or other types of communication capabilities that allow for the vehicle control system 204 to communicate outside the vehicle 104.
  • the vehicle control system 204 may communicate through the communication network 224 to a server 228 that may be located in a facility that is not within physical proximity to the vehicle 104.
  • the server 228 may represent a cloud computing system or cloud storage that allows the vehicle control system 204 to either gain access to further computing capabilities or to storage at a location outside of the vehicle 104.
  • the server 228 can include a computer processor and memory and be similar to any computing system as understood to one skilled in the art.
  • the server 228 may be associated with stored data 232.
  • the stored data 232 may be stored in any system or by any method, as described in conjunction with system data 208, device data 220, and/or profile data 252.
  • the stored data 232 can include information that may be associated with one or more users 216 or associated with one or more vehicles 104.
  • the stored data 232 being stored in a cloud or in a distant facility, may be exchanged among vehicles 104 or may be used by a user 216 in different locations or with different vehicles 104.
  • the server may be associated with profile data 252 as provided herein. It is anticipated that the profile data 252 may be accessed across the communication network 224 by one or more components of the system 200. Similar to the stored data 232, the profile data 252, being stored in a cloud or in a distant facility, may be exchanged among vehicles 104 or may be used by a user 216 in different locations or with different vehicles 104.
  • the vehicle control system 204 may also communicate with one or more sensors 236, 242, which are either associated with the vehicle 104 or communicate with the vehicle 104.
  • Vehicle sensors 242 may include one or more sensors for providing information to the vehicle control system 204 that determine or provide information about the environment 100 in which the vehicle 104 is operating. Embodiments of these sensors may be as described in conjunction with Figs. 6A-7B.
  • Non-vehicle sensor 236 can be any type of sensor that is not currently associated with the vehicle 104.
  • non- vehicle sensor 236 can be sensors in a traffic system operated by a third party that provides data to the vehicle control system 204.
  • non- vehicle sensor(s) 236 can be other types of sensors which provide information about the distant environment 116 or other information about the vehicle 104 or the environment 100. These non- vehicle sensors 236 may be operated by third parties but provide information to the vehicle control system 204. Examples of information provided by the sensors 236 and that may be used by the vehicle control system 204 may include weather tracking data, traffic data, user health tracking data, vehicle maintenance data, or other types of data, which may provide environmental or other data to the vehicle control system 204. The vehicle control system 204 may also perform signal processing of signals received from one or more sensors 236, 242.
  • Such signal processing may include estimation of a measured parameter from a single sensor, such as multiple measurements of a range state parameter from the vehicle 104 to an obstacle, and/or the estimation, blending, or fusion of a measured state parameter from multiple sensors such as multiple radar sensors or a combination of a ladar/lidar range sensor and a radar sensor.
  • Signal processing of such sensor signal measurements may comprise stochastic signal processing, adaptive signal processing, and/or other signal processing techniques known to those skilled in the art.
  • the various sensors 236, 242 may include one or more sensor memory 244. Embodiments of the sensor memory 244 may be configured to store data collected by the sensors 236, 242. For example, a temperature sensor may collect temperature data associated with a vehicle 104, user 216, and/or environment, over time.
  • the temperature data may be collected incrementally, in response to a condition, or at specific time periods.
  • the temperature data may be stored in the sensor memory 244.
  • the data may be stored along with an identification of the sensor and a collection time associated with the data.
  • this stored data may include multiple data points and may be used to track changes in sensor
  • the sensor memory 244 can represent any type of database or other storage system.
  • the diagnostic communications module 256 may be configured to receive and transmit diagnostic signals and information associated with the vehicle 104.
  • diagnostics signals and information may include, but is in no way limited to, vehicle system warnings, sensor data, vehicle component status, service information, component health, maintenance alerts, recall notifications, predictive analysis, and the like.
  • Embodiments of the diagnostic communications module 256 may handle warning/error signals in a predetermined manner.
  • the signals for instance, can be presented to one or more of a third party, occupant, vehicle control system 204, and a service provider (e.g., manufacturer, repair facility, etc.).
  • a service provider e.g., manufacturer, repair facility, etc.
  • the diagnostic communications module 256 may be utilized by a third party (i.e., a party other than the user 216, etc.) in communicating vehicle diagnostic information.
  • a manufacturer may send a signal to a vehicle 104 to determine a status associated with one or more components associated with the vehicle 104.
  • the diagnostic communications module 256 may communicate with the vehicle control system 204 to initiate a diagnostic status check. Once the diagnostic status check is performed, the information may be sent via the diagnostic communications module 256 to the manufacturer. This example may be especially useful in determining whether a component recall should be issued based on the status check responses returned from a certain number of vehicles.
  • Wired/wireless transceiver/communications ports 260 may be included.
  • the wired/wireless transceiver/communications ports 260 may be included to support communications over wired networks or links, for example with other communication devices, server devices, and/or peripheral devices.
  • Examples of wired/wireless transceiver/communications ports 260 include Ethernet ports, Universal Serial Bus (USB) ports, Institute of Electrical and Electronics Engineers (IEEE) 1594, or other interface ports.
  • FIG. 3 An embodiment of a vehicle control environment 300 including a vehicle control system 204 may be as shown in Fig. 3. Beyond the vehicle control system 204, the vehicle control environment 300 can include one or more of, but is not limited to, a power source and/or power control module 316, a data storage module 320, user
  • the subsystems, modules, components, etc. 316-352 may include hardware, software, firmware, computer readable media, displays, input devices, output devices, etc. or combinations thereof.
  • the system, subsystems, modules, components, etc. 204, 316-352 may communicate over a network or bus 356.
  • This communication bus 356 may be bidirectional and perform data communications using any known or future-developed standard or protocol.
  • An example of the communication bus 356 may be as described in conjunction with Fig. 4.
  • the vehicle control system 204 can include a processor 304, memory 308, and/or an input/output (I/O) module 312.
  • the vehicle control system 204 may be a computer system, which can comprise hardware elements that may be electrically coupled.
  • the hardware elements may include one or more central processing units (CPUs) 304; one or more components of the I/O module 312 including input devices (e.g., a mouse, a keyboard, etc.) and/or one or more output devices (e.g., a display device, a printer, etc.).
  • the processor 304 may comprise a general purpose programmable processor or controller for executing application programming or instructions.
  • the processor 304 may, optionally, include multiple processor cores, and/or implement multiple virtual processors. Additionally or alternatively, the processor 304 may include multiple physical processors.
  • the processor 304 may comprise a specially configured application specific integrated circuit (ASIC) or other integrated circuit, a digital signal processor, a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like.
  • ASIC application specific integrated circuit
  • the processor 304 generally functions to run programming code or instructions implementing various functions of the vehicle control system 204.
  • the input/output module 312 and associated ports may be included to support communications over wired or wireless networks or links, for example with other communication devices, server devices, and/or peripheral devices.
  • Examples of an input/output module 312 include an Ethernet port, a Universal Serial Bus (USB) port, Institute of Electrical and Electronics Engineers (IEEE) 1594, or other interface.
  • USB Universal Serial Bus
  • IEEE Institute of Electrical and Electronics Engineers
  • the vehicle control system 204 may also include one or more storage devices 308.
  • storage devices 308 may be disk drives, optical storage devices, solid-state storage devices such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like.
  • the vehicle control system 204 may additionally include a computer-readable storage media reader; a communications system (e.g., a modem, a network card (wireless or wired), an infra-red communication device, etc.); and working memory 308, which may include RAM and ROM devices as described above.
  • the vehicle control system 204 may also include a processing acceleration unit, which can include a digital signal processor (DSP), a special-purpose processor, and/or the like.
  • DSP digital signal processor
  • the computer-readable storage media reader can further be connected to a computer-readable storage medium, together (and, optionally, in combination with storage device(s)) comprehensively representing remote, local, fixed, and/or removable storage devices plus storage media for temporarily and/or more permanently containing computer- readable information.
  • the communications system may permit data to be exchanged with an external or internal network and/or any other computer or device described herein.
  • the term "storage medium” may represent one or more devices for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices, and/or other machine readable mediums for storing information.
  • the vehicle control system 204 may also comprise software elements including an operating system and/or other code, as described in conjunction with Fig. 10. It should be appreciated that alternates to the vehicle control system 204 may have numerous variations from that described herein. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software (including portable software, such as applets), or both. Further, connection to other computing devices such as network input/output devices may be employed.
  • the power source and/or power control module 316 can include any type of power source, including, but not limited to, batteries, alternating current sources (from connections to a building power system or power line), solar cell arrays, etc.
  • One or more components or modules may also be included to control the power source or change the characteristics of the provided power signal.
  • Such modules can include one or more of, but is not limited to, power regulators, power filters, alternating current (AC) to direct current (DC) converters, DC to AC converters, receptacles, wiring, other converters, etc.
  • the power source and/or power control module 316 functions to provide the vehicle control system 204 and any other system with power.
  • the data storage 320 can include any module for storing, retrieving, and/or managing data in one or more data stores and/or databases.
  • the database or data stores may reside on a storage medium local to (and/or resident in) the vehicle control system 204 or in the vehicle 104. Alternatively, some of the data storage capability may be remote from the vehicle control system 204 or automobile, and in communication (e.g., via a network) to the vehicle control system 204.
  • the database or data stores may reside in a storage-area network ("SAN") familiar to those skilled in the art.
  • SAN storage-area network
  • any necessary files for performing the functions attributed to the vehicle control system 204 may be stored locally on the respective vehicle control system 204 and/or remotely, as appropriate.
  • the databases or data stores may be a relational database, and the data storage module 320 may be adapted to store, update, and retrieve data in response to specifically-formatted commands.
  • the data storage module 320 may also perform data management functions for any flat file, object oriented, or other type of database or data store.
  • a first data store that may be part of the vehicle control environment 300 is a profile data store 252 for storing data about user profiles and data associated with the users.
  • a system data store 208 can include data used by the vehicle control system 204 and/or one or more of the components 324-352 to facilitate the functionality described herein.
  • the data stores 208 and/or 252 may be as described in conjunction with Figs. 1 and/or 12A-12D.
  • Vehicle systems 328 can include any of the mechanical, electrical, electromechanical, computer, or other systems associated with the function of the vehicle 100.
  • vehicle systems 328 can include one or more of, but is not limited to, the steering system, the braking system, the engine and engine control systems, the electrical system, the suspension, the drive train, the cruise control system, the radio, the heating, ventilation, air conditioning
  • HVAC HVAC
  • the windows and/or doors etc.
  • Examples of the other systems and subsystems 324-352 may be as described further herein.
  • the user interface(s)/input interface(s) 324 may be as described in Figs. 2 and 8B; the vehicle subsystems 328 may be as described in Figs. 6a et. seq.; the user interaction subsystem 332 may be as described in conjunction with the user/device interaction subsystem 817 of Fig. 8B; the Navigation subsystem 336 may be as described in Figs. 6A and 8C; the sensor(s)/sensor subsystem 340 may be as described in Figs. 7A and 7B; the communication subsystem 344 may be as described in Figs.
  • the media subsystem 348 may be as described in Fig. 8A; and, the device interaction subsystem 352 may be as described in Fig. 2 and in conjunction with the user/device interaction subsystem 817 of Fig. 8B.
  • Fig. 4 illustrates an optional communications channel architecture 400 and associated communications components. Fig. 4 illustrates some of the optional components that can be interconnected via the communication channels/zones 404.
  • Communication channels/zones 404 can carry information on one or more of a wired and/or wireless communications link with, in the illustrated example, there being three communications channels/zones, 408, 412, and 416.
  • This optional environment 400 can also include an IP router 420, an operator cluster 424, one or more storage devices 428, one or more blades, such as master blade 432, and computational blades 436 and 440. Additionally, the communications channels/zones 404 can interconnect one or more displays, such as, remote display 1 444, remote display N 448, and console display 452. The communications channels/zones 404 also interconnect an access point 456, a Bluetooth® access point/USB hub 460, a
  • the environment 400 optionally includes a firewall 484 which will be discussed hereinafter in greater detail.
  • Other components that could also share the communications channel/zones 404 include GPS 488, media controller 492, which is connected to one or more media sources 496, and one or more subsystems, such as subsystem switches 498.
  • the communications channels/zones 404 can be viewed as an I/O network or bus where the communications channels are carried on the same physical media.
  • the communication channels 404 can be split amongst one or more physical media and/or combined with one or more wireless communications protocols.
  • the communications channels 404 can be based on wireless protocols with no physical media interconnecting the various elements described herein.
  • the environment 400 shown in Fig. 4 can include a collection of blade processors that are housed in a "crate.”
  • the crate can have a PC-style backplane connector 408 and a backplane Ethernet 408 that allows the various blades to
  • FIG. 4 Various other functional elements illustrated in Fig. 4 can be integrated into this crate architecture with, as discussed hereinafter, various zones utilized for security.
  • the backplane 404/408 can have two separate Ethernet zones that may or may not be on the same communications channel.
  • the zones exist on a single communications channel on the I/O network/bus 408.
  • the zones are actually on different communications channels, e.g., 412, 416; however, the implementation is not restricted to any particular type of configuration.
  • This backplane or I/O network/bus 408 also optionally can provide power distribution to the various modules and blades illustrated in Fig. 4.
  • the red and green Ethernet zones, 417 and 413 respectively, can be implemented as Ethernet switches, with one on each side of the firewall 484. Two Ethernets (untrusted and trusted) are not connected in accordance with an optional embodiment.
  • the connector geometry for the firewall can be different for the Ethernet zones than for the blades that are a part of the system.
  • the red zone 417 only needs to go from the modular connector to the input side of the backplane connector of the firewall 484. While Fig. 4 indicates that there are five external red zone connectors to the firewall 484, provisions can be made for any number of ports with the connections being made at the access point 456, the Bluetooth® access point (combo controller) 460, Femtocell 464, storage controller 468, and/or firewall 484. Optionally, the external port connections can be made through a manufacturer
  • the configurable modular connector panel, and one or more of the red zone Ethernet ports could be available through a customer supplied crate which allows, for example, wired Ethernet connections from a bring-your-own-device (BYOD) to the firewall 484.
  • BYOD bring-your-own-device
  • the green zone 413 goes from the output side of the firewall 484 and generally defines the trusted Ethernet.
  • the Ethernet on the backplane 408 essentially implements an Ethernet switch for the entire system, defining the Ethernet backbone of the vehicle 104. All other modules, e.g., blades, etc., can connect to a standard backplane bus and the trusted Ethernet.
  • switch ports can be reserved to connect to an output modular connector panel to distribute the Ethernet throughout the vehicle 104, e.g., connecting such elements as the console display 452, remote displays 444, 448, GPS 488, etc.
  • only trusted components either provided or approved by the
  • the environment 400 utilizes IPv6 over Ethernet connections wherever possible.
  • the Broadcom single-twisted pair Ethernet technology Using, for example, the Broadcom single-twisted pair Ethernet technology, wiring harnesses are simplified and data transmission speeds are maximized.
  • the Broadcom single-twisted pair Ethernet technology can be used, in general, systems and methods can work comparably well with any type of well- known Ethernet technology or other comparable communications technology.
  • the I/O network/bus 408 is a split-bus concept that contains three independent bus structures:
  • the green zone 413 - the trusted Ethernet environment this zone 413 can be used to connect manufacturer certified devices such as GPS units, remote displays, subsystem switches, and the like, to the vehicle network 404.
  • Manufacturer certified devices can be implemented by vendors that allow the vehicle software system to validate whether or not a device is certified to operate with the vehicle 100.
  • only certified devices are allowed to connect to the trusted side of the network.
  • the I/O bus 409 - the I/O bus may be used to provide power and data
  • bus-based devices such as the vehicle solid state drive, the media controller blade 492, the computational blades 436, 440, and the like.
  • the split-bus structure can have the following minimum configuration:
  • This slot can implement, for example, WiFi, Bluetooth®, and/or USB connectivity 456, which can be provided in, for example, the customer crate;
  • the expansion controller that allows the I/O bus to be extended and provides additional Ethernet switch ports for one or more of the red or green zones, which may require that the basic green zone Ethernet switch implementation will support additional ports beyond the initial three that are needed for the basic exemplary system.
  • the red zone 417 can be implemented as an 8-port Ethernet switch that has three actual bus ports within the crate with the remaining five ports being available on the customer crate.
  • the crate implements red zone slots for the firewall controller 484, the combo controller which includes WiFi, Bluetooth®, USB hub (456, 460) and the IP router
  • the firewall controller 484 can have a dedicated slot that bridges the red zone 417, green zone 413, and uses the I/O bus for power connections.
  • the firewall 484 can be implemented by a dummy module that simply bridges the red zone 417 and the green zone 413 without necessarily providing any firewall functionality.
  • the combo controller 460 that includes the WiFi, Bluetooth®, and USB hub can be provided for consumer device connections. This controller can also implement the IPv6 (un-routable) protocol to insure that all information is packetized for transmission via IP over the Ethernet in the I/O network/bus 408.
  • the combo controller 460 with the USB hub can have ports in the customer crate.
  • the combo controller 460 can implement USB discovery functions and packetizes the information for transmission via IP over Ethernet.
  • the combo controller 460 can also facilitate installation of the correct USB driver for the discovered device, such as a BYOD from the user.
  • the combo controller 460 and USB hub can then map the USB address to a "local" IPv6 address for interaction with one or more of the computational blades which is generally going to be the media controller 492.
  • the IP router 420 can implement Internet access through a manufacturer provided service. This service can allow, for example, a manufacturer to offer value- added services to be integrated into the vehicle information systems. The existence of the manufacturer provided Internet access can also allow the "e-Call" function and other vehicle data recorder functions to be implemented. IP router 420 also allows, for example, WiMax, 4G LTE, and other connections to the Internet through a service provider that can be, for example, contracted by the manufacturer. Internally, the IP router 420 can allow cellular handset connections to the Internet through a Femtocell 464 that is part of the IP router implementation. The IP router 420, with the Femtocell 464, can also allow a cone of silence functionality to be implemented.
  • the IP router 420 can be an optional component for a vehicle provided by, for example, the manufacturer, a dealer, or installed by a user. In the absence of the IP router 420, it is possible to connect a consumer handheld device to the I/O network/bus 408 using, for example, either WiFi or Bluetooth® 456, 460. While functionality may be somewhat reduced when using a handheld device instead of a built-in Ethernet connection, systems and methods of this invention can also work utilizing this consumer handheld device which then connects to the Internet via, for example, WiMax, 4G, 4G LTE, or the like.
  • Figs. 5A-5C show configurations of a vehicle 104.
  • a vehicle 104 may provide functionality based at least partially on one or more areas, zones, and distances, associated with the vehicle 104. Non- limiting examples of this functionality are provided herein below.
  • the sensor arrangement 500 can include one or more areas 508 within the vehicle.
  • An area can be a larger part of the environment inside or outside of the vehicle 104.
  • area one 508A may include the area within the trunk space or engine space of the vehicle 104 and/or the front passenger compartment.
  • Area two 508B may include a portion of the interior space 108 (e.g., a passenger compartment, etc.) of the vehicle 104.
  • the area N, 508N may include the trunk space or rear compartment area, when included within the vehicle 104.
  • the interior space 108 may also be divided into other areas.
  • one area may be associated with the front passenger's and driver's seats, a second area may be associated with the middle passengers' seats, and a third area may be associated with a rear passenger's seat.
  • Each area 508 may include one or more sensors that are positioned or operate to provide environmental information about that area 508.
  • Each area 508 may be further separated into one or more zones 512 within the area 508.
  • area 1 508A may be separated into zone A 512A, and zone B 512B.
  • Each zone 512 may be associated with a particular portion of the interior occupied by a passenger.
  • zone A 512A may be associated with a driver.
  • Zone B 512B may be associated with a front passenger.
  • Each zone 512 may include one or more sensors that are positioned or configured to collect information about the environment or ecosystem associated with that zone or person.
  • a passenger area 508B may include more than two zones as described in conjunction with area 508A.
  • area 508B may include three zones, 512C, 512D, and 512E. These three separate zones 512C, 512D, and 512E may be associated with three passenger seats typically found in the rear passenger area of a vehicle 104.
  • An area 508N and may include a single zone 512N as there may be no separate passenger areas but may include a single trunk area within the vehicle 104.
  • the number of zones 512 is unlimited within the areas as the areas are also unlimited inside the vehicle 104. Further, it should be noted that there may be one or areas 508 or zones 512 that may be located outside the vehicle 104 that may have a specific set of sensors associated therewith.
  • each area/access point 508, 456, 516, 520, and/or zone 512, associated with a vehicle 104 may comprise one or more sensors to determine a presence of a user 216 and/or device 212, 248 in and/or adjacent to each area 508, 456, 516, 520, and/or zone 512.
  • the sensors may include vehicle sensors 242 and/or non- vehicle sensors 236 as described herein. It is anticipated that the sensors may be configured to
  • the sensors may communicate with a vehicle control system 204 and/or the diagnostic communications module 256. Additionally or alternatively, the sensors may communicate with a device 212, 248. The communication of sensors with the vehicle 104 may initiate and/or terminate the control of device 212, 248 features. For example, a vehicle operator may be located in a second outside area 520 associated with a vehicle 104. As the operator approaches the first outside area 516, associated with the vehicle 104, the vehicle control system 204 may determine to control features associated with one or more device 212, 248 and diagnostic communications module 256.
  • the location of the device 212, 248 relative to the vehicle 104 may determine vehicle functionality and/or features to be provided and/or restricted to a user 216.
  • a device 212, 248 associated with a user 216 may be located at a second outside area 520 from the vehicle 104.
  • the vehicle 104 may lock one or more features (e.g., ignition access, vehicle access, communications ability, etc.) associated with the vehicle 104.
  • the vehicle 104 may provide an alert based on the distance of the device 212, 248 from the vehicle 104.
  • the vehicle 104 may provide an alert based on the distance of the device 212, 248 from the vehicle 104.
  • the vehicle 104 may unlock a door of the vehicle 104.
  • the various sensors 236, 242 may determine that the user 216 is in an area 508 and/or zone 512.
  • features of the vehicle 104, device 212, 248, and/or other components may be controlled based on rules stored in a memory.
  • Fig. 5B illustrates optional internal vehicle communications between one or more of the vehicle and one or more devices or between devices.
  • Various communications can occur utilizing one or more Bluetooth®, NFC, WiFi, mobile hot spot, point-to-point communications, point-to-multipoint other point communications, an ad hoc network, or in general any known communications protocol over any known communications media or media-types.
  • various types of internal vehicle communications can be facilitated using an access point 456 that utilizes one or more of Bluetooth®, NFC, WiFi, wireless Ethernet, mobile hot spot technology, or the like.
  • the connected device Upon being connected with, and optionally authenticated to the access point 456, the connected device is able to
  • the type of connection to the access point 456 can be based on, for example, the zone 512, in which the device is located.
  • the user may identify their zone 512 in conjunction with an authentication procedure to the access point 456.
  • a driver in zone A 512 A upon authenticating to the access point 456, can cause the access point 456 to send a query to the device asking the device user in which zone 512 they are located.
  • the zone 512 the user device is located in may have an impact on the type of communications, available bandwidth, the types of other devices or vehicle systems or subsystems the device could communicate with, and the like.
  • internal communications with zone A 512A may be given preferential treatment over those communications originating from area 2 508B, which could have in itself, preferential treatment over communications originating within area N 508N.
  • the device in zone A 512A can include profile information that governs the other devices that are allowed to connect to the access point 456 and what those devices have access to, how they can communicate, how much bandwidth they are allocated, and the like. While, optionally, the device associated with zone A 512A will be considered the "master" controller of the profile that governs the internal vehicle communications, it should be appreciated that this was arbitrarily chosen since it is assumed that there will always be a driver in a car that is present in zone A 512 A.
  • zone A 512A may not have a communications device in which case a device associated with one of the other areas or zones, such as zone B 512B, area 2 508B, or area N 508N could also be associated with or control this master profile.
  • various devices located within the various zones 512 can connect using, for example, ports provided by access point 456 or Bluetooth® access point/USB hub 460 as illustrated in Fig 4.
  • the device(s) could connect utilizing the Femtocell 464 and optionally be directly connected via, for example, a standard Ethernet port.
  • each one of the areas, area 1 508A, area 2 508B, and area N 508N can each have associated therewith a profile that governs, for example, how many and what types of devices can connect from that area 508, bandwidth allocated to that area 508, the types of media or content available to device(s) within that area 508, the interconnection of devices within that area 508 or between areas 508, or, in general, can control any aspect of communication of an associated device with any one or more other associated devices/vehicle systems within the vehicle 104.
  • area 2 508B devices can be provided with full access to multimedia and infotainment available within the vehicle 104, however, devices in area 2 508B may be restricted from any access to vehicle functions. Only devices in area 1 508A may be able to access vehicle control functions such as when "parents" are located in area 1 508A and the children are located in area 2 508B. Optionally, devices found in zone E 512E of area 2 508B may be able to access limited vehicle control functionality such as climate control within area 2. Similarly, devices in area N 508N may be able to control climate features within zone N 512N.
  • profiles can be established that allow management of communications within each of the areas 508, and further optionally within each of the zones 512.
  • the profile can be granular in nature controlling not only what type of devices can connect within each zone 512, but how those devices can communicate with other devices and/or the vehicle and types of information that can be communicated.
  • One optional technique involves one or more of the vehicle sensors detecting the presence of an individual within one of the zones 512. Upon detection of an individual in a zone 512, communications subsystems 344 and the access point 456 can cooperate to not only associate the device within the zone 512 with the access point 456 but to also determine the location of the device within an area, and optionally within a zone 512.
  • a profile associated with the vehicle 104 can store information identifying that device and/or a person and optionally associating it with a particular zone 512 as a default. As discussed, there can be a master profile optionally associated with the device in zone A 512 A, this master profile can govern communications with the communications subsystems 340 and where communications within vehicle 104 are to occur.
  • a user's device such as a SmartPhone
  • the areas 508 and zones 512 can have associated therewith restrictions as to which one or more other user's devices with which users' devices can connect.
  • a first user's device can connect with any other user device in area 2 508B or area N 508N, however is restricted from connecting with a user device in area 1 508A, zone A 512A.
  • the first user device may be able to communicate with another user's device that is located in area 1 508A, zone B 512B.
  • These communications can include any type of standard communications such as sharing content, exchanging messages, forwarding or sharing multimedia or infotainment, or in general can include any communications that would ordinarily be available between two devices and/or the vehicle and vehicle systems.
  • zone A 512A there may be restrictions on the type of communications that can be sent to the device in area 1 508A, zone A 512A.
  • the user's device in area 1 508A, zone A 512A may be restricted from receiving one or more of text messages, multimedia, infotainment, or in general anything that can be envisioned as a potential distraction to the driver.
  • the communications between the various devices and the various zones 512 need not necessarily occur with the assistance of access point 456, but the
  • Fig. 5C outlines optional internal vehicle communications between one or more of the vehicle and one or more devices. More specifically, Fig. 5C illustrates an example of vehicle communications where the vehicle 104 is equipped with the necessary transceivers to provide a mobile hot spot functionality to any user device(s) therein, such as user devices 248A and 248N.
  • one or more user devices can connect to the access point 456.
  • This access point 456 is equipped to handle communications routing to not only the communication network/buses 224 for intra-vehicle communications, but optionally can also communicate with, for example, the Internet or the cloud, in cooperation with transceiver 260.
  • a firewall 484 that has the capability of not only blocking certain types of content, such as a malicious content, but can also operate to exclude certain type of communications from emanating from the vehicle 104 and transceiver 260.
  • various profiles could be established in the firewall 484 that controls not only the type of communications that can be received at the vehicle 104, but the type of communications that can be sent from the vehicle 104.
  • the transceiver 260 can be any type of well-known wireless transceiver that communicates using a known communications protocol such as WiMax, 4G, 4G LTE, 3G, or the like.
  • the user devices can communicate via, for example, WiFi link 248 with the access point 456, with the transceiver 260 providing Internet connectivity to the various user devices.
  • the account is established on a month-to-month basis with an associated fee but could also be performed based on the amount of data to be transmitted, received, or in any other manner.
  • one or more of the user's devices and access point 456 can maintain profile information that governs how the user's devices are able to communicate with other devices, and optionally the Internet.
  • a profile can exist that only allows the user's devices to communicate with other user's devices and/or the vehicle, multimedia and/or the vehicle infotainment system, and may not be allowed access to the Internet via transceiver 260.
  • the profile can stipulate that the user's device could connect to the Internet via transceiver 260 for a specified period of time and/or up to a certain amount of data usage.
  • the user's device can have full access to the Internet via transceiver 260 with no limit on time or data usage which would reduce the data usage of the user's device since it is connected via WiFi to the access point 456, but however, would increase the data usage by transceiver 260, and therefore, shift the billing for that data usage to the transceiver 260 instead of the user's device.
  • the various profiles may stipulate which user's device has priority for use of the bandwidth provided by the transceiver 260. For example, a user's device located area 1 508A, zone A 512A may be given preferential routing treatment of data above that of a user's device in zone N 512N. In this manner, for example, a driver would be given priority for Internet access above that of the passengers. This could become important, for example, when the driver is trying to obtain traffic or direction information or, for example, when the vehicle is performing a download to update various software features.
  • the optional firewall 484 can cooperate with the access point 456 and the various profiles that area 508 associated with the various devices within the vehicle 104 and can fully implement communications restrictions, control bandwidth limits, Internet accessibility, malicious software blocking, and the like. Moreover, the optional firewall 484 can be accessed by an administrator with one or more of these configuration settings edited through an administrator's control panel. For example, in a scenario where parents are always in area 1 508A, it may be appropriate to give all of the user's devices in area 1 508A full access to the Internet utilizing transceiver 260, however, while restricting access and/or bandwidth to any other user devices within the vehicle 104. As the user's device and profile would be known by the firewall 484, upon the user's device being associated with the access point 456, the firewall 484 and transceiver 260 can be configured to allow communications in accordance with the stored profile.
  • a set of sensors or vehicle components 600 associated with the vehicle 104 may be as shown in Fig. 6A.
  • the vehicle 104 can include, among many other components common to vehicles, wheels 607, a power source 609 (such as an engine, motor, or energy storage system (e.g., battery or capacitive energy storage system)), a manual or automatic transmission 612, a manual or automatic transmission gear controller 616, a power controller 620 (such as a throttle), a vehicle control system 204, the display device 212, a braking system 636, a steering wheel 640, a power source activation/deactivation switch 644 (e.g., an ignition), an occupant seating system 648, a wireless signal receiver 653 to receive wireless signals from signal sources such as roadside beacons and other electronic roadside devices, and a satellite positioning system receiver 657 (e.g., a Global Positioning System (“GPS”) (US), GLONASS ( Russia), Galileo positioning system (EU), Compass navigation system (China), and Regional Navigational
  • the vehicle 104 can include a number of sensors in wireless or wired
  • Exemplary sensors may include one or more of, but are not limited to, wheel state sensor 660 to sense one or more of vehicle speed, acceleration, deceleration, wheel rotation, wheel speed (e.g., wheel revolutions-per-minute), wheel slip, and the like, a power source energy output sensor 664 to sense a power output of the power source 609 by measuring one or more of current engine speed (e.g., revolutions-per-minute), energy input and/or output (e.g., voltage, current, fuel consumption, and torque) (e.g., turbine speed sensor, input speed sensor, crankshaft position sensor, manifold absolute pressure sensor, mass flow sensor, and the like), and the like, a switch state sensor 668 to determine a current activation or deactivation state of the power source activation/deactivation switch 644, a transmission setting sensor 670 to determine a current setting of the transmission (e.g.
  • wheel state sensor 660 to sense one or more of vehicle speed, acceleration, deceleration, wheel rotation, wheel speed (e.g.,
  • Examples of other sensors include safety system state sensors to determine a current state of a vehicular safety system (e.g., air bag setting (deployed or undeployed) and/or seat belt setting (engaged or not engaged)), light setting sensor (e.g., current headlight, emergency light, brake light, parking light, fog light, interior or passenger compartment light, and/or tail light state (on or off)), brake control (e.g., pedal) setting sensor, accelerator pedal setting or angle sensor, clutch pedal setting sensor, emergency brake pedal setting sensor, door setting (e.g., open, closed, locked or unlocked) sensor, engine temperature sensor, passenger compartment or cabin temperature sensor, window setting (open or closed) sensor, one or more interior-facing or exterior-facing cameras or other imaging sensors (which commonly convert an optical image into an electronic signal but may include other devices for detection objects such as an electromagnetic radiation emitter/receiver that emits electromagnetic radiation and receives electromagnetic waves reflected by the object) to sense objects, such as other vehicles and pedestrians and optionally determine the distance,
  • light setting sensor e
  • the various sensors can be in the depicted vehicle embodiment.
  • the various sensors can be in the depicted vehicle embodiment.
  • the signal carrier network 224 can be a network of signal conductors, a wireless network (e.g., a radio frequency, microwave, or infrared
  • Wi-Fi Wireless Fidelity
  • the vehicle control system 204 may also provide signal processing of one or more sensors, sensor fusion of similar and/or dissimilar sensors, signal smoothing in the case of erroneous "wild point" signals, and/or sensor fault detection. For example, ranging measurements provided by one or more RF sensors may be combined with ranging measurements from one or more IR sensors to determine one fused estimate of vehicle range to an obstacle target.
  • the control system 204 may receive and read sensor signals, such as wheel and engine speed signals, as a digital input comprising, for example, a pulse width modulated (PWM) signal.
  • the processor 304 can be configured, for example, to read each of the signals into a port configured as a counter or configured to generate an interrupt on receipt of a pulse, such that the processor 304 can determine, for example, the engine speed in revolutions per minute (RPM) and the speed of the vehicle in miles per hour (MPH) and/or kilometers per hour (KPH).
  • RPM revolutions per minute
  • MPH miles per hour
  • KPH kilometers per hour
  • the current engine speed and vehicle speed can be received in a communication packet as numeric values from a conventional dashboard subsystem comprising a tachometer and a speedometer.
  • the transmission speed sensor signal can be similarly received as a digital input comprising a signal coupled to a counter or interrupt signal of the processor 304 or received as a value in a communication packet on a network or port interface from an existing subsystem of the vehicle 104.
  • the ignition sensor signal can be configured as a digital input, wherein a HIGH value represents that the ignition is on and a LOW value represents that the ignition is OFF.
  • Three bits of the port interface can be configured as a digital input to receive the gear shift position signal, representing eight possible gear shift positions.
  • the gear shift position signal can be received in a communication packet as a numeric value on the port interface.
  • the throttle position signal can be received as an analog input value, typically in the range 0-5 volts.
  • the throttle position signal can be received in a communication packet as a numeric value on the port interface.
  • the output of other sensors can be processed in a similar fashion.
  • Other sensors may be included and positioned in the interior space 108 of the vehicle 104. Generally, these interior sensors obtain data about the health of the driver and/or passenger(s), data about the safety of the driver and/or passenger(s), and/or data about the comfort of the driver and/or passenger(s).
  • the health data sensors can include sensors in the steering wheel that can measure various health telemetry for the person (e.g., heart rate, temperature, blood pressure, blood presence, blood composition, etc.). Sensors in the seats may also provide for health telemetry (e.g., presence of liquid, weight, weight shifts, etc.). Infrared sensors could detect a person's temperature; optical sensors can determine a person's position and whether the person has become unconscious. Other health sensors are possible and included herein.
  • Safety sensors can measure whether the person is acting safely. Optical sensors can determine a person's position and focus. If the person stops looking at the road ahead, the optical sensor can detect the lack of focus. Sensors in the seats may detect if a person is leaning forward or may be injured by a seat belt in a collision. Other sensors can detect that the driver has at least one hand on a steering wheel. Other safety sensors are possible and contemplated as if included herein.
  • Comfort sensors can collect information about a person's comfort. Temperature sensors may detect a temperature of the interior cabin. Moisture sensors can determine a relative humidity. Audio sensors can detect loud sounds or other distractions. Audio sensors may also receive input from a person through voice data. Other comfort sensors are possible and contemplated as if included herein.
  • Fig. 6B shows an interior sensor configuration for one or more zones 512 of a vehicle 104 optionally.
  • the areas 508 and/or zones 512 of a vehicle 104 may include sensors that are configured to collect information associated with the interior 108 of a vehicle 104.
  • the various sensors may collect environmental information, user information, and safety information, to name a few. Embodiments of these sensors may be as described in conjunction with Figs. 7A-8B.
  • the sensors may include one or more of optical, or image, sensors 622A-B (e.g., cameras, etc.), motion sensors 624A-B (e.g., utilizing RF, IR, and/or other sound/image sensing, etc.), steering wheel user sensors 642 (e.g., heart rate, temperature, blood pressure, sweat, health, etc.), seat sensors 677 (e.g., weight, load cell, moisture, electrical, force transducer, etc.), safety restraint sensors 679 (e.g., seatbelt, airbag, load cell, force transducer, etc.), interior sound receivers 692A-B, environmental sensors 694 (e.g., temperature, humidity, air, oxygen, etc.), and the like.
  • sensors 622A-B e.g., cameras, etc.
  • motion sensors 624A-B e.g., utilizing RF, IR, and/or other sound/image sensing, etc.
  • steering wheel user sensors 642 e.g., heart rate,
  • the image sensors 622A-B may be used alone or in combination to identify objects, users 216, and/or other features, inside the vehicle 104.
  • a first image sensor 622A may be located in a different position within a vehicle 104 from a second image sensor 622B.
  • the image sensors 622 A-B may combine captured images to form, among other things, stereo and/or three-dimensional (3D) images.
  • the stereo images can be recorded and/or used to determine depth associated with objects and/or users 216 in a vehicle 104.
  • the image sensors 622A-B used in combination may determine the complex geometry associated with identifying characteristics of a user 216.
  • the image sensors 622A-B may be used to determine dimensions between various features of a user's face (e.g., the depth/distance from a user's nose to a user's cheeks, a linear distance between the center of a user's eyes, and more). These dimensions may be used to verify, record, and even modify
  • utilizing stereo images can allow for a user 216 to provide complex gestures in a 3D space of the vehicle 104. These gestures may be interpreted via one or more of the subsystems as disclosed herein.
  • the image sensors 622A-B may be used to determine movement associated with objects and/or users 216 within the vehicle 104. It should be appreciated that the number of image sensors used in a vehicle 104 may be increased to provide greater dimensional accuracy and/or views of a detected image in the vehicle 104.
  • the vehicle 104 may include one or more motion sensors 624A-B. These motion sensors 624A-B may detect motion and/or movement of objects inside the vehicle 104. Optionally, the motion sensors 624A-B may be used alone or in combination to detect movement. For example, a user 216 may be operating a vehicle 104 (e.g., while driving, etc.) when a passenger in the rear of the vehicle 104 unbuckles a safety belt and proceeds to move about the vehicle 104. In this example, the movement of the passenger could be detected by the motion sensors 624 A-B. Optionally, the user 216 could be alerted of this movement by one or more of the devices 212, 248 in the vehicle 104.
  • a passenger may attempt to reach for one of the vehicle control features (e.g., the steering wheel 640, the console, icons displayed on the head unit and/or device 212, 248, etc.).
  • the movement (i.e., reaching) of the passenger may be detected by the motion sensors 624A-B.
  • the path, trajectory, anticipated path, and/or some other direction of movement/motion may be determined using the motion sensors 624 A-B.
  • the passenger may be prevented from interfacing with and/or accessing at least some of the vehicle control features (e.g., the features represented by icons may be hidden from a user interface, the features may be locked from use by the passenger, combinations thereof, etc.).
  • the user 216 may be alerted of the movement/motion such that the user 216 can act to prevent the passenger from interfering with the vehicle 104 controls.
  • the number of motion sensors in a vehicle 104, or areas of a vehicle 104 may be increased to increase an accuracy associated with motion detected in the vehicle 104.
  • the interior sound receivers 692A-B may include, but are not limited to, microphones and other types of acoustic-to-electric transducers or sensors.
  • the interior sound receivers 692A-B may be configured to receive and convert sound waves into an equivalent analog or digital signal.
  • the interior sound receivers 692A-B may serve to determine one or more locations associated with various sounds in the vehicle 104. The location of the sounds may be determined based on a comparison of volume levels, intensity, and the like, between sounds detected by two or more interior sound receivers 692 A-B. For instance, a first interior sound receiver 692 A may be located in a first area of the vehicle 104 and a second interior sound receiver 692B may be located in a second area of the vehicle 104.
  • the sound may be determined to be closer to the second area of the vehicle 104.
  • the number of sound receivers used in a vehicle 104 may be increased (e.g., more than two, etc.) to increase measurement accuracy surrounding sound detection and location, or source, of the sound (e.g., via triangulation, etc.).
  • Seat sensors 677 may be included in the vehicle 104.
  • the seat sensors 677 may be associated with each seat and/or zone 512 in the vehicle 104.
  • the seat sensors 677 may provide health telemetry and/or identification via one or more of load cells, force transducers, weight sensors, moisture detection sensor, electrical
  • the seat sensors 677 may determine that a user 216 weighs 180 lbs. This value may be compared to user data stored in memory to determine whether a match exists between the detected weight and a user 216 associated with the vehicle 104. In another example, if the seat sensors 677 detect that a user 216 is fidgeting, or moving, in a seemingly uncontrollable manner, the system may determine that the user 216 has suffered a nervous and/or muscular system issue (e.g., seizure, etc.).
  • a nervous and/or muscular system issue e.g., seizure, etc.
  • the vehicle control system 204 may then cause the vehicle 104 to slow down and in addition or alternatively the automobile controller 8104 (described below) can safely take control of the vehicle 104 and bring the vehicle 104 to a stop in a safe location (e.g., out of traffic, off a freeway, etc).
  • a safe location e.g., out of traffic, off a freeway, etc.
  • Health telemetry and other data may be collected via the steering wheel user sensors 642.
  • the steering wheel user sensors 642 may collect heart rate, temperature, blood pressure, and the like, associated with a user 216 via at least one contact disposed on or about the steering wheel 640.
  • the safety restraint sensors 679 may be employed to determine a state associated with one or more safety restraint devices in a vehicle 104.
  • the state associated with one or more safety restraint devices may serve to indicate a force observed at the safety restraint device, a state of activity (e.g., retracted, extended, various ranges of extension and/or retraction, deployment, buckled, unbuckled, etc.), damage to the safety restraint device, and more.
  • Environmental sensors 694 including one or more of temperature, humidity, air, oxygen, carbon monoxide, smoke, and other environmental condition sensors may be used in a vehicle 104. These environmental sensors 694 may be used to collect data relating to the safety, comfort, and/or condition of the interior space 108 of the vehicle 104. Among other things, the data collected by the environmental sensors 694 may be used by the vehicle control system 204 to alter functions of a vehicle.
  • the environment may correspond to an interior space 108 of a vehicle 104 and/or specific areas 508 and/or zones 512 of the vehicle 104. It should be appreciate that an environment may correspond to a user 216.
  • a low oxygen environment may be detected by the environmental sensors 694 and associated with a user 216 who is operating the vehicle 104 in a particular zone 512.
  • at least one of the subsystems of the vehicle 104 may alter the environment, especially in the particular zone 512, to increase the amount of oxygen in the zone 512.
  • the environmental sensors 694 may be used to report conditions associated with a vehicle (e.g., fire detected, low oxygen, low humidity, high carbon monoxide, etc.). The conditions may be reported to a user 216 and/or a third party via at least one communications module as provided herein.
  • the sensors as disclosed herein may communicate with each other, with devices 212, 248, and/or with the vehicle control system 204 via the signal carrier network 224. Additionally or alternatively, the sensors disclosed herein may serve to provide data relevant to more than one category of sensor information including, but not limited to, combinations of environmental information, user information, and safety information to name a few.
  • Figs. 7A-7B show block diagrams of various sensors that may be associated with a vehicle 104. Although depicted as interior and exterior sensors, it should be appreciated that any of the one or more of the sensors shown may be used in both the interior space 108 and the exterior space of the vehicle 104. Moreover, sensors having the same symbol or name may include the same, or substantially the same, functionality as those sensors described elsewhere in the present disclosure. Further, although the various sensors are depicted in conjunction with specific groups (e.g., environmental 708, 708E, user interface 712, safety 716, 716E, etc.) the sensors should not be limited to the groups in which they appear.
  • groups e.g., environmental 708, 708E, user interface 712, safety 716, 716E, etc.
  • the sensors may be associated with other groups or combinations of groups and/or disassociated from one or more of the groups shown.
  • the sensors as disclosed herein may communicate with each other, the devices 212, 248, and/or the vehicle control system 204 via one or more communications channel(s) 356.
  • Fig. 7A is a block diagram of an embodiment of interior sensors 340 for a vehicle 104 is provided.
  • the interior sensors 340 may be arranged into one or more groups, based at least partially on the function of the interior sensors 340.
  • the interior space 108 of a vehicle 104 may include an environmental group 708, a user interface group 712, and a safety group 716. Additionally or alternatively, there may be sensors associated with various devices inside the vehicle (e.g., devices 212, 248, smart phones, tablets, mobile computers, etc.)
  • the environmental group 708 may comprise sensors configured to collect data relating to the internal environment of a vehicle 104. It is anticipated that the environment of the vehicle 104 may be subdivided into areas 508 and zones 512 in an interior space 108 of a vehicle 104. In this case, each area 508 and/or zone 512 may include one or more of the environmental sensors. Examples of environmental sensors associated with the environmental group 708 may include, but are not limited to, oxygen/air sensors 724, temperature sensors 728, humidity sensors 732, light/photo sensors 736, and more.
  • the oxygen/air sensors 724 may be configured to detect a quality of the air in the interior space 108 of the vehicle 104 (e.g., ratios and/or types of gasses comprising the air inside the vehicle 104, dangerous gas levels, safe gas levels, etc.). Temperature sensors 728 may be configured to detect temperature readings of one or more objects, users 216, and/or areas 508 of a vehicle 104. Humidity sensors 732 may detect an amount of water vapor present in the air inside the vehicle 104. The light/photo sensors 736 can detect an amount of light present in the vehicle 104. Further, the light/photo sensors 736 may be configured to detect various levels of light intensity associated with light in the vehicle 104.
  • a quality of the air in the interior space 108 of the vehicle 104 e.g., ratios and/or types of gasses comprising the air inside the vehicle 104, dangerous gas levels, safe gas levels, etc.
  • Temperature sensors 728 may be configured to detect temperature readings of one or more objects, users 216, and/
  • the user interface group 712 may comprise sensors configured to collect data relating to one or more users 216 in a vehicle 104.
  • the user interface group 712 may include sensors that are configured to collect data from users 216 in one or more areas 508 and zones 512 of the vehicle 104.
  • each area 508 and/or zone 512 of the vehicle 104 may include one or more of the sensors in the user interface group 712.
  • Examples of user interface sensors associated with the user interface group 712 may include, but are not limited to, infrared sensors 740, motion sensors 744, weight sensors 748, wireless network sensors 752, biometric sensors 756, camera (or image) sensors 760, audio sensors 764, and more.
  • Infrared sensors 740 may be used to measure IR light irradiating from at least one surface, user 216, or other object in the vehicle 104. Among other things, the Infrared sensors 740 may be used to measure temperatures, form images (especially in low light conditions), identify users 216, and even detect motion in the vehicle 104.
  • the motion sensors 744 may be similar to the motion detectors 624A-B, as described in conjunction with Fig. 6B.
  • Weight sensors 748 may be employed to collect data relating to objects and/or users 216 in various areas 508 of the vehicle 104. In some cases, the weight sensors 748 may be included in the seats and/or floor of a vehicle 104.
  • the vehicle 104 may include a wireless network sensor 752.
  • This sensor 752 may be configured to detect one or more wireless network(s) inside the vehicle 104.
  • wireless networks may include, but are not limited to, wireless communications utilizing Bluetooth®, Wi-FiTM, ZigBee, IEEE 802.11, and other wireless technology standards.
  • a mobile hotspot may be detected inside the vehicle 104 via the wireless network sensor 752.
  • the vehicle 104 may determine to utilize and/or share the mobile hotspot detected via/with one or more other devices 212, 248 and/or components associated with the vehicle 104.
  • Biometric sensors 756 may be employed to identify and/or record characteristics associated with a user 216. It is anticipated that biometric sensors 756 can include at least one of image sensors, IR sensors, fingerprint readers, weight sensors, load cells, force transducers, heart rate monitors, blood pressure monitors, and the like as provided herein.
  • the camera sensors 760 may be similar to image sensors 622A-B, as described in conjunction with Fig. 6B.
  • the camera sensors may record still images, video, and/or combinations thereof.
  • the audio sensors 764 may be similar to the interior sound receivers 692A-B, as described in conjunction with Figs. 6A-6B.
  • the audio sensors may be configured to receive audio input from a user 216 of the vehicle 104.
  • the audio input from a user 216 may correspond to voice commands, conversations detected in the vehicle 104, phone calls made in the vehicle 104, and/or other audible expressions made in the vehicle 104.
  • the safety group 716 may comprise sensors configured to collect data relating to the safety of a user 216 and/or one or more components of a vehicle 104.
  • the vehicle 104 may be subdivided into areas 508 and/or zones 512 in an interior space 108 of a vehicle 104 where each area 508 and/or zone 512 may include one or more of the safety sensors provided herein.
  • Examples of safety sensors associated with the safety group 716 may include, but are not limited to, force sensors 768, mechanical motion sensors 772, orientation sensors 776, restraint sensors 780, and more.
  • the force sensors 768 may include one or more sensors inside the vehicle 104 configured to detect a force observed in the vehicle 104.
  • a force sensor 768 may include a force transducer that converts measured forces (e.g., force, weight, pressure, etc.) into output signals.
  • Mechanical motion sensors 772 may correspond to encoders, accelerometers, damped masses, and the like.
  • the mechanical motion sensors 772 may be adapted to measure the force of gravity (i.e., G-force) as observed inside the vehicle 104. Measuring the G-force observed inside a vehicle 104 can provide valuable information related to a vehicle's acceleration, deceleration, collisions, and/or forces that may have been suffered by one or more users 216 in the vehicle 104.
  • the mechanical motion sensors 772 can be located in an interior space 108 or an exterior of the vehicle 104.
  • Orientation sensors 776 can include accelerometers, gyroscopes, magnetic sensors, and the like that are configured to detect an orientation associated with the vehicle 104. Similar to the mechanical motion sensors 772, the orientation sensors 776 can be located in an interior space 108 or an exterior of the vehicle 104.
  • the restraint sensors 780 may be similar to the safety restraint sensors 679 as described in conjunction with Figs. 6A-6B. These sensors 780 may correspond to sensors associated with one or more restraint devices and/or systems in a vehicle 104. Seatbelts and airbags are examples of restraint devices and/or systems.
  • the restraint devices and/or systems may be associated with one or more sensors that are configured to detect a state of the device/system.
  • the state may include extension, engagement, retraction, disengagement, deployment, and/or other electrical or mechanical conditions associated with the device/system.
  • the associated device sensors 720 can include any sensors that are associated with a device 212, 248 in the vehicle 104.
  • typical devices 212, 248 may include smart phones, tablets, laptops, mobile computers, and the like. It is anticipated that the various sensors associated with these devices 212, 248 can be employed by the vehicle control system 204.
  • a typical smart phone can include, an image sensor, an IR sensor, audio sensor, gyroscope, accelerometer, wireless network sensor, fingerprint reader, and more. It is an aspect of the present disclosure that one or more of these associated device sensors 720 may be used by one or more subsystems of the vehicle system 200.
  • Fig. 7B a block diagram of an embodiment of exterior sensors 340 for a vehicle 104 is shown.
  • the exterior sensors may include sensors that are identical, or substantially similar, to those previously disclosed in conjunction with the interior sensors of Fig. 7A.
  • the exterior sensors 340 may be configured to collect data relating to one or more conditions, objects, users 216, and other events that are external to the interior space 108 of the vehicle 104.
  • the oxygen/air sensors 724 may measure a quality and/or composition of the air outside of a vehicle 104.
  • the motion sensors 744 may detect motion outside of a vehicle 104.
  • the external environmental group 708E may comprise sensors configured to collect data relating to the external environment of a vehicle 104.
  • the external environmental group 708E may include additional sensors, such as, vehicle sensors 750, biological sensors, and wireless signal sensors 758.
  • Vehicle sensors 750 can detect vehicles that are in an environment surrounding the vehicle 104.
  • the vehicle sensors 750 may detect vehicles in a first outside area 516, a second outside area 520, and/or combinations of the first and second outside areas 516, 520.
  • the vehicle sensors 750 may include one or more of RF sensors, IR sensors, image sensors, and the like to detect vehicles, people, hazards, etc. that are in an environment exterior to the vehicle 104.
  • the vehicle sensors 750 can provide distance/directional information relating to a distance (e.g., distance from the vehicle 104 to the detected object) and/or a direction (e.g., direction of travel, etc.) associated with the detected object.
  • a distance e.g., distance from the vehicle 104 to the detected object
  • a direction e.g., direction of travel, etc.
  • the biological sensors 754 may determine whether one or more biological entities (e.g., an animal, a person, a user 216, etc.) is in an external environment of the vehicle 104. Additionally or alternatively, the biological sensors 754 may provide distance information relating to a distance of the biological entity from the vehicle 104. Biological sensors 754 may include at least one of RF sensors, IR sensors, image sensors and the like that are configured to detect biological entities. For example, an IR sensor may be used to determine that an object, or biological entity, has a specific temperature, temperature pattern, or heat signature.
  • a comparison of the determined heat signature may be compared to known heat signatures associated with recognized biological entities (e.g., based on shape, locations of temperature, and combinations thereof, etc.) to determine whether the heat signature is associated with a biological entity or an inanimate, or non-biological, object.
  • the wireless signal sensors 758 may include one or more sensors configured to receive wireless signals from signal sources such as Wi-FiTM hotspots, cell towers, roadside beacons, other electronic roadside devices, and satellite positioning systems.
  • the wireless signal sensors 758 may detect wireless signals from one or more of a mobile phone, mobile computer, keyless entry device, RFID device, near field communications (NFC) device, and the like.
  • the external safety group 716E may comprise sensors configured to collect data relating to the safety of a user 216 and/or one or more components of a vehicle 104.
  • Examples of safety sensors associated with the external safety group 716E may include, but are not limited to, force sensors 768, mechanical motion sensors 772, orientation sensors 776, vehicle body sensors 782, and more.
  • the exterior safety sensors 716E may be configured to collect data relating to one or more conditions, objects, vehicle components, and other events that are external to the vehicle 104.
  • the force sensors 768 in the external safety group 716E may detect and/or record force information associated with the outside of a vehicle 104. For instance, if an object strikes the exterior of the vehicle 104, the force sensors 768 from the exterior safety group 716E may determine a magnitude, location, and/or time associated with the strike.
  • the vehicle 104 may include a number of vehicle body sensors 782.
  • the vehicle body sensors 782 may be configured to measure characteristics associated with the body (e.g., body panels, components, chassis, windows, etc.) of a vehicle 104.
  • two vehicle body sensors 782 including a first body sensor and a second body sensor, may be located at some distance apart.
  • the first body sensor may be configured to send an electrical signal across the body of the vehicle 104 to the second body sensor, or vice versa.
  • the second body sensor may record a detected current, voltage, resistance, and/or combinations thereof associated with the received electrical signal. Values (e.g., current, voltage, resistance, etc.) for the sent and received electrical signal may be stored in a memory.
  • the vehicle body sensors 782 may communicate with each other, a vehicle control system 204, and/or systems of the vehicle system 200 via a communications channel 356.
  • vehicle body sensors 782 may use sound waves and/or light to perform a similar function.
  • Fig. 8A is a block diagram of an embodiment of a media controller subsystem 348 for a vehicle 104.
  • the media controller subsystem 348 may include, but is not limited to, a media controller 804, a media processor 808, a match engine 812, an audio processor 816, a speech synthesis module 820, a network transceiver 824, a signal processing module 828, memory 832, and a language database 836.
  • the media controller subsystem 348 may be configured as a dedicated blade that implements the media-related functionality of the system 200.
  • the media controller subsystem 348 can provide voice input, voice output, library functions for multimedia, and display control for various areas 508 and/or zones 512 of the vehicle 104.
  • the media controller subsystem 348 may include a local IP address (e.g., IPv4, IPv6, combinations thereof, etc.) and even a routable, global unicast address.
  • the routable, global unicast address may allow for direct addressing of the media controller subsystem 348 for streaming data from Internet resources (e.g., cloud storage, user accounts, etc.). It is anticipated, that the media controller subsystem 348 can provide multimedia via at least one Internet connection, or wireless network communications module, associated with the vehicle 104.
  • the media controller subsystem 348 may be configured to service multiple independent clients simultaneously.
  • the media processor 808 may comprise a general purpose programmable processor or controller for executing application programming or instructions related to the media subsystem 348.
  • the media processor 808 may include multiple processor cores, and/or implement multiple virtual processors.
  • the media processor 808 may include multiple physical processors.
  • the media processor 808 may comprise a specially configured application specific integrated circuit (ASIC) or other integrated circuit, a digital signal processor, a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like.
  • ASIC application specific integrated circuit
  • the media processor 808 generally functions to run programming code or instructions implementing various functions of the media controller 804.
  • the match engine 812 can receive input from one or more components of the vehicle system 800 and perform matching functions.
  • the match engine 812 may receive audio input provided via a microphone 886 of the system 800.
  • the audio input may be provided to the media controller subsystem 348 where the audio input can be decoded and matched, via the match engine 812, to one or more functions available to the vehicle 104. Similar matching operations may be performed by the match engine 812 relating to video input received via one or more image sensors, cameras 878, and the like.
  • the media controller subsystem 348 may include a speech synthesis module 820 configured to provide audio output to one or more speakers 880, or audio output devices, associated with the vehicle 104.
  • the speech synthesis module 820 may be configured to provide audio output based at least partially on the matching functions performed by the match engine 812.
  • the coding/decoding, the analysis of audio input/output, and/or other operations associated with the match engine 812 and speech synthesis module 820 may be performed by the media processor 808 and/or a dedicated audio processor 816.
  • the audio processor 816 may comprise a general purpose programmable processor or controller for executing application programming or instructions related to audio processing. Further, the audio processor 816 may be similar to the media processor 808 described herein.
  • the network transceiver 824 can include any device configured to transmit and receive analog and/or digital signals.
  • the media controller subsystem 348 may utilize a network transceiver 824 in one or more communication networks associated with the vehicle 104 to receive and transmit signals via the communications channel 356.
  • the network transceiver 824 may accept requests from one or more devices 212, 248 to access the media controller subsystem 348.
  • One example of the communication network is a local-area network (LAN).
  • LAN local-area network
  • the functionality associated with the network transceiver 824 may be built into at least one other component of the vehicle 104 (e.g., a network interface card, communications module, etc.).
  • the signal processing module 828 may be configured to alter audio/multimedia signals received from one or more input sources (e.g., microphones 886, etc.) via the communications channel 356. Among other things, the signal processing module 828 may alter the signals received electrically, mathematically, combinations thereof, and the like.
  • the media controller 804 may also include memory 832 for use in connection with the execution of application programming or instructions by the media processor 808, and for the temporary or long term storage of program instructions and/or data.
  • the memory 832 may comprise RAM, DRAM, SDRAM, or other solid state memory.
  • the language database 836 may include the data and/or libraries for one or more languages, as are used to provide the language functionality as provided herein.
  • the language database 836 may be loaded on the media controller 804 at the point of manufacture.
  • the language database 836 can be modified, updated, and/or otherwise changed to alter the data stored therein.
  • additional languages may be supported by adding the language data to the language database 836.
  • this addition of languages can be performed via accessing administrative functions on the media controller 804 and loading the new language modules via wired (e.g., USB, etc.) or wireless communication.
  • the administrative functions may be available via a vehicle console device 248, a user device 212, 248, and/or other mobile computing device that is authorized to access administrative functions (e.g., based at least partially on the device's address, identification, etc.).
  • One or more video controllers 840 may be provided for controlling the video operation of the devices 212, 248, 882 associated with the vehicle.
  • the video controller 840 may include a display controller for controlling the operation of touch sensitive screens, including input (touch sensing) and output (display) functions.
  • Video data may include data received in a stream and unpacked by a processor and loaded into a display buffer.
  • the processor and video controller 840 can optimize the display based on the characteristics of a screen of a display device 212, 248, 882.
  • the functions of a touch screen controller may be incorporated into other components, such as a media processor 808 or display subsystem.
  • the audio controller 844 can provide control of the audio entertainment system (e.g., radio, subscription music service, multimedia entertainment, etc.), and other audio associated with the vehicle 104 (e.g., navigation systems, vehicle comfort systems, convenience systems, etc.).
  • the audio controller 844 may be configured to translate digital signals to analog signals and vice versa.
  • the audio controller 844 may include device drivers that allow the audio controller 844 to communicate with other components of the system 800 (e.g., processors 816, 808, audio I/O 874, and the like).
  • the system 800 may include a profile identification module 848 to determine whether a user profile is associated with the vehicle 104.
  • the profile identification module 848 may receive requests from a user 216, or device 212, 228, 248, to access a profile stored in a profile database 856 or profile data 252. Additionally or alternatively, the profile identification module 848 may request profile information from a user 216 and/or a device 212, 228, 248, to access a profile stored in a profile database 856 or profile data 252.
  • the profile identification module 848 may be configured to create, modify, retrieve, and/or store user profiles in the profile database 856 and/or profile data 252.
  • the profile identification module 848 may include rules for profile identification, profile information retrieval, creation, modification, and/or control of components in the system 800.
  • a user 216 may enter the vehicle 104 with a smart phone or other device 212.
  • the profile identification module 848 may determine that a user profile is associated with the user's smart phone 212.
  • the system 800 may receive information about a user 216 (e.g., from a camera 878, microphone 886, etc.), and, in response to receiving the user information, the profile identification module 848 may refer to the profile database 856 to determine whether the user information matches a user profile stored in the database 856. It is anticipated that the profile identification module 848 may communicate with the other components of the system to load one or more preferences, settings, and/or conditions based on the user profile. Further, the profile identification module 848 may be configured to control components of the system 800 based on user profile information.
  • data storage 852 may be provided. Like the memory 832, the data storage 852 may comprise a solid state memory device or devices. Alternatively or in addition, the data storage 852 may comprise a hard disk drive or other random access memory. Similar to the data storage 852, the profile database 856 may comprise a solid state memory device or devices.
  • An input/output module 860 and associated ports may be included to support communications over wired networks or links, for example with other communication devices, server devices, and/or peripheral devices.
  • Examples of an input/output module 860 include an Ethernet port, a Universal Serial Bus (USB) port, CAN Bus, Institute of Electrical and Electronics Engineers (IEEE) 1594, or other interface. Users may bring their own devices (e.g., Bring Your Own Device (BYOD), device 212, etc.) into the vehicle 104 for use with the various systems disclosed.
  • USB Universal Serial Bus
  • IEEE Institute of Electrical and Electronics Engineers
  • the input/output module 860 can provide the necessary connection of one or more devices to the vehicle systems described herein.
  • a video input/output interface 864 can be included to receive and transmit video signals between the various components in the system 800.
  • the video input/output interface 864 can operate with compressed and uncompressed video signals.
  • the video input/output interface 864 can support high data rates associated with image capture devices. Additionally or alternatively, the video input/output interface 864 may convert analog video signals to digital signals.
  • the infotainment system 870 may include information media content and/or entertainment content, informational devices, entertainment devices, and the associated programming therefor.
  • the infotainment system 870 may be configured to handle the control of one or more components of the system 800 including, but in no way limited to, radio, streaming audio/video devices, audio devices 880, 882, 886, video devices 878, 882, travel devices (e.g., GPS, navigational systems, etc.), wireless communication devices, network devices, and the like.
  • the infotainment system 870 can provide the functionality associated with other infotainment features as provided herein.
  • An audio input/output interface 874 can be included to provide analog audio to an interconnected speaker 880 or other device, and to receive analog audio input from a connected microphone 886 or other device.
  • the audio input/output interface 874 may comprise an associated amplifier and analog to digital converter.
  • the devices 212, 248 can include integrated audio input/output devices 880, 886 and/or an audio jack for interconnecting an external speaker 880 or microphone 886.
  • an integrated speaker 880 and an integrated microphone 886 can be provided, to support near talk, voice commands, spoken information exchange, and/or speaker phone operations.
  • the system 800 may include devices that are part of the vehicle 104 and/or part of a device 212, 248 that is associated with the vehicle 104.
  • these devices may be configured to capture images, display images, capture sound, and present sound.
  • the system 800 may include at least one of image sensors/cameras 878, display devices 882, audio input devices/microphones 886, and audio output devices/speakers 880.
  • the cameras 878 can be included for capturing still and/or video images.
  • image sensors 878 can include a scanner or code reader.
  • An image sensor/camera 878 can include or be associated with additional elements, such as a flash or other light source.
  • the display device 882 may include an audio input device and/or an audio output device in addition to providing video functions.
  • the display device 882 may be a console, monitor, a tablet computing device, and/or some other mobile computing device.
  • Fig. 8B is a block diagram of an embodiment of a user/device interaction subsystem 817 in a vehicle system 800.
  • the user/device interaction subsystem 817 may comprise hardware and/or software that conduct various operations for or with the vehicle 104.
  • the user/device interaction subsystem 817 may include at least one user interaction subsystem 332 and device interaction subsystem 352 as previously described. These operations may include, but are not limited to, providing information to the user 216, receiving input from the user 216, and controlling the functions or operation of the vehicle 104, etc.
  • the user/device interaction subsystem 817 may include a computing system operable to conduct the operations as described herein.
  • the user/device interaction subsystem 817 can include one or more of the components and modules provided herein.
  • the user/device interaction subsystem 817 can include one or more of a video input/output interface 864, an audio input/output interface 874, a sensor module 814, a device interaction module 818, a user identification module 822, a vehicle control module 826, an environmental control module 830, and a gesture control module 834.
  • the user/device interaction subsystem 817 may be in communication with other devices, modules, and components of the system 800 via the communications channel 356.
  • the user/device interaction subsystem 817 may be configured to receive input from a user 216 and/or device via one or more components of the system.
  • a user 216 may provide input to the user/device interaction subsystem 817 via wearable devices 802, 806, 810, video input (e.g., via at least one image sensor/camera 878, etc.) audio input (e.g., via the microphone, audio input source, etc.), gestures (e.g., via at least one image sensor 878, motion sensor 888, etc.), device input (e.g., via a device 212, 248 associated with the user, etc.), combinations thereof, and the like.
  • video input e.g., via at least one image sensor/camera 878, etc.
  • audio input e.g., via the microphone, audio input source, etc.
  • gestures e.g., via at least one image sensor 878, motion sensor 888, etc.
  • device input e.g., via a device 212, 2
  • the wearable devices 802, 806, 810 can include heart rate monitors, blood pressure monitors, glucose monitors, pedometers, movement sensors, wearable computers, and the like. Examples of wearable computers may be worn by a user 216 and configured to measure user activity, determine energy spent based on the measured activity, track user sleep habits, determine user oxygen levels, monitor heart rate, provide alarm functions, and more. It is anticipated that the wearable devices 802, 806, 810 can communicate with the user/device interaction subsystem 817 via wireless communications channels or direct connection (e.g., where the device docks, or connects, with a USB port or similar interface of the vehicle 104).
  • wireless communications channels or direct connection e.g., where the device docks, or connects, with a USB port or similar interface of the vehicle 104.
  • a sensor module 814 may be configured to receive and/or interpret input provided by one or more sensors in the vehicle 104.
  • the sensors may be associated with one or more user devices (e.g., wearable devices 802, 806, 810, smart phones 212, mobile computing devices 212, 248, and the like).
  • the sensors may be associated with the vehicle 104, as described in conjunction with Figs. 6A-7B.
  • the device interaction module 818 may communicate with the various devices as provided herein.
  • the device interaction module 818 can provide content, information, data, and/or media associated with the various subsystems of the vehicle system 800 to one or more devices 212, 248, 802, 806, 810, 882, etc. Additionally or alternatively, the device interaction module 818 may receive content, information, data, and/or media associated with the various devices provided herein.
  • the user identification module 822 may be configured to identify a user 216 associated with the vehicle 104. The identification may be based on user profile information that is stored in profile data 252. For instance, the user identification module 822 may receive characteristic information about a user 216 via a device, a camera, and/or some other input. The received characteristics may be compared to data stored in the profile data 252. Where the characteristics match, the user 216 is identified. As can be appreciated, where the characteristics do not match a user profile, the user identification module 822 may communicate with other subsystems in the vehicle 104 to obtain and/or record profile information about the user 216. This information may be stored in a memory and/or the profile data storage 252.
  • the vehicle control module 826 may be configured to control settings, features, and/or the functionality of a vehicle 104.
  • the vehicle control module 826 can communicate with the vehicle control system 204 to control critical functions (e.g., driving system controls, braking, accelerating, etc.) and/or noncritical functions (e.g., driving signals, indicator/hazard lights, mirror controls, window actuation, etc.) based at least partially on user/device input received by the user/device interaction subsystem 817.
  • critical functions e.g., driving system controls, braking, accelerating, etc.
  • noncritical functions e.g., driving signals, indicator/hazard lights, mirror controls, window actuation, etc.
  • the environmental control module 830 may be configured to control settings, features, and/or other conditions associated with the environment, especially the interior environment, of a vehicle 104.
  • the environmental control module 830 may communicate with the climate control system (e.g. changing cabin temperatures, fan speeds, air direction, etc.), oxygen and/or air quality control system (e.g.,
  • the environmental control module 830 may control the environment based at least partially on user/device input received by the user/device interaction subsystem 817.
  • the gesture control module 834 is configured to interpret gestures provided by a user 216 in the vehicle 104.
  • the gesture control module 834 may provide control signals to one or more of the vehicle systems 300 disclosed herein.
  • a user 216 may provide gestures to control the environment, critical and/or noncritical vehicle functions, the infotainment system, communications, networking, and more.
  • gestures may be provided by a user 216 and detected via one or more of the sensors as described in conjunction with Figs. 6B-7A.
  • one or more motion sensors 888 may receive gesture input from a user 216 and provide the gesture input to the gesture control module 834.
  • the gesture input is interpreted by the gesture control module 834. This interpretation may include comparing the gesture input to gestures stored in a memory.
  • the gestures stored in memory may include one or more functions and/or controls mapped to specific gestures. When a match is determined between the detected gesture input and the stored gesture information, the gesture control module 834 can provide a control signal to any of the systems/subsystems as disclosed herein.
  • Fig. 8C illustrates a GPS/Navigation subsystem(s) 336.
  • the Navigation subsystem(s) 336 can be any present or future-built navigation system that may use location data, for example, from the Global Positioning System (GPS), to provide navigation information or control the vehicle 104.
  • GPS Global Positioning System
  • the Navigation subsystem(s) 336 can include several components or modules, such as, one or more of, but not limited to, a GPS Antenna/receiver 892, a location module 896, a maps database 8100, an automobile controller 8104, a vehicle systems transceiver 8108, a traffic controller 8112, a network traffic transceiver 8116, a vehicle-to-vehicle transceiver 8120, a traffic information database 8124, etc.
  • the several components or modules 892-8124 may be hardware, software, firmware, computer readable media, or combinations thereof.
  • a GPS Antenna/receiver 892 can be any antenna, GPS puck, and/or receiver capable of receiving signals from a GPS satellite or other navigation system, as mentioned hereinbefore.
  • the signals may be demodulated, converted, interpreted, etc. by the GPS Antenna/receiver 892 and provided to the location module 896.
  • the GPS Antenna/receiver 892 can be any antenna, GPS puck, and/or receiver capable of receiving signals from a GPS satellite or other navigation system, as mentioned hereinbefore.
  • the signals may be demodulated, converted, interpreted, etc. by the GPS Antenna/receiver 892 and provided to the location module 896.
  • Antenna/receiver 892 may convert the time signals from the GPS system and provide a location (e.g., coordinates on a map) to the location module 896.
  • the location module 896 can interpret the time signals into coordinates or other location information.
  • the location module 896 can be the controller of the satellite navigation system designed for use in automobiles.
  • the location module 896 can acquire position data, as from the GPS Antenna/receiver 892, to locate the user or vehicle 104 on a road in the unit's map database 8100. Using the road database 8100, the location module 896 can give directions to other locations along roads also in the database 8100.
  • the location module 896 may apply dead reckoning to estimate distance data from sensors 242 including one or more of, but not limited to, a speed sensor attached to the drive train of the vehicle 104, a gyroscope, an accelerometer, etc. GPS signal loss and/or multipath can occur due to urban canyons, tunnels, and other obstructions.
  • the location module 896 may use known locations of Wi-Fi hotspots, cell tower data, etc. to determine the position of the vehicle 104, such as by using time difference of arrival (TDOA) and/or frequency difference of arrival (FDOA) techniques.
  • TDOA time difference of arrival
  • FDOA frequency difference of arrival
  • the maps database 8100 can include any hardware and/or software to store information about maps, geographical information system information, location information, etc.
  • the maps database 8100 can include any data definition or other structure to store the information.
  • the maps database 8100 can include a road database that may include one or more vector maps of areas of interest. Street names, street numbers, house numbers, and other information can be encoded as geographic coordinates so that the user can find some desired destination by street address. Points of interest (waypoints) can also be stored with their geographic coordinates. For example, a point of interest may include speed cameras, fuel stations, public parking, and "parked here" (or "you parked here") information.
  • the map database contents can be produced or updated by a server connected through a wireless system in communication with the Internet, even as the vehicle 104 is driven along existing streets, yielding an up-to-date map.
  • An automobile controller 8104 can be any hardware and/or software that can receive instructions from the location module 896 or the traffic controller 8112 and operate the vehicle 104.
  • the automobile controller 8104 receives this information and data from the sensors 242 to operate the vehicle 104 without driver input.
  • the automobile controller 8104 can drive the vehicle 104 along a route provided by the location module 896. The route may be adjusted by information sent from the traffic controller 8112. Discrete and real-time driving can occur with data from the sensors 242.
  • the automobile controller 8104 can communicate with a vehicle systems transceiver 8108.
  • the vehicle systems transceiver 8108 can be any present or future-developed device that can comprise a transmitter and/or a receiver, which may be combined and can share common circuitry or a single housing.
  • the vehicle systems transceiver 8108 may communicate or instruct one or more of the vehicle control subsystems 328.
  • the vehicle systems transceiver 8108 may send steering commands, as received from the automobile controller 8104, to an electronic steering system, to adjust the steering of the vehicle 100 in real time.
  • the automobile controller 8104 can determine the effect of the commands based on received sensor data 242 and can adjust the commands as need be.
  • the vehicle systems transceiver 8108 can also communicate with the braking system, the engine and drive train to speed or slow the car, the signals (e.g., turn signals and brake lights), the headlights, the windshield wipers, etc. Any of these communications may occur over the components or function as described in conjunction with Fig. 4.
  • a traffic controller 8112 can be any hardware and/or software that can communicate with an automated traffic system and adjust the function of the vehicle 104 based on instructions from the automated traffic system.
  • An automated traffic system is a system that manages the traffic in a given area. This automated traffic system can instruct cars to drive in certain lanes, instruct cars to raise or lower their speed, instruct a car to change their route of travel, instruct cars to communicate with other cars, etc. To perform these functions, the traffic controller 8112 may register the vehicle 104 with the automated traffic system and then provide other information including the route of travel. The automated traffic system can return registration information and any required instructions.
  • the communications between the automated traffic system and the traffic controller 8112 may be received and sent through a network traffic transceiver 8116.
  • the network traffic transceiver 81 16 can be any present or future-developed device that can comprise a transmitter and/or a receiver, which may be combined and can share common circuitry or a single housing.
  • the network traffic transceiver 8116 may communicate with the automated traffic system using any known or future-developed, protocol, standard, frequency, bandwidth range, etc.
  • the network traffic transceiver 8116 enables the sending of information between the traffic controller 8112 and the automated traffic system.
  • the traffic controller 8112 can also communicate with another vehicle, which may be in physical proximity (i.e., within range of a wireless signal), using the vehicle-to- vehicle transceiver 8120.
  • the vehicle-to- vehicle transceiver 8120 can be any present or future-developed device that can comprise a transmitter and/or a receiver, which may be combined and can share common circuitry or a single housing.
  • the vehicle-to-vehicle transceiver 8120 enables communication between the vehicle 104 and any other vehicle.
  • the vehicle 104 may receive traffic or safety information, control or be controlled by another vehicle, establish an alternative communication path to communicate with the automated traffic system, establish a node including two or more vehicle that can function as a unit, etc.
  • the vehicle-to-vehicle transceiver 8120 may communicate with the other vehicles using any known or future-developed, protocol standard, frequency, bandwidth range, etc.
  • the traffic controller 8112 can control functions of the automobile controller 8104 and communicate with the location module 896.
  • the location module 896 can provide current location information and route information that the traffic controller 8112 may then provide to the automated traffic system.
  • the traffic controller 8112 may receive route adjustments from the automated traffic system that are then sent to the location module 896 to change the route.
  • the traffic controller 8112 can also send driving instructions to the automobile controller 8104 to change the driving characteristics of the vehicle 104. For example, the traffic controller 8112 can instruct the automobile controller 8104 to accelerate or decelerate to a different speed, change lanes, or perform another driving maneuver.
  • the traffic controller 8112 can also manage vehicle-to-vehicle communications and store information about the communications or other information in the traffic information database 8124.
  • the traffic information database 8124 can be any type of database, such as relational, hierarchical, object-oriented, and/or the like.
  • the traffic information database 8124 may reside on a storage medium local to (and/or resident in) the vehicle control system 204 or in the vehicle 104.
  • the traffic information database 8124 may be adapted to store, update, and retrieve information about communications with other vehicles or any active instructions from the automated traffic system. This information may be used by the traffic controller 8112 to instruct or adjust the performance of driving maneuvers.
  • Fig. 9 illustrates an optional communications architecture where, the host device 908 may include one more routing profiles, permission modules, and rules that control how communications within the vehicle 104 are to occur.
  • This communications architecture can be used in conjunction with the routing tables, rules and permissions associated with access point 456 and optional firewall 484, or can be in lieu thereof.
  • the host device 908 acts as a mobile hot spot to one or more other devices within vehicle 104, such as, other device 1 912, other device 2 916, other device 3 920, and other device N 924.
  • one or more of the other devices 912 can communicate directly with the host device 908 which then provides Internet access to those devices 912 via the device 908.
  • the host device 908 can act as a mobile hot spot for any one or more of the other devices 912, which may not need to communicate over the network/communications buses 224/404, but could instead connect directly to the host device 908 via, for example, NFC, Bluetooth®, WiFi, or the like.
  • the device 908 can include one or more routing profiles, permissions, rules modules, and can also act as a firewall for the various inter and intra vehicle communications.
  • host 904 there could be alternative host devices, such as, host 904 which could also act as, for example, a co-host in association with device 908.
  • one or more of the routing profile, permission information, and rules could be shared between the co-host devices 904, 908, both of those devices being usable for Internet access for one or more of the other devices, 912-924.
  • the other devices 912-924 need not necessarily connect to one or more of host device 908 and the other device 904 via a direct communications link, but could also interface with those devices 904, 908 utilizing the network /communications buses 224/404 associated with the vehiclelOO.
  • one or more of the other devices can connect to the network/communications buses 224/404 utilizing the various networks and/or buses discussed herein which would therefore enable, for example, regulation of the various communications based on the Ethernet zone that the other device 912 is associated with.
  • FIG. 10 An embodiment of one or more modules that may be associated with the vehicle control system 204 may be as shown in Fig. 10.
  • the modules can include a
  • the communications may pass through a firewall 1044.
  • the firewall 1044 can be any software that can control the incoming and outgoing communications by analyzing the data packets and determining whether the packets should be allowed through the firewall, based on applied rule set.
  • a firewall 1044 can establish a "barrier" between a trusted, secure internal network and another network (e.g., the Internet) that is not assumed to be secure and trusted.
  • the firewall 1044 may establish security zones that are implemented by running system services and/or applications in restricted user groups and accounts. A set of configuration files and callbacks may then be linked to an IP table firewall.
  • the IP table firewall can be configured to notify a custom filter application at any of the layers of the Ethernet packet.
  • the different users/group rights to access the system may include: system users, which may have exclusive right over all device firewall rules and running software; a big-brother user, which may have access to on board device (OBD) control data and may be able to communicate with the vehicle subsystem 328 and may be able to alter the parameters in the vehicle control system 204; a dealer user, which can have rights to read OBD data for diagnostics and repairs; a dashboard user, which can have rights to launch dashboard applications and/or authenticate guest users and change their permissions to trusted/friend/family, and can read but cannot write into OBD diagnostic data; a world wide web (WWW) data user, which can have HTTP rights to respond to HTTP requests (the HTTP requests also can target different user data, but may be filtered by default user accounts); a guest user, which may have no rights; a
  • WWW world wide web
  • family/friend user which may have rights to play media from the media subsystem 348 and/or to stream media to the media subsystem 348.
  • the operating system 1004 can be a collection of software that manages computer hardware resources and provides common services for applications and other programs.
  • the operating system 1004 may schedule time-sharing for efficient use of the system.
  • the operating system 1004 can act as an intermediary between applications or programs and the computer hardware. Examples of operating systems that may be deployed as operating system 1004 include Android, BSD, iOS, Linux, OS X, QNX, Microsoft Windows, Windows Phone, IBM z/OS, etc.
  • the operating system 1004 can include one or more sub-modules.
  • a desktop manager 1012 can manage one or more graphical user interfaces (GUI) in a desktop environment.
  • GUIs graphical user interfaces
  • a command-line interface (CLI) may be used if full control over the operating system (OS) 1004 is required.
  • OS operating system
  • a kernel 1028 can be a computer program that manages input/output requests from software and translates them into data processing instructions for the processor 304 and other components of the vehicle control system 204.
  • the kernel 1028 is the fundamental component of the operating system 1004 that can execute many of the functions associated with the OS 1004.
  • the kernel 1028 can include other software functions, including, but not limited to, driver(s) 1056, communication software 1052, and/or Internet Protocol software 1048.
  • a driver 1056 can be any computer program that operates or controls a particular type of device that is attached to a vehicle control system 204.
  • a driver 1056 can communicate with the device through the bus 356 or communications subsystem 1008 to which the hardware connects. When a calling program invokes a routine in the driver 1056, the driver 1056 may issue one or more commands to the device. Once the device sends data back to the driver 1056, the driver 1056 may invoke routines in the original calling program.
  • Drivers can be hardware-dependent and operating-system-specific.
  • Driver(s) 1056 can provide the interrupt handling required for any necessary asynchronous time- dependent hardware interface.
  • the IP module 1048 can conduct any IP addressing, which may include the assignment of IP addresses and associated parameters to host interfaces.
  • the address space may include networks and sub-networks.
  • the IP module 1048 can perform the designation of network or routing prefixes and may conduct IP routing, which transports packets across network boundaries. Thus, the IP module 1048 may perform all functions required for IP multicast operations.
  • the communications module 1052 may conduct all functions for communicating over other systems or using other protocols not serviced by the IP module 1048. Thus, the communications module 1052 can manage multicast operations over other busses or networks not serviced by the IP module 1048. Further, the communications module 1052 may perform or manage communications to one or more devices, systems, data stores, services, etc. that are in communication with the vehicle control system 204 or other subsystems through the firewall 1044. Thus, the communications module 1052 can conduct communications through the communication subsystem interface 1008.
  • a file system 1016 may be any data handling software that can control how data is stored and retrieved.
  • the file system 1016 can separate the stored data into individual pieces, and giving each piece a name, can easily separate and identify the pieces of data. Each piece of data may be considered a "file”.
  • the file system 1016 can construct data structure and logic rules used to manage the information and the identifiers for the information. The structure and logic rules can be considered a "file system.”
  • a device discovery daemon 1020 may be a computer program that runs as a background process that can discover new devices that connect with the network 356 or communication subsystem 1008 or devices that disconnect from the network 356 or communication subsystem 1008.
  • the device discovery daemon 1020 can ping the network 356 (the local subnet) when the vehicle 104 starts, when a vehicle door opens or closes, or upon the occurrence of other events. Additionally or alternatively, the device discovery daemon 1020 may force Bluetooth®, USB, and/or wireless detection. For each device that responds to the ping, the device discovery daemon 1020 can populate the system data 208 with device information and capabilities, using any of one or more protocols, including one or more of, but not limited to, IPv6 Hop-by-Hop Option
  • HOPOPT Internet Control Message Protocol
  • ICMP Internet Control Message Protocol
  • IGMP Internet Group Management Protocol
  • GGP Gateway-to-Gateway Protocol
  • IP Internet Protocol
  • IP Internet Protocol
  • ST Transmission Control Protocol
  • GGP Gateway-to-Gateway Protocol
  • IP Internet Protocol
  • ST Transmission Control Protocol
  • GGP Exterior Gateway Protocol
  • CHAOS User Datagram Protocol
  • UDP User Datagram Protocol
  • the device discovery daemon 1020 can determine device capabilities based on the opened ports the device exposes. If a camera exposes port 80, then the device discovery daemon 1020 can determine that the camera is using a Hypertext Transfer Protocol (HTTP). Alternatively, if a device is supporting Universal Plug and Play (UPnP), the system data 208 can include more information, for example, a camera control universal resource locator (URL), a camera zoom URL, etc. When a scan stops, the device discovery daemon 1020 can trigger a dashboard refresh to ensure the user interface reflects the new devices on the desktop.
  • HTTP Hypertext Transfer Protocol
  • URL camera control universal resource locator
  • the device discovery daemon 1020 can trigger a dashboard refresh to ensure the user interface reflects the new devices on the desktop.
  • a desktop manager 1012 may be a computer program that manages the user interface of the vehicle control system 204.
  • the desktop environment may be designed to be customizable and allow the definition of the desktop configuration look-and-feel for a wide range of appliances or devices from computer desktops, mobile devices, computer tablets, etc.
  • Launcher(s), panels, desktop areas, the desktop background, notifications, panes, etc. can be configured from a dashboard configuration file managed by the desktop manager 1012.
  • the graphical elements in which the desktop manager 1012 controls can include launchers, the desktop, notification bars, etc.
  • the desktop may be an area of the display where the applications are running.
  • the desktop can have a custom background.
  • the desktop may be divided into two or more areas.
  • the desktop may be divided into an upper half of a display and a lower half of the display.
  • Each application can be configured to run in a portion of the desktop.
  • Extended settings can be added to the desktop configuration file, such that, some objects may be displayed over the whole desktop or in custom size out of the context of the divided areas.
  • the notification bar may be a part of a bar display system, which may provide notifications by displaying, for example, icons and/or pop-up windows that may be associated with sound notifications.
  • the notification mechanism can be designed for separate plug-ins, which run in separate processes and may subscribe to a system
  • the desktop manager 1012 may include a windows manager 1032, an application launcher 1036, and/or a panel launcher 1040. Each of these components can control a different aspect of the user interface.
  • the desktop manager 1012 can use a root window to create panels that can include functionality for one or more of, but not limited to:
  • the windows manager 1032 may be software that controls the placement and appearance of windows within a graphical user interface presented to the user. Generally, the windows manager 1032 can provide the desktop environment used by the vehicle control system 204. The windows manager 1032 can communicate with the kernel 1028 to interface with the graphical system that provides the user interface(s) and supports the graphics hardware, pointing devices, keyboard, touch-sensitive screens, etc.
  • the windows manager 1032 may be a tiling window manager (i.e., a window manager with an organization of the screen into mutually non-overlapping frames, as opposed to a coordinate-based stacking of overlapping objects (windows) that attempts to fully emulate the desktop metaphor).
  • the windows manager 1032 may read and store configuration files, in the system data 208, which can control the position of the application windows at precise positions.
  • An application manager 1036 can control the function of any application over the lifetime of the process.
  • the process or application can be launched from a panel launcher 1040 or from a remote console.
  • the application manager 1036 can intercept the process name and may take appropriate action to manage that process. If the process is not running, the application manager 1036 can load the process and may bring the process to a foreground in a display. The application manager 1036 may also notify the windows manager 1032 to bring the associated window(s) to a top of a window stack for the display.
  • the application manager 1036 can scan files to match the process name with the entry name provided. When a match is found, the application manager 1036 can configure the process according to a settings file.
  • the application manager 1036 may restrict an application as singleton (i.e., restricts the instantiation of a class to one object). If an application is already running and the application manager 1036 is asked to run the application again, the application manager 1036 can bring the running process to a foreground on a display. There can be a notification event exchange between the windows manager 1032 and the application manager 1036 for activating the appropriate window for the foreground process. Once an application is launched, the application may not be terminated or killed. The application can be sent to the background, except, possibly, for some applications (e.g., media player, Bluetooth®, notifications, etc.), which may be given a lowest process priority.
  • some applications e.g., media player, Bluetooth®, notifications, etc.
  • the panel launcher 1040 can be a widget configured to be placed along a portion of the display.
  • the panel launcher 1040 may be built from desktop files from a desktop folder.
  • the desktop folder location can be configured by a configuration file stored in system data 208.
  • the panel launcher 1040 can allow for the launching or executing of applications or processes by receiving inputs from a user interface to launch programs.
  • a desktop plugin 1024 may be a software component that allows for
  • One or more gestures used to interface with the vehicle control system 204 may be as described in conjunction with Fig. 11 A through 1 IK.
  • Figs. 11 A through 11H depict various graphical representations of gesture inputs that may be recognized by the devices 212, 248.
  • the gestures may be performed not only by a user's body part, such as a digit, but also by other devices, such as a stylus, that may be sensed by the contact sensing portion(s) of a screen associated with the device 212, 248.
  • gestures are interpreted differently, based on where the gestures are performed (either directly on a display or in a gesture capture region). For example, gestures in a display may be directed to a desktop or application, and gestures in a gesture capture region may be interpreted as for the system.
  • a first type of gesture is substantially stationary on a portion (e.g., a screen, a display, etc.) of a device 212, 248 for a selected length of time.
  • a circle 1128 represents a touch or other contact type received at particular location of a contact sensing portion of the screen.
  • the circle 1128 may include a border 1132, the thickness of which indicates a length of time that the contact is held substantially stationary at the contact location. For instance, a tap 1120 (or short press) has a thinner border 1132A than the border 1132B for a long press 1124 (or for a normal press).
  • the long press 1124 may involve a contact that remains substantially stationary on the screen for longer time period than that of a tap 1120.
  • differently defined gestures may be registered depending upon the length of time that the touch remains stationary prior to contact cessation or movement on the screen.
  • a drag gesture 1100 on the screen is an initial contact (represented by circle 1128) with contact movement 1136 in a selected direction.
  • the initial contact 1128 may remain stationary on the screen for a certain amount of time represented by the border 1132.
  • the drag gesture typically requires the user to contact an icon, window, or other displayed image at a first location followed by movement of the contact in a drag direction to a new second location desired for the selected displayed image.
  • the contact movement need not be in a straight line but have any path of movement so long as the contact is substantially continuous from the first to the second locations.
  • a flick gesture 1104 on the screen is an initial contact (represented by circle 1128) with truncated contact movement 1136 (relative to a drag gesture) in a selected direction.
  • a flick may have a higher exit velocity for the last movement in the gesture compared to the drag gesture.
  • the flick gesture can, for instance, be a finger snap following initial contact.
  • a flick gesture generally does not require continual contact with the screen from the first location of a displayed image to a predetermined second location. The contacted displayed image is moved by the flick gesture in the direction of the flick gesture to the predetermined second location.
  • both gestures commonly can move a displayed image from a first location to a second location, the temporal duration and distance of travel of the contact on the screen is generally less for a flick than for a drag gesture.
  • a pinch gesture 1108 on the screen is depicted.
  • the pinch gesture 1108 may be initiated by a first contact 1128 A to the screen by, for example, a first digit and a second contact 1128B to the screen by, for example, a second digit.
  • the first and second contacts 1128A,B may be detected by a common contact sensing portion of a common screen, by different contact sensing portions of a common screen, or by different contact sensing portions of different screens.
  • the first contact 1128 A is held for a first amount of time, as represented by the border 1132A
  • the second contact 1128B is held for a second amount of time, as represented by the border 1132B.
  • the first and second amounts of time are generally substantially the same, and the first and second contacts 1128A,B generally occur substantially simultaneously.
  • the first and second contacts 1128A,B generally also include corresponding first and second contact movements 1136A,B, respectively.
  • the first and second contact movements 1136A,B are generally in opposing directions. Stated another way, the first contact movement 1136A is towards the second contact 1136B, and the second contact movement 1136B is towards the first contact 1136A.
  • the pinch gesture 1108 may be accomplished by a user's digits touching the screen in a pinching motion.
  • a spread gesture 1110 on the screen is depicted.
  • the spread gesture 1110 may be initiated by a first contact 1128 A to the screen by, for example, a first digit, and a second contact 1128B to the screen by, for example, a second digit.
  • the first and second contacts 1128A,B may be detected by a common contact sensing portion of a common screen, by different contact sensing portions of a common screen, or by different contact sensing portions of different screens.
  • the first contact 1128 A is held for a first amount of time, as represented by the border 1132A
  • the second contact 1128B is held for a second amount of time, as represented by the border 1132B.
  • the first and second amounts of time are generally substantially the same, and the first and second contacts 1128A,B generally occur substantially simultaneously.
  • the first and second contacts 1128A,B generally also include corresponding first and second contact movements 1 136A,B, respectively.
  • the first and second contact movements 1136A,B are generally in an opposing direction. Stated another way, the first and second contact movements 1136A,B are away from the first and second contacts 1128A,B.
  • the spread gesture 1110 may be accomplished by a user's digits touching the screen in a spreading motion.
  • Figs. 1 IG and 11H may be combined in any manner, such as those shown by Figs. 1 IG and 11H, to produce a determined functional result.
  • a tap gesture 1120 is combined with a drag or flick gesture 1112 in a direction away from the tap gesture 1120.
  • Fig. 11H a tap gesture 1120 is combined with a drag or flick gesture 1116 in a direction towards the tap gesture 1120.
  • the functional result of receiving a gesture can vary depending on a number of factors, including a state of the vehicle 104, display, or screen of a device, a context associated with the gesture, or sensed location of the gesture, etc.
  • the state of the vehicle 104 commonly refers to one or more of a configuration of the vehicle 104, a display orientation, and user and other inputs received by the vehicle 104.
  • Context commonly refers to one or more of the particular application(s) selected by the gesture and the portion(s) of the application currently executing, whether the application is a single- or multi-screen application, and whether the application is a multi-screen application displaying one or more windows.
  • a sensed location of the gesture commonly refers to whether the sensed set(s) of gesture location coordinates are on a touch sensitive display or a gesture capture region of a device 212, 248, whether the sensed set(s) of gesture location coordinates are associated with a common or different display, or screen, or device 212, 248, and/or what portion of the gesture capture region contains the sensed set(s) of gesture location coordinates.
  • a tap when received by a touch sensitive display of a device 212, 248, can be used, for instance, to select an icon to initiate or terminate execution of a corresponding application, to maximize or minimize a window, to reorder windows in a stack, and/or to provide user input such as by keyboard display or other displayed image.
  • a drag when received by a touch sensitive display of a device 212, 248, can be used, for instance, to relocate an icon or window to a desired location within a display, to reorder a stack on a display, or to span both displays (such that the selected window occupies a portion of each display simultaneously).
  • a flick when received by a touch sensitive display of a device 212, 248 or a gesture capture region, can be used to relocate a window from a first display to a second display or to span both displays (such that the selected window occupies a portion of each display simultaneously). Unlike the drag gesture, however, the flick gesture is generally not used to move the displayed image to a specific user-selected location but to a default location that is not configurable by the user.
  • the pinch gesture when received by a touch sensitive display or a gesture capture region of a device 212, 248, can be used to minimize or otherwise increase the displayed area or size of a window (typically when received entirely by a common display), to switch windows displayed at the top of the stack on each display to the top of the stack of the other display (typically when received by different displays or screens), or to display an application manager (a "pop-up window” that displays the windows in the stack).
  • the spread gesture when received by a touch sensitive display or a gesture capture region of a device 212, 248, can be used to maximize or otherwise decrease the displayed area or size of a window, to switch windows displayed at the top of the stack on each display to the top of the stack of the other display (typically when received by different displays or screens), or to display an application manager (typically when received by an off-screen gesture capture region on the same or different screens).
  • the combined gestures of Fig. 11G when received by a common display capture region in a common display or screen of a device 212, 248, can be used to hold a first window location constant for a display receiving the gesture while reordering a second window location to include a window in the display receiving the gesture.
  • the combined gestures of Fig. 11H when received by different display capture regions in a common display or screen of a device 212, 248 or in different displays or screens of one more devices 212, 248, can be used to hold a first window location for a display receiving the tap part of the gesture while reordering a second window location to include a window in the display receiving the flick or drag gesture.
  • Gestures that may be completed in three-dimensional space and not on a touch sensitive screen or gesture capture region of a device 212, 248 may be as shown in Figs. 1 II- 1 IK.
  • the gestures may be completed in an area where a sensor, such as an optical sensor, infrared sensor, or other type of sensor, may detect the gesture.
  • the gesture 1140 in Fig. I ll may be executed by a person when the person opens their hand 1164 and moves their hand in a back and forth direction 1148 as a gesture 1140 to complete some function with the vehicle 104.
  • gesture 1140 may change the station of the radio in the vehicle 104.
  • the sensors 242 may both determine the configuration of the hand 1164 and the vector of the movement. The vector and hand configuration can be interpreted to mean certain things to the vehicle control system 204 and produce different results.
  • a user may configure their hand 1164 to extend two fingers and move the hand 1164 in an up and down operation 1156.
  • This gesture 1152 may control the volume of the radio or some other function.
  • this gesture 1152 may be configured to place the vehicle in a "valet" mode to, among other things, restrict access to certain features associated with the vehicle.
  • the sensors 242 may determine how the person has configured their hand 1164, and the vector of the movement.
  • a user may extend their middle three fingers at an angle that is substantially 45° for vertical from straight vertical and circle the hand in a counter-clockwise motion 1166.
  • This gesture 1160 may cause the automobile to change the heat setting or do some other function.
  • the configurations of the hand and the types of movement are variable.
  • the user may configure the hand 1164 in any way imaginable and may also move that hand 1164 in any direction with any vector in three- dimensional space.
  • the gestures 1140, 1152, 1160 may occur in a predetermined volume of space within the vehicle 104.
  • a sensor may be configured to identify such gestures 1140, 1152, 1160 between the front passenger's and front driver's seats over a console area within the passenger compartment of the vehicle 104.
  • the gestures 1140, 1152, 1160 may be made within area 1 508A between zones A 512A and B 512B. However, there may be other areas 508 where a user may use certain gestures, where sensors 242 may be able to determine a certain function is desired.
  • Gestures that may be similar but used in different areas within the vehicle 104 may cause different functions to be performed.
  • the gesture 1140 in Fig. 1 II if used in zone E 512E, may change the heat provided in zone E 512E, but may change the station of a radio if used in zone A 512A and/or zone B 512B.
  • the gestures may be made with other body parts or, for example, different expressions of a person's face and may be used to control functions in the vehicle 104.
  • the user may use two hands in some circumstances or do other types of physical movements that can cause different reactions in the vehicle 104.
  • Figs. 12A-12D show various embodiments of a data structure 1200 to store different settings.
  • the data structure 1200 may include one or more of data files or data objects 1204, 1250, 1270, 1280.
  • the data structure 1200 may represent different types of databases or data storage, for example, object-oriented data bases, flat file data structures, relational database, or other types of data storage arrangements.
  • Embodiments of the data structure 1200 disclosed herein may be separate, combined, and/or distributed.
  • the data file 1204 may include several portions 1208-1242 representing different types of data. Each of these types of data may be associated with a user, as shown in portion 1208.
  • the user can be any person that uses or rides within the vehicle or conveyance 104.
  • the user may be identified in portion 1212.
  • the user may include a set of one or more features that may identify the user. These features may be the physical characteristics of the person that may be identified by facial recognition or some other type of system. In other situations, the user may provide a unique code to the vehicle control system 204 or provide some other type of data that allows the vehicle control system 204 to identify the user.
  • the features or characteristics of the user are then stored in portion 1212.
  • Each user may have a different set of settings for each area 508 and/or each zone 512 within the vehicle 104.
  • each set of settings may also be associated with a predetermined zone 512 or area 508.
  • the zone 512 is stored in portion 1220, and the area 508 is stored in portion 1216.
  • One or more settings may be stored in portion 1224. These settings 1224 may be the configurations of different functions within the vehicle 104 that are specified by or for that user. For example, the settings 1224 may be the position of a seat, the position of a steering wheel, the position of accelerator and/or brake pedals, positions of mirrors, a heating/cooling setting, a radio setting, a cruise control setting, or some other type of setting associated with the vehicle 104. Further, in vehicles adapted to have a
  • the settings 1224 may also provide for how that heads-up display, dash, or console are configured for this particular user.
  • Each setting 1224 may be associated with a different area 508 or zone 512.
  • the sensors 242 within the vehicle 104 may be able to either obtain or track health data in portion 1228.
  • Health data 1228 may include any type of physical characteristic associated with the user. For example, a heart rate, a blood pressure, a temperature, or other types of heath data may be obtained and stored in portion 1228.
  • the user may have this health data tracked over a period of time to allow for statistical analysis of the user's health while operating the vehicle 104. In this way, if some function of the user's health deviates from a norm (e.g., a baseline measurement, average measurements taken over time, and the like), the vehicle 104 may be able to determine there is a problem with the person and react to that data.
  • a norm e.g., a baseline measurement, average measurements taken over time, and the like
  • One or more gestures may be stored in portion 1232.
  • the gestures used and described in conjunction Fig. 11 A through 1 IK may be configurable. These gestures may be determined or created by the user and stored in portion 1132.
  • a user may have different gestures for each zone 512 or area 508 within the vehicle. The gestures that do certain things while driving may do other things while in a different area 508 of the vehicle 104. Thus, the user may use a first set of gestures while driving and a second set while a passenger.
  • one or more users may share gestures as shown in portion 1232. Each driver may have a common set of gestures that they use in zone A 512A, 512 A. Each of these gestures may be determined or captured and then stored with their characteristics (e.g., vector, position of gesture, etc.) in portion 1232.
  • Safety parameters 1236 may be common operating characteristics for this driver/passenger or for all drivers/passengers that if deviated from may determine there is a problem with the driver/passenger or the vehicle 104. For example, a certain route may be taken repeatedly and an average speed or mean speed may be determined. If the mean speed deviates by some number of standard deviations, a problem with the vehicle 104 or the user may be determined. In another example, the health characteristics or driving experience of the user may be determined.
  • the vehicle control system 204 may determine that the safety parameter includes the users face or head being within this certain portion of the vehicle interior space. If the user's head deviates from that interior space for some amount of time, the vehicle control system 204 can determine that something is wrong with the driver and change the function or operation of the vehicle 104 to assist the driver. This may happen, for example, when a user falls asleep at the wheel.
  • the vehicle control system 204 can determine that the driver has fallen asleep and may take control of the operation of the vehicle 204 and the automobile controller 8104 may steer the vehicle 204 to the side of the road. In other examples, if the user's reaction time is too slow or some other safety parameter is not nominal, the vehicle control system 204 may determine that the user is inebriated or having some other medical problem. The vehicle control system 204 may then assume control of the vehicle to ensure that the driver is safe.
  • Information corresponding to a user and/or a user profile may be stored in the profile information portion 1238.
  • the profile information 1238 may include data relating to at least one of current data, historical data, a user preference, user habit, user routine, observation, location data (e.g., programmed and/or requested destinations, locations of parking, routes traveled, average driving time, etc.), social media connections, contacts, brand recognition (e.g., determined via one or more sensors associated with the vehicle 104, a device 212, 248, etc.), audible recording data, text data, email data, political affiliation, preferred retail locations/sites (e.g., physical locations, web-based locations, etc.), recent purchases, behavior associated with the aforementioned data, and the like.
  • the data in the profile information portion 1238 may be stored in one or more of the data structures 1200 provided herein. As can be appreciated, these one or more data structures may be stored in one or more memory locations. Examples of various memory locations are described in conjunction with Fig. 2.
  • One or more additional data fields may be stored in the linked data portion 1242 as data and/or locations of data.
  • the linked data 1242 may include at least one of pointers, addresses, location identification, data source information, and other information corresponding to additional data associated with the data structure 1200.
  • the linked data portion 1242 may refer to data stored outside of a particular data structure 1200.
  • the linked data portion 1242 may include a link/locator to the external data. Continuing this example, the link/locator may be resolved (e.g., via one or more of the methods and/or systems provided herein, etc.) to access the data stored outside of the data structure 1200.
  • the linked data portion 1242 may include information configured to link the data objects 1204 to other data files or data objects 1250, 1270, 1280.
  • the data object 1204 relating to a user may be linked to at least one of a device data object 1250, a vehicle system data object 1270, and a vehicle data object 1280, to name a few.
  • FIG. 12B An embodiment of a data structure 1200 to store information associated with one or more devices is shown in Fig. 12B.
  • the data file 1250 may include several portions 1216-1262 representing different types of data. Each of these types of data may be associated with a device, as shown in portion 1252.
  • the device may be any device that is associated with the vehicle 104.
  • a device may be associated with a vehicle 104 when that device is physically located within the interior space 108 of the vehicle 104.
  • a device may be associated with a vehicle 104 when the device registers with the vehicle 104. Registration may include pairing the device with the vehicle 104 and/or one or more of the vehicle systems (e.g., as provided in Fig. 3). In some cases, the registration of a device with a vehicle 104 may be performed manually and/or
  • An example of automatic registration may include detecting, via one or more of the vehicle systems, that a device is inside the vehicle 104. Upon detecting that the device is inside the vehicle 104, the vehicle system may identify the device and determine whether the device is or should be registered. Registration may be performed outside of a vehicle 104 via providing a unique code to the vehicle 104 and/or at least one of the vehicle systems. [0552] The device may be identified in portion 1256. Among other things, the device identification may be based on the hardware associated with the device (e.g., Media Access Control (MAC) address, Burned-In Address (BIA), Ethernet Hardware Address (EHA), physical address, hardware address, and the like).
  • MAC Media Access Control
  • BIOA Burned-In Address
  • EHA Ethernet Hardware Address
  • a device may be associated with one or more users.
  • a tablet and/or graphical user interface (GUI) associated with the vehicle 104 may be used by multiple members of a family.
  • the GUI may be located in a particular area 508 and/or zone 512 of the vehicle 104.
  • the device may include various settings, features, priorities, capabilities, and the like, based on an identification of the family member.
  • the user may be identified in portion 1254.
  • the user identification portion 1254 may include a set of one or more features that may identify a particular user.
  • These features may be the physical characteristics of the person that may be identified by facial recognition, or some other type of system, associated with the device and/or the vehicle 104.
  • the user may provide a unique code to the device, or provide some other type of data, that allows the device to identify the user. The features or characteristics of the user are then stored in portion 1254.
  • Each device identified in the device identification portion 1256 may have a different set of settings for each area 508 and/or each zone 512, and/or each user of the device. Thus, each set of settings may also be associated with a predetermined zone 512, area 508, and/or user.
  • the zone 512 is stored in portion 1220 and the area 508 is stored in portion 1216.
  • One or more settings may be stored in portion 1224. These settings 1224 may be similar and/or identical to those previously described. Further, the settings 1224 may also provide for how a device is configured for a particular user. Each setting 1224 may be associated with a different area 508 or zone 512. Thus, there may be more restrictive settings 1224 (e.g., restricted multimedia, texting, limited access to device functions, and the like) for the device when the user is the driver and in zone A 512 A, 512 A, of area 1 , 508A.
  • restrictive settings 1224 e.g., restricted multimedia, texting, limited access to device functions, and the like
  • the settings 1224 may provide unrestricted access to one or more features of the device (e.g., allowing texting, multimedia, etc.).
  • the capabilities of a device may be stored in portion 1258.
  • Examples of device capabilities may include, but are not limited to, a communications ability (e.g., via wireless network, EDGE, 3G, 4G, LTE, wired, Bluetooth®, Near Field Communications (NFC), Infrared (IR), etc.), hardware associated with the device (e.g., cameras, gyroscopes, accelerometers, touch interface, processor, memory, display, etc.), software (e.g., installed, available, revision, release date, etc.), firmware (e.g., type, revision, etc.), operating system, system status, and the like.
  • the various capabilities associated with a device may be controlled by one or more of the vehicle systems provided herein. Among other things, this control allows the vehicle 104 to leverage the power and features of various devices to collect, transmit, and/or receive data.
  • One or more priorities may be stored in portion 1260.
  • the priority may correspond to a value, or combination of values, configured to determine how a device interacts with the vehicle 104 and/or its various systems.
  • the priority may be based on a location of the device (e.g., as stored in portions 1216, 1220).
  • a default priority can be associated with each area 508 and/or zone 512 of a vehicle 104.
  • the default priority associated with a device found in zone 1 512A of area 1 508A e.g., a vehicle operator position
  • the vehicle 104 may determine that, although other devices are found in the vehicle, the device, having the highest priority, controls features associated with the vehicle 104. These features may include vehicle control features, critical and/or non-critical systems, communications, and the like.
  • the priority may be based on a particular user associated with the device.
  • the priority may be used to determine which device will control a particular signal in the event of a conflict.
  • Registration data may be stored in portion 1262.
  • data related to the registration may be stored in the registration data portion 1262.
  • data may include, but is not limited to, registration information, registration codes, initial registration time, expiration of registration, registration timers, and the like.
  • one or more systems of the vehicle 104 may refer to the registration data portion 1262 to determine whether a device has been previously registered with the vehicle 104. As shown in Fig. 12B, User 4 of Device 2 has not been registered. In this case, the registration data field 1262, for this user, may be empty, contain a null value, or other information/indication that there is no current registration information associated with the user.
  • the data structure 1200 may include a profile information portion 1238 and/or a linked data portion 1242.
  • the profile information portion 1238 and/or the linked data portion 1242 may include different information from that described above, it should be appreciated that the portions 1238, 1242 may be similar, or identical, to those as previously disclosed.
  • FIG. 12C An embodiment of a data structure 1200 to store information associated with one or more vehicle systems is shown in Fig. 12C.
  • the data file 1270 may include several portions 1216-1279 representing different types of data. Each of these types of data may be associated with a vehicle system, as shown in portion 1272.
  • the vehicle systems may be any system and/or subsystem that is associated with the vehicle 104. Examples of various systems are described in conjunction with Fig. 3 and other related figures (e.g., systems 324-352, etc.).
  • One example of a system associated with the vehicle 104 is the vehicle control system 204.
  • Other systems may include communications subsystems 344, vehicle subsystems 328, and media subsystems 348, to name a few. It should be appreciated that the various systems may be associated with the interior space 108 and/or the exterior of the vehicle 104.
  • Each system may include one or more components.
  • the components may be identified in portion 1274. Identification of the one or more components may be based on hardware associated with the component. This identification may include hardware addresses similar to those described in conjunction with the devices of Fig. 12B.
  • a component can be identified by one or more signals sent via the component.
  • signals may include an Internet Protocol (IP), or similar, address as part of the signal.
  • IP Internet Protocol
  • the signal may identify the component sending the signal via one or more of a header, a footer, a payload, and/or an identifier associated with the signal (e.g., a packet of a signal, etc.).
  • Each system and/or component may include priority type information in portion 1276.
  • the priority type information stored in portion 1276 may be used by the various methods and systems provided herein to differentiate between critical and non-critical systems.
  • critical systems may correspond to those systems used to control the vehicle 104, such as, steering control, engine control, throttle control, braking control, and/or navigation informational control (e.g., speed measurement, fuel measurement, etc.)
  • Non-critical systems may include other systems that are not directly related to the control of the vehicle 104.
  • non- critical systems may include media presentation, wireless communications, comfort settings systems (e.g., climate control, seat position, seat warmers, etc.), and the like.
  • the priority type of a system may change (e.g., from critical to non-critical, from non-critical to critical, etc.) depending on the scenario.
  • the interior climate control system may be classified as a non-critical system at a first point in time, it may be subsequently classified as a critical system when a temperature inside/outside of the vehicle 104 is measured at a dangerous level (e.g., subzero Fahrenheit, greater than 90-degrees Fahrenheit, etc.).
  • the priority type may be associated with temperature conditions, air quality, times of the day, condition of the vehicle 104, and the like.
  • Each system may be associated with a particular area 508 and/or zone 512 of a vehicle 104.
  • the location of a system may be used to assess a state of the system and/or provide how the system interacts with one or more users of the vehicle 104.
  • each system may have a different set of settings for each area 508 and/or each zone 512, and/or each user of the system.
  • each set of settings may also be associated with a predetermined zone 512, area 508, system, and/or user.
  • the zone 512 is stored in portion 1220 and the area 508 is stored in portion 1216.
  • One or more settings may be stored in portion 1224. These settings 1224 may be similar and/or identical to those previously described. Further, the settings 1224 may also provide for how a system is configured for a particular user. Each setting 1224 may be associated with a different area 508 or zone 512. For instance, a climate control system may be associated with more than one area 508 and/or zone 512. As such, a first user seated in zone 1 512A of area 1 508A may store settings related to the climate control of that zone 512A that are different from other users and/or zones 512 of the vehicle 104. Optionally, the settings may not be dependent on a user. For instance, specific areas 508 and/or zones 512 of a vehicle 104 may include different, default, or the same settings based on the information stored in portion 1224.
  • the various systems and/or components may be able to obtain or track health status data of the systems and/or components in portion 1278.
  • the health status 1278 may include any type of information related to a state of the systems. For instance, an operational condition, manufacturing date, update status, revision information, time in operation, fault status, state of damage detected, inaccurate data reporting, and other types of component/system health status data may be obtained and stored in portion 1278.
  • Each component and/or system may be configured to communicate with users, systems, servers, vehicles, third parties, and/or other endpoints via one or more communication type.
  • At least one communication ability and/or type associated with a system may be stored in the communication type portion 1279.
  • the communication type portion 1279 may be stored in the communication type portion 1279.
  • communication types contained in this portion 1279 may be ordered in a preferential order of communication types.
  • a system may be configured to preferably communicate via a wired communication protocol over one or more wired communication channels (e.g., due to information transfer speeds, reliability, and the like). However, in this instance, if the one or more wired communication channels fail, the system may transfer information via an alternative communication protocol and channel (e.g., a wireless communication protocol and wireless communication channel, etc.).
  • an alternative communication protocol and channel e.g., a wireless communication protocol and wireless communication channel, etc.
  • the methods and systems provided herein may take advantage of the information stored in the communication type portion 1279 to open available
  • a component or system may be restricted from communicating via a particular
  • the component or system may be forced by the vehicle control system 204 to use an alternate communication type where available, cease
  • the data structure 1200 may include a profile information portion 1238 and/or a linked data portion 1242.
  • the profile information portion 1238 and/or the linked data portion 1242 may include different information from that described above, it should be appreciated that the portions 1238, 1242 may be similar, or identical, to those as previously disclosed.
  • the data file 1280 may include several portions 1216-1286 representing different types of data. Each of these types of data may be associated with a vehicle, as shown in portion 1282.
  • the vehicle 104 can be any vehicle or conveyance 104 as provided herein.
  • the vehicle 104 may be identified in portion 1282. Additionally or alternatively, the vehicle 104 may be identified by one or more systems and/or subsystems.
  • the various systems of a vehicle 104 may be identified in portion 1284. For example, various features or characteristics of the vehicle 104 and/or its systems may be stored in portion 1284.
  • the vehicle 104 may be identified via a unique code or some other type of data that allows the vehicle 104 to be identified.
  • Each system may be associated with a particular area 508 and/or zone 512 of a vehicle 104.
  • the location of a system may be used to assess a state of the system and/or provide how the system interacts with one or more users of the vehicle 104.
  • each system may have a different set of settings for each area 508 and/or each zone 512, and/or each user of the system.
  • each set of settings may also be associated with a predetermined zone 512, area 508, system, and/or user.
  • the zone 512 is stored in portion 1220 and the area 508 is stored in portion 1216.
  • One or more settings may be stored in portion 1224. These settings 1224 may be similar and/or identical to those previously described. Further, the settings 1224 may also provide for how a vehicle and/or its systems are configured for one or more users. Each setting 1224 may be associated with a different area 508 or zone 512. Optionally, the settings may not be dependent on a particular user. For instance, specific areas 508 and/or zones 512 of a vehicle 104 may include different, default, or the same settings based on the information stored in portion 1224.
  • the various systems and/or components may be able to obtain or track health status data of the systems and/or components in portion 1278.
  • the health status 1278 may include any type of information related to a state of the systems. For instance, an operational condition, manufacturing date, update status, revision information, time in operation, fault status, state of damage detected, inaccurate data reporting, and other types of component/system health status data may be obtained and stored in portion 1278.
  • the warnings data 1286 may include warning generated by the vehicle 104, systems of the vehicle 104, manufacturer of the vehicle, federal agency, third party, and/or a user associated with the vehicle. For example, several components of the vehicle may provide health status information (e.g., stored in portion 1278) that, when considered together, may suggest that the vehicle 104 has suffered some type of damage and/or failure. Recognition of this damage and/or failure may be stored in the warnings data portion 1286.
  • the data in portion 1286 may be communicated to one or more parties (e.g., a manufacturer, maintenance facility, user, etc.).
  • a manufacturer may issue a recall notification for a specific vehicle 104, system of a vehicle 104, and/or a component of a vehicle 104. It is anticipated that the recall notification may be stored in the warning data field 1286. Continuing this example, the recall notification may then be communicated to the user of the vehicle 104 notifying the user of the recall issued by the manufacturer.
  • the data structure 1200 may include a profile information portion 1238 and/or a linked data portion 1242.
  • the profile information portion 1238 and/or the linked data portion 1242 may include different information from that described above, it should be appreciated that the portions 1238, 1242 may be similar, or identical, to those as previously disclosed.
  • FIG. 13 An embodiment of a method 1300 for storing settings for a user 216 associated with vehicle 104 is shown in Fig. 13. While a general order for the steps of the method 1300 is shown in Fig. 13, the method 1300 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 13. Generally, the method 1300 starts with a start operation 1304 and ends with an end operation 1336. The method 1300 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1300 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-12.
  • a person may enter the vehicle space 108.
  • One or more sensors 242 may then identify that a person is sitting within the vehicle 104, in step 1308. For example, sensors 242 in a seat, may determine that some new amount of weight has been registered. The amount of weight may fall within predetermined parameters (e.g., over a threshold, in a specific range, etc.). This weight may then be determined to be a person by one or more optical or other sensors 242.
  • the vehicle control system 204 may then determine that a person is in a certain zone 512 or area 508. For example, the sensors 242 may send signals to the vehicle controls system 204 that an event has occurred. This information may be sent to the vehicle control system processor 304 to determine the zone 512 and area 508 where the event occurred. Further, the vehicle control system 204 may then identify the person, in step 1312.
  • the vehicle control system 204 can receive the information from the sensors 242 and use that information to search the database 1200 that may be stored within the system data 208.
  • the sensor data may be compared to ID characteristics 1212 to determine if the person has already been identified.
  • the vehicle control system 204 may also send the characteristic data from the sensors to the communication network 224 to a server 228 to compare the sensor data to stored data 232 that may be stored in a cloud system.
  • the person's features can be compared to stored features 1212 to determine if the person in the vehicle 104 can be identified.
  • step 1316 the method 1300 proceeds YES to step 1316 where that person may be identified.
  • the information associated with that person 1240 may be retrieved and provided to the vehicle control system 204 for further action.
  • step 1320 the vehicle control system 204, using an application, may create a new record in table 1200 for the user. This new record may store a user identifier and their characteristics 1212. It may also store the area 508 and zone 512 in data portions 1216 and 1220. The new record may then be capable of receiving new settings data for this particular user. In this way, the vehicle 104 can automatically identify or characterize a person so that settings may be established for the person in the vehicle 104.
  • the input module 312 may then determine if settings are to be stored, in step 1324.
  • Settings might be any configuration of the vehicle 104 that may be associated with the user. The determination may be made after receiving a user input from the user. For example, the user may make a selection on a touch sensitive display indicating that settings currently made are to be stored. In other situations, a period of time may elapse after the user has made a configuration. After determining that the user is finished making changes to the settings, based on the length of the period of time since the setting was established, the vehicle control system 204 can save the setting. Thus, the vehicle control system 204 can make settings automatically based on reaching a steady state for settings for user.
  • the vehicle control system 204 may then store the settings for the person, in step 1328.
  • the user interaction subsystem 332 can make a new entry for the user 1208 in data structure 1204.
  • the new entry may be either a new user or a new settings listed in 1224.
  • the settings may be stored based on the area 508 and zone 512. As explained previously, the settings can be any kind of configuration of the vehicle 104 that may be associated with the user in that area 508 and the zone 512.
  • the settings may also be stored in cloud storage, in step 1332.
  • the vehicle control system 204 can send the new settings to the server 228 to be stored in storage 232. In this way, these new settings may be ported to other vehicles for the user. Further, the settings in storage system 232 may be retrieved, if local storage does not include the settings in storage system 208.
  • the settings may be stored in profile data 252.
  • the profile data 252 may be associated with one or more devices 212, 248, servers 228, vehicle control systems 204, and the like.
  • the settings in profile data 252 may be retrieved in response to conditions. For instance, the settings may be retrieved from at least one source having the profile data if local storage does not include the settings in storage system 208.
  • a user 216 may wish to transfer settings stored in profile data 252 to the system data 208. In any event, the retrieval and transfer of settings may be performed automatically via one or more devices 204, 212, 248, associated with the vehicle 104.
  • FIG. 14 An embodiment of a method 1400 to configure the vehicle 104 based on stored settings is shown in Fig. 14.
  • a general order for the steps of the method 1400 is shown in Fig. 14.
  • the method 1400 starts with a start operation 1404 and ends with an end operation 1428.
  • the method 1400 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 14.
  • the method 1400 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium.
  • the method 1400 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-13.
  • the vehicle control system 204 can determine if a person is in a zone 512 or area 508, in step 1408. This determination may be made by receiving data from one or more sensors 242.
  • the vehicle 104 can use facial recognition, weight sensors, heat sensors, or other sensors to determine whether a person is occupying a certain zone 512.
  • the vehicle control system 204 can identify the person, in step 1412. The vehicle control system 204 can obtain
  • the settings in portion 1224 may be retrieved by identifying the correct zone 512, area 508, and characteristics for the user.
  • the vehicle control system 204 can first determine if there are settings associated with the identified person for that zone 512 and/or area 508, in step 1416. After identifying the user by matching characteristics with the features in portion 1212, the vehicle control system 204 can determine if there are settings for the user for the area 1216 and zone 1220 the user currently occupies. If there are settings, then the vehicle control system 204 can make the determination that there are settings in portion 1224, and the vehicle control system 204 may then read and retrieve those settings, in step 1420. The settings may be then used to configure or react to the presence of the user, in step 1424.
  • these settings may be obtained to change the configuration of the vehicle 104, for example, how the position of the seats or mirrors are set, how the dash, console, or heads up display is configured, how the heat or cooling is configured, how the radio is configured, or how other different configurations are made.
  • Embodiments of a method 1500 for storing settings in cloud storage are shown in Fig. 15.
  • a general order for the steps of the method 1500 is shown in Fig. 15.
  • the method 1500 starts with a start operation 1504 and ends with an end operation 1540.
  • the method 1500 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 15.
  • the method 1500 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium.
  • the method 1500 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-14.
  • the vehicle control system 204 can determine if a person is in a zone 512 or area 508, in step 1508. As explained previously, the vehicle control system 204 can receive vehicle sensor data from vehicle sensors 242 that show a person has occupied a zone 512 or an area 508 of the vehicle 104. Using the vehicle sensor data, the vehicle control system 204 can determine characteristics of the person, in step 1512. These characteristics are compared to the features in portion 1212 of the data structure 1204. From this comparison, the vehicle control system 204 can determine if the person is identified within the data structure 1204, in step 1516. If there is a comparison and the person can be identified, the method 1500 proceeds YES to step 1520. However, if the person cannot be identified, the method 1500 proceeds NO, to step 1524.
  • step 1520 the person is identified in portion 1208 by the successful comparison of the characteristics and the features. It should be noted that there may be a degree of variability between the characteristics and the features in portion 1212. Thus, the comparison may not be an exact comparison but may use methods known in the art to make a statistically significant comparison between the characteristics received from the sensors 242 and the features stored in portion 1212.
  • step 1524 the characteristics received from sensors 242 are used to characterize the person. In this way, the received characteristics may be used as an ID, in portion 1212, for a new entry for a new user in portion 1208.
  • the user may make one or more settings for the vehicle 104.
  • the vehicle control system 204 may determine if the settings are to be stored, in step 1528. If the settings are to be stored, the method 1500 proceeds YES to step 1536. If the settings are not to be stored or if there are no settings to be stored, the method 1500 proceeds NO to step 1532. In step 1532, the vehicle control system 204 can retrieve the settings in the portion 1224 of the data structure 1204. Retrieval of the settings may be as described in conjunction with Fig. 14. If settings are to be stored, the vehicle control system 204 can send those settings to server 228 to be stored in data storage 232, in step 1536.
  • Data storage 232 acts as cloud storage that can be used to retrieve information on the settings from other vehicles or from other sources. Thus, the cloud storage 232 allows for permanent and more robust storage of user preferences for the settings of the vehicle 104.
  • FIG. 16 An embodiment of a method 1600 for storing gestures associated with the user is shown in Fig. 16.
  • a general order for the steps of the method 1600 is shown in Fig. 16.
  • the method 1600 starts with a start operation 1604 and ends with an end operation 1640.
  • the method 1600 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 16.
  • the method 1600 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium.
  • the method 1600 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-15.
  • Vehicle control system 204 may receive sensor data from sensors 242 to determine a person is occupying a zone 512 in an area 508 of the vehicle 104, in step 1608.
  • the sensor data may provide characteristics for the person, in step 1612.
  • the vehicle control system 204 may then use the characteristics to determine if the person can be identified, in step 1616.
  • the vehicle control system 204 may compare the
  • step 1620 the person may be identified by the vehicle control system 204.
  • the vehicle control system 204 can characterize the person in step 1624 by establishing a new record in data structure 1204 using the characteristics, received from the sensors 242, for the features in portion 1212.
  • the vehicle control system 204 may determine if gestures are to be stored and associated with the user, in step 1628.
  • the vehicle control system 204 may receive user input on a touch sensitive display or some other type of gesture capture region which acknowledges that the user wishes to store one or more gestures.
  • the user may create their own gestures such as those described in conjunction with Figs. 1 lA-1 IK. These gestures may then be characterized and stored in data structure 1204. If there are gestures to be stored, the method 1600 proceeds YES to step 1636. If gestures are not to be stored the method 1600 may proceed NO to step 1632.
  • the vehicle control system 204 can retrieve current gestures from portion 1232, which are associated with user 1240. These gestures may be used then to configure how the vehicle 104 will react if a gesture is received. If gestures are to be stored, the vehicle control system 204 may store characteristics, in step 1636, as received from sensor 242 or from one more user interface inputs. These characteristics may then be used to create the stored gestures 1232, in data structure 1204. The characteristics may include what the gesture looks like or appears and also what affect the gesture should have. This information may then be used to change the configuration or operation of the vehicle 104 based on the gesture if it is received at a later time.
  • An embodiment of a method 1700 for receiving a gesture and configuring the vehicle 104 based on the gesture may be as provided in Fig. 17.
  • a general order for the steps of the method 1700 is shown in Fig. 17.
  • the method 1700 starts with a start operation 1704 and ends with an end operation 1728.
  • the method 1700 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 17.
  • the method 1700 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium.
  • the method 1700 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-16.
  • a vehicle control system 204 can receive sensor data from vehicle sensors 242.
  • the vehicle sensor data can be used by the vehicle control system 204 to determine that a person is in a zone 512 or area 508, in step 1708.
  • the vehicle sensor data may then be used to compare against feature characteristics 1212 to identify a person, in step 1712.
  • the vehicle control system 204 thereinafter may receive a gesture, in step 1716.
  • the gesture may be perceived by vehicle sensors 242 or received in a gesture capture region.
  • the gesture may be as described in conjunction with Figs. 1 lA-1 IK.
  • the vehicle control system 204 can compare the gesture to gesture characteristics in portion 1232, in step 1720. The comparison may be made so that a statistically significant correlation between the sensor data or gesture data and the gesture
  • the vehicle control system 204 can configure the vehicle 104 and/or react to the gesture, in step 1724.
  • the configuration or reaction to the gesture may be as prescribed in the gesture characteristic 1232.
  • An embodiment of a method 1800 for storing health data may be as shown in Fig. 18.
  • a general order for the steps of the method 1800 is shown in Fig. 18.
  • the method 1800 starts with a start operation 1804 and ends with an end operation 1844.
  • the method 1800 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 18.
  • the method 1800 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium.
  • the method 1800 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-17.
  • Vehicle control system 204 can receive sensor data from sensors 242. The sensor data may be used to determine that a person is in a zone 512 or area 508, in step 1808. The sensor data may then be used to determine characteristics of the person, in step 1812. From the characteristics, the vehicle control system 204 can determine if a person may be identified in data structure 1204, in step 1816. If it is determined that the person can be identified in step 1816, the method 1800 proceeds YES to step 1820. If the person cannot be identified, the method 1800 proceeds NO to step 1824. A person may be identified by matching the characteristics of a person from the sensor data to the features shown in portion 1212. If these comparisons are statistically significant, the person may be identified in portion 1208, in step 1820. However, if the person is not identified in portion 1208, the vehicle control system 204 can characterize the person using the vehicle sensor data, in step 1824. In this way, the vehicle control system 204 can create a new record for a new user in data structure 1204.
  • the vehicle control system 204 may receive health and/or safety data from the vehicle sensors 242, in step 1828.
  • the vehicle control system 204 can determine if the health or safety data is to be stored, in step 1832. The determination is made as to whether or not there is sufficient health data or safety parameters, in portion 1228 and 1236, to provide a reasonable baseline data pattern for the user 1240. If there is data to be received and stored, the vehicle control system 204 can store the data for the person in portions 1228 and 1236 of the data structure 1204, in step 1832.
  • the vehicle control system 204 may then wait a period of time, in step 1836.
  • the period of time may be any amount of time from seconds to minutes to days.
  • the vehicle control system 204 can receive new data from vehicle sensors 242, in step 1828.
  • the vehicle control system 204 can receive data periodically and update or continue to refine the health data and safety parameters in data structure 1204.
  • the vehicle control system 204 may optionally store the health and safety data in cloud storage 232 by sending it through the communication network 224 to the server 228, in step 1840.
  • An embodiment of a method 1900 for monitoring the health of a user may be as shown in Fig. 19.
  • a general order for the steps of the method 1900 is shown in Fig. 19.
  • the method 1900 starts with a start operation 1904 and ends with an end operation 1928.
  • the method 1900 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 19.
  • the method 1900 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium.
  • the method 1900 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-18.
  • the vehicle control system 204 can receive health data from sensors 242. The health data may be received in step 1908. The vehicle control system 204 may then compare the received health data to stored health parameters in portion 1228 or portion 1236, in step 1912. The comparison may check if there is statistically significant separation or disagreement between the received health data and the stored health data. Thus, the vehicle control system 204 can make a health comparison of the user based on a baseline of health data previously stored. A statistically significant comparison may include determining if there are any parameters more than three standard deviations from the average or norm, any parameter that is increasing or decreasing over a period of eight different measurements, a measurement that is more than two standard deviations from the norm more than three measurements consecutively, or other types of statistical comparisons.
  • the vehicle control system 204 can determine whether the health data is within acceptable limits, in step 1916. If the health data is within acceptable limits, the method 1900 proceeds YES back to receiving new health data, in step 1908. In this way, the health data is periodically or continually monitored to ensure that the driver is in a healthy state and able to operate the vehicle. If the health data is not within acceptable parameters, the method 1900 may proceed NO to step 1924 where the vehicle control system 204 may react to the change in the health data.
  • the reaction may include any measure to provide for the safety of the user, such as stopping the vehicle, beginning to drive the vehicle, driving the vehicle to a new location, such as a hospital, waking the driver with an alarm or other noise, or performing some other function that may help maintain the health or safety of the user.
  • the health data received may be a reaction from the driver.
  • the driver may call for help or ask the vehicle for assistance.
  • the driver or passenger may say that they are having a medical emergency and ask the car to perform some function to help.
  • the function to help may include driving the person to a hospital or stopping the car and calling for emergency assistance.
  • Firewall 484 can provide security but, due to the seriousness of the consequences to public safety of a security breach, additional security measures are desirable.
  • Network security can use a common computer to run simultaneously first and second operating systems, respectively, for critical and non-critical tasks, functions, and operations of each processing module of the vehicle control system 204, whereby, in the event of a security breach event, the second operating system is disabled or its execution discontinued.
  • a network topology (or MIB) of a component in the network controlled by the first computer may describe or include information regarding one or more components of the network controlled by the second computer or the second computer itself.
  • the infotainment network (or the distributed processing network containing any component (such as one or more of the IP router 420, IP module 1048, communications module 1052, device discovery daemon 1020, media controller 492, media processor 808, access point 456, match engine 812, video controller 840, audio controller 844, speech synthesis module 820, network transceiver 824, signal processing module 828, device interaction module 818, and combo controller 460) participating in, providing, or handling infotainment content) can be prohibited from communicating with a vehicle non-infotainment control network containing any component participating in, providing, or handling signaling for controlling non- infotainment vehicle control tasks, functions, or operations, such as critical tasks, functions or operations (such as one or more of the vehicle control module 826, location module 896, automobile controller 8104, vehicle systems transceiver 8108, traffic controller 8112, network traffic transceiver 8116, and environmental control module 830).
  • a vehicle non-infotainment control network containing any component participating in
  • the first computer can have no contactable address (such as a local IP address or global unicast address) on any wireless or non- wireless network, including the infotainment network, and is not addressable, contactable, or otherwise able to communicate with the second computer, a component in the network controlled by the second computer, or any portable computational device (such as the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312).
  • no contactable address such as a local IP address or global unicast address
  • Critical tasks, functions or operations can include one or more of (depending on the particular vehicle) monitoring, controlling, and/or operating the ECU, TCU, door settings, window settings, and/or blind spot monitor, monitoring, controlling, and/or operating the safety equipment (e.g., airbag deployment control unit, collision sensor, nearby object sensing system, seat belt control unit, sensors for setting the seat belt, etc.), monitoring and/or controlling certain critical sensors such as the power source controller and energy output sensor, engine temperature, oil pressure sensing, hydraulic pressure sensors, sensors for headlight and other lights (e.g.
  • emergency light e.g., ignition
  • brake light e.g., parking light, fog light, interior or passenger compartment light, and/or tail light state (on or off)
  • vehicle control system sensors e.g., steering/torque sensor, controlling the operation of the engine (e.g., ignition), head light control unit, power steering, display panel, switch state control unit, power control unit, and/or brake control unit, and/or issuing alerts to a user and/or remote monitoring entity of potential problems with a vehicle operation.
  • the engine e.g., ignition
  • head light control unit e.g., power steering, display panel, switch state control unit, power control unit, and/or brake control unit
  • issuing alerts to a user and/or remote monitoring entity of potential problems with a vehicle operation.
  • Less critical tasks, functions or operations to vehicle operation can, depending on the application, further include one or more of (depending on the particular vehicle) monitoring, controlling, and/or operating a wireless network sensor (e.g., Wi-Fi and/or Bluetooth sensor), cellular data sensor, emissions control, seating system controller and sensor, monitoring certain non-critical sensors such as ambient (outdoor) weather readings (e.g., temperature, precipitation, wind speed, and the like), odometer reading sensor, trip mileage reading sensor, road condition sensors (e.g., wet, icy, etc.), radar transmitter/receiver output, brake wear sensor, oxygen sensor, ambient lighting sensor, vision system sensor, ranging sensor, parking sensor, heating, venting, and air conditioning (HVAC) system and sensor, water sensor, air-fuel ratio meter, hall effect sensor, microphone, radio frequency (RF) sensor, and/or infrared (IR) sensor.
  • a wireless network sensor e.g., Wi-Fi and/or Bluetooth sensor
  • cellular data sensor e.g.,
  • Infotainment tasks, functions, or operations can include one or more of monitoring, controlling, and/or operating a wireless network sensor (e.g., Wi-Fi and/or Bluetooth sensor), cellular data sensor, emissions control, entertainment system, receiving, processing, and/or providing media and/or multimedia content.
  • a wireless network sensor e.g., Wi-Fi and/or Bluetooth sensor
  • Infotainment tasks, functions, or operations are typically performed by one or more of the IP router 420, IP module 1048, communications module 1052, device discovery daemon 1020, media controller 492, media processor 808, access point 456, match engine 812, video controller 840, audio controller 844, speech synthesis module 820, network transceiver 824, signal processing module 828, device interaction module 818, desktop manager 1012, windows manager 1032, application manager 1036, panel launcher 1040, desktop plugin 1024, and combo controller 460.
  • FIG. 23 shows a common computer 2300 having a microprocessor 2320 executing first and second virtual machines 2308 and 2312.
  • each of the first and second virtual machines is an isolated software container with its own operating system 1004 and executing one or more applications.
  • the first virtual machine can simulate the first computer while the second virtual machine can simulate the second computer.
  • Applications stored in each virtual machine would be those applications responsible for critical tasks, functions, and operations in the first virtual machine 2308 and non-critical tasks, functions, and operations in the second virtual machine 2312.
  • the applications stored in the first virtual machine 2308 can be those controlling vehicle tasks, functions, and operations while those stored in the second virtual machine 2312 can be those controlling infotainment tasks, functions, and operations.
  • the first virtual machine 2308 can include one or more of vehicle control module 826, location module 896, automobile controller 8104, vehicle systems transceiver 8108, traffic controller 8112, network traffic transceiver 8116, and environmental control module 830 while the second virtual machine 2312 can include one or more of the IP module 1048, communications module 1052, device discovery daemon 1020, media controller 492, media processor 808, access point 456, match engine 812, video controller 840, audio controller 844, speech synthesis module 820, network transceiver 824, signal processing module 828, device interaction module 818, and combo controller 460.
  • a hypervisor module 2316 decouples the first and second virtual machines 2308 and 2312 from the host and dynamically allocates computing resources to each of the first and second virtual machine as needed.
  • Computing resources includes, for example, interrupt requests, input/output memory locations, direct memory access, memory capacity allocated for each operating system, scheduling of central processing units, and the like.
  • a common computer 2400 includes first and second operating systems 1004a and 1004b stored in common memory 2412.
  • Each of the first and second operating systems 1004a and 1004b executing by the microprocessor 2320 can have one or more of a libraries layer 2408, application framework layer 2416, and application layer 2420.
  • Applications are executed in application layer supported by application framework layer of the respective operating system.
  • the application framework layer may include a window manager, activity manager, package manager, resource manager, telephony manager, gesture controller, and/or other managers and services.
  • Each operating system can rely on a common kernel 1028 for process isolation, memory management, and threading support.
  • Libraries layer includes user libraries that implement common functions with input/output and string manipulation, graphics libraries, database libraries, communication libraries, and/or other libraries.
  • the first operating system and environment can simulate the first computer while the second operating system and environment can simulate the second computer.
  • Applications executing in each operating system and environment would be those applications responsible for critical and non-critical tasks, functions, and operations. More specifically, the applications stored in the application framework layer of the first operating system 1004a can be those controlling vehicle tasks, functions, and operations while those stored in the second operating system 1004b can be those controlling infotainment tasks, functions, and operations.
  • the first operating system 1004a can include one or more of vehicle control module 826, location module 896, automobile controller 8104, vehicle systems transceiver 8108, traffic controller 8112, network traffic transceiver 8116, and environmental control module 830 while the second operating system 1004b can include one or more of the IP module 1048, communications module 1052, device discovery daemon 1020, media controller 492, media processor 808, access point 456, match engine 812, video controller 840, audio controller 844, speech synthesis module 820, network transceiver 824, signal processing module 828, device interaction module 818, and combo controller 460.
  • the first operating system 1004a can run in a separate execution environment from the second operating system 1004b.
  • one operating system typically the first operating system 1004a
  • the other operating system typically the first operating system 1004b
  • Processes and applications running on the second operating system can access user libraries, manager(s), and service(s) in the secondary execution environment.
  • applications for one operating system do not run on the other operating system.
  • the kernel 1028 is shared such that the first and second operating systems run concurrently and independently on the shared kernel 1028. Specifically, both operating systems 1004a,b interface to the shared kernel 1028 through the same kernel interface, such as by making system calls to the shared kernel 1028.
  • the shared kernel 1028 manages task scheduling for processes of both operating systems and resource
  • Fig. 25 depicts an operation of the hypervisor module 2316 or common kernel 1028 (collectively “computer control module”).
  • the computer control module detects a stimulus, such as a potential security breach, attack, or other intrusion or attempted intrusion notification, a clock signal, an interrupt and the like.
  • a stimulus such as a potential security breach, attack, or other intrusion or attempted intrusion notification, a clock signal, an interrupt and the like.
  • breaches, attacks, and other intrusions include denial-of-service attacks, spoofing, man in the middle, ARP spoofing, smurf attack, buffer overflow, heap overflow, format string attack, SQL injection, and cyber attack.
  • the computer control module collects metrics regarding operations of the first and/or second operating system (e.g., of first or second virtual machine) and/or of the computer or associated therewith (e.g., of the network controlled by the first or second operating system).
  • Metrics include network metrics, computational device metrics, and the like. Examples of network metrics include measuring link utilization, number of hops, speed of the path, packet loss, latency, path reliability, path bandwidth, network overhead, maximum transmission unit or MTU, throughput and the like and of computer metrics include load, transaction times, database access times, available memory, central processing unit (“CPU”) usage, memory usage, and the like.
  • the computer control module determines whether or not the collected metrics indicate a normal or abnormal operating state of the computer and/or of the first or second operating system and/or associated therewith.
  • a normal operating state is deemed to occur when the collected metrics are within normal operational ranges.
  • An abnormal operating state is deemed to occur when one or more of the collected metrics are outside of normal operational ranges. The ranges are
  • the metrics may be ranked or scored, with the cumulative score being the basis for the normal or abnormal operating state determination.
  • the computer control module acts differently depending on whether the abnormality impacts the first or second operating system.
  • the first operating system responsible for critical vehicle functions, tasks, or operations is behaving abnormally, the driver can be warned of the potential problem and asked to terminate operation of the vehicle.
  • Execution of certain critical vehicle tasks, functions or operations can be discontinued in favor of manual control of the vehicle tasks, functions, or operations.
  • Execution of certain critical vehicle tasks, functions, or operations can be discontinued while other more critical tasks, functions, or operations continue to be executed. For example, steering, brakes, headlights, wipers, and vehicle acceleration can continue to be executed while environmental or climate control operations are
  • the determination as to whether the first or second operating system is behaving abnormally can be different.
  • the first operating system can require more metrics to be abnormal than the second operating system or vice versa. Because the second operating system is more likely to be subject to an attack from the Internet or a portable device, a lesser deviation from normal operational ranges can be considered to be abnormal operation of the second operating system compared to the deviation required for the first operating system to be considered to be operating abnormally. Stated another way, a lower threshold is required to find abnormal operation of the second operating system when compared to the first operating system.
  • the computer control module in step 2512, shuts down or disables the second virtual machine, operating system, or computer so that the first virtual machine, operating system, or computer continues to perform critical and/or automotive tasks, functions or operations while shutting down, terminating, or discontinuing execution of infotainment tasks, functions, or operations.
  • the second, but not the first, operating system can be shut down, particularly when the second operating system (but not the first operating system) is controlling and/or receiving input from the network.
  • step 2516 After step 2512 or when the operating state is normal, the computer control module, in step 2516, records the metrics and findings and returns to step 2500.
  • Critical tasks, functions, and operations can be performed by an application executing in the first operating system and less critical tasks, functions, and operations (and infotainment tasks, functions, and operations) can be performed by an application executing in the second operating system.
  • vehicle controls e.g., steering, braking, indicating, etc.
  • environmental controls and radio stations and entertainment presentations can be considered as non-critical and performed in the second operating system environment.
  • the critical tasks, functions, and operations can be performed not only by the first operating system but also alternatively by a redundant operating system.
  • the redundant operating system may be configured to run in tandem with the first operating system performing critical tasks, functions, and operations and replace, or receive a "hot" handoff from, the first operating system in the event of a failure or malfunction of the first operating system or an application executing in the first operating system. In a "hot” handoff, the redundant operating system may replace the mission critical operating system without interruption of operation.
  • the first operating system may utilize one or more of error checking communication protocols, secure communication standards, encrypted communications, high speed communications protocols, and the like.
  • the vehicle control system 204 can act as an artificially intelligent assistant that, based on an awareness of a selected vehicle occupant's persona (e.g., his or her physical, psychological, mental and other characteristics, needs, desires, behavior, personality, goals, habits, biometric information, likes or interests, dislikes, preferences, and the like) and/or the persona of family and friends of the selected vehicle occupant, proactively (without a preceding request of the selected vehicle occupant) provides feedback, suggestions, reminders, recommendations, and/or other types of assistance to the selected vehicle occupant.
  • a selected vehicle occupant's persona e.g., his or her physical, psychological, mental and other characteristics, needs, desires, behavior, personality, goals, habits, biometric information, likes or interests, dislikes, preferences, and the like
  • the vehicle control system 204 can characterize, define, and/or analyze a person's persona based on profile data 252 of the person, device data 220 associated with the person, stored data 232, system data 208, and information related to the person as collected by vehicle interior and exterior sensors.
  • a vehicle occupant's persona can include the information in one or more of the profile data 252 of the person, device data 220 associated with the person, stored data 232, system data 208, and information related to the person as collected by vehicle interior and exterior sensors.
  • Based on the persona of selected persona and vehicle-related information (such as vehicle context, state, external surroundings, location, past, current, and/or intended path of travel, waypoints, and destination of the vehicle) can proactively provide suggestions, reminders, recommendations and/or other types of assistance to the selected vehicle occupant.
  • vehicle-related information such as vehicle context, state, external surroundings, location, past, current, and/or intended path of travel, waypoints, and destination of the vehicle
  • the persona can include a user selected and/or configured avatar, which is provided to the user via any graphical user interface by which the vehicle control system provides suggestions, reminders, recommendations and/or other types of assistance to the selected user as a vehicle occupant.
  • an "avatar” is the graphical and/or voice representation of an object that can be the user or the user's alter ego or character or of a person selected by the user. It may take either a three-dimensional form, as in games or virtual worlds, or a two-dimensional form as an icon in Internet forums and other online communities. It can have not only a visual image but also distinct voice, mannerisms, and/or other behavior. It is commonly an object selected and/or configured by the user. Avatars can be used as virtual embodiments of embodied agents, which are driven more or less by artificial intelligence rather than real people.
  • Each occupant of a vehicle can have a differently configured and/or appearing avatar at his or her display interacting with him or her.
  • the avatar may accompany the user from vehicle-to-vehicle as part of the user's persona, which can be shared from vehicle to vehicle.
  • the vehicle control system based on an awareness of a selected person's persona and the vehicle-related information, can note vehicle related alarms, bring to a selected person's attention instant messages, emails, voice mails from, missed phone calls from, current activities of and/or locations of friends, and points of interest near the vehicle or its path of travel.
  • the vehicle control system notes that a vehicle driver's anniversary is next week, that the driver's spouse has pinned a bracelet that she likes, that the bracelet is on sale at a specific store, and that the driver has an appointment near the store and, in response, alerts the driver to the sale and spousal interest in the bracelet and asks the driver, via any one of the user interfaces 212, 248, input/output module 312, and user interface/input interfaces 324, whether the vehicle control system can (i) add a stop to the driver's calendar and/or waypoint for the store to the driver's path of travel and/or (ii) make the purchase of the bracelet online for the driver.
  • the vehicle control system notes that a vehicle warning has been triggered, such as an actual or potential engine or other vehicle malfunction (e.g., low battery charge voltage, low oil pressure, high engine pressure, low fuel level, low tire pressure, air bag initiator malfunction, etc.), identifies vehicle service facilities in proximity to the current vehicle position or path of travel that is capable of repairing or addressing the source of the warning (e.g., a service garage, fuel station, dealership, etc.), contacts the facility to provide details on the malfunction to determine service cost, determine possible appointment times, notifies the driver of the possible appointment times, service cost, and corresponding facility location, and, at the option of the driver, schedules an appointment at the facility and adds a stop to the driver's calendar and/or waypoint for the facility to the driver's path of travel.
  • a vehicle warning e.g., low battery charge voltage, low oil pressure, high engine pressure, low fuel level, low tire pressure, air bag initiator malfunction, etc.
  • vehicle service facilities in proximity to the current vehicle position or path of travel that is
  • the vehicle control system notes that a friend or family member of a vehicle occupant is in spatial proximity to the current vehicle position or path of travel, notifies the vehicle occupant of the identity and current location of the friend or family member, and, at the option of the occupant, contacts the friend or family member to notify them of the vehicle occupant's current position and arrange a meeting at a specified location nearby the current vehicle location or path of travel and adds a stop to the driver's calendar and/or waypoint for the meeting location to the driver's path of travel. This is particularly useful where the friend of family member is mobile too, such as in a vehicle or on a bike.
  • the vehicle control system notes that the vehicle is on a long distance trip (based on the driver's or other occupant's electronic calendar and/or current location relative to the driver's or other occupant's home location), identifies hotels or motels in proximity to the current vehicle position or path of travel, contacts the hotel or motel to determine possible room types (e.g., one or two beds, bed size, etc.), cost, and availabilities, notifies the driver of the accommodations available and cost, and, at the option of the driver, books an accommodation at a selected hotel or motel and adds a stop to the driver's calendar and/or waypoint for the hotel or motel to the driver's path of travel.
  • possible room types e.g., one or two beds, bed size, etc.
  • cost, and availabilities notifies the driver of the accommodations available and cost
  • the vehicle control system notes that the vehicle is on a long distance trip, identifies a point or location of interest based on a persona of a vehicle occupant in proximity to the current vehicle position or path of travel, collects and provides information about the point or location of interest to the vehicle occupant, optionally contacts the point or location of interest to determine business hours and costs to visit the point or location of interest, notifies the occupant of the business hours and cost, and, at the option of the driver, books an accommodation at the point or location of interest and adds a stop to the occupant's calendar and/or waypoint for the point or location of interest to the driver's path of travel.
  • the point or location of interest can be a historical landmark, museum, library, church, store, restaurant, coffee shop, mall, healthcare facility, and the like.
  • the vehicle control system notes, from the electronic calendar of the driver and/or historic driver behavior, that the destination of the vehicle is a meeting, place of employment, or other point of interest and the location thereof, determines that the path of travel of the vehicle will not arrive at the destination timely such as due to traffic, road construction, or other delays, determines a different route that will arrive timely at the destination, proposes the alternate route to the driver, and, at the driver's option, changes the on board navigation to redirect the driver along the alternate route.
  • the vehicle control system can also send a message, such as an email or instant message, to one or more meeting participants or his or her place of employment to let them know that he or she will or may be late.
  • an interior or exterior vehicle sensor 340 senses a condition or activity, such as of a vehicle passenger (e.g., a child passenger unbuckling a seat belt, children fighting in the back seat, an in appropriate gesture of a vehicle occupant, and the like) and notifies a selected vehicle occupant, such as the driver of the condition or activity.
  • a vehicle passenger e.g., a child passenger unbuckling a seat belt, children fighting in the back seat, an in appropriate gesture of a vehicle occupant, and the like
  • a selected vehicle occupant such as the driver of the condition or activity.
  • a vehicle control system based on information received from a sensor, such as a camera sensor 760, can identify and/or distinguish an occupant's actions, clothing, held objects or objects in the possession of the user, and the like and perform an action. For instance, the vehicle control system can direct the occupant to a current sale of an identified object worn by or in the possession of the occupant that is in spatial proximity to the current vehicle location. Information regarding objects associated with the user can also be identified by radio frequency identification readers and tags, barcode readers, quick response code readers and the like.
  • the vehicle control system determines that the occupant is driving the vehicle to a specific store and provides an alternate store site based on a busyness of one or more store locations.
  • Parking information e.g., how many spaces are occupied or free based on parking sensors, metered parking information, etc.
  • a user may be driving down a street and pull into a StarbucksTM parking lot at where the user and five other vehicles cannot find a parking spot. In this case, the user may spend five minutes waiting for the five other vehicles to clear the lot so the user can travel to another location.
  • the vehicle control system knowing that the parking lot associated with that particular StarbucksTM was fully occupied, may cause the vehicle to alter the route (or suggest an alternate route) to go to another StarbucksTM location based on the traffic and/or lack of parking.
  • the vehicle control system may use image sensors to translate signs (e.g., road signs, store signs, advertisements, billboards, etc.).
  • the translation process may be provided via the processing power of the mobile device, via processing on the cloud, combinations thereof, etc., and provide the translated signage message to the occupant.
  • the vehicle control system may present a translated availability and/or assistance to guide a user in finding products traditionally purchased on trips. This guidance may be based upon the past behavior associated with the user persona. For instance, when a user travels to a new place the user may always (or mostly) shop for toothpaste, water, and ibuprofen. As can be appreciated, some foreign countries may not offer all of the products at a single store, like North American supermarkets.
  • the vehicle may provide a translation for the purchasing behavior by at least providing one or more shopping outlets where a user can procure toothpaste, water, and ibuprofen.
  • the vehicle control system may state in the user's own language "You can find toothpaste and ibuprofen at the premise on Dipplestrasse in downtown Munich. For water you may wish to visit the Tarêt on Maybach Gaenge two blocks south of Dipplestrasse.”
  • the user may query for translation and visual assistance, such as "What does Aspirin look like in Germany".
  • the system may provide a picture or image and a translation text, voice, etc. via the mobile device and/or the vehicle.
  • the vehicle control system and/or the mobile device may present travel and tourism options based on popular sites and/or destinations and, if selected, add the selected option as a waypoint or destination.
  • the vehicle control system may provide the popular sites and/or destinations with a message, such as "75% of travelers visit the following destinations.”
  • the vehicle control system may develop a "perfect itinerary" based on the destinations desired and the timeframe. In some cases the itinerary may be based on desired locations and/or locations to be avoided. For instance a user may not wish to visit an industrial neighborhood, and as such the itinerary may provide routes that avoid industrialization.
  • the vehicle control system can use a vehicle location- based trigger to activate and/or deactivate features and/or settings at a waypoint or destination, such as a home, a garage, and the like.
  • a waypoint or destination such as a home, a garage, and the like.
  • the user may be returning home from work (this determination may be predicted based on prior trips made at the same or similar time of day by the same user), at a certain distance from the home, the vehicle may send a signal to a home automation system to set appropriate levels of lighting, thermostat, and/or initiate/deactivate other devices and/or open a garage door or entrance gate.
  • a user may be traveling home from work and the vehicle control system may send a message, such as a text message, phone call, email, etc.
  • the message may be identified as being sent from the vehicle (e.g., a vehicle icon with the message).
  • the third party may respond to the message, which is then relayed, visually and or audibly, to the user via the components of the vehicle (e.g., dashboard, head unit, speakers, combinations thereof, etc.).
  • a third party may state "Please pick up some eggs and fabric softener before you come home, thanks.”
  • the vehicle control system can learn which routes one or more users travels at any given time period (hours, days, weeks, months, etc.). This information may be used to adjust routes based on traffic, weather, mood, etc. For instance, a user may plan to travel (or is traveling) to work on a Monday morning where heavy traffic is detected ahead in the user's "usual" Monday morning route. Upon detecting the traffic along the usual route, the vehicle may present the user with alternate route selections, or automatically adjust a route presented to the user. Routes may be altered by the vehicle control system based on past purchases, patterns, preferences and the like associated with a user profile.
  • the route guidance may alter the route of the individual to pass along several McDonalds.
  • the route alteration may be terminated once the user stops to purchase products at the McDonalds.
  • businesses may pay for route alteration and suggestion as a form of advertising.
  • the vehicle control system centers around interactions and learns from user preferences. For instance, the vehicle control system learns that the user tends to like it warmer in the morning than in the afternoon, and adjust temperature settings in the vehicle accordingly. Further, since there are cameras and facial recognition, it can know my preferences and put them in, no matter what seat the user is in. IN another instance, the user has been surfing the Internet looking for new bikes, and the vehicle control system informs the user that there is a bike shop around the corner and adds the bike shop as a waypoint or destination.
  • the vehicle control system synchronizes with an electronic calendar of the user to create (i) smarter alarms and (ii) updates.
  • Smart alarms mean that instead of a standard 15 minute warning before a meeting, if it is an offsite meeting with an address entered, the system can determine how much time it will take based upon traffic, previous driving habits, the amount of time it generally takes to exit the office and get to the car, etc., and change the warning accordingly.
  • the updates can be triggered based upon the time of arrival determination from the GPS or as calculated above and send SMS notices to other attendees or prompt to call the meeting
  • the vehicle control system knows a child's curfew and also the distance from home. Given that information, it could inform the child driver that he or she must leave to make it home in time to avoid a curfew violation. Further, since it knows the occupants, it could potentially include time to deliver them home too.
  • the user is training for a marathon. He has a wearable device attached to his clothing that uses his biometric data (e.g., heart rate, body temperature, energy levels, sleeping habits, etc.) throughout the day. It also has a GPS that tracks location and terrain and an MP3 player. On his way to work, the wearable device has synchronized with the vehicle control system. The user verbally checks off his breakfast from his marathon widget on the home page displayed on the on board console and confirms his goals to be realized that day as well as his calendar appointments.
  • biometric data e.g., heart rate, body temperature, energy levels, sleeping habits, etc.
  • the vehicle control system notes that it is near lunch time, reviews the restaurants in spatial proximity to the vehicle, determines which restaurants have menu items consistent with the user's dietary goals that day, and recommends a nearby restaurant. The user accepts the recommendation, and the vehicle control system books an advanced reservation and adds the restaurant as a waypoint. On his way back to the office, the vehicle control system notes that the user's heart rate is too low, notifies the user, and, upon the user's confirmation, adjusts the oxygen levels in the vehicle, seat and lumbar settings, and temperature levels to inhibit drowsiness.
  • a user and his child enter the vehicle while the child is watching programming (such as a movie on NetflixTM) on a tablet computer.
  • the vehicle control system recognizes the user, determines his role in the car or seating position (driver), and applies the user's personal settings.
  • the user's phone is automatically paired with the multimedia controller via the local or on board wireless access point.
  • the vehicle control system determines that the child is in the back seat and recognizes his identity.
  • the vehicle control system asks the user if the programming is to be continued to be provided to the child on the child's rear seat display subsystem.
  • the vehicle control system determines, based on the user's electronic calendar, that the user is behind schedule for a next appointment due to an accident on the current path of travel, recommends an alternate path of travel and sends a message to the meeting participants that the user will be late and providing an estimated arrival time and reason for the late arrival.
  • the message may be automatically generated or dictated by the user via the vehicle infotainment system.
  • the user drives home during a baseball game.
  • the persona of the user informs the vehicle control system that the user is a baseball fan and recommends to the user that he tune into the game on radio.
  • the infotainment system tunes automatically into the game broadcast on the radio.
  • the occupant persona is not limited to information obtained and stored directly by the vehicle. It can include information obtained by the vehicle from other sources, such as from a server 228 over a communication network 228 such as the Internet and IP router 420.
  • An example would be from a social network profile or page of a selected vehicle occupant, a web or home page of the selected vehicle occupant, a telecommunications presence server associated with the selected vehicle occupant, and the like.
  • Another example would be from another vehicle driven or occupied by the selected vehicle occupant.
  • Another example would be from a home computer network of the selected vehicle occupant.
  • the information can be obtained not only via the Internet but also directly via synchronization when the vehicles are parked in a common garage of the occupant's home even when the ignition of the vehicle(s) is/are turned off.
  • Another example would be from a portable communication or computational device of an occupant, such as from a device or user interface 212, 248.
  • a transceiver of the vehicle can provide a mobile hot spot functionality not only to any user device(s) therein but also to a computer of another vehicle located in proximity to the selected vehicle. Synchronization with another vehicle's computer or with a home computer can be done, for example, by the http server capabilities of the vehicles.
  • the persona can be stored as part of a template locally, remotely, or
  • the persona can be stored in the cloud, on a personal
  • the template may be universally applied by all vehicle manufacturers to enable the persona to be portable among vehicles so that the intelligent capability of the vehicle appears to follow the associated user from vehicle to vehicle even when manufactured by different companies.
  • the template can even follow the user from seat position to seat position within a vehicle, automatically populating vehicle settings, such as seat and console settings and features associated with a new seating position, when the user is identified as having moved to the new seat.
  • a console may not support multimedia entertainment, particularly video.
  • Multimedia content is typically displayed on remote display subsystems dedicated to a specific seating position.
  • user personas may be presented (or uploaded) to a vehicle in advance of a user reaching a vehicle (rental cars, etc.).
  • a vehicle rental cars, etc.
  • Hertz® car rental agency can maintain a persona for each customer and upload the persona onto the car rented to the customer at the time the leasing is consummated.
  • the car is automatically adjusting all of the vehicle settings as set forth in the user persona. In this way, the user need not manually select any settings when he or she begins driving the car.
  • a rental car system may determine where a user is staying (e.g., hotel, motel, building, etc.), a mobile device number associated with the user, and the like.
  • the system may send a text message, email, phone call, or other signal to the mobile device number to request access to the user's persona template.
  • the template can be incorporated with the vehicle and settings associated with the vehicle may be updated prior to the user reaching the vehicle.
  • administrative assistants can reserve a vehicle on behalf of another individual and link the user template to the reservation based on the user's mobile device phone number.
  • the mobile device can act as the key. For instance, a phone number may be embedded in the private key of the mobile device. The public key may be presented to the rental car. In some cases a combination of phone number and device ID (e.g., EIN, MAC address, etc.) may be used as part of the authentication.
  • the vehicle control system can apply rule sets or templates to trigger
  • the vehicle control system and/or mobile device may determine patterns, such as, destinations, purchases, etc. to categorize data, advertising, and the like.
  • the artificial or computational intelligence can, for example, have symbolic, sub-symbolic, or hybrid components and/or modules. Examples include neural networks (e.g., acyclic or feedforward neural networks and recurrent neural networks), fuzzy systems, and evolutionary computation.
  • FIG. 30 An example of a home page 3000 displayed by a device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 is shown in Fig. 30.
  • the layout of the home page 3000 is defined by the user profile. When the user is signed as a guest user, a default layout is provided.
  • the various objects in the layout can be configured as widgets or a lightweight version of an application or item of functionality.
  • the home page 3000t includes, in an upper area 3004 of the screen, user selectable icons for various applications (which if selected open or initiate or recall the application), such as navigation ("Nav" icon) 3016, media (“Media” icon) 3018, phone (“Phone” icon) 3020, rear view (to provide the driver with a video stream from a backup camera) ("Rear View” icon) 3022, other applications 3024, and "my dash” (to provide the user with a display having typical dash indicators, such as speedometer, tacometer, oil pressure, warning lights, battery charge indicator, fuel level, and the like) 3026; an upper information bar 3028 including a greeting to the identified driver 3030, satellite reception indicator 3032, WiFi connectivity indicator 3034, cellular connectivity indicator 3036, weather information 3038, and date 3040; in a central area 3008 of the screen, various types of information, such as navigation input fields 3042 (e.g., "set destination” field (which is a drop target into which location-enabled objects can be dragged), etc.
  • the navigation subsystem 336 When a suggested, favorite, recent location, people, or points of interest object or object identifier is dragged and dropped by a sensed gesture or stylus or mouse into the "set destination" or “set waypoint” fields, the navigation subsystem 336, based on the current satellite-based location of the vehicle, automatically determines a route from the vehicle's current location to the stored location associated with the dragged and dropped object. As more objects are dragged and dropped into either the "set destination" or “set waypoint” fields, the route is altered to accommodate the various associated locations. A traceroute on a map and/or directions can be provided to the vehicle operator. The traceroute or directions can be updated to show the vehicle's current location.
  • the traceroute or directions can be altered to reflect traffic and/or road conditions (when the user selects “quickest” route), surface street or highway preferences, roadside aesthetics (when the user selects “most scenic” route), and the like. This information can be displayed by the user selection of the "Nav" icon 3016.
  • the screen can provide a map on the background to assist navigation
  • the traceroute can be updated on the background map.
  • the various location- enabled objects and the "set destination” and “set waypoint” fields are overlays of the map.
  • the screen configuration and content such as the arrangement of the various icons and displayed information, can be rearranged or reconfigured by the user to reflect his or her needs or desires.
  • the screen content is complex and/or content-heavy and can distract the operator when driving
  • the screen may be altered when the vehicle is in a driving mode relative to a parked mode.
  • a driving mode the wheels are in motion and the gear shift is out of the "parked" position.
  • the parked mode the wheels are not in motion and the gear shift is in the "parked” position.
  • the home screen of Fig. 30 therefore can be closed automatically on the driver's console display and disabled when the vehicle is in the driving mode and opened automatically and enabled when the vehicle is in the parked mode.
  • Certain applications can be launched automatically when the vehicle is in the driving mode.
  • An example would be the navigation application so that the console display shows the display for the navigation application instead of the home screen of Fig. 30.
  • One or more other applications can be selected by the user as the display in driving mode.
  • the rear- view application when the vehicle is placed in reverse gear with the home screen active, can be launched automatically to display, in the rear view application display, live video from one or more rear cameras on the back of the vehicle.
  • the rear view application can be automatically closed to terminate the rear view application display and the navigation application automatically opened to provide the navigation display.
  • step 3600 the, the profile identification module 848 detects a stimulus, such as the vehicle changing operational state, expiration of a selected time period, a door opening or closing, and the like.
  • a stimulus such as the vehicle changing operational state, expiration of a selected time period, a door opening or closing, and the like.
  • step 3604 the profile identification module 848 selects a current vehicle occupant, which can be the driver or a passenger.
  • the profile identification module 848 determines a seating position of the selected vehicle occupant; that is, the module 848 determines which of the driver's seat, front passenger seat, driver's side rear passenger seat, middle rear passenger seat, and passenger's side rear passenger seat the selected vehicle occupant is occupying. This can be done by the user identification module 822.
  • the profile identification module 848 determines the types of information to collect for the persona of the selected occupant.
  • the type of information to be collected can vary by occupant identity, age of occupant, an association of the occupant with the vehicle, and/or occupant seating position.
  • the driving history during the recent vehicle use may be collected for a driver but not a passenger.
  • the driving history during the recent vehicle use may be collected for an adult passenger but not for a child passenger.
  • the driving history during the recent vehicle use may be collected for a vehicle owner but not for a guest driver or driver having a default profile.
  • step 3616 the profile identification module 848 collects the selected types of information.
  • step 3620 the profile identification module 848 determines whether there is a next occupant to be selected. If so, the profile identification module 848 returns to step 3604. If not, the profile identification module 848 returns to step 3600 and awaits the next stimulus instance.
  • step 3700 the vehicle control system 204 detects a stimulus, such as the vehicle changing operational state, expiration of a selected time period, a door opening or closing, and the like. [0678] In step 3704, the vehicle control system 204 selects a current vehicle occupant.
  • step 3708 the vehicle control system 204 determines a seating position of the selected occupant.
  • step 3712 the profile identification module 848 accesses, at the request of the vehicle control system 204, the persona of the selected occupant.
  • the vehicle control system 204 determines vehicle related information.
  • This information includes typically any information collected by an interior or exterior vehicle sensor 104, 242, including a satellite navigation system, such as GPS
  • the vehicle control system 204 determines other relevant types of information, such as the information referenced above in the examples and illustrations.
  • the other relevant types of information can include, for instance, a persona of a selected person not currently in the vehicle, an instant message, email, voice mail from, missed phone call from, current activity of and/or current location of a friend and/or family member of the selected vehicle occupant, a location, hours of operation, and/or descriptive information about a point and/or location of interest near the vehicle and/or its path of travel, a location, hours of operation, and/or descriptive information about a vehicle service facility near the vehicle and/or its path of travel, a location, hours of operation, and descriptive information about a hotel and/or motel near the vehicle and/or its path of travel, a current location of the friend or family member near the vehicle and/or its path of travel, a road condition such as traffic, road construction, or other delay near the vehicle and/or its path of travel, and the like.
  • This information can be collected from the server
  • the vehicle control system 204 determines whether or not to perform an action. This determination can be based on a rule-based or template- based analysis of the persona of the selected occupant, vehicle related information, and other relevant types of information. The particular rules selected to apply to the collected information to determine whether or not to perform an action and what action to perform can depend on the identification of the selected vehicle occupant and/or seating position of the selected vehicle occupant.
  • Actions that can be performed include those referenced above, including making an appointment or reservation, purchasing an item on line, adding a waypoint to a path of travel on a navigation system, adding an entry into the selected occupant's calendar, changing a destination or path of travel on a navigation system, warning or notifying the occupant, and sending a message to a person regarding an arrival time at a waypoint or destination.
  • the vehicle control system 204 determines if there is a next occupant to be selected in the vehicle. If not, the vehicle control system returns to step 3700 and, if so, to step 3704.
  • the vehicle control system 204 in step 3732 optionally prompts the occupant whether or not the action is to be performed.
  • the profile identification module 848 updates the persona of the selected vehicle occupant to reflect the action performed and the surrounding context.
  • the vehicle control system 204 determines if there is a next occupant to be selected in the vehicle. If not, the vehicle control system returns to step 3700 and, if so, to step 3704.
  • the device discovery daemon 1020 can discover automatically computational devices within the vehicle that connect or attempt to connect to network 356 or communication subsystem 1008 or that disconnect or attempt to disconnect from the network 356 or communication subsystem 1008.
  • the device discovery daemon 1020 discovers automatically computational devices, such as portable user communication devices, located within the vehicle and connects, wirelessly, the portable user
  • the device discovery daemon 1020 may intercept emitted signals from one or more devices in or about the vehicle to pair a device with the network 356 or
  • the network 356 or communication subsystem 1008 may utilize certain receivers to "listen" for cell tower registration signals, sent messages, sent packets (packet sniffing), etc. From this information, the device discovery daemon 1020 may isolate a MAC address, or other identifier, associated with a device and register the device with the vehicle, a vehicle zone, a user, etc. In some cases, and upon detecting a device signal, the device discovery daemon 1020 may request permission from a user before pairing the device. In one example, the pairing may be initiated by the device discovery daemon 1020 (upon a first registration) to a user's device. Subsequent pairings may be initiated by a user's device to the network 356 or communication subsystem 1008. One or more of Bluetooth, Near Field Communications (NFC), and other protocols may be used to pair a device with the network 356 or communication subsystem 1008.
  • NFC Near Field Communications
  • Fig. 20 depicts an operation of the device discovery daemon 1020.
  • the device discovery daemon 1020 detects a computational device attempting to connect to or otherwise connectable with the vehicle network 356 or communication subsystem 1008 and, applying rules, determines when it is appropriate to connect with the computational device.
  • the computational device may be a module installed on a backplane, a vehicle component and/or module that is attempting to be installed, a portable user communication device, and the like.
  • the daemon can ping the local communication network each time the vehicle starts or each time a vehicle door opens or closes.
  • the daemon can perform a BluetoothTM and WiFiTM device discovery too.
  • the daemon can use plural protocols, such as HPPOPT, ICMP, IGMP, GGP, IP, ST, TCP, EGP, CHAOS, and UDP to populate a device's folder in the device data with device capabilities.
  • Device capabilities are determined by the opened port(s) the device exposes.
  • the device discovery daemon 1020 in conjunction with the combo controller can determine a spatial location of the computational device as a prerequisite to permitting the computational device, even if properly authenticated, to connect to the vehicle network 356 or communication subsystem 1008.
  • the spatial location can be required to be in an area 508 or set of areas 508 and/or zone or set of zones.
  • the computational device for example, may be required to be in the area two 508B for connection to be accepted. This can be the case for a portable personal communication device, such as a wireless or cellular phone, tablet computer, personal digital assistant, laptop, and the like.
  • the computational device for example, may be required to be in area one 508A (such as the engine space) and/or area N 508n (such as the trunk).
  • area one 508A such as the engine space
  • N 508n such as the trunk
  • computational devices can have different location requirements as a prerequisite for connectivity to be enabled or accepted. This can be the case for an engine control module, vehicle sensor, or other device that controls or senses a vehicle task, function or operation that, for instance, either connects wirelessly or by a wire-line connection, to the red zone 417, green zone 413, or I/O backplane on the network bus 408.
  • the device discovery daemon 1020 detects a computational device connectable with the vehicle network 356 or communication subsystem 1008.
  • Connectable can refer to the computational device being physically within the vehicle, the computational device having a threshold signal strength or connection quality with the vehicle network, the computational device being configured properly to connect with the vehicle network, and the like. "Connection” can be done either wirelessly or by wired connection.
  • Detection can be done by pinging the computational device and/or receiving a ping from the computational device.
  • a computational device to connect to a Wi-Fi LAN a computational device must be equipped with a wireless network interface controller.
  • the combination of a computer, such as the computer in the computational device, and interface controller is called a station. All stations share a single radio frequency communication channel. Transmissions on this channel are received by all stations within range.
  • the hardware does not signal the user that the transmission was delivered and is therefore called a best-effort delivery mechanism.
  • a carrier wave is used to transmit the data in packets, referred to as "Ethernet frames".
  • Each station is constantly tuned in on the radio frequency communication channel to pick up available transmissions.
  • Device detection can be in response to or triggered by a sensed event other than receipt of a ping from the computational device.
  • device detection can be receipt of information from one or more on board sensors that a new occupant has entered the vehicle. Exemplary information includes a door opening or closing, a successful authentication of an occupant or computational device, a sensed load in a seat, detection of movement within the vehicle, and detection of initiation of a vehicle task, function or operation, such as a key inserted in an ignition, engine start up, and the like.
  • the device discovery daemon 1020 accesses computational device accesses configuration and other information and capabilities of the connected computational device. This information can include device type and/or class, manufacturer name, product and/or device identifier or name, firmware identification, services running, device operating system, network address(es), capabilities, user credentials, and the like.
  • the device discovery daemon 1020 can determine presence of the computational device on any input channel, such as Ethernet, USB, WiFi, Bluetooth, and the like.
  • the device discovery daemon 1020 determines whether or not the computational device is recognized and/or authenticated properly. Recognition can be based on any of the configuration and other information and is deemed to occur when selected fields of the information map to and are the same as similar fields previously encountered and/or recorded in system data 208 by the device discovery daemon 1020. Authentication is deemed to occur when the credentials match stored credentials for the recognized computational device.
  • the device discovery daemon 1020 determines whether the computational device is in a predetermined area or zone of the vehicle for the type of computational device corresponding to the computational device.
  • some wireless protocols particularly WiFiTM, use an access point (or hotspot) having a defined range, which can be commensurate with the area of the vehicle passenger cabin.
  • the passenger cabin can be surrounded by walls that substantially block radio waves to prevent computational devices external to the passenger cabin from connecting to the network. Multiple overlapping access points can be used to cover larger passenger cabins, such as buses, trains, planes and the like.
  • This determination can be based on received signal strength from the computational device, a satellite-based position of the computational device compared to a satellite-based position of the vehicle or sub-area of the passenger cabin, triangulation based on the relative received signal strengths of multiple access points, camera feedback or image processing of camera or video feed of the passenger compartment, occupant information received from other sensors such as occupant location or presence information received from an infrared sensor 740, motion sensor 744, weight sensor 748, biometric sensor 756, camera sensor 760, audio sensor 764, and associated device sensor 720, user input and the like.
  • User input can be received by a screen pop-up on the computational device requesting the user to designate whether or not and what vehicle the user is currently in and/or what seat position the user has in that vehicle.
  • a level of confidence can be assigned by the device discovery daemon 1020 to the computational device indicating a likelihood or probability that the computational device is located within the passenger compartment. This level of confidence can be based on consideration and analysis of multiple of the factors identified above. When multiple factors indicate that the user is within the passenger compartment, a higher likelihood is assigned to that determination; likewise, when multiple factors indicate that the user is outside of the passenger compartment, a lower likelihood is assigned to the determination. When no or an unsatisfactory response is received from the user in response to a request for user input and/or the level of confidence is below a selected threshold, the user is deemed to be located outside of the predetermined area and/or zone.
  • a further spatial determination can be made by the device discovery daemon 1020 based on how the computational device connected to the automotive network. If the computational device is hard wire connected, or plugged in, to the automotive network (which can be determined by known techniques), the spatial location of the computational device can be assumed to be within the vehicle regardless of other location mechanisms employed by the device discovery module.
  • a further spatial determination can be made by the device discovery daemon 1020 based on whether or not the computational device has moved within a defined time interval. This indicates that the computational device may not be within the vehicle as movement of a computational device within the vehicle is less likely than a computational device located outside of the vehicle, where the person corresponding to the computational device is often moving relative to the position of the vehicle.
  • a further spatial determination can be made by the device discovery daemon 1020 based on whether the signal strength or signal-to-noise ratio associated with a ping or other signal from the computational device varies significantly over a selected time interval.
  • the signal-to-noise ratio associated with signaling from a computational device located within the vehicle would remain relatively constant while the signal-to-noise ratio associated with signaling from a computational device located outside of the vehicle would likely vary significantly as the person associated with the computational device moves relative to the vehicle and/or network 356 or communication subsystem 1008.
  • a further spatial determination can be made by the device discovery daemon 1020 based on an analysis of signal characteristics.
  • Signal characteristics can include one or more of, but is not limited to analyzing signal attenuation, where a signal with a shrinking strength or increasing strength may be determined to be moving in relative proximity or position to the vehicle, analyzing any Doppler shift in the frequency, which may indicate movement in reference to the vehicle and/or network 356 or communication subsystem 1008, analyzing any kind of delay between receiving the same signal at the various transceivers 260 and/or sensors 242A-N.
  • a difference in the time of receipt can be used to triangulate where the location of the signal originated and if that location is outside the vehicle or inside the vehicle.
  • a further spatial determination can be made by the device discovery daemon 1020 based on an analysis of location information. Beyond the signal characteristics, the device discovery daemon 1020may receive information from one or more sensors 242A-N to determine a location of the vehicle. If the location of the vehicle is in an area where there is not a likelihood of signal congestion, for example, in the driveway of someone's home, then all received signals may be determined to have been with inside the vehicle. Thus, as signals are received and if the location has changed, the device discovery daemon 1020 may determine whether the current location is an area where there may be more signals received that would be outside the vehicle or whether the signals received has changed.
  • the device discovery daemon 1020 may access historical signal data (which is a record of devices that have previously been granted access to the vehicle communications network) to determine if the signals have been received from this device or from this person before.
  • the signal may identify a person documented in the signal data, and the device discovery daemon 1020 may determine if that person has used or connected with the device discovery daemon 1020 previously. Further, the device discovery daemon 1020 can determine if there is movement of the vehicle. If a signal remains within the car after the vehicle moves, then that signal can be determined to be inside the vehicle. For example, if the signal is received at the beginning of a route and then at some time thereinafter the signal continues to be received, then it is determined that signal may be inside the vehicle.
  • sensor data may be analyzed by the device discovery daemon 1020.
  • Sensor data may include such things as determining if there are people and the number of people within a car. Thus, if there are three people in the vehicle and three signals are received, all three signals may be determined to be inside the vehicle. Further, it may be possible for the sensors 242 to determine if a device is currently being used inside the vehicle. For example, if an optical sensor can view a device within its field of vision and/or if an electromagnetic field sensor determines that there is EMF radiation emanating from a location in the vehicle, then the device discovery daemon 1020 can determine that that signal is originating inside a vehicle.
  • a further spatial determination can be made based on the type (or service) of computational device attempting to interact with the automotive network. If the computational device is not the type (or service) of device expected to be positioned within the passenger compartment, its attempt to connect to the automotive network can be disregarded completely.
  • the computational device is the type (or service) of device that can be connected to the vehicle network but is not the type (or service) of device expected to be associated with an occupant (such as a non-infotainment vehicle task, function, or operation control module, sensor module, a module plugged into a backplane (such as an XM radio module), or other type of device not permitted or expected to be associated with an occupant), the spatial determination may be deemed to be satisfied by the computational device being located somewhere within the vehicle even if outside of the passenger compartment. Conversely, if the computational device is the type (or service) of device that would normally be carried by a vehicle occupant, the spatial location query would be satisfied only when the computational device location were within the passenger compartment.
  • mapping of type of computational device to permissible location (e.g., area and/or zone) within the vehicle can be effected in a look up table or other set of rules.
  • the device discovery daemon 1020 may receive one or more of these analyses and resolve the information. Thus, device discovery daemon 1020 can cross-correlate information from different analyses to determine if the signal is within the vehicle.
  • Whether or not a communication device is enabled to connect to the vehicle network 356 or communication subsystem 1008 can be based solely on successful authentication.
  • the vehicle on board computer can connect automatically to the owner's home virtual private network to upload and/or download information, settings, and other information (such as user input into the vehicle computer, vehicle driving history (e.g., miles traveled, travel traceroutes, speeds traveled, and locations visited), vehicle service information (such as gas and fluid levels, engine problems, alarms or warnings activated, and the like), input received by on board applications from the user (such as scheduled appointments, notes, documents, and the like), applications downloaded, and the like.
  • vehicle driving history e.g., miles traveled, travel traceroutes, speeds traveled, and locations visited
  • vehicle service information such as gas and fluid levels, engine problems, alarms or warnings activated, and the like
  • input received by on board applications from the user such as scheduled appointments, notes, documents, and the like
  • applications downloaded and the like.
  • the device discovery daemon 1020 determines a spatial location and/or seat position of the computational device to determine appropriate rights and/or privileges and/or restrictions of the user of the computational device as defined by user account or profile associated with the user, the device type of the computational device, the seat position of the user, or area or zone in which the user and/or computational device is located.
  • the device discovery daemon 1020 permits or enables connection of the computational device with the vehicle network 356 or communication subsystem 1008 and optionally stipulates or defines what set of tasks, functions, and/or operations the user of the computational device can perform using the computational device, such as based on the location of the computational device within the vehicle and/or based on the
  • authentication credentials e.g., the identity of the computational device user.
  • the device discovery daemon 1020 determines whether the computational device is outside of a defined area and/or zone of the vehicle (e.g., outside the passenger compartment or cabin) and the device discovery daemon 1020 does not recognize and/or authenticate the computational device successfully, the device discovery daemon 1020, in step 2024, denies access of the computational device to connect to or access the vehicle network 356 or communication subsystem 1008.
  • the device discovery daemon 1020 operation will now be discussed. Assume that a vehicle owner purchases an XM radio module off the shelf from a vendor. The user then plugs in the module to the vehicle network. In attempting to enable the functionality of the XM radio, the device discovery daemon 1020 first attempts to recognize and/or authenticate the module. During the process, the XM radio module can be permitted to communicate through the vehicle network or an antenna internal to the module with a remote website to confirm that the XM radio was purchased by the vehicle owner and/or perform licensing validation checks. The vehicle owner's XM radio account is likely already associated with the XM radio module when purchased.
  • the trigger event occurs, and the device discovery daemon 1020 and combo controller commence operation.
  • the combo controller downloads, from the user's XM account, the user's XM preferences (e.g., favorites) and causes the XM functionality in the vehicle to be configured accordingly.
  • the combo controller can provide an installation in process dialog the user and, upon successful completion, an installation complete dialog.
  • a new application icon for the XM radio appears in the upper area of the screen of Fig. 30.
  • the new application icon can have a different appearance than other previously installed application icons to inform the user that the new application is available.
  • the application screen is shown. The application screen of any application depends on the application and user preferences.
  • the environment 400 of Fig. 4 can interface with three different types of blade processors.
  • a first type of blade processors is installed prior to sale by a manufacturer or vendor. These blade processors are in a manufacturer "crate” and not replaceable or upgradeable by the customer but only by a certified service facility.
  • a second type of blade processors can be installed post-sale by or on behalf of the customer. These blade processors are in a customer "crate”.
  • a third type of blade processors interfaces with the first and second type of blade processors. These blade processors are in an accessory "crate".
  • the blade processors generally refer to executable instructions executing on a microprocessor.
  • Each blade typically includes a corresponding set of functions (such as 802.11 WiFiTM interface, BluetoothTM radio, cellphone radio, storage unit, or satellite position system (such as GPS)), a flash memory, a microprocessor, and a network interface.
  • the function is software or a combination of hardware and software. Examples include a media controller blade, video controller blade, audio controller blade, profile identification module blade, and the like.
  • This configuration may also be used for a wide variety of other software, such as applications, drivers, and the like, and devices, such as an infotainment system, satellite receiving system, and the like and combinations thereof.
  • an aftermarket infotainment system upgrade or component and/or module can be installed by the customer.
  • the infotainment system upgrade or component and/or module searches for, locates, and connects to a suitable antenna installed prior to sale.
  • the output of the blade processor is an IP message framed into an Ethernet packet.
  • the blade processor can be compatible with the Car Area Network ("CAN") protocol.
  • CAN Car Area Network
  • the CAN control system which was intended for sensor feedback and controller control signaling through a relatively slow, serial bus, can be integrated into the blade processor architecture by terminating CAN buses in one or more Ethernet bus controllers.
  • the output is configured for a discrete Ethernet-based sensor or controller, with the translation to the CAN protocol being made at a CAN Ethernet controller subsystem.
  • a “crate” commonly has a backplane connector and on-backplane Ethernet interfaces that enable blade processors to communicate with one another using Ethernet messages.
  • the crate can be a USB hub have a series of ports to permit devices to connect to the USB hub; therefore, each "crate” generally has a corresponding USB hub and ports connected thereto.
  • a master blade processor can be provided.
  • the master blade processor can be substantially the same as the other blade processors except for scheduling and control functionality.
  • the master blade processor can include a service manager that inventories hardware and software capabilities of components and/or modules available (or connected) through hard wire connections within the vehicle and/or computational devices connected wirelessly to the vehicle local area network (e.g., through a wireless access point or local hot spot maintained by the vehicle control system).
  • the output of the inventory step is a network topology comprising assigned network addresses, component and/or module description and capabilities, operational status, and "next hop" addresses to demonstrate positions of components and/or modules next to one another.
  • An application resource table can be maintained that maps application identity, functional description, and/or application type against computational resource requirements and/or capabilities. Requests to the master blade processor allow various applications to access distributed services or automatically disable features not supported by the vehicle's computer system.
  • the master blade processor can coordinate a health check for the various components and/or modules in or connected wirelessly to the vehicle's computer system and be responsible for sending a keep alive messages (or status queries or interrogation messages sent at periodic intervals) to the various components and/or modules to evaluate health or identify malfunctions. If no response is received within a predetermined time period after keep alive message transmission, a malfunction is assumed to exist.
  • the master blade processor can create a high availability set with a required number of blade processors. This table controls the priority of which blades are capable of becoming master blade processors in the event of a malfunction of the current master blade processor. Shadow copies of each high availability component (e.g., device) and/or module (e.g., application) are created and maintained for backup purposes in the event of a malfunction of the corresponding active blade processor. High availability applications write all variables to a file created at application initiation residing in vehicle memory. In the event of a malfunction of a high availability application, the master blade processor reassigns the high availability application to a new blade processor.
  • This table controls the priority of which blades are capable of becoming master blade processors in the event of a malfunction of the current master blade processor. Shadow copies of each high availability component (e.g., device) and/or module (e.g., application) are created and maintained for backup purposes in the event of a malfunction of the corresponding active blade processor. High availability applications write all variables to a file created at application initiation
  • the application can be reinitiated or refreshed or synchronized (using the shadow file) on the newly assigned blade processor.
  • the master blade processor is able to redistribute applications to blade processors and/or stopping execution of applications to realize a selected level of performance for the entire vehicle computer system.
  • a table can be maintained mapping a network address (e.g., a MAC and/or IPv6 address) of each component and/or module against information about the corresponding component and/or module and/or a pointer to a variable file for the component and/or module in memory. Entries on the table may have a corresponding state of health or be added when the corresponding component and/or module passes a health check or removed when the corresponding component and/or module fails to pass a health check.
  • a network address e.g., a MAC and/or IPv6 address
  • the master blade processor Upon start up, the master blade processor checks the application resource requirement entries in an application resource table and assigns each application to a blade processor having the most applicable and/or available resources. If an application has specific requirements for execution (e.g., a multimedia infotainment application), the application is assigned to and locked to a specific blade processor (such as a multimedia controller).
  • an application has specific requirements for execution (e.g., a multimedia infotainment application)
  • the application is assigned to and locked to a specific blade processor (such as a multimedia controller).
  • step 3800 the blade processor detects a stimulus.
  • the stimulus can be system boot sequence.
  • the master blade processor identifies and inventories enabled on board computational resources (such as each blade processor, IP router, vehicle sensors, database management systems, and the like, and the capabilities thereof.
  • On board components and/or modules are typically physically connected to a network bus.
  • the master blade processor identifies and inventories enabled portable computational resources and the capabilities thereof.
  • Portable components such as tablet computers and smart phones, are typically wirelessly connected, via a wireless access point, to a network bus.
  • step 3812 the master blade processor assigns a network address to each enabled resource.
  • step 3816 the master blade processor maintains and stores the resulting network topology in system data 208.
  • a blade processor is not identified as the "master" blade processor prior to system booting.
  • An application executes on a designated blade processor (such as a blade processor handling multimedia system or media subsystem functions) at boot time to detect the presence of a blade processor having specific capabilities.
  • a variable associated with the designated blade processor is changed from "master" to "slave” and the system reboots, thereby transferring system control to the identified master blade processor.
  • a table can be created controlling a priority of which blade processors are capable of becoming master blade processors in the event of master blade processor malfunction. When a master blade processor malfunctions, a next in priority blade processor is selected as the new master blade processor.
  • the master blade processor detects a stimulus.
  • exemplary stimuli include determining that a component and/or module has failed a health check, receipt of a request for a component and/or module for blade processor resources to execute, and the like.
  • the master blade processor determines the requirements for execution of a selected component and/or module, such as set forth in the application resource requirement table.
  • the selected component and/or module can, for example, be a blade processor, software, such as an application, a component, such as a sensor, a satellite receiving system, and the like, and/or a combination thereof.
  • the master blade processor determines whether a blade processor is available to execute the selected component and/or module. This determination can be done by any number of techniques. It can be based on one or more of a degree of compatibility of the selected component and/or module with each of the blade processor, a current availability of a blade processor to execute the selected component and/or module (e.g., whether the blade processor is currently executing another component and/or module), a relative importance or priority of the selected component and/or module relative to a relative importance of a currently executing component and/or module on a blade processor, whether the blade processor can satisfy the requirements of the selected component and/or module, a current available bandwidth of a blade processor which may be executing another component and/or module, and the like.
  • step 3912 which is performed when a blade processor is available to execute the selected component and/or module, the master blade processor assigns the selected component and/or module to a blade processor for execution.
  • the assignment is recorded in a resource allocation table which maps application identity and/or description against an identifier of the assigned blade processor. If an application has specific requirements for execution (e.g., a multimedia infotainment application or firewall application), the master blade processor may permanently assign the application (or lock the assignment) to a specific blade processor.
  • the master blade processor can, for unlocked assignments, redistribute or reassign components or modules to blade processors or stop components or modules from executing to realize a maximum performance level for the system and/or ensure that critical vehicle, less critical vehicle, or infotainment tasks, functions, or operations are performed.
  • the master blade processor can stop a lower importance or priority component and/or module from executing in favor of a higher importance or priority component and/or module requiring a blade processor to execute.
  • a critical task, function or operation has a higher level of importance or priority than a less critical vehicle task, function or operation.
  • a less critical vehicle task, function or operation can have a higher level of importance or priority than an infotainment task, function or operation or vice versa.
  • the master blade processor in next step 3916, updates the network topology.
  • step 3920 which is performed when a blade processor is unavailable to execute the selected component and/or module, the master blade processor denies the selected component and/or module permission to execute.
  • the master blade processor determines that a minimum set of blade processor resources and/or capabilities is not operational, it will send a system reset message to clear operator warning circuitry within an allocated timeframe.
  • An integrated circuit board configuration can be provided to enable media streaming to computational devices wirelessly connected to the vehicle network via an access point 456 of the vehicle. In this manner, different media can be provided simultaneously to different user interfaces (e.g., on board display subsystems) of different vehicle occupants.
  • Fig. 21 depicts a configuration of the media controller subsystem 348 for simultaneously streaming multiple channels of media (e.g., multimedia or infotainment content).
  • Media content 2100 such as audio, video, picture, and/or text media, is received by the network transceiver 824 and provided to a common media processing board 2104.
  • the processing board 2104 comprises a network control server 2108, such as a distributed network control (“DNC") server, a media server 21 12, such as a digital living network alliance (“DLNA”) server, and virtual network console (“VNC”) software 21 16.
  • the network control server 2108 has an IP address and/or a routable, global unicast address while the media server has a contactable electronic address (such as a media access control or MAC address) on the vehicle network 356 or communication subsystem 1008.
  • each of the network control and media servers 2108 and 21 12 comprise a microprocessor and memory to buffer media content, execution of application programming, long term storage of program instructions and/or data, and the like,
  • the media server 21 12 comprises functionality of the media controller 804, media processor 808, and signal processing module 828.
  • a user portable (remote) or Bring Your Own Device (BYOD) computational device 2120 is in wireless or wired communication with the board 2104, such as via a wireless protocol (e.g., BluetoothTM, WiFiTM, and the like).
  • the VNC software 21 16 effectively provides the remote computational device 2120 (e.g., portable communication device) with remote access to the board 2104, which is typically part of the vehicle on board computer (e.g., the console computer).
  • the VNC software can operate in accordance with the RFB (Remote Frame Buffer) protocol on top of the TCP/IP suite of protocols. The result is that the remote computational device appears to the on board computer as if it were part of the on board vehicle computer or vehicle control system 204.
  • RFB Remote Frame Buffer
  • the remote computational device to provide commands to the media controller subsystem 348 for desired multimedia content and media presentation features and settings (such as volume, contrast, resolution, channel selection, and the like).
  • the remote computational device must download an application from the vehicle to enable multimedia requests from the remote computational device to pull information from the multimedia controller, where the media can be streamed to its final destination.
  • the commands are provided, by the VNC software 21 16 indirectly to the media server 21 12 for delivery to the network control server 2108 or directly to the network control server 2108, which retrieves the requested content from an external network (e.g., an Internet or radio network) accessible source.
  • the media may then be provided to a display at the seat position of the user and/or to the BYOD computational device 2120 (via the wireless access point).
  • the media content is provided from local storage in preference to accessing the media content via the Internet.
  • the requested media is not available locally, it is obtained via the Internet and is cached in local storage in parallel with streaming to the user's display and/or BYOD computational device 2120.
  • Each display at each seat position can independently provide a corresponding user with a playlist of audio or video from independent sources.
  • step 2200 the VNC software 2116 receives a command or request for content from the user of the BYOD computational device.
  • the device has been previously connected successfully by the device discovery daemon or combo controller to the vehicle network 356 or communication subsystem 1008.
  • the combo controller has ports in the crate, implements certain device discovery functions, and packetizes information for transmission via IP over Ethernet.
  • the combo controller commonly installs the correct driver for the discovered device. It then maps the device (e.g., USB) address to a local TCP/IP address for interaction with one of the blade processors.
  • the media server determines, from the system data 208, portable computational device information and capabilities, the identity of the user of the device, an identity of the portable computational device associated with the user, and/or the spatial location of the device (e.g., area and/or zone in which the device is located). This information has been previously determined by the device discovery daemon or combo controller but may be updated by the media server using the location techniques discussed above.
  • the media server applies appropriate filters and restrictions based on the identity of the user (e.g., user account) and/or spatial location of the device (e.g., user context).
  • the filters or restrictions can include, for example, age-related content filters and/or restrictions, seating location filters and/or restrictions, and privacy filters and/or restrictions.
  • the filters and/or restrictions can ban the type of content requested or the source of the content (e.g., a blacklist of web sites), redirect the request to a different source of the content, and/or limit the user to content from approved sources, such as permitted by a whitelist.
  • Other user account restrictions can be applied by the media server before the content request is passed to the network control server 2108.
  • the owner of the vehicle can configure the multimedia controller menus.
  • the menu allows for all seat positions to access any media that is available through the multimedia controller.
  • the owner can create profiles for different potential vehicle occupants. By creating the profiles, the owner is able to allow or disallow access to media from selected seat positions, including access to media menus (e.g., audio or video playlists, software applications (such as Internet browsers, applications to control passenger compartment climate control functions, games, and the like).
  • the owner can also limit media and/or media menu access based on a MAC address of a computational device 2120 of an occupant. Only devices having a permitted MAC address can access media and/or menus while those not having a permitted MAC address cannot. This can ensure that control is not only position-based but also device-based.
  • the network control server 2108 accesses the filtered content request (which may be altered to reflect user account restrictions) and retrieves the permissible content via the public network.
  • the network control server 2108 can access the media content 2100 from a media source accessed through an external network, such as a radio network or other type of untrusted public network (such as the Internet).
  • the filtered content request may, for example, be for different content than that originally requested, differently formatted content than that originally requested, differently sourced content (e.g., from a different web server or site) than that originally requested, and the like.
  • the network control server 2108 receives the permissible content from a publicly networked source (e.g., web server or site) and passes the retrieved content to the media server 2112. Based on user account restrictions and/or user context, the media server applies appropriate filters and restrictions to remove or filter out
  • a publicly networked source e.g., web server or site
  • the filters or restrictions can include, for example, age-related content filters and/or restrictions, seating location filters and/or restrictions, and privacy filters and/or restrictions.
  • the combo controller connects via IP over Ethernet and assigns a local IP address to the BYOD device connected to a port in the consumer crate.
  • the USB layer of the combo controller packetizes information from the BYOD device and communicates with a USB access program that executes on a multimedia blade processor.
  • the USB access program de-packetizes the information and connects to the USB I/O subsystem of the multimedia blade processor to transfer the information to either local storage or directly to one of the output devices associated with the appropriate seating position.
  • Other user account restrictions can be applied by the media server before the content is passed by the media server to the user. For example, when the user is driving the vehicle video can be removed automatically from the content to avoid distracting the driver so that only the audio channel accompanying the video channel is provided to the user. In another example and in response to determining that the requesting user or user is driving the vehicle rather than being a (non-driving) passenger, screen magnification can be applied automatically to the visual content to assist the driver of the vehicle; that is, the user is the driver and larger font is easier to see than smaller font.
  • the visual content in response to determining that the requesting user or user is driving the vehicle rather than being a (non-driving) passenger, is presented or rendered automatically in a large font and/or icon size (e.g., without changing screen resolution) compared to the font and/or icon size for the content displayed to a (non-driving) passenger to reduce driver distraction.
  • the media server 2112 determines that the user is driving the car rather than being a (non-driving) passenger and enables or disables automatically removal of all unnecessary animations from the content to be displayed.
  • the media server 2112 determines that the user is driving the car rather than being a (non-driving) passenger and enables automatic removal of background images from, while leaving one or more displayed foreground images in, the content to be displayed. In yet another example, the media server 2112 determines that the user is driving the car rather than being a (non-driving) passenger and automatically enables or renders high contrast (e.g., between text and background colors) to make the displayed content more visible to the user.
  • the feature and/or setting and/or filters could be applied to the displayed content only when the vehicle is in motion or in forward or reverse gear and not parked even when the user is the driver.
  • any of the above features and/or settings and/or filters can be a standard feature and/or setting and/or filter and can be applied to the on board vehicle display for the driver but not to the on board vehicle displays of other (non- driving) passengers.
  • the feature or setting or filter would not be applied to the on board vehicle display of a (non-driving) passenger.
  • the feature and/or setting and/or filter would be applied to the on board vehicle display for the driver only when the vehicle is in motion or in forward or reverse gear but not when the vehicle is parked.
  • the media server 2112 configures the filtered media for the capabilities of the user device and/or on board vehicle display and in accordance with user preferences.
  • the media server 2112 directly provides, or streams, the media stream or indirectly provides, or streams, the media stream via the VNC software 2116 to the user computational device 2120 and/or on board vehicle display for presentation to the user. Stated another way, the network control server 2108 provides the media content 2100 while the media server 2112 provides the channel for the media content 2100 to be provided to the user computational device 2120.
  • the media controller subsystem 348 can handle multiple (e.g., eight or more) media and/or multimedia streams simultaneously by differing channels.
  • the streams of information may be separated and served on different channels via one or more of WiFi, Bluetooth, NFC, and other communications protocols.
  • Multiple BYOD computational devices can be served simultaneously and can mix and match or otherwise control what is streamed and where it is streamed.
  • the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 can be reconfigured when the vehicle changes state from parked or substantially motionless to be in motion or in forward or reverse gear to facilitate the driver's ability to view visual content and avoid driver distractions.
  • Other devices or user interfaces 212, 248, user interface (s)/input interface(s) 324 and/or I/O modules 312 of other vehicle occupants or passengers typically are not reconfigured even when the vehicle is in gear or motion.
  • the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 of the driver returns to a different configuration, which is typically the same configuration as the devices or user interfaces 212, 248, user interface (s)/input interface(s) 324 and/or I/O modules 312 of other vehicle occupants or passengers.
  • the media controller subsystem 348 can remove video
  • a still image derived from one or more frames of the video content can replace the video component in the displayed content.
  • the media controller subsystem 348 applies screen magnification automatically to the visual content to assist the driver of the vehicle; that is, the user is the driver and larger font is easier to see than smaller font.
  • the screen magnifier is software that interfaces with a computer's graphical output to present enlarged screen content.
  • the simplest form of magnification presents an enlarged portion of the original screen content, the focus, so that it covers some or all of the full screen. This enlarged portion should include the content of interest to the user and the pointer or cursor, also suitably enlarged. As the user moves the pointer or cursor the screen magnifier should track with it and show the new enlarged portion.
  • the media controller subsystem 348 presents or renders the automatically in a large font and/or icon size (e.g., without changing screen resolution) to reduce driver distraction.
  • the large font and/or icon size unlike screen magnification, applies to all, and not simply a portion of, displayed content.
  • the media controller subsystem 348 initiates
  • a screen reader is a software application that attempts to identify and interpret what is being displayed on the screen (or, more accurately, sent to standard output, whether a video monitor is present or not). This interpretation is then represented to the user with text-to-speech and/or sound icons. Screen readers can be used in combination with the screen magnifier to read the content within the enlarged portion of the screen content.
  • screen narration in vehicle can describe the layout of a current display, including a current position, page, functionality, etc., of displayed content and the position of a digit of the user and/or cursor relative to displayed objects.
  • the names of the displayed objects in proximity to the current digit and/or cursor location can be audibly identified.
  • a user can identify a displayed object to be located and the media controller subsystem 348 can use "hotter”, “hot”, “colder”, and cold” game to inform the user where his or her digit and digit movement is relative to the identified object.
  • the media controller subsystem 348 can tell the user that "you're getting warmer” as the user's digit moves closer to the identified object.
  • the media controller subsystem 348 determines that the user is driving the car rather than being a (non-driving) passenger and enables or disables automatically remove all unnecessary animation effects , such as fading effects (e.g., fade windows), from the displayed content.
  • unnecessary animation effects such as fading effects (e.g., fade windows)
  • the media controller subsystem 348 automatically removes background images from the displayed content while leaving one or more displayed foreground images.
  • the removed background image can be replaced with a uniform background, such as a commonly colored or white background.
  • the media controller subsystem 348 enables
  • the media controller subsystem 348 automatically disables automatic arrangement of windows when the mouse cursor is moved to the edge of the screen.
  • the media controller subsystem 348 automatically enables activate a window by hovering over it with the mouse cursor.
  • the media controller subsystem 348 automatically enables keyboard web page navigation.
  • the media controller subsystem 348 automatically enables or renders high contrast (e.g., between text and background colors) to make the displayed content more visible to the user.
  • high contrast e.g., between text and background colors
  • the media controller subsystem 348 automatically changes the color and transparency of window borders (making the borders easier to see).
  • the media controller subsystem 348 automatically changes the thickness of focus rectangle (around the currently selected item in a dialog box).
  • the media controller subsystem 348 automatically changes the color, size, and/or thickness of the on-screen mouse pointer.
  • the media controller subsystem 348 automatically changes the keyboard settings (e.g., select how long you need to press a key before the keyboard character starts repeating, the speed at which keyboard characters repeat, and the rate at which the cursor blinks, etc.). [0776] In yet another example, the media controller subsystem 348 automatically ignores colors, font styles and font sizes used on web pages, or formats web pages using a user specified or default style sheet.
  • the media controller subsystem 348 automatically enables or renders in the content an increased size of a mouse-selectable screen element or object to provide a larger target.
  • the media controller subsystem 348 automatically enables mouse keys to move the mouse pointer or cursor.
  • the media controller subsystem 348 automatically enables one or more of sticky keys (which allow the user to enter a combination of multiple keys without having to hold a first key down when he or she depresses a second key), toggle keys (which play an alert each time the user presses Caps Lock, Num Lock, or Scroll Lock keys), and filter keys (which causes the computer to ignore keystrokes that occur in rapid succession or keystrokes held down for several seconds unintentionally).
  • sticky keys which allow the user to enter a combination of multiple keys without having to hold a first key down when he or she depresses a second key
  • toggle keys which play an alert each time the user presses Caps Lock, Num Lock, or Scroll Lock keys
  • filter keys which causes the computer to ignore keystrokes that occur in rapid succession or keystrokes held down for several seconds unintentionally).
  • the media controller subsystem 348 automatically causes the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to be configured to enable blind typing.
  • blind typing the driver can write naturally with a finger on the interface and the interface can determine, from the natural written script, a command or request of the driver. This configuration typically requires the interface to be configured as a touchpad. While other systems require the driver to focus away from the road— either by selecting letters from menus or on a displayed keyboard— blind typing permits the driver to input commands or requests to the vehicle control system without having to visually focus on any screen.
  • any of the above features and/or settings and/or filters can be a standard feature and/or setting and/or filter and can be applied to the on board vehicle display for the driver but not to the on board vehicle displays of other (non- driving) passengers.
  • the feature or setting or filter would not be applied to the on board vehicle display of a (non-driving) passenger.
  • the feature and/or setting and/or filter would be applied to the on board vehicle display for the driver only when the vehicle is in motion or in forward or reverse gear and not when the vehicle is parked.
  • a user interface such as the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312, may be provided that includes an area for navigation by a user.
  • the user may be provided with one or more of sounds, audio narration, haptic/tactile feedback, visual indicator (e.g., not associated with the UI), and the like.
  • a location in the UI area may represent a button and/or other icon.
  • a sound and/or tactile feedback may be presented to the user to indicate where the user's pointer (e.g., one or more finger, hand, etc.) is located.
  • the sound presented may be a ring (e.g., a bell), and a vibration may be provided to the UI.
  • screen narration by a screen reader can be provided, which informs conversationally the user where his or her digit is located.
  • the narration can be to recite the name corresponding to the button.
  • a frequency of sound may change to indicate a position in the UI area (e.g., the low left portion of the UI area may provide a low pitch and/or frequency sound, while a position in the upper right portion of the UI area may have a higher pitch and/or frequency sound, and portions in between can have different pitches and/or frequencies, etc. - similar to the Theremin musical instrument).
  • a graphical UI or GUI may allow a user to interact with certain displayed buttons, features, and/or menus while a vehicle is stationary, but may restrict access and/or interaction when the vehicle is moving.
  • This restricted interaction involves greying-out buttons associated with an associated phone feature when the vehicle moves.
  • the display can be completely turned off or blanked out when moving and a user may still be able to interact with features. Because the user will not be distracted by the visual presentation of elements the user is free to drive while interacting.
  • GUI may be used where a display of the GUI is turned off to provide the above described functionality.
  • a display navigation mode can be selectively triggered by the user to implement any of the above configurations and/or features.
  • the user can provide a command, such as by voice or gesture, to enter that mode.
  • any of the above assistive configurations or features can be implemented.
  • voice navigation is activated and the residence time of the user's touch on the touch screen required to select a contacted icon is lengthened to avoid incidental or accidental icon selection as the user moves his or her hand/digit around the touch screen.
  • the icon will be activated after touch contact for the longer residence time.
  • the touch screen can return to normal operating mode with a shorter residence time for icon activation.
  • the user can alternatively use a roller or track ball and selector to effect narrated screen navigation.
  • the user contacts an arbitrary and substantially planar or flat control surface, such as an arm rest, dash, steering wheel, glove box door, center or rear center console, window (e.g., windshield), passenger compartment roof, seat surface, door panel, or other electrically, magnetically, electromagnetically, and optically nonreactive surface to activate icons on the activated or deactivated vehicle computer center or rear center display.
  • the control surface can be any arbitrary surface selected by the user. This can be done by optically or visually separating the surface into segments, each segment corresponding to a segment on the display. The movement of the user's hand or digit is tracked optically over the surface with reference to the segments.
  • the user can be audibly told where his or her hand is relative to the display.
  • the user can leave his or her hand/digit at that location for a selected residence time to cause the icon in the corresponding segment of the display to be activated.
  • the activated or deactivated vehicle computer center or rear center display can be any of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312, may be provided that includes an area for navigation by a user.
  • Figure 34 depicts the display of Fig. 30 with defined segments overlaid on the display. This is done to show the process of mapping the control surface to the displayed objects.
  • Fig. 33 shows the same defined segments positioned on the control surface 3300. Each segment of the control surface 3300 corresponds to an information-containing and/or activatable or selectable object of the current display.
  • the type, identity, and/or arrangement of displayed information and objects changes requiring the display to be remapped to the control surface 3300. Remapping typically requires the defined segments to be redefined and reconfigured to reflect the new display configuration.
  • cell 3304a corresponds to display field comprising driver name 3028, cell 3304b to satellite reception indicator 3032, cell 3304c to WiFi connectivity indicator 3034, cell 3304d to cellular connectivity indicator 3036, cell 3304e to display field comprising weather information 3038, cell 3304f to display field comprising date 3040, cell 3034g to icon 3380, cell 3034h to navigation ("Nav" icon) 3016, cell 3034i to media ("Media” icon) 3018, cell 3034j to phone ("Phone” icon) 3020, cell 3034k to rear view (“Rear View” icon) 3022, cell 30341 to other applications icon 3024, cell 3034m to icon 3384, cell 3034n to "my dash” icon 3026, cell 3034o to navigation display field 3088, cell 3034zz to suggested icon, cell 3034p to set destination icon, cell 3034q to display
  • the user's digit points at 3304h which corresponds to the navigation icon 3016 of the home page 3000 of Fig. 30.
  • the media controller subsystem 348 determines that the user desires to activate the navigation icon 3016, which causes a different display to be rendered.
  • the new display is remapped to the control surface so that the dividing lines on the control surface mirror the dividing lines overlain on the new display.
  • the user's digit points at 3304f which corresponds to the display field for date and time of the home page 3000 of Fig. 30.
  • a screen reader reads the date and time to the user.
  • the user's digit points at 3304dd which corresponds to the display field for appointment of the home page 3000 of Fig. 30.
  • a screen reader reads the displayed appointment information to the user.
  • Optical tracking of the user's digit enables the location of the digit. Optical tracking typically determines in real-time the position of the digit by tracking the positions of either active or passive infrared markers attached to the object. The position of the point of reflection is determined using the camera sensors 760, motion sensors 744, and/or infrared sensors 740. In one configuration, the user's digit is equipped with : ; ... ⁇ . Markers can be light reflectors restartive markers", e.g. retroreflectors) or light emitters ("active markers", e.g. LEDs). To also measure the orientation of a body, several (greater than or equal to 3) of these markers have to be arranged at a known geometry.
  • the camera sensors 760 scan a certain volume and detect the light that comes from the markers. These images are processed to identify and calculate potential marker positions (in image coordinates). The result of each measurement are coordinates that describe the position of the markers, and hence the position and orientation of the body carrying the markers.
  • the cell dividing lines can be projected onto the control surface by the camera sensors 760.
  • the position of a selected digit of the user can be tracked using image processing of images of the control surface with the cell boundary or dividing lines in the images.
  • Examples of other techniques include video tracking, such as blob tracking, kernel-based tracking, contour tracking, visual feature matching, match moving, motion capture, motion estimation, Swistrack, and single particle tracking.
  • the media controller subsystem 348 detects a stimulus, such as ignition of the engine, motion of the vehicle, the vehicle being placed in gear, a gesture or other activation command from the user, and the like.
  • a stimulus such as ignition of the engine, motion of the vehicle, the vehicle being placed in gear, a gesture or other activation command from the user, and the like.
  • the media controller subsystem 348 maps the current display of a selected one of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to a selected control surface. This can include proportional sizing the control surface to the display and determining cell and cell boundary locations. The sizes of the cells are commonly proportional to the
  • the media controller subsystem 348 determines, by one or more of the motion sensors 744, infrared sensors 740, and camera sensors 760, whether a user's digit has moved relative to the stationary cell boundaries and cells on the mapped control surface.
  • the media controller subsystem 348 detects or determines, by one or more of the motion sensors 744, infrared sensors 740, and camera sensors 760, the position of the digit on the control surface relative to the cells and cell boundaries.
  • step 3516 the detected position of the digit is mapped to the currently displayed content. This is done by mapping the cell where the tip of the digit is positioned to the corresponding displayed content on the display.
  • step 3520 the media controller subsystem 348 performs an action based on the detected digit position.
  • the action depends on the type of displayed content
  • the media controller subsystem 348 activates the corresponding task, function or operation.
  • a screen reader in the media controller subsystem 348 reads the displayed content to the user.
  • Some operations are drag-and-drop operations where content is moved by the user's digit from one location to another, such as a destination or waypoint being dragged from one location and dropped into a set destination or wet waypoint box.
  • the UI may not include any display at all. Because any surface can receive input, as provided herein, a display is not needed. Examples of this type of display-less UI can be implemented on any surface that has a substantially flat or planar area or surface.
  • the UI area may include a capacitive and/or resistive touch interface region.
  • the UI area may be a surface that includes an area visible from at least one image sensor (e.g., camera, etc.).
  • Other examples of a display-less UI may include a volume of space in which a user can move a pointer.
  • a user may navigate the non-displayed UI elements (e.g., icons, buttons, etc.) using sound, haptic feedback, etc.
  • UI elements e.g., icons, buttons, etc.
  • a stimulus is detected by the media controller subsystem 348 based on input from one or more vehicle sensors.
  • An exemplary stimulus includes a change in vehicle operating state. Common changes in vehicle operating state include a change from “parked” to “forward gear” and vice versa, from “parked” to “reverse gear” and vice versa, from “forward gear” to “reverse gear” and vice versa, and from stationary to in motion and vice versa.
  • the media controller subsystem 348 selects a device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 and determines whether or not it is associated with the driver. This is typically determined by determining whether or not the selected device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 is located in an area and/or zone occupied by the driver. Location of the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 can be determined using the techniques noted above.
  • the media controller subsystem 348 When the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 is located in the area and/or zone of the driver, the media controller subsystem 348, in step 2608, reconfigures the user interface to enable or disable (depending on the initial and final states) one of the features, settings, or filters discussed above.
  • the media controller subsystem 348 does not reconfigure the user interface to enable or disable (depending on the initial and final states) one of the features, settings, or filters discussed above.
  • the media controller subsystem 348 selects a next device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 and returns to decision diamond 2604.
  • the media controller subsystem 348 can audibly warn of or describe an image detected via an infrared sensor, motion sensor or camera sensor that is in the path of the vehicle.
  • image processing and image recognition techniques can be used to identify the object image and compare the image to a projected path of the vehicle.
  • An exemplary technique acquires the image, pre-processes the acquired image, performs feature extraction from the pre-processed image, performs detection and/or segmentation using the extracted features, and performs high level processing and decision making.
  • the path of the vehicle as determined by a steering wheel setting and a projected path of the center line of the vehicle for that setting along with a width of the vehicle can be compared against the position of the object.
  • the vehicle control system 204 can manage user accounts that stipulate, control, and define rights and privileges of each user of the vehicle and vehicle network. Different users and/or devices may include different accounts. In some cases these accounts may include user and/or device-specific privileges.
  • a system user has exclusive rights and privileges over all communication device firewall rules and running software.
  • the system user can control selected vehicle tasks, functions and operations and infotainment via predetermined or preconfigured gestures.
  • a manufacturer or vendor user can have local or remote access to on board control system data and alter the parameters and settings in the vehicle control system and to perform diagnosis and repair.
  • a dashboard user can have rights to launch a dashboard application and authenticate guest or default users and change permissions to trusted friends or family members. Dashboard users can read on board diagnostic system data but cannot overwrite or change it.
  • a user can be both a system and dashboard user.
  • An Internet source interacting with the vehicle has only http rights to respond to http requests received from the vehicle. The requests can target different user data but these are filtered by default user accounts.
  • the back seat displays are part of this group of users.
  • the back seat displays can be smart WiFi removable communication devices that can be removed from the vehicle in WiFi range of the communication network.
  • a child user may wish to change the temperature in his zone of the vehicle.
  • the child may have an account that allows access to this feature (but not other more critical features).
  • the child may change the temperature via the device or other control.
  • the child may not be entitled to change the feature for an area or zone other than that in which he or she is currently located.
  • a driver may have a broader account and privileges. In this case, the driver may control critical features, and even tasks, operations, and functions, of a vehicle. Remote control of a vehicle or infotainment task, operation, or function through the on board vehicle computer and by a portable or handheld
  • a computational device such as a cellular phone or tablet computer, interconnected with the vehicle computer, can be permitted for some users but not others.
  • a privileged user can control an infotainment or climate control function on the dashboard from the back seat of the vehicle.
  • the control of the vehicle or infotainment task, function, or operation may be made via a device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312.
  • occupant control of a (typically non- or less critical) vehicle task, function, or operation can be done using an on board (or vehicle mounted) computational device or a portable computational device discrete from and not mounted on the vehicle.
  • a user in the rear passenger seat of a vehicle may wish to take control of specific features that are allowed in that zone.
  • Such features may include controlling the climate, changing a radio station or infotainment presentation, opening a window, setting a volume level, setting a screen contrast and/or resolution, and/or adjusting a seat position.
  • the control of these features may be provided to the user in the rear passenger seat via a display, gesture control region, or other device associated with the zone in which the rear passenger is seated.
  • Associated devices may include the user's smart-phone, tablet, computer, and/or other device.
  • a vehicle occupant can perform a first set of vehicle tasks, functions and operations when in a first area and/or zone and a different set of vehicle tasks, functions, and operations when in a different second area and/or zone.
  • a vehicle occupant can drive the car or perform another critical vehicle task, function or operation when seated in the driver's seat but not when seated in a different seat.
  • the vehicle occupant is in the back seat, he is unable to be a back seat driver of the vehicle or perform another critical vehicle task, function or operation.
  • a vehicle occupant can perform a first set of vehicle tasks, functions and operations when the vehicle is in a first operating state and a different set of vehicle tasks, functions, and operations when the vehicle is in a second operating state.
  • the vehicle occupant for instance, can watch a video on the driver's center console or display subsystem when parked and/or the vehicle is stationary (not in motion) but cannot watch the video on the driver's center console or display subsystem when the vehicle is in gear and/or motion.
  • a different authentication procedure is used for the vehicle occupant when requesting a first set of vehicle tasks, operations, and functions than when requesting a second set of vehicle tasks, operations, and functions.
  • a different and higher or more stringent set of authentication requirements can be required than when the vehicle occupant is not entitled to drive the vehicle.
  • multiple authentication techniques may be used in the former case but only one in the latter case.
  • the vehicle control system 204 manages user accounts, which contain user credentials, such as user identifiers, passwords, user biometric data (e.g., fingerprints, retina images, facial characteristics, user weight, and other physical characteristics), and rights and privileges of the user in performing vehicle tasks, functions, and operations, both critical and non-critical.
  • user credentials such as user identifiers, passwords, user biometric data (e.g., fingerprints, retina images, facial characteristics, user weight, and other physical characteristics), and rights and privileges of the user in performing vehicle tasks, functions, and operations, both critical and non-critical.
  • user biometric data e.g., fingerprints, retina images, facial characteristics, user weight, and other physical characteristics
  • rights and privileges e.g., passwords, passwords, passwords, user biometric data (e.g., fingerprints, retina images, facial characteristics, user weight, and other physical characteristics), and rights and privileges of the user in performing vehicle tasks, functions, and operations, both critical and non-critical.
  • the vehicle control system 204 can manage and enforce accounts for
  • Each account specifies rights and privileges for the corresponding user and/or security/authentication requirements and/or personal settings of the user (such as seat and lumbar settings, climate control settings, lighting settings, configuration of instrument cluster, rear view mirror settings, driving modes (such as fuel economy, sport, city, and the like), media channel settings or presets, media delivery preferences, music genre preferences, scheduled programs, playlists,
  • cloud-based data associated with the user such as iCloudTM of AppleTM, OutlookTM, and the like with examples of cloud-based data being electronic calendar, email, contacts, media content, texts, voice mail messages, and the like), application-specific personalization and selections, display settings and configurations, and the like
  • cloud-based data such as iCloudTM of AppleTM, OutlookTM, and the like with examples of cloud-based data being electronic calendar, email, contacts, media content, texts, voice mail messages, and the like
  • application-specific personalization and selections display settings and configurations, and the like
  • an owner of the vehicle would have the right and privilege not only to access infotainment media but also to access and control critical and non- or less critical vehicle tasks, functions, and operations. Because he is entitled to the highest level of rights and privileges, his or her authentication requirements may be more demanding than another vehicle occupant not having such rights and privileges.
  • a child of the vehicle owner on the other hand would have a very restricted or no right to access and control vehicle tasks, functions, and operations (except for climate control, seat, and/or window settings) and restricted rights to access infotainment media.
  • a child filter could be configured by the vehicle owner to prevent the child from accessing age- restricted media.
  • a guest of the vehicle owner having no corresponding account would be provided with a default account having default and restricted rights and privileges to access vehicle and infotainment tasks, functions, and operations, and settings and configurations, which could be expanded based on input from the owner. This could be done in response to a query by the vehicle control system 204 that the vehicle occupant is not recognized (or authenticated).
  • the vehicle control system 204 could provide the owner with multiple default accounts, one for an adult with driving privileges, one for an adult without driving privileges, and one for a child, and request the owner select the appropriate default account for the occupant.
  • An identifier of the occupant and optionally authentication credentials would be inputted by the owner, and the vehicle control system 204 could store image processing information of the occupant's facial or other image, biometric information, and/or occupant weight for purposes of authentication.
  • the owner could define, in an account created or modified by the owner, the rights and privileges of the person associated with the account.
  • FIG. 27 An operation of the vehicle control system 204 is shown in Fig. 27.
  • step 2700 the vehicle control system 204 detects a stimulus, such as a driver prompt, presence of an occupant, and the like.
  • a stimulus such as a driver prompt, presence of an occupant, and the like.
  • the presence of an occupant can be sensed by seat weight sensors, image processing analysis, motion detection, proximity of a key fob, and/or biometric information.
  • the vehicle control system 204 determines whether or not each of the vehicle occupants has been authenticated successfully. Authentication can be performed by any suitable technique and using any suitable input, such as by one or more of receiving and validating user credentials, sensing an occupant's weight in a seat (with the weight being different from weight ranges predefined for each known occupant), image processing performed on the facial characteristics of the various vehicle occupants (and comparing the sensed facial characteristics against those for known occupants), biometrics (such as a fingerprint image of the occupant sensed by the occupant touching a touchscreen display, retinal scan, heart rhythm as received by conductive elements on a vehicle component such as the steering wheel or gear shift) (which are compared against corresponding stored biometrics for known occupants), and the like.
  • biometrics such as a fingerprint image of the occupant sensed by the occupant touching a touchscreen display, retinal scan, heart rhythm as received by conductive elements on a vehicle component such as the steering wheel or gear shift
  • An unauthenticated occupant could be recognized by sensing an occupant's weight in a seat (with the weight being different from weight ranges predefined for each known occupant), image processing performed on the facial characteristics of the various vehicle occupants (with the sensed facial characteristics failing to match stored facial characteristics for known occupants), unrecognized biometrics (which do not correspond to biometrics for known occupants), and the like.
  • Authentication using image processing applied to video images of occupants in the vehicle can be performed by any suitable technique.
  • facial recognition occurs in two stages, namely face detection and recognition.
  • the face detection stage attempts to obtain an acceptable image from the video stream of the vehicle occupants to use in the face recognition stage.
  • the face recognition process uses the image obtained from the detection stage and compares it with a database of known faces. When the facial image obtained from the detection stage matches a known stored facial image, authentication is successful. When the facial image obtained from the detection stage fails to match a known stored facial image, authentication is unsuccessful.
  • image processing can be augmented by other security mechanisms for higher level privileges, such as authentication by user credentials (e.g., pattern lock, PIN, or other password).
  • the vehicle control system 204 retrieves the corresponding account for the successfully authenticated user.
  • the vehicle control system 204 next determines the area and/or zone (e.g., seat position) occupied by the authenticated user, such as by using any of the techniques discussed herein, including without limitation image processing information of the various occupants, sensed seat weight, user input, location coordinates received from a portable computational device, such as a cell phone, tablet computer, or personal digital assistant associated with the user, and the like.
  • the vehicle control system 204 may also determine an operating state of the vehicle, e.g., in motion, parked, in gear, in neutral, and the like.
  • step 2716 based on location (e.g., seat position) of the corresponding authenticated user and/or the operating state of the vehicle, the vehicle control system 204 enables eligible rights, privileges, settings and configurations set forth in the
  • the vehicle control system 204 in step 2720, creates an account for the unauthenticated user as set forth above.
  • the account can be a default account.
  • FIG. 28-29 Another application of the vehicle control system 204 for authentication by facial recognition is shown in Figs. 28-29.
  • step 2800 the vehicle control system 204 detects a stimulus or trigger event.
  • the vehicle control system 204 provides, via the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312, the user with a face detection screen, such as that shown in Fig. 29.
  • the face detection screen 2900 is a dashed outline 2904 of a human face having eye holes 2908a and b to enable the user to align his or her face with the screen for a higher quality video image of the user's face.
  • This screen 2900 prompts the user to align his or her face such that the facial recognition camera can obtain a good image.
  • the screen can show the video feed from the facial recognition camera with the dashed outline as an overlay.
  • the dashed outline is intended to assist users situate his or her image properly in the camera's field of view.
  • An instruction line 2912 can appear at the top of the screen 2900.
  • the vehicle control system 204 proceeds 2803to a manual sign-in step (discussed below) and provides the user with a face recognition failure notification dialog over the screen 2900 of the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 in step 2812. If a proper facial image is obtained, the vehicle control system 204, in step 2816, searches through a repository of stored facial images to find a match and, during the mapping, provides the user with a recognition progress indicator modal on the screen 2900.
  • step 2812 When recognition is not successful or no match is found, the vehicle control system 204 proceeds to step 2812 and the manual sign-in step discussed below.
  • step 2820 the vehicle control system 204 proceeds to step 2820 and applies the personal settings of the user, such as any or all of the settings referenced above.
  • a settings application progress modal can be displayed over the displayed screen.
  • step 2812 the vehicle control system 204 provides the user with a manual sign-in screen (not shown) requesting input of the user's credentials.
  • the manual sign-in screen for example, can include fields for user identifier and password and options to sign in or continue as a guest.
  • the vehicle control system 204 can create 2828 a guest or default account for the user as discussed above and applies default settings.
  • the vehicle control system 204 in step 2820 applies the personal settings of the user.
  • step 2832 The vehicle control system 204, after steps 2820, 2824, or 2828 proceeds to step 2832 and provides a home screen page, such as home page 3000, configured in accordance with the applicable settings.
  • the profile data 252 can include, for each occupant in a corresponding user profile, physical impairment information, such as information regarding a disability.
  • the physical impairments include vision impairments (e.g., low vision, blindness, night and color blindness, and light sensitivity), hearing impairments (such as hearing loss, hard-of- hearing, and deafness), dexterity and mobility impairments (such as caused by arthritis, cerebral palsy, multiple sclerosis, loss of limb or digit, spinal cord injury, and repetitive stress injury), and language and/or communication impairments (such as aphasia, delayed speech, dyslexia, and other conditions resulting in difficulties remembering, solving problems, and/or perceiving sensory information).
  • vision impairments e.g., low vision, blindness, night and color blindness, and light sensitivity
  • hearing impairments such as hearing loss, hard-of- hearing, and deafness
  • dexterity and mobility impairments such as caused by arthritis, cerebral palsy, multiple sclerosis, loss of limb
  • System data 208 can include rules to implement by the vehicle control system 204 an accessibility or assistive technology. This can include reconfiguring, for example, the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312. Once the user having the impairment is located in the vehicle or correlated with a particular seat position and/or device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312, the accessibility or assistive technology can be applied to the corresponding device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312.
  • the type of accessibility or assistive technology employed can depend not only on the type and capabilities of the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 but also on the seat position of the impaired user. Specifically, when the impaired user is a driver of the vehicle, he or she may receive different accessibility or assistive technology compared to when he or she is a non-driving vehicle occupant or passenger.
  • the type of assistance provided by the vehicle control system 204 depends on the particular impairment and/or disability involved. For example, if the vehicle control system 204 were to determine that the user has a particular impairment and/or disability, the vehicle control system 204 can recommend and/or implement automatically other feature or device settings and/or accessible technology that may provide improved access for the user.
  • information, commands, warnings, and requests provided by way of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to users with vision impairments can be one or more of the use of screen magnification, high contrast (e.g., between text and background colors such as white text on a black background, large font size and/or icon size (e.g., without changing screen resolution), color changes on the screen, a screen reader (or other text-to-speech program), speech recognition software (such as to operate the computer and/or software), enablement of a read mode, keyboard web page navigation, and the like.
  • high contrast e.g., between text and background colors such as white text on a black background
  • large font size and/or icon size e.g., without changing screen resolution
  • speech recognition software such as to operate the computer and/or software
  • enablement of a read mode keyboard web page navigation, and the like.
  • Information, commands, warnings, and requests provided by way of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to users with a hearing impairment can be one or more of the use of text or visual alternatives for sounds, high volume levels, changed computer sounds, sign language interpretation or translation (e.g., by image processing and acquisition based on visual images captured by one or more camera sensors), a text phone application, and the like.
  • Information, commands, warnings, and requests provided by way of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to users with a dexterity and/or mobility impairment can be one or more of the use of particular mouse settings (e.g., mouse button configuration and timing (such as double-click speed), to make the mouse pointer more visible, and to alter the scroll speed of the mouse wheel or how quickly the mouse pointer responds to movements of the mouse), an increased size of a mouse-selectable screen element to provide a larger target, mouse keys to move the mouse pointer, sticky keys, toggle keys, filter keys, keyboard shortcuts, access keys, keyboard settings, an on-screen keyboard, speech recognition software (such as to dictate into almost any application (e.g., the user can dictate documents and email, surf the web, and control selected vehicle tasks, functions, and operations by voice command)), disablement of the automatic arrangement of windows when the mouse cursor is moved to the edge of the screen, enablement of activate

Abstract

Methods, systems, and a computer readable medium are provided for maintaining a persona of a vehicle occupant and, based on the persona of the vehicle occupant and vehicle-related information, performing an action assisting the vehicle occupant. Methods, systems, and a computer readable medium are also provided for a vehicle containing multiple blade processors for performing vehicle and/or infotainment tasks, functions, and operations. The blade processors can be included in a crate having a first communication zone defining a trusted network within the vehicle to connect with trusted computational devices and/or modules provided or certified by the vehicle manufacturer but not untrusted computational devices and/or modules provided by vehicle occupants, a second communication zone defining an untrusted network to connect with the untrusted computational devices and/or modules, and a third communication zone providing power and data transmission to the blade processors. A master blade processor can assign a component and/or module requiring a blade processor for execution to a selected blade processor.

Description

INTELLIGENT VEHICLE FOR ASSISTING VEHICLE OCCUPANTS AND INCORPORATING VEHICLE CRATE FOR BLADE PROCESSORS CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims the benefits of and priority, under 35 U.S.C. § 119(e), to U.S. Provisional Application Serial Nos. 61/811,981, filed on April 15, 2013, entitled "Functional Specification for a Next Generation Automobile"; 61/865,954, filed on August 14, 2013, entitled "Gesture Control of Vehicle Features"; 61/870,698, filed on August 27, 2013, entitled "Gesture Control and User Profiles Associated with Vehicle Features"; 61/891,217, filed on October 15, 2013, entitled "Gesture Control and User Profiles Associated with Vehicle Features"; 61/904,205, filed on November 14, 2013, entitled "Gesture Control and User Profiles Associated with Vehicle Features";
61/924,572, filed on January 7, 2014, entitled "Gesture Control and User Profiles
Associated with Vehicle Features"; and 61/926,749, filed on January 13, 2014, entitled "Method and System for Providing Infotainment in a Vehicle." The entire disclosures of the applications listed above are hereby incorporated by reference, in their entirety, for all that they teach and for all purposes.
[0002] This application is also related to U.S. Patent Application Nos. 13/420,236, filed on March 14, 2012, entitled, "Configurable Vehicle Console"; 13/420,240, filed on March 14, 2012, entitled "Removable, Configurable Vehicle Console"; 13/462,593, filed on May 2, 2012, entitled "Configurable Dash Display"; 13/462,596, filed on May 2, 2012, entitled "Configurable Heads-Up Dash Display"; 13/679,459, filed on November 16, 2012, entitled "Vehicle Comprising Multi-Operating System" (Attorney Docket No. 6583-228); 13/679,234, filed on November 16, 2012, entitled "Gesture Recognition for On-Board Display" (Attorney Docket No. 6583-229); 13/679,412, filed on November 16, 2012, entitled "Vehicle Application Store for Console" (Attorney Docket No. 6583-230);
13/679,857, filed on November 16, 2012, entitled "Sharing Applications/Media Between Car and Phone (Hydroid)" (Attorney Docket No. 6583-231); 13/679,878, filed on
November 16, 2012, entitled "In-Cloud Connection for Car Multimedia" (Attorney Docket No. 6583-232); 13/679,875, filed on November 16, 2012, entitled "Music Streaming" (Attorney Docket No. 6583-233); 13/679,676, filed on November 16, 2012, entitled "Control of Device Features Based on Vehicle State" (Attorney Docket No. 6583-234); 13/678,673, filed on November 16, 2012, entitled "Insurance Tracking" (Attorney Docket No. 6583-235); 13/678,691, filed on November 16, 2012, entitled "Law
Breaking/Behavior Sensor" (Attorney Docket No. 6583-236); 13/678,699, filed on November 16, 2012, entitled "Etiquette Suggestion" (Attorney Docket No. 6583-237); 13/678,710, filed on November 16, 2012, entitled "Parking Space Finder Based on Parking Meter Data" (Attorney Docket No. 6583-238); 13/678,722, filed on November 16, 2012, entitled "Parking Meter Expired Alert" (Attorney Docket No. 6583-239);
13/678,726, filed on November 16, 2012, entitled "Object Sensing (Pedestrian
Avoidance/Accident Avoidance)" (Attorney Docket No. 6583-240); 13/678,735, filed on November 16, 2012, entitled "Proximity Warning Relative to Other Cars" (Attorney Docket No. 6583-241); 13/678,745, filed on November 16, 2012, entitled "Street Side Sensors" (Attorney Docket No. 6583-242); 13/678,753, filed on November 16, 2012, entitled "Car Location" (Attorney Docket No. 6583-243); 13/679,441, filed on November 16, 2012, entitled "Universal Bus in the Car" (Attorney Docket No. 6583-244);
13/679,864, filed on November 16, 2012, entitled "Mobile Hot Spot/Router/Application Share Site or Network" (Attorney Docket No. 6583-245); 13/679,815, filed on November 16, 2012, entitled "Universal Console Chassis for the Car" (Attorney Docket No. 6583- 246); 13/679,476, filed on November 16, 2012, entitled "Vehicle Middleware" (Attorney Docket No. 6583-247); 13/679,306, filed on November 16, 2012, entitled "Method and System for Vehicle Data Collection Regarding Traffic" (Attorney Docket No. 6583-248); 13/679,369, filed on November 16, 2012, entitled "Method and System for Vehicle Data Collection" (Attorney Docket No. 6583-249); 13/679,680, filed on November 16, 2012, entitled "Communications Based on Vehicle Diagnostics and Indications" (Attorney Docket No. 6583-250); 13/679,443, filed on November 16, 2012, entitled "Method and System for Maintaining and Reporting Vehicle Occupant Information" (Attorney Docket No. 6583-251); 13/678,762, filed on November 16, 2012, entitled "Behavioral Tracking and Vehicle Applications" (Attorney Docket No. 6583-252); 13/679,292, filed November 16, 2012, entitled "Branding of Electrically Propelled Vehicles Via the Generation of Specific Operating Output" (Attorney Docket No. 6583-258); 13/679,400, filed November 16, 2012, entitled "Vehicle Climate Control" (Attorney Docket No. 6583-313);
13/840,240, filed on March 15, 2013, entitled "Improvements to Controller Area Network Bus" (Attorney Docket No. 6583-314); 13/678,773, filed on November 16, 2012, entitled "Location Information Exchange Between Vehicle and Device" (Attorney Docket No. 6583-315); 13/679,887, filed on November 16, 2012, entitled "In Car Communication Between Devices" (Attorney Docket No. 6583-316); 13/679,842, filed on November 16, 2012, entitled "Configurable Hardware Unit for Car Systems" (Attorney Docket No. 6583- 317); 13/679,204, filed on November 16, 2012, entitled "Feature Recognition for Configuring a Vehicle Console and Associated Devices" (Attorney Docket No. 6583-318); 13/679,350, filed on November 16, 2012, entitled "Configurable Vehicle Console" (Attorney Docket No. 6583-412); 13/679,358, filed on November 16, 2012, entitled "Configurable Dash Display" (Attorney Docket No. 6583-413); 13/679,363, filed on November 16, 2012, entitled "Configurable Heads-Up Dash Display" (Attorney Docket No. 6583-414); and 13/679,368, filed on November 16, 2012, entitled "Removable, Configurable Vehicle Console" (Attorney Docket No. 6583-415). The entire disclosures of the applications listed above are hereby incorporated by reference, in their entirety, for all that they teach and for all purposes.
BACKGROUND
[0003] Whether using private, commercial, or public transport, the movement of people and/or cargo has become a major industry. In today's interconnected world, daily travel is essential to engaging in commerce. Commuting to and from work can account for a significant portion of a traveler's day. As a result, vehicle manufacturers have begun to focus on making this commute, and other journeys, more enjoyable.
[0004] Currently, vehicle manufacturers attempt to entice travelers to use a specific conveyance based on any number of features. Most of these features focus on vehicle safety or efficiency. From the addition of safety-restraints, air-bags, and warning systems to more efficient engines, motors, and designs, the vehicle industry has worked to appease the supposed needs of the traveler. Recently, however, vehicle manufactures have shifted their focus to user and passenger comfort as a primary concern. Making an individual more comfortable while traveling instills confidence and pleasure in using a given vehicle, increasing an individual's preference for a given manufacturer and/or vehicle type.
[0005] One way to instill comfort in a vehicle is to create an environment within the vehicle similar to that of an individual's home. Integrating features in a vehicle that are associated with comfort found in an individual's home can ease a traveler's transition from home to vehicle. Several manufacturers have added comfort features in vehicles such as the following: leather seats, adaptive and/or personal climate control systems, music and media players, ergonomic controls, and, in some cases, Internet connectivity. However, because these manufacturers have added features to a conveyance, they have built comfort around a vehicle and failed to build a vehicle around comfort.
SUMMARY
[0006] There is a need for a vehicle ecosystem, which can integrate both physical and mental comforts, while seamlessly communicating with current electronic devices to result in a totally intuitive and immersive user experience. These and other needs are addressed by the various aspects, embodiments, and/or configurations of the present disclosure. Also, while the disclosure is presented in terms of exemplary and optional embodiments, it should be appreciated that individual aspects of the disclosure can be separately claimed.
[0007] A method can be provided by the present disclosure that includes the steps of:
[0008] providing, in a vehicle, first and second operating systems executing on a common microprocessor, wherein the first operating system comprises one or more applications performing a critical vehicle task, function, and/or operation and the second operating system comprises one or more applications performing an infotainment task, function, and/or operation;
[0009] collecting, by a computer control module, one or more metrics regarding an operation of the first and/or second operating system and/or computer and/or a network in communication with the computer;
[0010] determining, by the computer control module, whether the collected one or metrics are normal and/or abnormal; and
[0011] when the collected one or metrics is not normal or abnormal, shutting down the second operating system but not the first operating system.
[0012] In a vehicle, first and second operating systems can execute on a common microprocessor, the first operating system can comprise one or more applications performing a critical vehicle task, function, and/or operation and the second operating system can comprise one or more applications performing an infotainment task, function, and/or operation, and a tangible and non-transient computer readable medium in the vehicle can include microprocessor executable and readable instructions that, when executed by the microprocessor, can perform operations including:
[0013] collecting, by a computer control module, one or more metrics regarding an operation of the first and/or second operating system and/or a network in communication with the computer;
[0014] determining, by the computer control module, whether the collected one or metrics of one or more of the first and/or second operating system and/or a network in communication with the computer are normal and/or abnormal; and
[0015] applying one or more of the following rules:
[0016] when the collected one or metrics associated with the first operating system is not normal or abnormal, shutting down the first and second operating systems; [0017] when the collected one or metrics associated with the second operating system is not normal or abnormal, shutting down the second, but not the first, operating system; and [0018] when the collected one or metrics associated with the network is not normal or abnormal, shutting down the second, but not the first, operating system.
[0019] A vehicle can include:
[0020] a first computer having a first operating system executing one or more applications performing a critical vehicle task, function, and/or operation; and
[0021] a second computer having a different second operating system executing one or more applications performing at least one of an infotainment task, function, and/or operation and less critical task, function, and/or operation, wherein at least one of the following can be true:
[0022] (a) a computer control module collects one or more metrics regarding an operation of the first and/or second computers and/or a network in communication with the first and/or second computer, determines whether the collected one or metrics regarding an operation of the first and/or second computers and/or network are normal and/or abnormal, and apply one or more of the following rules:
[0023] when the collected one or metrics associated with the first operating system is not normal or abnormal, shutting down the first and second operating systems;
[0024] when the collected one or metrics associated with the second operating system is not normal or abnormal, shutting down the second, but not the first, operating system; and [0025] when the collected one or metrics associated with the network is not normal or abnormal, shutting down the second, but not the first, operating system; and
[0026] (b) the first and second computers are configured to be incapable of
communicating with each other.
[0027] Each member of the set of critical vehicle tasks, functions, and/or operations can be different from each member of the set of infotainment tasks, functions, and/or operations, whereby the union of the two sets is null.
[0028] The critical task, function or operation can be one or more of monitoring, controlling, and/or operating the ECU, TCU, door settings, window settings, and/or blind spot monitor, monitoring, controlling, and/or operating the safety equipment, monitoring and/or controlling certain critical sensors such as the power source controller and energy output sensor, engine temperature, oil pressure sensing, hydraulic pressure sensors, sensors for headlight and other lights, vehicle control system sensors, and/or
steering/torque sensor, controlling the operation of the engine, head light control unit, power steering, display panel, switch state control unit, power control unit, and/or brake control unit, and/or issuing alerts to a user and/or remote monitoring entity of potential problems with a vehicle operation.
[0029] The infotainment task, function or operation can be one or more of receiving, processing, and/or providing media and/or multimedia content.
[0030] The first operating system can be simulated by a first virtual machine and the second operating system by a second virtual machine and the computer control module can be in or configured as a hypervisor module.
[0031] The first and second operating systems can concurrently and independently run on a common kernel.
[0032] A first requirement applied to the first operating system to determine abnormal operation can be different from a second requirement applied to the second operating system to determine abnormal operation.
[0033] The present disclosure can include a method, vehicle, and/or tangible and non- transient computer readable medium comprising the steps, operations, and/or functions of:
[0034] (a) determining, by a microprocessor executable device discovery daemon, that a computational device is connected to or attempting to connect to a network and/or communication subsystem of a vehicle;
[0035] (b) in response, determining, by the device discovery daemon, whether the computational device is located within a predetermined area and/or zone of the vehicle; and
[0036] (c) applying, by the device discovery daemon, at least the following rules:
[0037] (CI) when the computational device is located within the predetermined area and/or zone of the vehicle, permitting the computational device to access or attempt to access the vehicle network and/or communication subsystem; and
[0038] (C2) when the computational device is not located within the predetermined area and/or zone of the vehicle, not permitting the computational device to access or attempt to access the vehicle network and/or communication subsystem.
[0039] A type of the computational device can determine a specific predetermined area and/or zone of the vehicle, from among a plurality of predetermined areas and/or zones, to be used in applying the rules.
[0040] The computational device can be one or more of a tablet computer, laptop, smart phone, and personal digital assistant. [0041] The specific predetermined area and/or zone of the vehicle can be at least part of the passenger compartment.
[0042] The device discovery daemon can perform sub-steps of step (a) including:
[0043] receiving, by the device discovery daemon, information from an on board vehicle sensor that a new occupant has entered the vehicle;
[0044] in response to the receipt of the information, emitting, by the device discovery daemon, a ping to discover the computational device; and
[0045] when a responsive signal is received from the computational device,
determining, by the device discovery daemon, that the computational device is attempting to connect to a network and/or communication subsystem of a vehicle.
[0046] The determining step/operation/function (b) can base the determination on whether the computational device is located within the predetermined area and/or zone of the vehicle on one or more of signal strength of a signal from the computational device as received by an access point of the vehicle, a received satellite -based position of the computational device, triangulation based on relative received signal strengths of a signal from the computational device as received by multiple access points of the vehicle, image processing of images of the predetermined area and/or zone, occupant presence and/or location information received by an on board vehicle sensor, whether the computational device is attempting to connect to the network and/or communication subsystem wirelessly or by hard wire connection, whether the computational device has moved relative to a selected access point during a defined time interval, whether the received signal strength of signaling from the computational device at a selected access point varies temporally, a type or service of the computational device, and input received from a user of the computational device.
[0047] When the computational device is determined to be located within the predetermined area and/or zone of the vehicle and is permitted to access or attempt to access the vehicle network and/or communication subsystem and wherein the device discovery daemon can determine a set of tasks, functions, and/or operations that can be performed and a set of tasks, functions, and/or operations that cannot be performed based on the determined location of the computational device.
[0048] The device discovery daemon can determine a level of confidence that the computational device is located within the predetermined area and/or zone and wherein the device discovery daemon determines that the computational device is located within the predetermined area and/or zone when the level of confidence has at least a threshold value.
[0049] A method, vehicle, and/or computer executable instructions can be provided that perform at least the following steps, operations, and functions:
[0050] (a) determining, by a microprocessor executable media controller subsystem, that a user is driving a vehicle; and
[0051] (b) in response, the microprocessor executable media controller subsystem performing one or more of the following steps:
[0052] (i) removing a video channel but not an audio channel from media content to be displayed on a screen associated with the user;
[0053] (ii) applying screen magnification to content displayed on the screen associated with the user;
[0054] (iii) reconfiguring the screen and/or content to provide a large font and/or icon size for the displayed content compared to the displayed content when the user is not driving the vehicle;
[0055] (iv) removing unnecessary animations from the content to be displayed on the screen;
[0056] (v) removing background images from, while leaving at least one foreground image in, the content to be displayed on the screen;
[0057] (vi) compared to the displayed content when the user is not driving the vehicle, reconfiguring the screen and/or content to provide higher contrast to make the displayed content more visible to the user;
[0058] (vii) initiating a screen reader to audibly describe currently displayed content to the user;
[0059] (viii) compared to the period for notification dialog boxes to remain open on the screen when the user is not driving the vehicle, enabling a longer period for notification dialog boxes to remain open on the screen;
[0060] (ix) changing the color and/or transparency of window borders on the displayed content;
[0061] (x) changing the thickness of a focus rectangle around a currently selected object in a dialog box displayed on the screen;
[0062] (xi) changing the color, size and/or thickness of an on-screen mouse pointer displayed on the screen;
[0063] (xii) changing a keyboard setting of a keyboard displayed on the screen; [0064] (xiii) formatting a web page, in the content to be displayed, differently from the web page format received from a web server;
[0065] (xiv) compared to the size of a mouse-selectable screen object when the user is not driving the vehicle, increasing a size of a mouse-selectable screen object to provide a larger target;
[0066] (xv) enabling mouse keys to move the mouse cursor on the screen;
[0067] (xvi) enabling one or more of sticky keys, toggle keys, and filter keys; and
[0068] (xvii) enabling the screen to receive input written by the user's finger as an inputted command or request.
[0069] The user can be determined to be driving the vehicle when the user is determined to be in a driver's seat of the vehicle.
[0070] The user can be determined to be driving the vehicle when the vehicle is at least one of in gear and in motion.
[0071] The method/vehicle/instructions can further perform the following steps, operations, and functions:
[0072] (c) determining, by the media controller subsystem, that the user is no longer driving the vehicle; and
[0073] (d) in response to step (c), the media controller subsystem performing one or more of the following steps:
[0074] (i) no longer removing a video channel from media content to be displayed;
[0075] (ii) no longer applying screen magnification to content to be displayed;
[0076] (iii) reconfiguring the display to provide a smaller font and/or icon size for the content to be displayed compared to the font and/or icon size used when the user is driving the vehicle;
[0077] (iv) no longer removing unnecessary animations from the content to be displayed;
[0078] (v) no longer removing background images from the content to be displayed;
[0079] (vi) reconfiguring the screen to provide lower contrast compared to the contrast used when the user is driving the vehicle;
[0080] (vii) disabling the screen reader;
[0081] (viii) compared to the period for notification dialog boxes to remain open on the screen when the user is driving the vehicle, enabling a shorter period for notification dialog boxes to remain open on the screen; [0082] (ix) compared to the color and/or transparency of window borders on the displayed content when the user is driving the vehicle changing the color and/or transparency of window borders on the displayed content;
[0083] (x) compared to the thickness of a focus rectangle around a currently selected object in a dialog box displayed on the screen color and/or transparency of window borders on the displayed content when the user is driving the vehicle, changing the thickness of a focus rectangle around a currently selected object in a dialog box displayed on the screen;
[0084] (xi) compared to the color, size and/or thickness of an on-screen mouse pointer displayed on the screen when the user is driving the vehicle, changing the color, size and/or thickness of an on-screen mouse pointer displayed on the screen;
[0085] (xii) compared to the keyboard setting of a keyboard displayed on the screen when the user is driving the vehicle, changing a keyboard setting of a keyboard displayed on the screen;
[0086] (xiii) no longer formatting a web page, in the content to be displayed, differently from the web page format received from a web server;
[0087] (xiv) compared to the size of a mouse-selectable screen object when the user is driving the vehicle, decreasing a size of a mouse-selectable screen object to provide a larger target;
[0088] (xv) disabling mouse keys to move the mouse cursor on the screen;
[0089] (xvi) disabling the one or more of sticky keys, toggle keys, and filter keys; and [0090] (xvii) disabling the screen to receive input written by the user's finger as an inputted command or request.
[0091] The user can be determined not to be driving the vehicle when the vehicle is at least one of not in gear, parked, and not in motion.
[0092] The user can be determined to be in a driver's seat of the vehicle when the user is located within a predetermined area and/or zone.
[0093] The determining step (a) can base the determination on whether the user is located within the predetermined area and/or zone of the vehicle on one or more of signal strength of a signal from a computational device associated with the user as received by an access point of the vehicle, a received satellite-based position of the computational device, triangulation based on relative received signal strengths of a signal from the computational device as received by multiple access points of the vehicle, image processing of images of the predetermined area and/or zone, user presence and/or location information received by an on board vehicle sensor, and input received from a user of the computational device.
[0094] A media controller subsystem can include:
[0095] a microprocessor executable distributed network control server operable to access selected content on a public network external to a vehicle comprising the media controller subsystem, the distributed network control server having at least one of an assigned Internet Protocol address and global unicast address;
[0096] a microprocessor executable media server operable to receive requests for content from a vehicle occupant and provide requested content to a portable computational device associated with the vehicle occupant, the media server having a contactable electronic address on a local area network maintained by the vehicle;
[0097] a microprocessor readable memory to store content; and
[0098] a microprocessor executable virtual network console operable to provide the computational device with remote access to the media server.
[0099] The distributed network control server, media server, and virtual network console can be on a common media processing board mounted on the vehicle.
[0100] The portable computational device can be discrete from, remote from, and in wireless communication with the media processing board.
[0101] The virtual network console can operate in accordance with the Remote Frame Buffer protocol on top of the TCP/IP suite of protocols, thereby causing the remote computational device to appear to a computer mounted in the vehicle as if the remote computational device is part of the on board vehicle control system comprising the computer.
[0102] The vehicle occupant, through the remote computational device, can control one or more of the following media presentation features of a screen and sound system mounted on the vehicle: volume, contrast, resolution, and channel selection.
[0103] A method, vehicle, and instructions can perform at least the following steps, operations, and functions:
[0104] (a) determining, by a microprocessor executable media server, at least one of an identity of a vehicle occupant requesting media content, an identity of a portable computational device associated with the vehicle occupant, and a spatial location of the vehicle occupant and/or remote computational device;
[0105] (b) based on the at least one of the identity of the vehicle occupant requesting media content, the identity of a portable computational device associated with the vehicle occupant, and the spatial location of the vehicle occupant and/or remote computational device, applying, by the microprocessor executable media server, at least one of a filter and restriction to the requested media content to form filtered and/or permitted media content to be provided to the portable computational device; and
[0106] (c) providing , by a microprocessor executable media server, the filtered and/or permitted media content to the portable computational device.
[0107] The portable computational device can be discrete from, remote from, and in wireless communication with the media processing board.
[0108] The filter and/or restriction can be applied to a media request of the vehicle occupant before and/or after the requested media content is accessed.
[0109] The filter and/or restriction can be one or more of an age-related content filter and/or restriction, a vehicle occupant seating location filter and/or restriction, and a privacy filter and/or restriction.
[0110] A method, vehicle, and tangible and non-transient computer readable medium can be provided to perform steps, operations, and functions, including:
[0111] (a) receiving a request from a vehicle occupant to perform a vehicle task, function and/or operation;
[0112] (b) determining that the vehicle occupant has been authenticated successfully;
[0113] (c) accessing an account for the vehicle occupant, the account defining rights and privileges of the user with respect to controlling a vehicle task, function and/or operation;
[0114] (d) determining at least one of an area and/or zone occupied by the vehicle occupant and an operating state of the vehicle; and
[0115] (e) based upon the account corresponding to the vehicle occupant and the at least one of an area and/or zone occupied by the vehicle occupant and an operating state of the vehicle, applying the following rules:
[0116] (i) when the account permits the vehicle occupant to perform the requested vehicle task, function and/or operation, performing or causing to be performed the vehicle task, function, and/or operation; and
[0117] (ii) when the account does not permit the vehicle occupant to perform the requested vehicle task, function and/or operation, not performing or causing to be performed the vehicle task, function, and/or operation.
[0118] In step, operation, or function (e), a microprocessor executable vehicle control system can apply the rules based on the account corresponding to the vehicle occupant and the area and/or zone occupied by the vehicle occupant. Then vehicle occupant can perform a first set of vehicle tasks, functions and operations when in a first area and/or zone and a different set of vehicle tasks, functions, and operations when in a different second area and/or zone.
[0119] In step, operation, or function (e), the microprocessor executable vehicle control system can apply the rules based on the account corresponding to the vehicle occupant and the operating state of the vehicle. The vehicle occupant can perform a first set of vehicle tasks, functions and operations when the vehicle is in a first operating state and a different set of vehicle tasks, functions, and operations when the vehicle is in a second operating state.
[0120] When rule (i) applies, the vehicle occupant can control the requested vehicle task, function and/or operation using one or more of a cell phone, laptop, tablet computer, and personal digital assistant.
[0121] A different authentication procedure can be used for the vehicle occupant when requesting a first set of vehicle tasks, operations, and functions than when requesting a second set of vehicle tasks, operations, and functions.
[0122] The account can include rights and privileges for the vehicle occupant with respect to a vehicle task, function or operation, security and/or authentication requirements and/or credentials for the vehicle occupant, and personal settings of the vehicle occupant.
[0123] The personal settings of the vehicle occupant can include a plurality of a seat setting, climate control setting, lighting setting, configuration of an instrument cluster on a screen, rear view mirror setting, driving mode, media channel setting or preset, media delivery preference, music genre preference, scheduled program, playlist, synchronization with cloud-based data associated with the vehicle occupant, application-specific personalization and selections, and a display setting and configuration.
[0124] A method/vehicle/computer readable instructions can perform at least the following steps/operations/ functions :
[0125] (a) detecting, by a microprocessor executable media controller subsystem, a change in state of a vehicle, a driver of the vehicle having a graphical user interface on board and/or in communication with a computer network controlled by computer on board the vehicle;
[0126] (b) in response, reconfiguring, by the media controller subsystem, the graphical user interface by at least one of the following actions to reduce driver distraction and/or make content displayed by the graphical user interface more visible to the driver: [0127] apply screen magnification to at least part of the content displayed on the graphical user interface;
[0128] render at least part of the displayed content in a larger font and/or icon size;
[0129] initiate a screen reader to audibly provide and/or describe at least part of the displayed content to the driver;
[0130] initiate haptic feedback to provide and/or describe at least part of the displayed content to the driver;
[0131] disable unnecessary animation effects from at least a part of the displayed content;
[0132] remove a background image from while leaving a foreground image in at least part of the displayed content;
[0133] enable a longer period for notification dialog boxes to remain open in the displayed content;
[0134] enable a longer contact period of a digit of the driver to select a selectable object in the displayed content;
[0135] disable automatic arrangement of windows when a mouse cursor is moved to an edge of the displayed content;
[0136] enable activate a window in at least part of the displayed content by hovering over the window with a mouse cursor;
[0137] enable keyboard web page navigation in the displayed content;
[0138] enable a high contrast between text and a background color in at least part of the displayed content;
[0139] change a color and transparency of a border of a window in at least part of the displayed content;
[0140] change a thickness of a focus rectangle around a currently selected item in a dialog box in the displayed content;
[0141] change a color, size, and/or thickness of an on-screen mouse pointer in the displayed content;
[0142] change a keyboard setting displayed in the displayed content and/or in communication with the displayed content;
[0143] ignore a color, font style, font size, and/or format of a web page of the displayed content in accordance with a predetermined style sheet;
[0144] increase a size of a selectable object in at least part of the displayed content;
[0145] enable blind typing on the graphical user interface; and [0146] remove one or more selectable objects from the displayed content.
[0147] A method/vehicle/computer readable instructions can perform at least the following steps/operations/ functions :
[0148] (a) mapping, by a microprocessor executable media controller subsystem, a graphical user interface on board and/or in communication with a computer network controlled by computer on board the vehicle with a segmented control surface, whereby each segment corresponds to an item of content displayed on the graphical user interface;
[0149] (b) tracking, by a microprocessor executable media controller subsystem, a position of a body part of a user on the control surface relative to the control surface segments;
[0150] (c) determining, by a microprocessor executable media controller subsystem, that the body part of the user is located on a first segment; and
[0151] (d) in response, at least one of providing displayed content corresponding to the first segment and selecting a selectable object corresponding to the first segment.
[0152] The control surface can be electrically, magnetically, and electromagnetically nonresponsive to contact of the user's body part.
[0153] The control surface can be optically nonresponsive to contact of the user's body part.
[0154] The control surface can be part of an arm rest and/or dashboard and wherein the graphical user interface is virtual and not displayed to the user.
[0155] The control surface can be part of an arm rest and/or dashboard and wherein the graphical user interface is displayed to the user on a screen on board the vehicle.
[0156] A method/vehicle/computer readable instructions can perform at least the following steps/operations/ functions :
[0157] (a) determining, by a vehicle control system, that a vehicle occupant has an impairment; and
[0158] (b) in response, altering a communication interface of the vehicle to
accommodate the impairment.
[0159] The impairment can be one or more of a vision impairment, hearing impairment, dexterity impairment, mobility impairment, language impairment, and communication impairment.
[0160] The communication interface can be a screen on board the vehicle.
[0161] The location of the vehicle occupant and the screen corresponding to the determined occupant location can be determined. [0162] The alteration can be one or more of:
[0163] apply screen magnification to at least part of the content displayed on the screen;
[0164] render at least part of the displayed content in a larger font and/or icon size;
[0165] initiate a screen reader to audibly provide and/or describe at least part of the displayed content to the occupant;
[0166] initiate haptic feedback to provide and/or describe at least part of the displayed content to the occupant;
[0167] disable unnecessary animation effects from at least a part of the displayed content;
[0168] remove a background image from while leaving a foreground image in at least part of the displayed content;
[0169] enable a longer period for notification dialog boxes to remain open in the displayed content;
[0170] enable a longer contact period of a digit of the occupant to select a selectable object in the displayed content;
[0171] disable automatic arrangement of windows when a mouse cursor is moved to an edge of the displayed content;
[0172] enable activate a window in at least part of the displayed content by hovering over the window with a mouse cursor;
[0173] enable keyboard web page navigation in the displayed content;
[0174] enable a high contrast between text and a background color in at least part of the displayed content;
[0175] change a color and transparency of a border of a window in at least part of the displayed content;
[0176] change a thickness of a focus rectangle around a currently selected item in a dialog box in the displayed content;
[0177] change a color, size, and/or thickness of an on-screen mouse pointer in the displayed content;
[0178] change a keyboard setting displayed in the displayed content and/or in communication with the displayed content;
[0179] ignore a color, font style, font size, and/or format of a web page of the displayed content in accordance with a predetermined style sheet;
[0180] increase a size of a selectable object in at least part of the displayed content;
[0181] enable blind typing on the graphical user interface; [0182] enable text and/or a visual alternative to an audio channel associated with the displayed content;
[0183] increase a volume setting of the audio channel;
[0184] change a sound provided by the audio channel;
[0185] enable sign language interpretation;
[0186] enable a text phone application;
[0187] change a setting of a mouse associated with the displayed content; and
[0188] remove one or more selectable objects from the displayed content.
[0189] A method/vehicle/computer readable instructions can perform at least the following steps/operations/ functions :
[0190] (a) sensing, by a vehicle control system, a sound from a source external to a vehicle;
[0191] (b) identifying, by the vehicle control system, a type and/or source of the sound;
[0192] (c) based on the identifying step, notifying a driver of the vehicle of the type and/or source of the sound.
[0193] The notification can be one or more of a visual notification, an audible notification, and a haptic notification.
[0194] The notification can be an audible notification.
[0195] The audible notification can be one or more of a portion of the frequency range of the received sound, a frequency shifted portion of the received sound, and a phase shifted portion of the sound.
[0196] A method/vehicle/computer readable instructions can perform at least the following step/operation/function:
[0197] altering, by a vehicle control system, an interface of a vehicle based on one or more of a user impairment, user medical condition, user age, user physical condition, user driving characteristic and driving history.
[0198] The interface can be one or more of a steering wheel, pedal, a graphical user interface, and a setting and/or configuration of an automated vehicle response system.
[0199] The automated vehicle response system can be a collision avoidance system.
[0200] A method/vehicle/computer readable instructions can perform at least the following step/operation/function:
[0201] maintaining a persona of a vehicle occupant; and
[0202] based on the persona of the vehicle occupant and vehicle-related information, perform an action assisting the vehicle occupant. [0203] the vehicle-related information comprises at least one of a current and/or future vehicle location and path of vehicle travel.
[0204] The action performed can depend on a seating position of the vehicle occupant.
[0205] The persona can include one or more of bioinformatics, medical information, driving history, personal information, private information, travel information, and Internet browsing history and/or browsed content.
[0206] The vehicle related information can include one or more of vehicle context, state, external surroundings, location, past, current, and/or intended path of travel, waypoint, and destination.
[0207] The action can be one or more of making an appointment, making a reservation, purchasing an item on line, adding a waypoint or destination to path of travel on a navigation system of the vehicle, adding an entry into the occupant's electronic calendar, changing a destination or path of travel on the navigation system, warning the occupant, notifying the occupant, and sending a message to a person at a waypoint or destination of the vehicle regarding an arrival time.
[0208] The steps/ functions/operations can determine relevant information other than the persona of the vehicle occupant and the vehicle-related information.
[0209] The relevant information can include one or more of a persona of a selected person not currently in the vehicle, a message from a friend or family member of the vehicle occupant, a current activity of the friend and/or family member, a location, hours of operation, and/or descriptive information about a point and/or location of interest near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a vehicle service facility near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a hotel and/or motel near the vehicle and/or the vehicle's path of travel, a current location of the friend or family member near the vehicle and/or the vehicle's path of travel, and a road condition along a path of travel of the vehicle.
[0210] The vehicle can obtain the persona from one or more of a different vehicle driven by the occupant and a home computer.
[0211] The vehicle can synchronize with the different vehicle and/or home computer when the vehicle is parked in proximity thereto and an ignition of the vehicle is turned off.
[0212] The maintaining of the persona can include collecting information from one or more vehicle sensors and/or from a remote information source.
[0213] A type of information collected can depend on a seating position of the occupant. [0214] The type of information collected can depend on one or more of an identity of the occupant, an age of the occupant, and an association of the occupant with the vehicle.
[0215] The persona can be defined in a format (e.g., grammar, syntax, and/or semantics) that can be processed by vehicles of different manufacturers.
[0216] A vehicle backplane assembly can include at least the following components:
[0217] A plurality of blade processors in a vehicle that includes a first set of blade processors installed prior to vehicle sale and not modifiable and/or replaceable by the vehicle owner and a second set of blade processors installable after vehicle sale and is modifiable and/or replaceable by the vehicle owner. Each blade processor includes a microprocessor, a memory, and a network interface and each of the blade processors performing a different set of functions.
[0218] The restricted access by the customer to the first set of blade processors can be done through restrictions on software rights and privileges (e.g., read only rights and privileges to machine code with no right or privilege to modify same) and physical access to the first set of blade processors (e.g., through a locked housing containing one or more components of the blade processors).
[0219] The assembly can include a third set of blade processors that interfaces with the first and second set of blade processors to provide input to and/or receive output from a corresponding one of the first and second blade processors.
[0220] A satellite receiving system can be a member of the third set of blade processors and a navigation system a member of one of the first and second set of blade processors.
[0221] Each of the first and second sets of blade processors can include a corresponding Universal Serial Bus ("USB") hub, the USB hub comprising a plurality of ports to permit devices to connect to the USB hub.
[0222] The first set of blades can include a master blade processor operable to inventory hardware and/or software in communication with the backplane assembly, assign blade processors to applications attempting to execute, and/or determine a health state of a selected the blade processor.
[0223] A vehicle can include a plurality of blade processors, each blade processor performing a function that is at least one of a vehicle task, function, or operation and an infotainment task, function, or operation.
[0224] The blade processors can have a backplane.
[0225] The backplane can have a first communication zone defining a trusted network within the vehicle to connect with trusted computational devices and/or module provided or certified by the vehicle manufacturer but not untrusted computational devices and/or modules provided by vehicle occupants. Whether or not a computational device and/or module is certified by a manufacturer can be determined using known license check procedures, such as a unique identifier, a unique credential (e.g., password or encrypted or unencrypted key), and the like. When the check is completed satisfactorily (e.g., the unique identifier or a unique credential (e.g., password or encrypted or unencrypted key) provided by the device and/or module matches a stored identifier in the memory of the vehicle control system (or master blade processor)), the computational device and/or module is deemed to be certified by the manufacturer. When the check is not completed satisfactorily, the computational device and/or module is not deemed to be certified by the manufacturer.
[0226] The backplane can have a second communication zone defining an untrusted network to connect with the untrusted computational devices.
[0227] The backplane can have a third communication zone providing power and data transmission to the plurality of blade processors.
[0228] The first and second communication zones can be connected logically on opposing sides of a firewall blade processor.
[0229] A master blade processor can enable a connection to a computational device connected to the first communication zone upon verification that the connected computational device is certified by a vehicle manufacturer.
[0230] The first and second communication zones can be configured as separate Ethernet switches. The first and second communication zones are typically not in signal communication with one another.
[0231] A firewall can have a dedicated slot that bridges the first and second
communication zones and uses the third communication zone for power connections.
[0232] An output of each of the plurality of blade processors can be an IP message framed into an Ethernet packet.
[0233] A sensor and/or controller of the vehicle can communicate by a Car Area Network ("CAN") protocol. An Ethernet bus controller can terminate a CAN bus to the sensor and/or controller and a CAN Ethernet controller subsystem can translate an Ethernet message from a blade processor to a CAN protocol-based message.
[0234] The plurality of blade processors can be configured as a crate having the backplane connector and on-backplane Ethernet interfaces enabling the blade processors to communicate with one another using Ethernet messages. [0235] A vehicle can include a plurality of blade processors, each blade processor performing a function that is at least one of a vehicle task, function, or operation and an infotainment task, function, or operation. The plurality of blade processors can include a master blade processor operable to assign a component and/or module requiring a blade processor for execution to a selected blade processor.
[0236] The master blade processor can inventory hardware and software capabilities of components and/or modules in communication with the master blade processor to provide an application resource table comprising component and/or module identity, functional description, and/or computational resource requirements and/or capabilities needed for execution of the component and/or module.
[0237] A blade processor can be assigned to execute the component and/or module when the blade processor satisfies the corresponding computational resource requirements and/or capabilities.
[0238] The master blade processor can assign a network address to each blade processor and communication device in communication with the master blade processor.
[0239] The master blade processor can assign a component and/or module to a selected blade processor when the selected blade processor is available or, though unavailable, is executing a component and/or module having a lower priority than the component and/or module to be assigned to the selected blade processor.
[0240] The present disclosure can provide a number of advantages depending on the particular aspect, embodiment, and/or configuration. For example, operating multiple operating systems on a common on board computer platform for vehicle control and infotainment tasks, functions and operations can provide a high level of vehicle security and block successfully intrusive attacks. The network security configurations can enable both infotainment and critical vehicle tasks, functions, and operations to performed automatically and concurrently in a vehicle without increasing driver danger from computer-targeted attacks by an outside source. Using a common computer to run safely both the first and second operating systems can not only reduce vehicle manufacturing costs but also provide a simpler computational architecture. The device discovery daemon, by locating the portable device seeking access to the vehicle network, can not only provide higher levels of security from intrusive attacks via a computational device external to the vehicle but also protect the privacy and resources of the vehicle network. The application of accessibility technologies to the driver's screen can successfully reduce driver distractions while enabling the driver to perform permissible computational tasks. The availability of accessibility technologies in the vehicle can not only comply with prevailing legal requirements but also enhance passenger enjoyment. The use of network control and media servers and virtual network console functionality on a common processing board can provide a simple yet effective way to stream media to devices within the vehicle network. The use of media filters and restrictions based on one or more of the identity of the vehicle occupant requesting media content, the identity of a portable computational device associated with the vehicle occupant, and the spatial location of the vehicle occupant and/or remote computational device can enable compliance with prevailing laws regarding driver distractions while providing the vehicle owner with the ability to control media access by one or more selected vehicle occupants. The use of user accounts to control passenger access to vehicle tasks, functions, and operations can prevent a non-driving passenger, for instance, from controlling a critical vehicle task, function or operation and confusing or frustrating the driver. The inactive control surface can enable a vehicle occupant to perform tasks, functions and operations without viewing his or her screen. This provides yet another convenient method for driver control of vehicle and non-vehicle tasks, functions, and operations without distraction while driving the vehicle. The blade processor architecture can take advantage of the pervasiveness of TCP/IP and the large volume of components and/or modules made for IP-based distributed systems, such as the Ethernet. The architecture can be a modular distributed system that can be upgraded and/or extended over time, without changing the basic processing architecture. Because the various subsystems can communicate with one another by protocol-based messages, there is not requirement that all software be written in any specific language or execute on any specific operating system. This can allow developers to use the most appropriate run time environment for the subsystem being developed. For instance, when a real time operating system ("RTOS") is required because of the timing constraints of the run time environment (i.e., real time control of a function), a subsystem can be developed using RTOS, without constraining the runtime environment of less demanding systems (e.g., a console subsystem).
[0241] These and other advantages will be apparent from the disclosure.
[0242] The phrases "at least one," "one or more," and "and/or" are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions "at least one of A, B and C," "at least one of A, B, or C," "one or more of A, B, and C," "one or more of A, B, or C" and "A, B, and/or C" means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together. [0243] The term "a" or "an" entity refers to one or more of that entity. As such, the terms "a" (or "an"), "one or more," and "at least one" can be used interchangeably herein. It is also to be noted that the terms "comprising," "including," and "having" can be used interchangeably.
[0244] "Accessible technology" refers to any computer technology that users can adjust and/or employ to accommodate his or her vision, dexterity, hearing, cognitive, language, learning, and/or speech needs. Accessibility technology can be in the form of accessibility features or settings built into software programs and specialty hardware devices or software programs.
[0245] "Assistive technology" refers to any technology that users can adjust and/or employ to accommodate his or her vision, dexterity, hearing, cognitive, language, learning, and/or speech needs. Examples of assistive technology include accessible technology, adjuncts, peripherals, plug-ins, and add-ins.
[0246] The term "disability" refers to the consequence of an impairment that may be physical, cognitive, intellectual, mental, sensory, emotional, developmental, or some combination of these. By way of illustration, Section 503 defines "disability" as an impairment that substantially limits a major life activity, even if it were not to limit any other major life activity, or an impairment that is episodically active or in remission and would substantially limit a major life activity when active.
[0247] The term "automatic" and variations thereof, as used herein, refer to any process or operation done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before the performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be "material."
[0248] The term "automotive navigation system" can refer to a satellite navigation system designed for use in vehicles. It typically uses a GPS navigation device to acquire position data to locate the user on a road in the unit's map database. Using the road database, the unit can give directions to other locations along roads also in its database. Dead reckoning using distance data from sensors attached to the drivetrain, a gyroscope and an accelerometer can be used for greater reliability, as GPS signal loss and/or multipath can occur due to urban canyons or tunnels. [0249] The term "bus" and variations thereof, as used herein, can refer to a subsystem that transfers information and/or data between various components. A bus generally refers to the collection communication hardware interface, interconnects, bus architecture, standard, and/or protocol defining the communication scheme for a communication system and/or communication network. A bus may also refer to a part of a communication hardware that interfaces the communication hardware with the interconnects that connect to other components of the corresponding communication network. The bus may be for a wired network, such as a physical bus, or wireless network, such as part of an antenna or hardware that couples the communication hardware with the antenna. A bus architecture supports a defined format in which information and/or data is arranged when sent and received through a communication network. A protocol may define the format and rules of communication of a bus architecture.
[0250] The terms "communication device," "smartphone," and "mobile device," and variations thereof, as used herein, can be used interchangeably and may include any type of device capable of communicating with one or more of another device and/or across a communications network, via a communications protocol, and the like. Exemplary communication devices may include but are not limited to smartphones, handheld computers, laptops, netbooks, notebook computers, subnotebooks, tablet computers, scanners, portable gaming devices, phones, pagers, GPS modules, portable music players, and other Internet-enabled and/or network-connected devices.
[0251] A "communication modality" can refer to any protocol- or standard defined or specific communication session or interaction, such as Voice-Over-Internet-Protocol ("VoIP), cellular communications (e.g., IS-95, 1G, 2G, 3G, 3.5G, 4G, 4G/IMT-Advanced standards, 3 GPP, WIMAX™, GSM, CDMA, CDMA2000, EDGE, lxEVDO, iDEN, GPRS, HSPDA, TDMA, UMA, UMTS, ITU-R, and 5G), Bluetooth™, text or instant messaging (e.g., AIM, Blauk, eBuddy, Gadu-Gadu, IBM Lotus Sametime, ICQ, iMessage, IMVU, Lync, MXit, Paltalk, Skype, Tencent QQ, Windows Live Messenger™ or MSN Messenger™, Wireclub, Xfire, and Yahoo! Messenger™), email, Twitter (e.g., tweeting), Digital Service Protocol (DSP), and the like.
[0252] The term "communication system" or "communication network" and variations thereof, as used herein, can refer to a collection of communication components capable of one or more of transmission, relay, interconnect, control, or otherwise manipulate information or data from at least one transmitter to at least one receiver. As such, the communication may include a range of systems supporting point-to-point or broadcasting of the information or data. A communication system may refer to the collection individual communication hardware as well as the interconnects associated with and connecting the individual communication hardware. Communication hardware may refer to dedicated communication hardware or may refer a processor coupled with a communication means (i.e., an antenna) and running software capable of using the communication means to send and/or receive a signal within the communication system. Interconnect refers some type of wired or wireless communication link that connects various components, such as communication hardware, within a communication system. A communication network may refer to a specific setup of a communication system with the collection of individual communication hardware and interconnects having some definable network topography. A communication network may include wired and/or wireless network having a pre-set to an ad hoc network structure.
[0253] The term "computer-readable medium," as used herein refers to any tangible storage and/or transmission medium that participates in providing instructions to a processor for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, non-volatile random access memory (NVRAM), or magnetic or optical disks. Volatile media includes dynamic memory, such as main memory. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, magneto-optical medium, a compact disc read only memory (CD-ROM), any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a random access memory (RAM), a programmable read only memory (PROM), and erasable programmable read only memory EPROM, a FLASH-EPROM, a solid state medium like a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read. A digital file attachment to an e-mail or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. When the computer-readable media is configured as a database, it is to be understood that the database may be any type of database, such as relational, hierarchical, object-oriented, and/or the like. Accordingly, the disclosure is considered to include a tangible storage medium or distribution medium and prior art-recognized equivalents and successor media, in which the software
implementations of the present disclosure are stored. It should be noted that any computer readable medium that is not a signal transmission may be considered non-transitory. [0254] The terms dash and dashboard and variations thereof, as used herein, may be used interchangeably and can be any panel and/or area of a vehicle disposed adjacent to an operator, user, and/or passenger. Dashboards may include, but are not limited to, one or more control panel(s), instrument housing(s), head unit(s), indicator(s), gauge(s), meter(s), light(s), audio equipment, computer(s), screen(s), display(s), HUD unit(s), and graphical user interface(s).
[0255] The term "module" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and software that is capable of performing the functionality associated with that element.
[0256] The term "desktop" refers to a metaphor used to portray systems. A desktop is generally considered a "surface" that may include pictures, called icons, widgets, folders, etc. that can activate and/or show applications, windows, cabinets, files, folders, documents, and other graphical items. The icons are generally selectable to initiate a task through user interface interaction to allow a user to execute applications and/or conduct other operations.
[0257] The term "display" refers to a portion of a physical screen used to display the output of a computer to a user.
[0258] The term "displayed image" refers to an image produced on the display. A typical displayed image is a window or desktop. The displayed image may occupy all or a portion of the display.
[0259] The term "display orientation" refers to the way in which a rectangular display is oriented for viewing. The two most common types of display orientations are portrait and landscape. In landscape mode, the display is oriented such that the width of the display is greater than the height of the display (such as a 4:3 ratio, which is 4 units wide and 3 units tall, or a 16:9 ratio, which is 16 units wide and 9 units tall). Stated differently, the longer dimension of the display is oriented substantially horizontal in landscape mode while the shorter dimension of the display is oriented substantially vertical. In the portrait mode, by contrast, the display is oriented such that the width of the display is less than the height of the display. Stated differently, the shorter dimension of the display is oriented
substantially horizontal in the portrait mode while the longer dimension of the display is oriented substantially vertical. A multi-screen display can have one composite display that encompasses all the screens. The composite display can have different display
characteristics based on the various orientations of the device. [0260] The term "electronic address" can refer to any contactable address, including a telephone number, instant message handle, e-mail address, Uniform Resource Locator ("URL"), Global Universal Identifier ("GUID"), Universal Resource Identifier ("URI"), Address of Record ("AOR"), electronic alias in a database, etc., combinations thereof.
[0261] The term "gesture" refers to a user action that expresses an intended idea, action, meaning, result, and/or outcome. The user action can include manipulating a device (e.g., opening or closing a device, changing a device orientation, moving a trackball or wheel, etc.), movement of a body part in relation to the device, movement of an implement or tool in relation to the device, audio inputs, etc. A gesture may be made on a device (such as on the screen) or with the device to interact with the device.
[0262] The term "gesture capture" refers to a sense or otherwise a detection of an instance and/or type of user gesture. The gesture capture can be received by sensors in three-dimensional space. Further, the gesture capture can occur in one or more areas of a screen, for example, on a touch-sensitive display or a gesture capture region. A gesture region can be on the display, where it may be referred to as a touch sensitive display, or off the display, where it may be referred to as a gesture capture area.
[0263] The terms "infotainment" and "infotainment system" may be used
interchangeably and can refer to the hardware/software products, data, content, information, and/or systems, which can be built into or added to vehicles to enhance driver and/or passenger experience. Infotainment may provide media and/or multimedia content. An example is information-based media content or programming that also includes entertainment content.
[0264] A "multi-screen application" refers to an application that is capable of producing one or more windows that may simultaneously occupy one or more screens. A multiscreen application commonly can operate in single-screen mode in which one or more windows of the application are displayed only on one screen or in multi-screen mode in which one or more windows are displayed simultaneously on multiple screens.
[0265] A "single-screen application" refers to an application that is capable of producing one or more windows that may occupy only a single screen at a time.
[0266] The terms "online community," "e-community," or "virtual community" can mean a group of people that interact via a computer network, for social, professional, educational, and/or other purposes. The interaction can use a variety of media formats, including wikis, blogs, chat rooms, Internet forums, instant messaging, email, and other forms of electronic media. Many media formats may be used in social software separately and/or in combination, including text-based chat rooms and forums that use voice, video text or avatars.
[0267] The term "satellite positioning system receiver" can refer to a wireless receiver or transceiver to receive and/or send location signals from and/or to a satellite positioning system (SPS), such as the Global Positioning System ("GPS") (US), GLONASS (Russia), Galileo positioning system (EU), Compass navigation system (China), and Regional Navigational Satellite System (India).
[0268] The term "social network service" may include a service provider that builds online communities of people, who share interests and/or activities, or who are interested in exploring the interests and/or activities of others. Social network services can be network-based and may provide a variety of ways for users to interact, such as e-mail and instant messaging services.
[0269] The term "social network" can refer to a network-based social network.
[0270] The term "screen," "touch screen," "touchscreen," or "touch-sensitive display" refers to a physical structure that enables the user to interact with the computer by touching areas on the screen and provides information to a user through a display. The touch screen may sense user contact in a number of different ways, such as by a change in an electrical parameter (e.g., resistance or capacitance), acoustic wave variations, infrared radiation proximity detection, light variation detection, and the like. In a resistive touch screen, for example, normally separated conductive and resistive metallic layers in the screen pass an electrical current. When a user touches the screen, the two layers make contact in the contacted location, whereby a change in electrical field is noted and the coordinates of the contacted location calculated. In a capacitive touch screen, a capacitive layer stores electrical charge, which is discharged to the user upon contact with the touch screen, causing a decrease in the charge of the capacitive layer. The decrease is measured, and the contacted location coordinates determined. In a surface acoustic wave touch screen, an acoustic wave is transmitted through the screen, and the acoustic wave is disturbed by user contact. A receiving transducer detects the user contact instance and determines the contacted location coordinates.
[0271] The term "window" refers to a, typically rectangular, displayed image on at least part of a display that contains or provides content different from the rest of the screen. The window may obscure the desktop. The dimensions and orientation of the window may be configurable either by another module or by a user. When the window is expanded, the window can occupy substantially all of the display space on a screen or screens.
[0272] The terms "determine," "calculate," and "compute," and variations thereof, as used herein, are used interchangeably and include any type of methodology, process, mathematical operation, or technique.
[0273] It shall be understood that the term "means," as used herein, shall be given its broadest possible interpretation in accordance with 35 U.S.C., Section 112, Paragraph 6 or other applicable law. Accordingly, a claim incorporating the term "means" shall cover all structures, materials, or acts set forth herein, and all of the equivalents thereof. Further, the structures, materials or acts and the equivalents thereof shall include all those described in the summary of the invention, brief description of the drawings, detailed description, abstract, and claims themselves.
[0274] The terms "vehicle," "car," "automobile," and variations thereof may be used interchangeably herein and can refer to a device or structure for transporting animate and/or inanimate or tangible objects (e.g., persons and/or things), such as a self-propelled conveyance. A vehicle as used herein can include any conveyance or model of a conveyance, where the conveyance was originally designed for the purpose of moving one or more tangible objects, such as people, animals, cargo, and the like. The term "vehicle" does not require that a conveyance moves or is capable of movement. Typical vehicles may include but are in no way limited to cars, trucks, motorcycles, busses, automobiles, trains, railed conveyances, boats, ships, marine conveyances, submarine conveyances, airplanes, space craft, flying machines, human-powered conveyances, and the like.
[0275] The term "profile," as used herein, can refer to any data structure, data store, and/or database that includes one or more items of information associated with a vehicle, a vehicle system, a device (e.g., a mobile device, laptop, mobile phone, etc.), or a person.
[0276] The term "in communication with," as used herein, refers to any coupling, connection, or interaction using electrical signals to exchange information or data, using any system, hardware, software, protocol, or format, regardless of whether the exchange occurs wirelessly or over a wired connection.
[0277] The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and/or configurations. It is intended neither to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments, and/or
configurations of the disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0278] BRIEF DESCRIPTION OF THE DRAWINGS
[0279] Fig. 1 depicts an embodiment of a vehicle operating environment;
[0280] Fig. 2 is a block diagram of an embodiment of a vehicle system;
[0281] Fig. 3 is a block diagram of an embodiment of a vehicle control system environment;
[0282] Fig. 4 is a block diagram of an embodiment of a vehicle communications subsystem;
[0283] Fig. 5A is a first block diagram of an embodiment of a vehicle interior environment separated into areas and/or zones;
[0284] Fig. 5B is a second block diagram of an embodiment of a vehicle interior environment separated into areas and/or zones;
[0285] Fig. 5C is a third block diagram of an embodiment of a vehicle interior environment separated into areas and/or zones;
[0286] Fig. 6A depicts an embodiment of a sensor configuration for a vehicle;
[0287] Fig. 6B depicts an embodiment of a sensor configuration for a zone of a vehicle;
[0288] Fig. 7A is a block diagram of an embodiment of interior sensors for a vehicle;
[0289] Fig. 7B is a block diagram of an embodiment of exterior sensors for a vehicle;
[0290] Fig. 8A is a block diagram of an embodiment of a media subsystem for a vehicle;
[0291] Fig. 8B is a block diagram of an embodiment of a user and device interaction subsystem for a vehicle;
[0292] Fig. 8C is a block diagram of an embodiment of a Navigation subsystem for a vehicle;
[0293] Fig. 9 is a block diagram of an embodiment of a communications subsystem for a vehicle;
[0294] Fig. 10 is a block diagram of an embodiment of a software architecture for the vehicle control system;
[0295] Fig. 11 A is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0296] Fig. 1 IB is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system; [0297] Fig. 11C is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0298] Fig. 1 ID is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0299] Fig. 1 IE is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0300] Fig. 1 IF is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0301] Fig. 11G is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0302] Fig. 11H is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0303] Fig. 1 II is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0304] Fig. 11J is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0305] Fig. 1 IK is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
[0306] Fig. 12A is a diagram of an embodiment of a data structure for storing information about a user of a vehicle;
[0307] Fig. 12B is a diagram of an embodiment of a data structure for storing information about a device associated with or in a vehicle;
[0308] Fig. 12C is a diagram of an embodiment of a data structure for storing information about a system of a vehicle;
[0309] Fig. 12D is a diagram of an embodiment of a data structure for storing information about a vehicle;
[0310] Fig. 13 is a flow or process diagram of a method for storing one or more settings associated with a user;
[0311] Fig. 14 is a flow or process diagram of a method for establishing one or more settings associated with a user;
[0312] Fig. 15 is a flow or process diagram of a method for storing one or more settings associated with a user;
[0313] Fig. 16 is a flow or process diagram of a method for storing one or more gestures associated with a user; [0314] Fig. 17 is a flow or process diagram of a method for reacting to a gesture performed by a user;
[0315] Fig. 18 is a flow or process diagram of a method for storing health data associated with a user;
[0316] Fig. 19 is a flow or process diagram of a method for reacting to a gesture performed by a user;
[0317] Fig. 20 depicts a flow diagram according to an embodiment;
[0318] Fig. 21 depicts an on board media system according to an embodiment;
[0319] Fig. 22 depicts a flow diagram according to an embodiment;
[0320] Fig. 23 depicts a flow diagram according to an embodiment;
[0321] Fig. 24 depicts a computational system according to an embodiment;
[0322] Fig. 25 depicts a computational system according to an embodiment;
[0323] Fig. 26 depicts a flow diagram according to an embodiment;
[0324] Fig. 27 depicts a flow diagram according to an embodiment;
[0325] Fig. 28 depicts a screen shot according to an embodiment; and
[0326] Fig. 29 depicts a screen shot according to an embodiment;
[0327] Fig. 30 depicts a screen shot according to an embodiment;
[0328] Fig. 31 depicts a flow diagram according to an embodiment;
[0329] Fig. 32 depicts a flow diagram according to an embodiment;
[0330] Fig. 33 depicts a segmented control surface according to an embodiment;
[0331] Fig. 34 depicts a mapped display according to an embodiment;
[0332] Fig. 35 depicts a flow diagram according to an embodiment;
[0333] Fig. 36 depicts a flow diagram according to an embodiment;
[0334] Figs. 37A-B depict a flow diagram according to an embodiment;
[0335] Fig. 38 depicts a flow diagram according to an embodiment; and
[0336] Fig. 39 depicts a flow diagram according to an embodiment.
[0337] In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components.
If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference letter or label.
[0338] DETAILED DESCRIPTION [0339] Presented herein are embodiments of systems, devices, processes, data structures, user interfaces, etc. The embodiments may relate to an automobile and/or an automobile environment. The automobile environment can include systems associated with the automobile and devices or other systems in communication with the automobile and/or automobile systems. Furthermore, the systems can relate to communications systems and/or devices and may be capable of communicating with other devices and/or to an individual or group of individuals. Further, the systems can receive user input in unique ways. The overall design and functionality of the systems provide for an enhanced user experience making the automobile more useful and more efficient. As described herein, the automobile systems may be electrical, mechanical, electro-mechanical, software- based, and/or combinations thereof.
[0340] A vehicle environment 100 that may contain a vehicle ecosystem is shown in Fig. 1. The vehicle environment 100 can contain areas associated with a vehicle or conveyance 104. The vehicle 104 is shown as a car but can be any type of conveyance. The environment 100 can include at least three zones. A first zone 108 may be inside a vehicle 104. The zone 108 includes any interior space, trunk space, engine compartment, or other associated space within or associated with the vehicle 104. The interior zone 108 can be defined by one or more techniques, for example, geo-fencing.
[0341] A second zone 112 may be delineated by line 120. The zone 112 is created by a range of one or more sensors associated with the vehicle 104. Thus, the area 112 is exemplary of the range of those sensors and what can be detected by those sensors associated with the vehicle 104. Although sensor range is shown as a fixed and
continuous oval, the sensor range may be dynamic and/or discontinuous. For example, a ranging sensor (e.g., radar, lidar, ladar, etc.) may provide a variable range depending on output power, signal characteristics, or environmental conditions (e.g., rain, fog, clear, etc.). The rest of the environment includes all space beyond the range of the sensors and is represented by space 116. Thus, the environment 100 may have an area 116 that includes all areas beyond the sensor range 112. The area 116 may include locations of travel that the vehicle 104 may proceed to in the future.
[0342] An embodiment of a vehicle system 200 is shown in Fig. 2. The vehicle system 200 may comprise hardware and/or software that conduct various operations for or with the vehicle 104. The operations can include, but are not limited to, providing information to the user 216, receiving input from the user 216, and controlling the functions or operation of the vehicle 104, etc. The vehicle system 200 can include a vehicle control system 204. The vehicle control system 204 can be any type of computing system operable to conduct the operations as described herein. An example of a vehicle control system may be as described in conjunction with Fig. 3.
[0343] The vehicle control system 204 may interact with a memory or storage system 208 that stores system data. System data 208 may be any type of data needed for the vehicle control system 204 to control effectively the vehicle 104. The system data 208 can represent any type of database or other storage system. Thus, the system data 208 can be a flat file data system, an object-oriented data system, or some other data system that may interface with the vehicle control system 204.
[0344] The vehicle control system 204 may communicate with a device or user interface 212, 248. The user interface 212, 248 may be operable to receive user input either through touch input, on one or more user interface buttons, via voice command, via one or more image sensors, or through a graphical user interface that may include a gesture capture region, as described in conjunction with the other figures provided herein. Further, the symbol 212, 248 can represent a device that is located or associated with the vehicle 104. The device 212, 248 can be a mobile device, including, but not limited to, a mobile telephone, a mobile computer, or other type of computing system or device that is either permanently located in or temporarily associated with, but not necessarily connected to, the vehicle 104. Thus, the vehicle control system 204 can interface with the device 212, 248 and leverage the device's computing capability to provide one or more of the features or functions as described herein.
[0345] The device or user interface 212, 248 can receive input or provide information to a user 216. The user 216 may thus interact with the vehicle control system 204 through the interface or device 212, 248. Further, the device 212, 248 may include or have access to device data 220 and/or profile data 252. The device data 220 can be any type of data that is used in conjunction with the device 212, 248 including, but not limited to, multimedia data, preferences data, device identification information, or other types of data. The profile data 252 can be any type of data associated with at least one user 216 including, but in no way limited to, bioinformatics, medical information, driving history, personal information (e.g., home physical address, business physical address, contact addresses, likes, dislikes, hobbies, size, weight, occupation, business contacts - including physical and/or electronic addresses, personal contacts - including physical and/or electronic addresses, family members, and personal information related thereto, etc.), other user characteristics, advertising information, user settings and feature preferences, travel information, associated vehicle preferences, communication preferences, historical information (e.g., including historical, current, and/or future travel destinations), Internet browsing history, or other types of data. In any event, the data may be stored as device data 220 and/or profile data 252 in a storage system similar to that described in
conjunction with Figs. 12A through 12D.
[0346] As an example, the profile data 252 may include one or more user profiles. User profiles may be generated based on data gathered from one or more of vehicle preferences (e.g., seat settings, HVAC settings, dash configurations, and the like), recorded settings, geographic location information (e.g., provided by a satellite positioning system (e.g., GPS), Wi-Fi hotspot, cell tower data, etc.), mobile device information (such as mobile device electronic addresses, Internet browsing history and content, application store selections, user settings and enabled and disabled features, and the like), private information (such as user information from a social network, user presence information, user business account, and the like), secure data, biometric information, audio information from on board microphones, video information from on board cameras, Internet browsing history and browsed content using an on board computer and/or the local area network enabled by the vehicle 104, geographic location information (e.g., a vendor storefront, roadway name, city name, etc.), and the like.
[0347] The profile data 252 may include one or more user accounts. User accounts may include access and permissions to one or more settings and/or feature preferences associated with the vehicle 104, communications, infotainment, content, etc. In one example, a user account may allow access to certain settings for a particular user, while another user account may deny access to the settings for another user, and vice versa. The access controlled by the user account may be based on at least one of a user account priority, role, permission, age, family status, a group priority (e.g., the user account priority of one or more users, etc.), a group age (e.g., the average age of users in the group, a minimum age of the users in the group, a maximum age of the users in the group, and/or combinations thereof, etc.).
[0348] For example, a user 216 may be allowed to purchase applications (e.g., software, etc.) for the vehicle 104 and/or a device associated with the vehicle 104 based on information associated with the user account. This user account information may include a preferred payment method, permissions, and/or other account information. As provided herein, the user account information may be part of the user profile and/or other data stored in the profile data 252. [0349] As another example, an adult user (e.g., a user with an age of 18 years old and/or over, etc.) may be located in an area of a vehicle 104, such as a rear passenger area.
Continuing this example a child user (e.g., a user with an age of 17 years old and/or less, etc.) may be located in the same, or close, area. In this example, the user account information in the profile data 252 associated with both the adult user and the child user may be used by the vehicle 104 in determining whether content is appropriate for the area given the age of the child user. For instance, a graphic movie containing violence (e.g., a movie associated with a mature rating, such as a Motion Picture Association of America (MPAA) rating of "R," "NC-17," etc.) may be suitable to present to a display device associated with the adult user but may not be acceptable to present to the display device if a 12-year old child user may see and/or hear the content of the movie.
[0350] The vehicle control system 204 may also communicate with or through a communication network 224. The communication network 224 can represent any type of wireless and/or wired communication system that may be included within the vehicle 104 or operable to communicate outside the vehicle 104. Thus, the communication network 224 can include a local area communication capability and a wide area communication capability. For example, the communication network 224 can include a Bluetooth® wireless system, an 802.1 lx (e.g., 802.11G/ 802.1 IN /802.11AC, or the like, wireless system), a CAN bus, an Ethernet network within the vehicle 104, or other types of communication networks that may function with or be associated with the vehicle 104. Further, the communication network 224 can also include wide area communication capabilities, including one or more of, but not limited to, a cellular communication capability, satellite telephone communication capability, a wireless wide area network communication capability, or other types of communication capabilities that allow for the vehicle control system 204 to communicate outside the vehicle 104.
[0351] The vehicle control system 204 may communicate through the communication network 224 to a server 228 that may be located in a facility that is not within physical proximity to the vehicle 104. Thus, the server 228 may represent a cloud computing system or cloud storage that allows the vehicle control system 204 to either gain access to further computing capabilities or to storage at a location outside of the vehicle 104. The server 228 can include a computer processor and memory and be similar to any computing system as understood to one skilled in the art.
[0352] Further, the server 228 may be associated with stored data 232. The stored data 232 may be stored in any system or by any method, as described in conjunction with system data 208, device data 220, and/or profile data 252. The stored data 232 can include information that may be associated with one or more users 216 or associated with one or more vehicles 104. The stored data 232, being stored in a cloud or in a distant facility, may be exchanged among vehicles 104 or may be used by a user 216 in different locations or with different vehicles 104. Additionally or alternatively, the server may be associated with profile data 252 as provided herein. It is anticipated that the profile data 252 may be accessed across the communication network 224 by one or more components of the system 200. Similar to the stored data 232, the profile data 252, being stored in a cloud or in a distant facility, may be exchanged among vehicles 104 or may be used by a user 216 in different locations or with different vehicles 104.
[0353] The vehicle control system 204 may also communicate with one or more sensors 236, 242, which are either associated with the vehicle 104 or communicate with the vehicle 104. Vehicle sensors 242 may include one or more sensors for providing information to the vehicle control system 204 that determine or provide information about the environment 100 in which the vehicle 104 is operating. Embodiments of these sensors may be as described in conjunction with Figs. 6A-7B. Non-vehicle sensor 236 can be any type of sensor that is not currently associated with the vehicle 104. For example, non- vehicle sensor 236 can be sensors in a traffic system operated by a third party that provides data to the vehicle control system 204. Further, the non- vehicle sensor(s) 236 can be other types of sensors which provide information about the distant environment 116 or other information about the vehicle 104 or the environment 100. These non- vehicle sensors 236 may be operated by third parties but provide information to the vehicle control system 204. Examples of information provided by the sensors 236 and that may be used by the vehicle control system 204 may include weather tracking data, traffic data, user health tracking data, vehicle maintenance data, or other types of data, which may provide environmental or other data to the vehicle control system 204. The vehicle control system 204 may also perform signal processing of signals received from one or more sensors 236, 242. Such signal processing may include estimation of a measured parameter from a single sensor, such as multiple measurements of a range state parameter from the vehicle 104 to an obstacle, and/or the estimation, blending, or fusion of a measured state parameter from multiple sensors such as multiple radar sensors or a combination of a ladar/lidar range sensor and a radar sensor. Signal processing of such sensor signal measurements may comprise stochastic signal processing, adaptive signal processing, and/or other signal processing techniques known to those skilled in the art. [0354] The various sensors 236, 242 may include one or more sensor memory 244. Embodiments of the sensor memory 244 may be configured to store data collected by the sensors 236, 242. For example, a temperature sensor may collect temperature data associated with a vehicle 104, user 216, and/or environment, over time. The temperature data may be collected incrementally, in response to a condition, or at specific time periods. In this example, as the temperature data is collected, it may be stored in the sensor memory 244. In some cases, the data may be stored along with an identification of the sensor and a collection time associated with the data. Among other things, this stored data may include multiple data points and may be used to track changes in sensor
measurements over time. As can be appreciated, the sensor memory 244 can represent any type of database or other storage system.
[0355] The diagnostic communications module 256 may be configured to receive and transmit diagnostic signals and information associated with the vehicle 104. Examples of diagnostics signals and information may include, but is in no way limited to, vehicle system warnings, sensor data, vehicle component status, service information, component health, maintenance alerts, recall notifications, predictive analysis, and the like.
Embodiments of the diagnostic communications module 256 may handle warning/error signals in a predetermined manner. The signals, for instance, can be presented to one or more of a third party, occupant, vehicle control system 204, and a service provider (e.g., manufacturer, repair facility, etc.).
[0356] Optionally, the diagnostic communications module 256 may be utilized by a third party (i.e., a party other than the user 216, etc.) in communicating vehicle diagnostic information. For instance, a manufacturer may send a signal to a vehicle 104 to determine a status associated with one or more components associated with the vehicle 104. In response to receiving the signal, the diagnostic communications module 256 may communicate with the vehicle control system 204 to initiate a diagnostic status check. Once the diagnostic status check is performed, the information may be sent via the diagnostic communications module 256 to the manufacturer. This example may be especially useful in determining whether a component recall should be issued based on the status check responses returned from a certain number of vehicles.
[0357] Wired/wireless transceiver/communications ports 260 may be included. The wired/wireless transceiver/communications ports 260 may be included to support communications over wired networks or links, for example with other communication devices, server devices, and/or peripheral devices. Examples of wired/wireless transceiver/communications ports 260 include Ethernet ports, Universal Serial Bus (USB) ports, Institute of Electrical and Electronics Engineers (IEEE) 1594, or other interface ports.
[0358] An embodiment of a vehicle control environment 300 including a vehicle control system 204 may be as shown in Fig. 3. Beyond the vehicle control system 204, the vehicle control environment 300 can include one or more of, but is not limited to, a power source and/or power control module 316, a data storage module 320, user
interface(s)/input interface(s) 324, vehicle subsystems 328, user interaction subsystems 332, Global Positioning System (GPS)/Navigation subsystems 336, sensor(s) and/or sensor subsystems 340, communication subsystems 344, media subsystems 348, and/or device interaction subsystems 352. The subsystems, modules, components, etc. 316-352 may include hardware, software, firmware, computer readable media, displays, input devices, output devices, etc. or combinations thereof. The system, subsystems, modules, components, etc. 204, 316-352 may communicate over a network or bus 356. This communication bus 356 may be bidirectional and perform data communications using any known or future-developed standard or protocol. An example of the communication bus 356 may be as described in conjunction with Fig. 4.
[0359] The vehicle control system 204 can include a processor 304, memory 308, and/or an input/output (I/O) module 312. Thus, the vehicle control system 204 may be a computer system, which can comprise hardware elements that may be electrically coupled. The hardware elements may include one or more central processing units (CPUs) 304; one or more components of the I/O module 312 including input devices (e.g., a mouse, a keyboard, etc.) and/or one or more output devices (e.g., a display device, a printer, etc.).
[0360] The processor 304 may comprise a general purpose programmable processor or controller for executing application programming or instructions. The processor 304 may, optionally, include multiple processor cores, and/or implement multiple virtual processors. Additionally or alternatively, the processor 304 may include multiple physical processors. As a particular example, the processor 304 may comprise a specially configured application specific integrated circuit (ASIC) or other integrated circuit, a digital signal processor, a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like. The processor 304 generally functions to run programming code or instructions implementing various functions of the vehicle control system 204. [0361] The input/output module 312 and associated ports may be included to support communications over wired or wireless networks or links, for example with other communication devices, server devices, and/or peripheral devices. Examples of an input/output module 312 include an Ethernet port, a Universal Serial Bus (USB) port, Institute of Electrical and Electronics Engineers (IEEE) 1594, or other interface.
[0362] The vehicle control system 204 may also include one or more storage devices 308. By way of example, storage devices 308 may be disk drives, optical storage devices, solid-state storage devices such as a random access memory ("RAM") and/or a read-only memory ("ROM"), which can be programmable, flash-updateable and/or the like. The vehicle control system 204 may additionally include a computer-readable storage media reader; a communications system (e.g., a modem, a network card (wireless or wired), an infra-red communication device, etc.); and working memory 308, which may include RAM and ROM devices as described above. The vehicle control system 204 may also include a processing acceleration unit, which can include a digital signal processor (DSP), a special-purpose processor, and/or the like.
[0363] The computer-readable storage media reader can further be connected to a computer-readable storage medium, together (and, optionally, in combination with storage device(s)) comprehensively representing remote, local, fixed, and/or removable storage devices plus storage media for temporarily and/or more permanently containing computer- readable information. The communications system may permit data to be exchanged with an external or internal network and/or any other computer or device described herein. Moreover, as disclosed herein, the term "storage medium" may represent one or more devices for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices, and/or other machine readable mediums for storing information.
[0364] The vehicle control system 204 may also comprise software elements including an operating system and/or other code, as described in conjunction with Fig. 10. It should be appreciated that alternates to the vehicle control system 204 may have numerous variations from that described herein. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software (including portable software, such as applets), or both. Further, connection to other computing devices such as network input/output devices may be employed. [0365] The power source and/or power control module 316 can include any type of power source, including, but not limited to, batteries, alternating current sources (from connections to a building power system or power line), solar cell arrays, etc. One or more components or modules may also be included to control the power source or change the characteristics of the provided power signal. Such modules can include one or more of, but is not limited to, power regulators, power filters, alternating current (AC) to direct current (DC) converters, DC to AC converters, receptacles, wiring, other converters, etc. The power source and/or power control module 316 functions to provide the vehicle control system 204 and any other system with power.
[0366] The data storage 320 can include any module for storing, retrieving, and/or managing data in one or more data stores and/or databases. The database or data stores may reside on a storage medium local to (and/or resident in) the vehicle control system 204 or in the vehicle 104. Alternatively, some of the data storage capability may be remote from the vehicle control system 204 or automobile, and in communication (e.g., via a network) to the vehicle control system 204. The database or data stores may reside in a storage-area network ("SAN") familiar to those skilled in the art. Similarly, any necessary files for performing the functions attributed to the vehicle control system 204 may be stored locally on the respective vehicle control system 204 and/or remotely, as appropriate. The databases or data stores may be a relational database, and the data storage module 320 may be adapted to store, update, and retrieve data in response to specifically-formatted commands. The data storage module 320 may also perform data management functions for any flat file, object oriented, or other type of database or data store.
[0367] A first data store that may be part of the vehicle control environment 300 is a profile data store 252 for storing data about user profiles and data associated with the users. A system data store 208 can include data used by the vehicle control system 204 and/or one or more of the components 324-352 to facilitate the functionality described herein. The data stores 208 and/or 252 may be as described in conjunction with Figs. 1 and/or 12A-12D.
[0368] The user interface/input interfaces 324 may be as described herein for providing information or data and/or for receiving input or data from a user. Vehicle systems 328 can include any of the mechanical, electrical, electromechanical, computer, or other systems associated with the function of the vehicle 100. For example, vehicle systems 328 can include one or more of, but is not limited to, the steering system, the braking system, the engine and engine control systems, the electrical system, the suspension, the drive train, the cruise control system, the radio, the heating, ventilation, air conditioning
(HVAC) system, the windows and/or doors, etc. These systems are well known in the art and will not be described further.
[0369] Examples of the other systems and subsystems 324-352 may be as described further herein. For example, the user interface(s)/input interface(s) 324 may be as described in Figs. 2 and 8B; the vehicle subsystems 328 may be as described in Figs. 6a et. seq.; the user interaction subsystem 332 may be as described in conjunction with the user/device interaction subsystem 817 of Fig. 8B; the Navigation subsystem 336 may be as described in Figs. 6A and 8C; the sensor(s)/sensor subsystem 340 may be as described in Figs. 7A and 7B; the communication subsystem 344 may be as described in Figs. 2, 4, 5B, 5C, and 9; the media subsystem 348 may be as described in Fig. 8A; and, the device interaction subsystem 352 may be as described in Fig. 2 and in conjunction with the user/device interaction subsystem 817 of Fig. 8B.
[0370] Fig. 4 illustrates an optional communications channel architecture 400 and associated communications components. Fig. 4 illustrates some of the optional components that can be interconnected via the communication channels/zones 404.
Communication channels/zones 404 can carry information on one or more of a wired and/or wireless communications link with, in the illustrated example, there being three communications channels/zones, 408, 412, and 416.
[0371] This optional environment 400 can also include an IP router 420, an operator cluster 424, one or more storage devices 428, one or more blades, such as master blade 432, and computational blades 436 and 440. Additionally, the communications channels/zones 404 can interconnect one or more displays, such as, remote display 1 444, remote display N 448, and console display 452. The communications channels/zones 404 also interconnect an access point 456, a Bluetooth® access point/USB hub 460, a
Femtocell 464, a storage controller 468, that is connected to one or more of USB devices 472, DVDs 476, or other storage devices 480. To assist with managing communications within the communication channel, the environment 400 optionally includes a firewall 484 which will be discussed hereinafter in greater detail. Other components that could also share the communications channel/zones 404 include GPS 488, media controller 492, which is connected to one or more media sources 496, and one or more subsystems, such as subsystem switches 498. [0372] Optionally, the communications channels/zones 404 can be viewed as an I/O network or bus where the communications channels are carried on the same physical media. Optionally, the communication channels 404 can be split amongst one or more physical media and/or combined with one or more wireless communications protocols. Optionally, the communications channels 404 can be based on wireless protocols with no physical media interconnecting the various elements described herein.
[0373] The environment 400 shown in Fig. 4 can include a collection of blade processors that are housed in a "crate." The crate can have a PC-style backplane connector 408 and a backplane Ethernet 408 that allows the various blades to
communicate with one another using, for example, an Ethernet.
[0374] Various other functional elements illustrated in Fig. 4 can be integrated into this crate architecture with, as discussed hereinafter, various zones utilized for security.
Optionally, as illustrated in Fig. 4, the backplane 404/408 can have two separate Ethernet zones that may or may not be on the same communications channel. Optionally, the zones exist on a single communications channel on the I/O network/bus 408. Optionally, the zones are actually on different communications channels, e.g., 412, 416; however, the implementation is not restricted to any particular type of configuration. Rather, as illustrated in Fig. 4, there can be a red zone 417 and a green zone 413, and the I/O backplane on the network/bus 408 that enables standard I/O operations. This backplane or I/O network/bus 408 also optionally can provide power distribution to the various modules and blades illustrated in Fig. 4. The red and green Ethernet zones, 417 and 413 respectively, can be implemented as Ethernet switches, with one on each side of the firewall 484. Two Ethernets (untrusted and trusted) are not connected in accordance with an optional embodiment. Optionally, the connector geometry for the firewall can be different for the Ethernet zones than for the blades that are a part of the system.
[0375] The red zone 417 only needs to go from the modular connector to the input side of the backplane connector of the firewall 484. While Fig. 4 indicates that there are five external red zone connectors to the firewall 484, provisions can be made for any number of ports with the connections being made at the access point 456, the Bluetooth® access point (combo controller) 460, Femtocell 464, storage controller 468, and/or firewall 484. Optionally, the external port connections can be made through a manufacturer
configurable modular connector panel, and one or more of the red zone Ethernet ports could be available through a customer supplied crate which allows, for example, wired Ethernet connections from a bring-your-own-device (BYOD) to the firewall 484. [0376] The green zone 413 goes from the output side of the firewall 484 and generally defines the trusted Ethernet. The Ethernet on the backplane 408 essentially implements an Ethernet switch for the entire system, defining the Ethernet backbone of the vehicle 104. All other modules, e.g., blades, etc., can connect to a standard backplane bus and the trusted Ethernet. Some number of switch ports can be reserved to connect to an output modular connector panel to distribute the Ethernet throughout the vehicle 104, e.g., connecting such elements as the console display 452, remote displays 444, 448, GPS 488, etc. Optionally, only trusted components, either provided or approved by the
manufacturer after testing, can be attached to the green zone 413, which is by definition in the trusted Ethernet environment.
[0377] Optionally, the environment 400, shown in Fig. 4, utilizes IPv6 over Ethernet connections wherever possible. Using, for example, the Broadcom single-twisted pair Ethernet technology, wiring harnesses are simplified and data transmission speeds are maximized. However, while the Broadcom single-twisted pair Ethernet technology can be used, in general, systems and methods can work comparably well with any type of well- known Ethernet technology or other comparable communications technology.
[0378] As illustrated in Fig. 4 the I/O network/bus 408 is a split-bus concept that contains three independent bus structures:
[0379] The red zone 417 - the untrusted Ethernet environment. This zone 417 may be used to connect network devices and customer provided devices to the vehicle information system with these devices being on the untrusted side of the firewall 484.
[0380] The green zone 413 - the trusted Ethernet environment, this zone 413 can be used to connect manufacturer certified devices such as GPS units, remote displays, subsystem switches, and the like, to the vehicle network 404. Manufacturer certified devices can be implemented by vendors that allow the vehicle software system to validate whether or not a device is certified to operate with the vehicle 100. Optionally, only certified devices are allowed to connect to the trusted side of the network.
[0381] The I/O bus 409 - the I/O bus may be used to provide power and data
transmission to bus-based devices such as the vehicle solid state drive, the media controller blade 492, the computational blades 436, 440, and the like.
[0382] As an example, the split-bus structure can have the following minimum configuration:
[0383] Two slots for the red zone Ethernet; [0384] One slot for built-in LTE/WiMax access 420 from the car to other network resources such as the cloud/Internet;
[0385] One slot for user devices or bring-your-own device access, this slot can implement, for example, WiFi, Bluetooth®, and/or USB connectivity 456, which can be provided in, for example, the customer crate;
[0386] One slot for combined red zone and green zone Ethernet, this slot can be reserved for the firewall controller;
[0387] Two slots for computational blades. Here the two computation blades are illustratively as shown the optional master blade and the multimedia blade or controller 492 which can be provided as standard equipment; and
[0388] The expansion controller that allows the I/O bus to be extended and provides additional Ethernet switch ports for one or more of the red or green zones, which may require that the basic green zone Ethernet switch implementation will support additional ports beyond the initial three that are needed for the basic exemplary system.
[0389] It should be possible to build 8 or 16 or more Ethernet switches that allow for the expansion with existing component(s) in a straight- forward manner.
[0390] The red zone 417 can be implemented as an 8-port Ethernet switch that has three actual bus ports within the crate with the remaining five ports being available on the customer crate. The crate implements red zone slots for the firewall controller 484, the combo controller which includes WiFi, Bluetooth®, USB hub (456, 460) and the IP router
420.
[0391] The firewall controller 484 can have a dedicated slot that bridges the red zone 417, green zone 413, and uses the I/O bus for power connections. In accordance with an optional low cost implementation, the firewall 484 can be implemented by a dummy module that simply bridges the red zone 417 and the green zone 413 without necessarily providing any firewall functionality. The combo controller 460 that includes the WiFi, Bluetooth®, and USB hub can be provided for consumer device connections. This controller can also implement the IPv6 (un-routable) protocol to insure that all information is packetized for transmission via IP over the Ethernet in the I/O network/bus 408.
[0392] The combo controller 460 with the USB hub can have ports in the customer crate. The combo controller 460 can implement USB discovery functions and packetizes the information for transmission via IP over Ethernet. The combo controller 460 can also facilitate installation of the correct USB driver for the discovered device, such as a BYOD from the user. The combo controller 460 and USB hub can then map the USB address to a "local" IPv6 address for interaction with one or more of the computational blades which is generally going to be the media controller 492.
[0393] The IP router 420 can implement Internet access through a manufacturer provided service. This service can allow, for example, a manufacturer to offer value- added services to be integrated into the vehicle information systems. The existence of the manufacturer provided Internet access can also allow the "e-Call" function and other vehicle data recorder functions to be implemented. IP router 420 also allows, for example, WiMax, 4G LTE, and other connections to the Internet through a service provider that can be, for example, contracted by the manufacturer. Internally, the IP router 420 can allow cellular handset connections to the Internet through a Femtocell 464 that is part of the IP router implementation. The IP router 420, with the Femtocell 464, can also allow a cone of silence functionality to be implemented. The IP router 420 can be an optional component for a vehicle provided by, for example, the manufacturer, a dealer, or installed by a user. In the absence of the IP router 420, it is possible to connect a consumer handheld device to the I/O network/bus 408 using, for example, either WiFi or Bluetooth® 456, 460. While functionality may be somewhat reduced when using a handheld device instead of a built-in Ethernet connection, systems and methods of this invention can also work utilizing this consumer handheld device which then connects to the Internet via, for example, WiMax, 4G, 4G LTE, or the like.
[0394] Figs. 5A-5C show configurations of a vehicle 104. In general, a vehicle 104 may provide functionality based at least partially on one or more areas, zones, and distances, associated with the vehicle 104. Non- limiting examples of this functionality are provided herein below.
[0395] An arrangement or configuration for sensors within a vehicle 104 is as shown in Fig. 5A. The sensor arrangement 500 can include one or more areas 508 within the vehicle. An area can be a larger part of the environment inside or outside of the vehicle 104. Thus, area one 508A may include the area within the trunk space or engine space of the vehicle 104 and/or the front passenger compartment. Area two 508B may include a portion of the interior space 108 (e.g., a passenger compartment, etc.) of the vehicle 104. The area N, 508N, may include the trunk space or rear compartment area, when included within the vehicle 104. The interior space 108 may also be divided into other areas. Thus, one area may be associated with the front passenger's and driver's seats, a second area may be associated with the middle passengers' seats, and a third area may be associated with a rear passenger's seat. Each area 508 may include one or more sensors that are positioned or operate to provide environmental information about that area 508.
[0396] Each area 508 may be further separated into one or more zones 512 within the area 508. For example, area 1 508A may be separated into zone A 512A, and zone B 512B. Each zone 512 may be associated with a particular portion of the interior occupied by a passenger. For example, zone A 512A may be associated with a driver. Zone B 512B, may be associated with a front passenger. Each zone 512 may include one or more sensors that are positioned or configured to collect information about the environment or ecosystem associated with that zone or person.
[0397] A passenger area 508B may include more than two zones as described in conjunction with area 508A. For example, area 508B may include three zones, 512C, 512D, and 512E. These three separate zones 512C, 512D, and 512E may be associated with three passenger seats typically found in the rear passenger area of a vehicle 104. An area 508N and may include a single zone 512N as there may be no separate passenger areas but may include a single trunk area within the vehicle 104. The number of zones 512 is unlimited within the areas as the areas are also unlimited inside the vehicle 104. Further, it should be noted that there may be one or areas 508 or zones 512 that may be located outside the vehicle 104 that may have a specific set of sensors associated therewith.
[0398] Optionally, each area/access point 508, 456, 516, 520, and/or zone 512, associated with a vehicle 104, may comprise one or more sensors to determine a presence of a user 216 and/or device 212, 248 in and/or adjacent to each area 508, 456, 516, 520, and/or zone 512. The sensors may include vehicle sensors 242 and/or non- vehicle sensors 236 as described herein. It is anticipated that the sensors may be configured to
communicate with a vehicle control system 204 and/or the diagnostic communications module 256. Additionally or alternatively, the sensors may communicate with a device 212, 248. The communication of sensors with the vehicle 104 may initiate and/or terminate the control of device 212, 248 features. For example, a vehicle operator may be located in a second outside area 520 associated with a vehicle 104. As the operator approaches the first outside area 516, associated with the vehicle 104, the vehicle control system 204 may determine to control features associated with one or more device 212, 248 and diagnostic communications module 256.
[0399] Optionally, the location of the device 212, 248 relative to the vehicle 104 may determine vehicle functionality and/or features to be provided and/or restricted to a user 216. By way of example, a device 212, 248 associated with a user 216 may be located at a second outside area 520 from the vehicle 104. In this case, and based at least partially on the distance of the device 212, 248 from the vehicle 104 (e.g., provided by detecting the device 212, 248 at or beyond the second outside area 520) the vehicle 104 may lock one or more features (e.g., ignition access, vehicle access, communications ability, etc.) associated with the vehicle 104. Optionally, the vehicle 104 may provide an alert based on the distance of the device 212, 248 from the vehicle 104. Continuing the example above, once the device 212, 248 reaches the first outside area 516 of the vehicle 104 at least one of the vehicle features may be unlocked. For instance, by reaching the first outside area 516, the vehicle 104 may unlock a door of the vehicle 104. In some cases, when the device is detected to be inside the vehicle 104, the various sensors 236, 242 may determine that the user 216 is in an area 508 and/or zone 512. As is further described herein, features of the vehicle 104, device 212, 248, and/or other components may be controlled based on rules stored in a memory.
[0400] Fig. 5B illustrates optional internal vehicle communications between one or more of the vehicle and one or more devices or between devices. Various communications can occur utilizing one or more Bluetooth®, NFC, WiFi, mobile hot spot, point-to-point communications, point-to-multipoint other point communications, an ad hoc network, or in general any known communications protocol over any known communications media or media-types.
[0401] Optionally, various types of internal vehicle communications can be facilitated using an access point 456 that utilizes one or more of Bluetooth®, NFC, WiFi, wireless Ethernet, mobile hot spot technology, or the like. Upon being connected with, and optionally authenticated to the access point 456, the connected device is able to
communicate with one or more of the vehicle and one or more other devices that are connected to the access point 456. The type of connection to the access point 456 can be based on, for example, the zone 512, in which the device is located.
[0402] The user may identify their zone 512 in conjunction with an authentication procedure to the access point 456. For example, a driver in zone A 512 A, upon authenticating to the access point 456, can cause the access point 456 to send a query to the device asking the device user in which zone 512 they are located. As discussed hereinafter, the zone 512 the user device is located in may have an impact on the type of communications, available bandwidth, the types of other devices or vehicle systems or subsystems the device could communicate with, and the like. As a brief introduction, internal communications with zone A 512A may be given preferential treatment over those communications originating from area 2 508B, which could have in itself, preferential treatment over communications originating within area N 508N.
[0403] Moreover, the device in zone A 512A can include profile information that governs the other devices that are allowed to connect to the access point 456 and what those devices have access to, how they can communicate, how much bandwidth they are allocated, and the like. While, optionally, the device associated with zone A 512A will be considered the "master" controller of the profile that governs the internal vehicle communications, it should be appreciated that this was arbitrarily chosen since it is assumed that there will always be a driver in a car that is present in zone A 512 A.
However, it should be appreciated the driver in zone A 512A, for example, may not have a communications device in which case a device associated with one of the other areas or zones, such as zone B 512B, area 2 508B, or area N 508N could also be associated with or control this master profile.
[0404] Optionally, various devices located within the various zones 512 can connect using, for example, ports provided by access point 456 or Bluetooth® access point/USB hub 460 as illustrated in Fig 4. Similarly, the device(s) could connect utilizing the Femtocell 464 and optionally be directly connected via, for example, a standard Ethernet port.
[0405] As discussed, each one of the areas, area 1 508A, area 2 508B, and area N 508N, can each have associated therewith a profile that governs, for example, how many and what types of devices can connect from that area 508, bandwidth allocated to that area 508, the types of media or content available to device(s) within that area 508, the interconnection of devices within that area 508 or between areas 508, or, in general, can control any aspect of communication of an associated device with any one or more other associated devices/vehicle systems within the vehicle 104.
[0406] Optionally, area 2 508B devices can be provided with full access to multimedia and infotainment available within the vehicle 104, however, devices in area 2 508B may be restricted from any access to vehicle functions. Only devices in area 1 508A may be able to access vehicle control functions such as when "parents" are located in area 1 508A and the children are located in area 2 508B. Optionally, devices found in zone E 512E of area 2 508B may be able to access limited vehicle control functionality such as climate control within area 2. Similarly, devices in area N 508N may be able to control climate features within zone N 512N. [0407] As will be appreciated, profiles can be established that allow management of communications within each of the areas 508, and further optionally within each of the zones 512. The profile can be granular in nature controlling not only what type of devices can connect within each zone 512, but how those devices can communicate with other devices and/or the vehicle and types of information that can be communicated.
[0408] To assist with identifying a location of a device within a zone 512, a number of different techniques can be utilized. One optional technique involves one or more of the vehicle sensors detecting the presence of an individual within one of the zones 512. Upon detection of an individual in a zone 512, communications subsystems 344 and the access point 456 can cooperate to not only associate the device within the zone 512 with the access point 456 but to also determine the location of the device within an area, and optionally within a zone 512. Once the device is established within a zone 512, a profile associated with the vehicle 104 can store information identifying that device and/or a person and optionally associating it with a particular zone 512 as a default. As discussed, there can be a master profile optionally associated with the device in zone A 512 A, this master profile can govern communications with the communications subsystems 340 and where communications within vehicle 104 are to occur.
[0409] Some optional profiles are illustrated below where the Master Profile governs other device connectivity:
[0410] Master Profile:
Figure imgf000051_0001
[0412] Secondary Profile, Option 2
Figure imgf000052_0001
Area 2 or Zone N
Devices, Internet
and Femtocell
No Vehicle Controls
[0417] Optionally, a user's device, such as a SmartPhone, can store in, for example a profile, with which zone 512 the user's device is associated. Then, assuming the user sits in the same zone 512 and area 508 as previously, the user's device can re-establish the same communications protocols with the access point 456 as were previously established.
[0418] In addition or in the alternative, the areas 508 and zones 512 can have associated therewith restrictions as to which one or more other user's devices with which users' devices can connect. For example, a first user's device can connect with any other user device in area 2 508B or area N 508N, however is restricted from connecting with a user device in area 1 508A, zone A 512A. However, the first user device may be able to communicate with another user's device that is located in area 1 508A, zone B 512B. These communications can include any type of standard communications such as sharing content, exchanging messages, forwarding or sharing multimedia or infotainment, or in general can include any communications that would ordinarily be available between two devices and/or the vehicle and vehicle systems. As discussed, there may be restrictions on the type of communications that can be sent to the device in area 1 508A, zone A 512A. For example, the user's device in area 1 508A, zone A 512A may be restricted from receiving one or more of text messages, multimedia, infotainment, or in general anything that can be envisioned as a potential distraction to the driver. Moreover, it should be appreciated that the communications between the various devices and the various zones 512 need not necessarily occur with the assistance of access point 456, but the
communications could also occur directly between the device(s).
[0419] Fig. 5C outlines optional internal vehicle communications between one or more of the vehicle and one or more devices. More specifically, Fig. 5C illustrates an example of vehicle communications where the vehicle 104 is equipped with the necessary transceivers to provide a mobile hot spot functionality to any user device(s) therein, such as user devices 248A and 248N.
[0420] Optionally, and as discussed above, one or more user devices can connect to the access point 456. This access point 456 is equipped to handle communications routing to not only the communication network/buses 224 for intra-vehicle communications, but optionally can also communicate with, for example, the Internet or the cloud, in cooperation with transceiver 260. Optionally included is a firewall 484 that has the capability of not only blocking certain types of content, such as a malicious content, but can also operate to exclude certain type of communications from emanating from the vehicle 104 and transceiver 260. As will be appreciated, various profiles could be established in the firewall 484 that controls not only the type of communications that can be received at the vehicle 104, but the type of communications that can be sent from the vehicle 104.
[0421] The transceiver 260 can be any type of well-known wireless transceiver that communicates using a known communications protocol such as WiMax, 4G, 4G LTE, 3G, or the like. The user devices can communicate via, for example, WiFi link 248 with the access point 456, with the transceiver 260 providing Internet connectivity to the various user devices. As will be appreciated, there may need to be an account associated with transceiver 260 with a wireless carrier to provide data and/or voice connectivity to enable the user devices to communicate with the Internet. Typically, the account is established on a month-to-month basis with an associated fee but could also be performed based on the amount of data to be transmitted, received, or in any other manner.
[0422] Moreover, one or more of the user's devices and access point 456 can maintain profile information that governs how the user's devices are able to communicate with other devices, and optionally the Internet. Optionally, a profile can exist that only allows the user's devices to communicate with other user's devices and/or the vehicle, multimedia and/or the vehicle infotainment system, and may not be allowed access to the Internet via transceiver 260. The profile can stipulate that the user's device could connect to the Internet via transceiver 260 for a specified period of time and/or up to a certain amount of data usage. The user's device can have full access to the Internet via transceiver 260 with no limit on time or data usage which would reduce the data usage of the user's device since it is connected via WiFi to the access point 456, but however, would increase the data usage by transceiver 260, and therefore, shift the billing for that data usage to the transceiver 260 instead of the user's device. Still further, and as previously discussed, the various profiles may stipulate which user's device has priority for use of the bandwidth provided by the transceiver 260. For example, a user's device located area 1 508A, zone A 512A may be given preferential routing treatment of data above that of a user's device in zone N 512N. In this manner, for example, a driver would be given priority for Internet access above that of the passengers. This could become important, for example, when the driver is trying to obtain traffic or direction information or, for example, when the vehicle is performing a download to update various software features.
[0423] As will be appreciated, the optional firewall 484 can cooperate with the access point 456 and the various profiles that area 508 associated with the various devices within the vehicle 104 and can fully implement communications restrictions, control bandwidth limits, Internet accessibility, malicious software blocking, and the like. Moreover, the optional firewall 484 can be accessed by an administrator with one or more of these configuration settings edited through an administrator's control panel. For example, in a scenario where parents are always in area 1 508A, it may be appropriate to give all of the user's devices in area 1 508A full access to the Internet utilizing transceiver 260, however, while restricting access and/or bandwidth to any other user devices within the vehicle 104. As the user's device and profile would be known by the firewall 484, upon the user's device being associated with the access point 456, the firewall 484 and transceiver 260 can be configured to allow communications in accordance with the stored profile.
[0424] A set of sensors or vehicle components 600 associated with the vehicle 104 may be as shown in Fig. 6A. The vehicle 104 can include, among many other components common to vehicles, wheels 607, a power source 609 (such as an engine, motor, or energy storage system (e.g., battery or capacitive energy storage system)), a manual or automatic transmission 612, a manual or automatic transmission gear controller 616, a power controller 620 (such as a throttle), a vehicle control system 204, the display device 212, a braking system 636, a steering wheel 640, a power source activation/deactivation switch 644 (e.g., an ignition), an occupant seating system 648, a wireless signal receiver 653 to receive wireless signals from signal sources such as roadside beacons and other electronic roadside devices, and a satellite positioning system receiver 657 (e.g., a Global Positioning System ("GPS") (US), GLONASS (Russia), Galileo positioning system (EU), Compass navigation system (China), and Regional Navigational Satellite System (India) receiver), driverless systems (e.g., cruise control systems, automatic steering systems, automatic braking systems, etc.).
[0425] The vehicle 104 can include a number of sensors in wireless or wired
communication with the vehicle control system 204 and/or display device 212, 248 to collect sensed information regarding the vehicle state, configuration, and/or operation. Exemplary sensors may include one or more of, but are not limited to, wheel state sensor 660 to sense one or more of vehicle speed, acceleration, deceleration, wheel rotation, wheel speed (e.g., wheel revolutions-per-minute), wheel slip, and the like, a power source energy output sensor 664 to sense a power output of the power source 609 by measuring one or more of current engine speed (e.g., revolutions-per-minute), energy input and/or output (e.g., voltage, current, fuel consumption, and torque) (e.g., turbine speed sensor, input speed sensor, crankshaft position sensor, manifold absolute pressure sensor, mass flow sensor, and the like), and the like, a switch state sensor 668 to determine a current activation or deactivation state of the power source activation/deactivation switch 644, a transmission setting sensor 670 to determine a current setting of the transmission (e.g., gear selection or setting), a gear controller sensor 672 to determine a current setting of the gear controller 616, a power controller sensor 674 to determine a current setting of the power controller 620, a brake sensor 676 to determine a current state (braking or non- braking) of the braking system 636, a seating system sensor 678 to determine a seat setting and current weight of seated occupant, if any) in a selected seat of the seating system 648, exterior and interior sound receivers 690 and 692 (e.g., a microphone, sonar, and other type of acoustic-to-electric transducer or sensor) to receive and convert sound waves into an equivalent analog or digital signal. Examples of other sensors (not shown) that may be employed include safety system state sensors to determine a current state of a vehicular safety system (e.g., air bag setting (deployed or undeployed) and/or seat belt setting (engaged or not engaged)), light setting sensor (e.g., current headlight, emergency light, brake light, parking light, fog light, interior or passenger compartment light, and/or tail light state (on or off)), brake control (e.g., pedal) setting sensor, accelerator pedal setting or angle sensor, clutch pedal setting sensor, emergency brake pedal setting sensor, door setting (e.g., open, closed, locked or unlocked) sensor, engine temperature sensor, passenger compartment or cabin temperature sensor, window setting (open or closed) sensor, one or more interior-facing or exterior-facing cameras or other imaging sensors (which commonly convert an optical image into an electronic signal but may include other devices for detection objects such as an electromagnetic radiation emitter/receiver that emits electromagnetic radiation and receives electromagnetic waves reflected by the object) to sense objects, such as other vehicles and pedestrians and optionally determine the distance, trajectory and speed of such objects, in the vicinity or path of the vehicle, odometer reading sensor, trip mileage reading sensor, wind speed sensor, radar transmitter/receiver output, brake wear sensor, steering/torque sensor, oxygen sensor, ambient lighting sensor, vision system sensor, ranging sensor, parking sensor, heating, venting, and air conditioning (HVAC) sensor, water sensor, air-fuel ratio meter, blind spot monitor, hall effect sensor, microphone, radio frequency (RF) sensor, infrared (IR) sensor, vehicle control system sensors, wireless network sensor (e.g., Wi-Fi and/or Bluetooth® sensor), cellular data sensor, and other sensors either future-developed or known to those of skill in the vehicle art.
[0426] In the depicted vehicle embodiment, the various sensors can be in
communication with the display device 212, 248 and vehicle control system 204 via signal carrier network 224. As noted, the signal carrier network 224 can be a network of signal conductors, a wireless network (e.g., a radio frequency, microwave, or infrared
communication system using a communications protocol, such as Wi-Fi), or a
combination thereof. The vehicle control system 204 may also provide signal processing of one or more sensors, sensor fusion of similar and/or dissimilar sensors, signal smoothing in the case of erroneous "wild point" signals, and/or sensor fault detection. For example, ranging measurements provided by one or more RF sensors may be combined with ranging measurements from one or more IR sensors to determine one fused estimate of vehicle range to an obstacle target.
[0427] The control system 204 may receive and read sensor signals, such as wheel and engine speed signals, as a digital input comprising, for example, a pulse width modulated (PWM) signal. The processor 304 can be configured, for example, to read each of the signals into a port configured as a counter or configured to generate an interrupt on receipt of a pulse, such that the processor 304 can determine, for example, the engine speed in revolutions per minute (RPM) and the speed of the vehicle in miles per hour (MPH) and/or kilometers per hour (KPH). One skilled in the art will recognize that the two signals can be received from existing sensors in a vehicle comprising a tachometer and a speedometer, respectively. Alternatively, the current engine speed and vehicle speed can be received in a communication packet as numeric values from a conventional dashboard subsystem comprising a tachometer and a speedometer. The transmission speed sensor signal can be similarly received as a digital input comprising a signal coupled to a counter or interrupt signal of the processor 304 or received as a value in a communication packet on a network or port interface from an existing subsystem of the vehicle 104. The ignition sensor signal can be configured as a digital input, wherein a HIGH value represents that the ignition is on and a LOW value represents that the ignition is OFF. Three bits of the port interface can be configured as a digital input to receive the gear shift position signal, representing eight possible gear shift positions. Alternatively, the gear shift position signal can be received in a communication packet as a numeric value on the port interface. The throttle position signal can be received as an analog input value, typically in the range 0-5 volts. Alternatively, the throttle position signal can be received in a communication packet as a numeric value on the port interface. The output of other sensors can be processed in a similar fashion.
[0428] Other sensors may be included and positioned in the interior space 108 of the vehicle 104. Generally, these interior sensors obtain data about the health of the driver and/or passenger(s), data about the safety of the driver and/or passenger(s), and/or data about the comfort of the driver and/or passenger(s). The health data sensors can include sensors in the steering wheel that can measure various health telemetry for the person (e.g., heart rate, temperature, blood pressure, blood presence, blood composition, etc.). Sensors in the seats may also provide for health telemetry (e.g., presence of liquid, weight, weight shifts, etc.). Infrared sensors could detect a person's temperature; optical sensors can determine a person's position and whether the person has become unconscious. Other health sensors are possible and included herein.
[0429] Safety sensors can measure whether the person is acting safely. Optical sensors can determine a person's position and focus. If the person stops looking at the road ahead, the optical sensor can detect the lack of focus. Sensors in the seats may detect if a person is leaning forward or may be injured by a seat belt in a collision. Other sensors can detect that the driver has at least one hand on a steering wheel. Other safety sensors are possible and contemplated as if included herein.
[0430] Comfort sensors can collect information about a person's comfort. Temperature sensors may detect a temperature of the interior cabin. Moisture sensors can determine a relative humidity. Audio sensors can detect loud sounds or other distractions. Audio sensors may also receive input from a person through voice data. Other comfort sensors are possible and contemplated as if included herein.
[0431] Fig. 6B shows an interior sensor configuration for one or more zones 512 of a vehicle 104 optionally. Optionally, the areas 508 and/or zones 512 of a vehicle 104 may include sensors that are configured to collect information associated with the interior 108 of a vehicle 104. In particular, the various sensors may collect environmental information, user information, and safety information, to name a few. Embodiments of these sensors may be as described in conjunction with Figs. 7A-8B.
[0432] Optionally, the sensors may include one or more of optical, or image, sensors 622A-B (e.g., cameras, etc.), motion sensors 624A-B (e.g., utilizing RF, IR, and/or other sound/image sensing, etc.), steering wheel user sensors 642 (e.g., heart rate, temperature, blood pressure, sweat, health, etc.), seat sensors 677 (e.g., weight, load cell, moisture, electrical, force transducer, etc.), safety restraint sensors 679 (e.g., seatbelt, airbag, load cell, force transducer, etc.), interior sound receivers 692A-B, environmental sensors 694 (e.g., temperature, humidity, air, oxygen, etc.), and the like.
[0433] The image sensors 622A-B may be used alone or in combination to identify objects, users 216, and/or other features, inside the vehicle 104. Optionally, a first image sensor 622A may be located in a different position within a vehicle 104 from a second image sensor 622B. When used in combination, the image sensors 622 A-B may combine captured images to form, among other things, stereo and/or three-dimensional (3D) images. The stereo images can be recorded and/or used to determine depth associated with objects and/or users 216 in a vehicle 104. Optionally, the image sensors 622A-B used in combination may determine the complex geometry associated with identifying characteristics of a user 216. For instance, the image sensors 622A-B may be used to determine dimensions between various features of a user's face (e.g., the depth/distance from a user's nose to a user's cheeks, a linear distance between the center of a user's eyes, and more). These dimensions may be used to verify, record, and even modify
characteristics that serve to identify a user 216. As can be appreciated, utilizing stereo images can allow for a user 216 to provide complex gestures in a 3D space of the vehicle 104. These gestures may be interpreted via one or more of the subsystems as disclosed herein. Optionally, the image sensors 622A-B may be used to determine movement associated with objects and/or users 216 within the vehicle 104. It should be appreciated that the number of image sensors used in a vehicle 104 may be increased to provide greater dimensional accuracy and/or views of a detected image in the vehicle 104.
[0434] The vehicle 104 may include one or more motion sensors 624A-B. These motion sensors 624A-B may detect motion and/or movement of objects inside the vehicle 104. Optionally, the motion sensors 624A-B may be used alone or in combination to detect movement. For example, a user 216 may be operating a vehicle 104 (e.g., while driving, etc.) when a passenger in the rear of the vehicle 104 unbuckles a safety belt and proceeds to move about the vehicle 104. In this example, the movement of the passenger could be detected by the motion sensors 624 A-B. Optionally, the user 216 could be alerted of this movement by one or more of the devices 212, 248 in the vehicle 104. In another example, a passenger may attempt to reach for one of the vehicle control features (e.g., the steering wheel 640, the console, icons displayed on the head unit and/or device 212, 248, etc.). In this case, the movement (i.e., reaching) of the passenger may be detected by the motion sensors 624A-B. Optionally, the path, trajectory, anticipated path, and/or some other direction of movement/motion may be determined using the motion sensors 624 A-B. In response to detecting the movement and/or the direction associated with the movement, the passenger may be prevented from interfacing with and/or accessing at least some of the vehicle control features (e.g., the features represented by icons may be hidden from a user interface, the features may be locked from use by the passenger, combinations thereof, etc.). As can be appreciated, the user 216 may be alerted of the movement/motion such that the user 216 can act to prevent the passenger from interfering with the vehicle 104 controls. Optionally, the number of motion sensors in a vehicle 104, or areas of a vehicle 104, may be increased to increase an accuracy associated with motion detected in the vehicle 104.
[0435] The interior sound receivers 692A-B may include, but are not limited to, microphones and other types of acoustic-to-electric transducers or sensors. Optionally, the interior sound receivers 692A-B may be configured to receive and convert sound waves into an equivalent analog or digital signal. The interior sound receivers 692A-B may serve to determine one or more locations associated with various sounds in the vehicle 104. The location of the sounds may be determined based on a comparison of volume levels, intensity, and the like, between sounds detected by two or more interior sound receivers 692 A-B. For instance, a first interior sound receiver 692 A may be located in a first area of the vehicle 104 and a second interior sound receiver 692B may be located in a second area of the vehicle 104. If a sound is detected at a first volume level by the first interior sound receiver 692A and a second, higher, volume level by the second interior sound receiver 692B in the second area of the vehicle 104, the sound may be determined to be closer to the second area of the vehicle 104. As can be appreciated, the number of sound receivers used in a vehicle 104 may be increased (e.g., more than two, etc.) to increase measurement accuracy surrounding sound detection and location, or source, of the sound (e.g., via triangulation, etc.).
[0436] Seat sensors 677 may be included in the vehicle 104. The seat sensors 677 may be associated with each seat and/or zone 512 in the vehicle 104. Optionally, the seat sensors 677 may provide health telemetry and/or identification via one or more of load cells, force transducers, weight sensors, moisture detection sensor, electrical
conductivity/resistance sensor, and the like. For example, the seat sensors 677 may determine that a user 216 weighs 180 lbs. This value may be compared to user data stored in memory to determine whether a match exists between the detected weight and a user 216 associated with the vehicle 104. In another example, if the seat sensors 677 detect that a user 216 is fidgeting, or moving, in a seemingly uncontrollable manner, the system may determine that the user 216 has suffered a nervous and/or muscular system issue (e.g., seizure, etc.). The vehicle control system 204 may then cause the vehicle 104 to slow down and in addition or alternatively the automobile controller 8104 (described below) can safely take control of the vehicle 104 and bring the vehicle 104 to a stop in a safe location (e.g., out of traffic, off a freeway, etc).
[0437] Health telemetry and other data may be collected via the steering wheel user sensors 642. Optionally, the steering wheel user sensors 642 may collect heart rate, temperature, blood pressure, and the like, associated with a user 216 via at least one contact disposed on or about the steering wheel 640.
[0438] The safety restraint sensors 679 may be employed to determine a state associated with one or more safety restraint devices in a vehicle 104. The state associated with one or more safety restraint devices may serve to indicate a force observed at the safety restraint device, a state of activity (e.g., retracted, extended, various ranges of extension and/or retraction, deployment, buckled, unbuckled, etc.), damage to the safety restraint device, and more.
[0439] Environmental sensors 694, including one or more of temperature, humidity, air, oxygen, carbon monoxide, smoke, and other environmental condition sensors may be used in a vehicle 104. These environmental sensors 694 may be used to collect data relating to the safety, comfort, and/or condition of the interior space 108 of the vehicle 104. Among other things, the data collected by the environmental sensors 694 may be used by the vehicle control system 204 to alter functions of a vehicle. The environment may correspond to an interior space 108 of a vehicle 104 and/or specific areas 508 and/or zones 512 of the vehicle 104. It should be appreciate that an environment may correspond to a user 216. For example, a low oxygen environment may be detected by the environmental sensors 694 and associated with a user 216 who is operating the vehicle 104 in a particular zone 512. In response to detecting the low oxygen environment, at least one of the subsystems of the vehicle 104, as provided herein, may alter the environment, especially in the particular zone 512, to increase the amount of oxygen in the zone 512. Additionally or alternatively, the environmental sensors 694 may be used to report conditions associated with a vehicle (e.g., fire detected, low oxygen, low humidity, high carbon monoxide, etc.). The conditions may be reported to a user 216 and/or a third party via at least one communications module as provided herein. [0440] Among other things, the sensors as disclosed herein may communicate with each other, with devices 212, 248, and/or with the vehicle control system 204 via the signal carrier network 224. Additionally or alternatively, the sensors disclosed herein may serve to provide data relevant to more than one category of sensor information including, but not limited to, combinations of environmental information, user information, and safety information to name a few.
[0441] Figs. 7A-7B show block diagrams of various sensors that may be associated with a vehicle 104. Although depicted as interior and exterior sensors, it should be appreciated that any of the one or more of the sensors shown may be used in both the interior space 108 and the exterior space of the vehicle 104. Moreover, sensors having the same symbol or name may include the same, or substantially the same, functionality as those sensors described elsewhere in the present disclosure. Further, although the various sensors are depicted in conjunction with specific groups (e.g., environmental 708, 708E, user interface 712, safety 716, 716E, etc.) the sensors should not be limited to the groups in which they appear. In other words, the sensors may be associated with other groups or combinations of groups and/or disassociated from one or more of the groups shown. The sensors as disclosed herein may communicate with each other, the devices 212, 248, and/or the vehicle control system 204 via one or more communications channel(s) 356.
[0442] Fig. 7A is a block diagram of an embodiment of interior sensors 340 for a vehicle 104 is provided. The interior sensors 340 may be arranged into one or more groups, based at least partially on the function of the interior sensors 340. The interior space 108 of a vehicle 104 may include an environmental group 708, a user interface group 712, and a safety group 716. Additionally or alternatively, there may be sensors associated with various devices inside the vehicle (e.g., devices 212, 248, smart phones, tablets, mobile computers, etc.)
[0443] The environmental group 708 may comprise sensors configured to collect data relating to the internal environment of a vehicle 104. It is anticipated that the environment of the vehicle 104 may be subdivided into areas 508 and zones 512 in an interior space 108 of a vehicle 104. In this case, each area 508 and/or zone 512 may include one or more of the environmental sensors. Examples of environmental sensors associated with the environmental group 708 may include, but are not limited to, oxygen/air sensors 724, temperature sensors 728, humidity sensors 732, light/photo sensors 736, and more. The oxygen/air sensors 724 may be configured to detect a quality of the air in the interior space 108 of the vehicle 104 (e.g., ratios and/or types of gasses comprising the air inside the vehicle 104, dangerous gas levels, safe gas levels, etc.). Temperature sensors 728 may be configured to detect temperature readings of one or more objects, users 216, and/or areas 508 of a vehicle 104. Humidity sensors 732 may detect an amount of water vapor present in the air inside the vehicle 104. The light/photo sensors 736 can detect an amount of light present in the vehicle 104. Further, the light/photo sensors 736 may be configured to detect various levels of light intensity associated with light in the vehicle 104.
[0444] The user interface group 712 may comprise sensors configured to collect data relating to one or more users 216 in a vehicle 104. As can be appreciated, the user interface group 712 may include sensors that are configured to collect data from users 216 in one or more areas 508 and zones 512 of the vehicle 104. For example, each area 508 and/or zone 512 of the vehicle 104 may include one or more of the sensors in the user interface group 712. Examples of user interface sensors associated with the user interface group 712 may include, but are not limited to, infrared sensors 740, motion sensors 744, weight sensors 748, wireless network sensors 752, biometric sensors 756, camera (or image) sensors 760, audio sensors 764, and more.
[0445] Infrared sensors 740 may be used to measure IR light irradiating from at least one surface, user 216, or other object in the vehicle 104. Among other things, the Infrared sensors 740 may be used to measure temperatures, form images (especially in low light conditions), identify users 216, and even detect motion in the vehicle 104.
[0446] The motion sensors 744 may be similar to the motion detectors 624A-B, as described in conjunction with Fig. 6B. Weight sensors 748 may be employed to collect data relating to objects and/or users 216 in various areas 508 of the vehicle 104. In some cases, the weight sensors 748 may be included in the seats and/or floor of a vehicle 104.
[0447] Optionally, the vehicle 104 may include a wireless network sensor 752. This sensor 752 may be configured to detect one or more wireless network(s) inside the vehicle 104. Examples of wireless networks may include, but are not limited to, wireless communications utilizing Bluetooth®, Wi-Fi™, ZigBee, IEEE 802.11, and other wireless technology standards. For example, a mobile hotspot may be detected inside the vehicle 104 via the wireless network sensor 752. In this case, the vehicle 104 may determine to utilize and/or share the mobile hotspot detected via/with one or more other devices 212, 248 and/or components associated with the vehicle 104.
[0448] Biometric sensors 756 may be employed to identify and/or record characteristics associated with a user 216. It is anticipated that biometric sensors 756 can include at least one of image sensors, IR sensors, fingerprint readers, weight sensors, load cells, force transducers, heart rate monitors, blood pressure monitors, and the like as provided herein.
[0449] The camera sensors 760 may be similar to image sensors 622A-B, as described in conjunction with Fig. 6B. Optionally, the camera sensors may record still images, video, and/or combinations thereof. The audio sensors 764 may be similar to the interior sound receivers 692A-B, as described in conjunction with Figs. 6A-6B. The audio sensors may be configured to receive audio input from a user 216 of the vehicle 104. The audio input from a user 216 may correspond to voice commands, conversations detected in the vehicle 104, phone calls made in the vehicle 104, and/or other audible expressions made in the vehicle 104.
[0450] The safety group 716 may comprise sensors configured to collect data relating to the safety of a user 216 and/or one or more components of a vehicle 104. The vehicle 104 may be subdivided into areas 508 and/or zones 512 in an interior space 108 of a vehicle 104 where each area 508 and/or zone 512 may include one or more of the safety sensors provided herein. Examples of safety sensors associated with the safety group 716 may include, but are not limited to, force sensors 768, mechanical motion sensors 772, orientation sensors 776, restraint sensors 780, and more.
[0451] The force sensors 768 may include one or more sensors inside the vehicle 104 configured to detect a force observed in the vehicle 104. One example of a force sensor 768 may include a force transducer that converts measured forces (e.g., force, weight, pressure, etc.) into output signals.
[0452] Mechanical motion sensors 772 may correspond to encoders, accelerometers, damped masses, and the like. Optionally, the mechanical motion sensors 772 may be adapted to measure the force of gravity (i.e., G-force) as observed inside the vehicle 104. Measuring the G-force observed inside a vehicle 104 can provide valuable information related to a vehicle's acceleration, deceleration, collisions, and/or forces that may have been suffered by one or more users 216 in the vehicle 104. As can be appreciated, the mechanical motion sensors 772 can be located in an interior space 108 or an exterior of the vehicle 104.
[0453] Orientation sensors 776 can include accelerometers, gyroscopes, magnetic sensors, and the like that are configured to detect an orientation associated with the vehicle 104. Similar to the mechanical motion sensors 772, the orientation sensors 776 can be located in an interior space 108 or an exterior of the vehicle 104. [0454] The restraint sensors 780 may be similar to the safety restraint sensors 679 as described in conjunction with Figs. 6A-6B. These sensors 780 may correspond to sensors associated with one or more restraint devices and/or systems in a vehicle 104. Seatbelts and airbags are examples of restraint devices and/or systems. As can be appreciated, the restraint devices and/or systems may be associated with one or more sensors that are configured to detect a state of the device/system. The state may include extension, engagement, retraction, disengagement, deployment, and/or other electrical or mechanical conditions associated with the device/system.
[0455] The associated device sensors 720 can include any sensors that are associated with a device 212, 248 in the vehicle 104. As previously stated, typical devices 212, 248 may include smart phones, tablets, laptops, mobile computers, and the like. It is anticipated that the various sensors associated with these devices 212, 248 can be employed by the vehicle control system 204. For example, a typical smart phone can include, an image sensor, an IR sensor, audio sensor, gyroscope, accelerometer, wireless network sensor, fingerprint reader, and more. It is an aspect of the present disclosure that one or more of these associated device sensors 720 may be used by one or more subsystems of the vehicle system 200.
[0456] In Fig. 7B, a block diagram of an embodiment of exterior sensors 340 for a vehicle 104 is shown. The exterior sensors may include sensors that are identical, or substantially similar, to those previously disclosed in conjunction with the interior sensors of Fig. 7A. Optionally, the exterior sensors 340 may be configured to collect data relating to one or more conditions, objects, users 216, and other events that are external to the interior space 108 of the vehicle 104. For instance, the oxygen/air sensors 724 may measure a quality and/or composition of the air outside of a vehicle 104. As another example, the motion sensors 744 may detect motion outside of a vehicle 104.
[0457] The external environmental group 708E may comprise sensors configured to collect data relating to the external environment of a vehicle 104. In addition to including one or more of the sensors previously described, the external environmental group 708E may include additional sensors, such as, vehicle sensors 750, biological sensors, and wireless signal sensors 758. Vehicle sensors 750 can detect vehicles that are in an environment surrounding the vehicle 104. For example, the vehicle sensors 750 may detect vehicles in a first outside area 516, a second outside area 520, and/or combinations of the first and second outside areas 516, 520. Optionally, the vehicle sensors 750 may include one or more of RF sensors, IR sensors, image sensors, and the like to detect vehicles, people, hazards, etc. that are in an environment exterior to the vehicle 104.
Additionally or alternatively, the vehicle sensors 750 can provide distance/directional information relating to a distance (e.g., distance from the vehicle 104 to the detected object) and/or a direction (e.g., direction of travel, etc.) associated with the detected object.
[0458] The biological sensors 754 may determine whether one or more biological entities (e.g., an animal, a person, a user 216, etc.) is in an external environment of the vehicle 104. Additionally or alternatively, the biological sensors 754 may provide distance information relating to a distance of the biological entity from the vehicle 104. Biological sensors 754 may include at least one of RF sensors, IR sensors, image sensors and the like that are configured to detect biological entities. For example, an IR sensor may be used to determine that an object, or biological entity, has a specific temperature, temperature pattern, or heat signature. Continuing this example, a comparison of the determined heat signature may be compared to known heat signatures associated with recognized biological entities (e.g., based on shape, locations of temperature, and combinations thereof, etc.) to determine whether the heat signature is associated with a biological entity or an inanimate, or non-biological, object.
[0459] The wireless signal sensors 758 may include one or more sensors configured to receive wireless signals from signal sources such as Wi-Fi™ hotspots, cell towers, roadside beacons, other electronic roadside devices, and satellite positioning systems. Optionally, the wireless signal sensors 758 may detect wireless signals from one or more of a mobile phone, mobile computer, keyless entry device, RFID device, near field communications (NFC) device, and the like.
[0460] The external safety group 716E may comprise sensors configured to collect data relating to the safety of a user 216 and/or one or more components of a vehicle 104.
Examples of safety sensors associated with the external safety group 716E may include, but are not limited to, force sensors 768, mechanical motion sensors 772, orientation sensors 776, vehicle body sensors 782, and more. Optionally, the exterior safety sensors 716E may be configured to collect data relating to one or more conditions, objects, vehicle components, and other events that are external to the vehicle 104. For instance, the force sensors 768 in the external safety group 716E may detect and/or record force information associated with the outside of a vehicle 104. For instance, if an object strikes the exterior of the vehicle 104, the force sensors 768 from the exterior safety group 716E may determine a magnitude, location, and/or time associated with the strike. [0461] The vehicle 104 may include a number of vehicle body sensors 782. The vehicle body sensors 782 may be configured to measure characteristics associated with the body (e.g., body panels, components, chassis, windows, etc.) of a vehicle 104. For example, two vehicle body sensors 782, including a first body sensor and a second body sensor, may be located at some distance apart. Continuing this example, the first body sensor may be configured to send an electrical signal across the body of the vehicle 104 to the second body sensor, or vice versa. Upon receiving the electrical signal from the first body sensor, the second body sensor may record a detected current, voltage, resistance, and/or combinations thereof associated with the received electrical signal. Values (e.g., current, voltage, resistance, etc.) for the sent and received electrical signal may be stored in a memory. These values can be compared to determine whether subsequent electrical signals sent and received between vehicle body sensors 782 deviate from the stored values. When the subsequent signal values deviate from the stored values, the difference may serve to indicate damage and/or loss of a body component. Additionally or alternatively, the deviation may indicate a problem with the vehicle body sensors 782. The vehicle body sensors 782 may communicate with each other, a vehicle control system 204, and/or systems of the vehicle system 200 via a communications channel 356.
Although described using electrical signals, it should be appreciated that alternative embodiments of the vehicle body sensors 782 may use sound waves and/or light to perform a similar function.
[0462] Fig. 8A is a block diagram of an embodiment of a media controller subsystem 348 for a vehicle 104. The media controller subsystem 348 may include, but is not limited to, a media controller 804, a media processor 808, a match engine 812, an audio processor 816, a speech synthesis module 820, a network transceiver 824, a signal processing module 828, memory 832, and a language database 836. Optionally, the media controller subsystem 348 may be configured as a dedicated blade that implements the media-related functionality of the system 200. Additionally or alternatively, the media controller subsystem 348 can provide voice input, voice output, library functions for multimedia, and display control for various areas 508 and/or zones 512 of the vehicle 104.
[0463] Optionally, the media controller subsystem 348 may include a local IP address (e.g., IPv4, IPv6, combinations thereof, etc.) and even a routable, global unicast address. The routable, global unicast address may allow for direct addressing of the media controller subsystem 348 for streaming data from Internet resources (e.g., cloud storage, user accounts, etc.). It is anticipated, that the media controller subsystem 348 can provide multimedia via at least one Internet connection, or wireless network communications module, associated with the vehicle 104. Moreover, the media controller subsystem 348 may be configured to service multiple independent clients simultaneously.
[0464] The media processor 808 may comprise a general purpose programmable processor or controller for executing application programming or instructions related to the media subsystem 348. The media processor 808 may include multiple processor cores, and/or implement multiple virtual processors. Optionally, the media processor 808 may include multiple physical processors. By way of example, the media processor 808 may comprise a specially configured application specific integrated circuit (ASIC) or other integrated circuit, a digital signal processor, a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like. The media processor 808 generally functions to run programming code or instructions implementing various functions of the media controller 804.
[0465] The match engine 812 can receive input from one or more components of the vehicle system 800 and perform matching functions. Optionally, the match engine 812 may receive audio input provided via a microphone 886 of the system 800. The audio input may be provided to the media controller subsystem 348 where the audio input can be decoded and matched, via the match engine 812, to one or more functions available to the vehicle 104. Similar matching operations may be performed by the match engine 812 relating to video input received via one or more image sensors, cameras 878, and the like.
[0466] The media controller subsystem 348 may include a speech synthesis module 820 configured to provide audio output to one or more speakers 880, or audio output devices, associated with the vehicle 104. Optionally, the speech synthesis module 820 may be configured to provide audio output based at least partially on the matching functions performed by the match engine 812.
[0467] As can be appreciated, the coding/decoding, the analysis of audio input/output, and/or other operations associated with the match engine 812 and speech synthesis module 820, may be performed by the media processor 808 and/or a dedicated audio processor 816. The audio processor 816 may comprise a general purpose programmable processor or controller for executing application programming or instructions related to audio processing. Further, the audio processor 816 may be similar to the media processor 808 described herein.
[0468] The network transceiver 824 can include any device configured to transmit and receive analog and/or digital signals. Optionally, the media controller subsystem 348 may utilize a network transceiver 824 in one or more communication networks associated with the vehicle 104 to receive and transmit signals via the communications channel 356.
Additionally or alternatively, the network transceiver 824 may accept requests from one or more devices 212, 248 to access the media controller subsystem 348. One example of the communication network is a local-area network (LAN). As can be appreciated, the functionality associated with the network transceiver 824 may be built into at least one other component of the vehicle 104 (e.g., a network interface card, communications module, etc.).
[0469] The signal processing module 828 may be configured to alter audio/multimedia signals received from one or more input sources (e.g., microphones 886, etc.) via the communications channel 356. Among other things, the signal processing module 828 may alter the signals received electrically, mathematically, combinations thereof, and the like.
[0470] The media controller 804 may also include memory 832 for use in connection with the execution of application programming or instructions by the media processor 808, and for the temporary or long term storage of program instructions and/or data. As examples, the memory 832 may comprise RAM, DRAM, SDRAM, or other solid state memory.
[0471] The language database 836 may include the data and/or libraries for one or more languages, as are used to provide the language functionality as provided herein. In one case, the language database 836 may be loaded on the media controller 804 at the point of manufacture. Optionally, the language database 836 can be modified, updated, and/or otherwise changed to alter the data stored therein. For instance, additional languages may be supported by adding the language data to the language database 836. In some cases, this addition of languages can be performed via accessing administrative functions on the media controller 804 and loading the new language modules via wired (e.g., USB, etc.) or wireless communication. In some cases, the administrative functions may be available via a vehicle console device 248, a user device 212, 248, and/or other mobile computing device that is authorized to access administrative functions (e.g., based at least partially on the device's address, identification, etc.).
[0472] One or more video controllers 840 may be provided for controlling the video operation of the devices 212, 248, 882 associated with the vehicle. Optionally, the video controller 840 may include a display controller for controlling the operation of touch sensitive screens, including input (touch sensing) and output (display) functions. Video data may include data received in a stream and unpacked by a processor and loaded into a display buffer. In this example, the processor and video controller 840 can optimize the display based on the characteristics of a screen of a display device 212, 248, 882. The functions of a touch screen controller may be incorporated into other components, such as a media processor 808 or display subsystem.
[0473] The audio controller 844 can provide control of the audio entertainment system (e.g., radio, subscription music service, multimedia entertainment, etc.), and other audio associated with the vehicle 104 (e.g., navigation systems, vehicle comfort systems, convenience systems, etc.). Optionally, the audio controller 844 may be configured to translate digital signals to analog signals and vice versa. As can be appreciated, the audio controller 844 may include device drivers that allow the audio controller 844 to communicate with other components of the system 800 (e.g., processors 816, 808, audio I/O 874, and the like).
[0474] The system 800 may include a profile identification module 848 to determine whether a user profile is associated with the vehicle 104. Among other things, the profile identification module 848 may receive requests from a user 216, or device 212, 228, 248, to access a profile stored in a profile database 856 or profile data 252. Additionally or alternatively, the profile identification module 848 may request profile information from a user 216 and/or a device 212, 228, 248, to access a profile stored in a profile database 856 or profile data 252. In any event, the profile identification module 848 may be configured to create, modify, retrieve, and/or store user profiles in the profile database 856 and/or profile data 252. The profile identification module 848 may include rules for profile identification, profile information retrieval, creation, modification, and/or control of components in the system 800.
[0475] By way of example, a user 216 may enter the vehicle 104 with a smart phone or other device 212. In response to determining that a user 216 is inside the vehicle 104, the profile identification module 848 may determine that a user profile is associated with the user's smart phone 212. As another example, the system 800 may receive information about a user 216 (e.g., from a camera 878, microphone 886, etc.), and, in response to receiving the user information, the profile identification module 848 may refer to the profile database 856 to determine whether the user information matches a user profile stored in the database 856. It is anticipated that the profile identification module 848 may communicate with the other components of the system to load one or more preferences, settings, and/or conditions based on the user profile. Further, the profile identification module 848 may be configured to control components of the system 800 based on user profile information.
[0476] Optionally, data storage 852 may be provided. Like the memory 832, the data storage 852 may comprise a solid state memory device or devices. Alternatively or in addition, the data storage 852 may comprise a hard disk drive or other random access memory. Similar to the data storage 852, the profile database 856 may comprise a solid state memory device or devices.
[0477] An input/output module 860 and associated ports may be included to support communications over wired networks or links, for example with other communication devices, server devices, and/or peripheral devices. Examples of an input/output module 860 include an Ethernet port, a Universal Serial Bus (USB) port, CAN Bus, Institute of Electrical and Electronics Engineers (IEEE) 1594, or other interface. Users may bring their own devices (e.g., Bring Your Own Device (BYOD), device 212, etc.) into the vehicle 104 for use with the various systems disclosed. Although most BYOD devices can connect to the vehicle systems (e.g., the media controller subsystem 348, etc.) via wireless communications protocols (e.g., Wi-Fi™, Bluetooth®, etc.) many devices may require a direct connection via USB, or similar. In any event, the input/output module 860 can provide the necessary connection of one or more devices to the vehicle systems described herein.
[0478] A video input/output interface 864 can be included to receive and transmit video signals between the various components in the system 800. Optionally, the video input/output interface 864 can operate with compressed and uncompressed video signals. The video input/output interface 864 can support high data rates associated with image capture devices. Additionally or alternatively, the video input/output interface 864 may convert analog video signals to digital signals.
[0479] The infotainment system 870 may include information media content and/or entertainment content, informational devices, entertainment devices, and the associated programming therefor. Optionally, the infotainment system 870 may be configured to handle the control of one or more components of the system 800 including, but in no way limited to, radio, streaming audio/video devices, audio devices 880, 882, 886, video devices 878, 882, travel devices (e.g., GPS, navigational systems, etc.), wireless communication devices, network devices, and the like. Further, the infotainment system 870 can provide the functionality associated with other infotainment features as provided herein. [0480] An audio input/output interface 874 can be included to provide analog audio to an interconnected speaker 880 or other device, and to receive analog audio input from a connected microphone 886 or other device. As an example, the audio input/output interface 874 may comprise an associated amplifier and analog to digital converter.
Alternatively or in addition, the devices 212, 248 can include integrated audio input/output devices 880, 886 and/or an audio jack for interconnecting an external speaker 880 or microphone 886. For example, an integrated speaker 880 and an integrated microphone 886 can be provided, to support near talk, voice commands, spoken information exchange, and/or speaker phone operations.
[0481] Among other things, the system 800 may include devices that are part of the vehicle 104 and/or part of a device 212, 248 that is associated with the vehicle 104. For instance, these devices may be configured to capture images, display images, capture sound, and present sound. Optionally, the system 800 may include at least one of image sensors/cameras 878, display devices 882, audio input devices/microphones 886, and audio output devices/speakers 880. The cameras 878 can be included for capturing still and/or video images. Alternatively or in addition, image sensors 878 can include a scanner or code reader. An image sensor/camera 878 can include or be associated with additional elements, such as a flash or other light source. In some cases, the display device 882 may include an audio input device and/or an audio output device in addition to providing video functions. For instance, the display device 882 may be a console, monitor, a tablet computing device, and/or some other mobile computing device.
[0482] Fig. 8B is a block diagram of an embodiment of a user/device interaction subsystem 817 in a vehicle system 800. The user/device interaction subsystem 817 may comprise hardware and/or software that conduct various operations for or with the vehicle 104. For instance, the user/device interaction subsystem 817 may include at least one user interaction subsystem 332 and device interaction subsystem 352 as previously described. These operations may include, but are not limited to, providing information to the user 216, receiving input from the user 216, and controlling the functions or operation of the vehicle 104, etc. Among other things, the user/device interaction subsystem 817 may include a computing system operable to conduct the operations as described herein.
[0483] Optionally, the user/device interaction subsystem 817 can include one or more of the components and modules provided herein. For instance, the user/device interaction subsystem 817 can include one or more of a video input/output interface 864, an audio input/output interface 874, a sensor module 814, a device interaction module 818, a user identification module 822, a vehicle control module 826, an environmental control module 830, and a gesture control module 834. The user/device interaction subsystem 817 may be in communication with other devices, modules, and components of the system 800 via the communications channel 356.
[0484] The user/device interaction subsystem 817 may be configured to receive input from a user 216 and/or device via one or more components of the system. By way of example, a user 216 may provide input to the user/device interaction subsystem 817 via wearable devices 802, 806, 810, video input (e.g., via at least one image sensor/camera 878, etc.) audio input (e.g., via the microphone, audio input source, etc.), gestures (e.g., via at least one image sensor 878, motion sensor 888, etc.), device input (e.g., via a device 212, 248 associated with the user, etc.), combinations thereof, and the like.
[0485] The wearable devices 802, 806, 810 can include heart rate monitors, blood pressure monitors, glucose monitors, pedometers, movement sensors, wearable computers, and the like. Examples of wearable computers may be worn by a user 216 and configured to measure user activity, determine energy spent based on the measured activity, track user sleep habits, determine user oxygen levels, monitor heart rate, provide alarm functions, and more. It is anticipated that the wearable devices 802, 806, 810 can communicate with the user/device interaction subsystem 817 via wireless communications channels or direct connection (e.g., where the device docks, or connects, with a USB port or similar interface of the vehicle 104).
[0486] A sensor module 814 may be configured to receive and/or interpret input provided by one or more sensors in the vehicle 104. In some cases, the sensors may be associated with one or more user devices (e.g., wearable devices 802, 806, 810, smart phones 212, mobile computing devices 212, 248, and the like). Optionally, the sensors may be associated with the vehicle 104, as described in conjunction with Figs. 6A-7B.
[0487] The device interaction module 818 may communicate with the various devices as provided herein. Optionally, the device interaction module 818 can provide content, information, data, and/or media associated with the various subsystems of the vehicle system 800 to one or more devices 212, 248, 802, 806, 810, 882, etc. Additionally or alternatively, the device interaction module 818 may receive content, information, data, and/or media associated with the various devices provided herein.
[0488] The user identification module 822 may be configured to identify a user 216 associated with the vehicle 104. The identification may be based on user profile information that is stored in profile data 252. For instance, the user identification module 822 may receive characteristic information about a user 216 via a device, a camera, and/or some other input. The received characteristics may be compared to data stored in the profile data 252. Where the characteristics match, the user 216 is identified. As can be appreciated, where the characteristics do not match a user profile, the user identification module 822 may communicate with other subsystems in the vehicle 104 to obtain and/or record profile information about the user 216. This information may be stored in a memory and/or the profile data storage 252.
[0489] The vehicle control module 826 may be configured to control settings, features, and/or the functionality of a vehicle 104. In some cases, the vehicle control module 826 can communicate with the vehicle control system 204 to control critical functions (e.g., driving system controls, braking, accelerating, etc.) and/or noncritical functions (e.g., driving signals, indicator/hazard lights, mirror controls, window actuation, etc.) based at least partially on user/device input received by the user/device interaction subsystem 817.
[0490] The environmental control module 830 may be configured to control settings, features, and/or other conditions associated with the environment, especially the interior environment, of a vehicle 104. Optionally, the environmental control module 830 may communicate with the climate control system (e.g. changing cabin temperatures, fan speeds, air direction, etc.), oxygen and/or air quality control system (e.g.,
increase/decrease oxygen in the environment, etc.), interior lighting (e.g., changing intensity of lighting, color of lighting, etc.), an occupant seating system 648 (e.g., adjusting seat position, firmness, height, etc.), steering wheel 640 (e.g., position adjustment, etc.), infotainment/entertainment system (e.g., adjust volume levels, display intensity adjustment, change content, etc.), and/or other systems associated with the vehicle environment. Additionally or alternatively, these systems can provide input, set- points, and/or responses, to the environmental control module 830. As can be appreciated, the environmental control module 830 may control the environment based at least partially on user/device input received by the user/device interaction subsystem 817.
[0491] The gesture control module 834 is configured to interpret gestures provided by a user 216 in the vehicle 104. Optionally, the gesture control module 834 may provide control signals to one or more of the vehicle systems 300 disclosed herein. For example, a user 216 may provide gestures to control the environment, critical and/or noncritical vehicle functions, the infotainment system, communications, networking, and more.
Optionally, gestures may be provided by a user 216 and detected via one or more of the sensors as described in conjunction with Figs. 6B-7A. As another example, one or more motion sensors 888 may receive gesture input from a user 216 and provide the gesture input to the gesture control module 834. Continuing this example, the gesture input is interpreted by the gesture control module 834. This interpretation may include comparing the gesture input to gestures stored in a memory. The gestures stored in memory may include one or more functions and/or controls mapped to specific gestures. When a match is determined between the detected gesture input and the stored gesture information, the gesture control module 834 can provide a control signal to any of the systems/subsystems as disclosed herein.
[0492] Fig. 8C illustrates a GPS/Navigation subsystem(s) 336. The Navigation subsystem(s) 336 can be any present or future-built navigation system that may use location data, for example, from the Global Positioning System (GPS), to provide navigation information or control the vehicle 104. The Navigation subsystem(s) 336 can include several components or modules, such as, one or more of, but not limited to, a GPS Antenna/receiver 892, a location module 896, a maps database 8100, an automobile controller 8104, a vehicle systems transceiver 8108, a traffic controller 8112, a network traffic transceiver 8116, a vehicle-to-vehicle transceiver 8120, a traffic information database 8124, etc. Generally, the several components or modules 892-8124 may be hardware, software, firmware, computer readable media, or combinations thereof.
[0493] A GPS Antenna/receiver 892 can be any antenna, GPS puck, and/or receiver capable of receiving signals from a GPS satellite or other navigation system, as mentioned hereinbefore. The signals may be demodulated, converted, interpreted, etc. by the GPS Antenna/receiver 892 and provided to the location module 896. Thus, the GPS
Antenna/receiver 892 may convert the time signals from the GPS system and provide a location (e.g., coordinates on a map) to the location module 896. Alternatively, the location module 896 can interpret the time signals into coordinates or other location information.
[0494] The location module 896 can be the controller of the satellite navigation system designed for use in automobiles. The location module 896 can acquire position data, as from the GPS Antenna/receiver 892, to locate the user or vehicle 104 on a road in the unit's map database 8100. Using the road database 8100, the location module 896 can give directions to other locations along roads also in the database 8100. When a GPS signal is not available, the location module 896 may apply dead reckoning to estimate distance data from sensors 242 including one or more of, but not limited to, a speed sensor attached to the drive train of the vehicle 104, a gyroscope, an accelerometer, etc. GPS signal loss and/or multipath can occur due to urban canyons, tunnels, and other obstructions.
Additionally or alternatively, the location module 896 may use known locations of Wi-Fi hotspots, cell tower data, etc. to determine the position of the vehicle 104, such as by using time difference of arrival (TDOA) and/or frequency difference of arrival (FDOA) techniques.
[0495] The maps database 8100 can include any hardware and/or software to store information about maps, geographical information system information, location information, etc. The maps database 8100 can include any data definition or other structure to store the information. Generally, the maps database 8100 can include a road database that may include one or more vector maps of areas of interest. Street names, street numbers, house numbers, and other information can be encoded as geographic coordinates so that the user can find some desired destination by street address. Points of interest (waypoints) can also be stored with their geographic coordinates. For example, a point of interest may include speed cameras, fuel stations, public parking, and "parked here" (or "you parked here") information. The map database contents can be produced or updated by a server connected through a wireless system in communication with the Internet, even as the vehicle 104 is driven along existing streets, yielding an up-to-date map.
[0496] An automobile controller 8104 can be any hardware and/or software that can receive instructions from the location module 896 or the traffic controller 8112 and operate the vehicle 104. The automobile controller 8104 receives this information and data from the sensors 242 to operate the vehicle 104 without driver input. Thus, the automobile controller 8104 can drive the vehicle 104 along a route provided by the location module 896. The route may be adjusted by information sent from the traffic controller 8112. Discrete and real-time driving can occur with data from the sensors 242. To operate the vehicle 104, the automobile controller 8104 can communicate with a vehicle systems transceiver 8108.
[0497] The vehicle systems transceiver 8108 can be any present or future-developed device that can comprise a transmitter and/or a receiver, which may be combined and can share common circuitry or a single housing. The vehicle systems transceiver 8108 may communicate or instruct one or more of the vehicle control subsystems 328. For example, the vehicle systems transceiver 8108 may send steering commands, as received from the automobile controller 8104, to an electronic steering system, to adjust the steering of the vehicle 100 in real time. The automobile controller 8104 can determine the effect of the commands based on received sensor data 242 and can adjust the commands as need be. The vehicle systems transceiver 8108 can also communicate with the braking system, the engine and drive train to speed or slow the car, the signals (e.g., turn signals and brake lights), the headlights, the windshield wipers, etc. Any of these communications may occur over the components or function as described in conjunction with Fig. 4.
[0498] A traffic controller 8112 can be any hardware and/or software that can communicate with an automated traffic system and adjust the function of the vehicle 104 based on instructions from the automated traffic system. An automated traffic system is a system that manages the traffic in a given area. This automated traffic system can instruct cars to drive in certain lanes, instruct cars to raise or lower their speed, instruct a car to change their route of travel, instruct cars to communicate with other cars, etc. To perform these functions, the traffic controller 8112 may register the vehicle 104 with the automated traffic system and then provide other information including the route of travel. The automated traffic system can return registration information and any required instructions. The communications between the automated traffic system and the traffic controller 8112 may be received and sent through a network traffic transceiver 8116.
[0499] The network traffic transceiver 81 16 can be any present or future-developed device that can comprise a transmitter and/or a receiver, which may be combined and can share common circuitry or a single housing. The network traffic transceiver 8116 may communicate with the automated traffic system using any known or future-developed, protocol, standard, frequency, bandwidth range, etc. The network traffic transceiver 8116 enables the sending of information between the traffic controller 8112 and the automated traffic system.
[0500] The traffic controller 8112 can also communicate with another vehicle, which may be in physical proximity (i.e., within range of a wireless signal), using the vehicle-to- vehicle transceiver 8120. As with the network traffic transceiver 8116, the vehicle-to- vehicle transceiver 8120 can be any present or future-developed device that can comprise a transmitter and/or a receiver, which may be combined and can share common circuitry or a single housing. Generally, the vehicle-to-vehicle transceiver 8120 enables communication between the vehicle 104 and any other vehicle. These communications allow the vehicle 104 to receive traffic or safety information, control or be controlled by another vehicle, establish an alternative communication path to communicate with the automated traffic system, establish a node including two or more vehicle that can function as a unit, etc. The vehicle-to-vehicle transceiver 8120 may communicate with the other vehicles using any known or future-developed, protocol standard, frequency, bandwidth range, etc.
[0501] The traffic controller 8112 can control functions of the automobile controller 8104 and communicate with the location module 896. The location module 896 can provide current location information and route information that the traffic controller 8112 may then provide to the automated traffic system. The traffic controller 8112 may receive route adjustments from the automated traffic system that are then sent to the location module 896 to change the route. Further, the traffic controller 8112 can also send driving instructions to the automobile controller 8104 to change the driving characteristics of the vehicle 104. For example, the traffic controller 8112 can instruct the automobile controller 8104 to accelerate or decelerate to a different speed, change lanes, or perform another driving maneuver. The traffic controller 8112 can also manage vehicle-to-vehicle communications and store information about the communications or other information in the traffic information database 8124.
[0502] The traffic information database 8124 can be any type of database, such as relational, hierarchical, object-oriented, and/or the like. The traffic information database 8124 may reside on a storage medium local to (and/or resident in) the vehicle control system 204 or in the vehicle 104. The traffic information database 8124 may be adapted to store, update, and retrieve information about communications with other vehicles or any active instructions from the automated traffic system. This information may be used by the traffic controller 8112 to instruct or adjust the performance of driving maneuvers.
[0503] Fig. 9 illustrates an optional communications architecture where, the host device 908 may include one more routing profiles, permission modules, and rules that control how communications within the vehicle 104 are to occur. This communications architecture can be used in conjunction with the routing tables, rules and permissions associated with access point 456 and optional firewall 484, or can be in lieu thereof. For example, the host device 908 acts as a mobile hot spot to one or more other devices within vehicle 104, such as, other device 1 912, other device 2 916, other device 3 920, and other device N 924. Optionally, one or more of the other devices 912 can communicate directly with the host device 908 which then provides Internet access to those devices 912 via the device 908. The host device 908 can act as a mobile hot spot for any one or more of the other devices 912, which may not need to communicate over the network/communications buses 224/404, but could instead connect directly to the host device 908 via, for example, NFC, Bluetooth®, WiFi, or the like. When the device 908 is acting as the host device, the device 908 can include one or more routing profiles, permissions, rules modules, and can also act as a firewall for the various inter and intra vehicle communications.
[0504] As will be appreciated, there could be alternative host devices, such as, host 904 which could also act as, for example, a co-host in association with device 908. Optionally, one or more of the routing profile, permission information, and rules could be shared between the co-host devices 904, 908, both of those devices being usable for Internet access for one or more of the other devices, 912-924. As will be appreciated, the other devices 912-924 need not necessarily connect to one or more of host device 908 and the other device 904 via a direct communications link, but could also interface with those devices 904, 908 utilizing the network /communications buses 224/404 associated with the vehiclelOO. As previously discussed, one or more of the other devices can connect to the network/communications buses 224/404 utilizing the various networks and/or buses discussed herein which would therefore enable, for example, regulation of the various communications based on the Ethernet zone that the other device 912 is associated with.
[0505] An embodiment of one or more modules that may be associated with the vehicle control system 204 may be as shown in Fig. 10. The modules can include a
communication subsystem interface 1008 in communication with an operating system 1004. The communications may pass through a firewall 1044. The firewall 1044 can be any software that can control the incoming and outgoing communications by analyzing the data packets and determining whether the packets should be allowed through the firewall, based on applied rule set. A firewall 1044 can establish a "barrier" between a trusted, secure internal network and another network (e.g., the Internet) that is not assumed to be secure and trusted.
[0506] In some situations, the firewall 1044 may establish security zones that are implemented by running system services and/or applications in restricted user groups and accounts. A set of configuration files and callbacks may then be linked to an IP table firewall. The IP table firewall can be configured to notify a custom filter application at any of the layers of the Ethernet packet. The different users/group rights to access the system may include: system users, which may have exclusive right over all device firewall rules and running software; a big-brother user, which may have access to on board device (OBD) control data and may be able to communicate with the vehicle subsystem 328 and may be able to alter the parameters in the vehicle control system 204; a dealer user, which can have rights to read OBD data for diagnostics and repairs; a dashboard user, which can have rights to launch dashboard applications and/or authenticate guest users and change their permissions to trusted/friend/family, and can read but cannot write into OBD diagnostic data; a world wide web (WWW) data user, which can have HTTP rights to respond to HTTP requests (the HTTP requests also can target different user data, but may be filtered by default user accounts); a guest user, which may have no rights; a
family/friend user, which may have rights to play media from the media subsystem 348 and/or to stream media to the media subsystem 348.
[0507] The operating system 1004 can be a collection of software that manages computer hardware resources and provides common services for applications and other programs. The operating system 1004 may schedule time-sharing for efficient use of the system. For hardware functions, such as input, output, and memory allocation, the operating system 1004 can act as an intermediary between applications or programs and the computer hardware. Examples of operating systems that may be deployed as operating system 1004 include Android, BSD, iOS, Linux, OS X, QNX, Microsoft Windows, Windows Phone, IBM z/OS, etc.
[0508] The operating system 1004 can include one or more sub-modules. For example, a desktop manager 1012 can manage one or more graphical user interfaces (GUI) in a desktop environment. Desktop GUIs can help the user to easily access and edit files. A command-line interface (CLI) may be used if full control over the operating system (OS) 1004 is required. The desktop manager 1012 is described further hereinafter.
[0509] A kernel 1028 can be a computer program that manages input/output requests from software and translates them into data processing instructions for the processor 304 and other components of the vehicle control system 204. The kernel 1028 is the fundamental component of the operating system 1004 that can execute many of the functions associated with the OS 1004.
[0510] The kernel 1028 can include other software functions, including, but not limited to, driver(s) 1056, communication software 1052, and/or Internet Protocol software 1048. A driver 1056 can be any computer program that operates or controls a particular type of device that is attached to a vehicle control system 204. A driver 1056 can communicate with the device through the bus 356 or communications subsystem 1008 to which the hardware connects. When a calling program invokes a routine in the driver 1056, the driver 1056 may issue one or more commands to the device. Once the device sends data back to the driver 1056, the driver 1056 may invoke routines in the original calling program. Drivers can be hardware-dependent and operating-system-specific. Driver(s) 1056 can provide the interrupt handling required for any necessary asynchronous time- dependent hardware interface.
[0511] The IP module 1048 can conduct any IP addressing, which may include the assignment of IP addresses and associated parameters to host interfaces. The address space may include networks and sub-networks. The IP module 1048 can perform the designation of network or routing prefixes and may conduct IP routing, which transports packets across network boundaries. Thus, the IP module 1048 may perform all functions required for IP multicast operations.
[0512] The communications module 1052 may conduct all functions for communicating over other systems or using other protocols not serviced by the IP module 1048. Thus, the communications module 1052 can manage multicast operations over other busses or networks not serviced by the IP module 1048. Further, the communications module 1052 may perform or manage communications to one or more devices, systems, data stores, services, etc. that are in communication with the vehicle control system 204 or other subsystems through the firewall 1044. Thus, the communications module 1052 can conduct communications through the communication subsystem interface 1008.
[0513] A file system 1016 may be any data handling software that can control how data is stored and retrieved. The file system 1016 can separate the stored data into individual pieces, and giving each piece a name, can easily separate and identify the pieces of data. Each piece of data may be considered a "file". The file system 1016 can construct data structure and logic rules used to manage the information and the identifiers for the information. The structure and logic rules can be considered a "file system."
[0514] A device discovery daemon 1020 may be a computer program that runs as a background process that can discover new devices that connect with the network 356 or communication subsystem 1008 or devices that disconnect from the network 356 or communication subsystem 1008. The device discovery daemon 1020 can ping the network 356 (the local subnet) when the vehicle 104 starts, when a vehicle door opens or closes, or upon the occurrence of other events. Additionally or alternatively, the device discovery daemon 1020 may force Bluetooth®, USB, and/or wireless detection. For each device that responds to the ping, the device discovery daemon 1020 can populate the system data 208 with device information and capabilities, using any of one or more protocols, including one or more of, but not limited to, IPv6 Hop-by-Hop Option
(HOPOPT), Internet Control Message Protocol (ICMP), Internet Group Management Protocol (IGMP), Gateway-to-Gateway Protocol (GGP), Internet Protocol (IP), Internet Stream Protocol (ST), Transmission Control Protocol (TCP), Exterior Gateway Protocol (EGP), CHAOS, User Datagram Protocol (UDP), etc.
[0515] For example, the device discovery daemon 1020 can determine device capabilities based on the opened ports the device exposes. If a camera exposes port 80, then the device discovery daemon 1020 can determine that the camera is using a Hypertext Transfer Protocol (HTTP). Alternatively, if a device is supporting Universal Plug and Play (UPnP), the system data 208 can include more information, for example, a camera control universal resource locator (URL), a camera zoom URL, etc. When a scan stops, the device discovery daemon 1020 can trigger a dashboard refresh to ensure the user interface reflects the new devices on the desktop.
[0516] A desktop manager 1012 may be a computer program that manages the user interface of the vehicle control system 204. The desktop environment may be designed to be customizable and allow the definition of the desktop configuration look-and-feel for a wide range of appliances or devices from computer desktops, mobile devices, computer tablets, etc. Launcher(s), panels, desktop areas, the desktop background, notifications, panes, etc., can be configured from a dashboard configuration file managed by the desktop manager 1012. The graphical elements in which the desktop manager 1012 controls can include launchers, the desktop, notification bars, etc.
[0517] The desktop may be an area of the display where the applications are running. The desktop can have a custom background. Further, the desktop may be divided into two or more areas. For example, the desktop may be divided into an upper half of a display and a lower half of the display. Each application can be configured to run in a portion of the desktop. Extended settings can be added to the desktop configuration file, such that, some objects may be displayed over the whole desktop or in custom size out of the context of the divided areas.
[0518] The notification bar may be a part of a bar display system, which may provide notifications by displaying, for example, icons and/or pop-up windows that may be associated with sound notifications. The notification mechanism can be designed for separate plug-ins, which run in separate processes and may subscribe to a system
Intelligent Input Bus (IBUS)/D-BUS event service. The icons on the notifications bar can be accompanied with application short-cuts to associated applications, for example, a Bluetooth® manager, a USB manager, radio volume and or tone control, a security firewall, etc. [0519] The desktop manager 1012 may include a windows manager 1032, an application launcher 1036, and/or a panel launcher 1040. Each of these components can control a different aspect of the user interface. The desktop manager 1012 can use a root window to create panels that can include functionality for one or more of, but not limited to:
launching applications, managing applications, providing notifications, etc.
[0520] The windows manager 1032 may be software that controls the placement and appearance of windows within a graphical user interface presented to the user. Generally, the windows manager 1032 can provide the desktop environment used by the vehicle control system 204. The windows manager 1032 can communicate with the kernel 1028 to interface with the graphical system that provides the user interface(s) and supports the graphics hardware, pointing devices, keyboard, touch-sensitive screens, etc. The windows manager 1032 may be a tiling window manager (i.e., a window manager with an organization of the screen into mutually non-overlapping frames, as opposed to a coordinate-based stacking of overlapping objects (windows) that attempts to fully emulate the desktop metaphor). The windows manager 1032 may read and store configuration files, in the system data 208, which can control the position of the application windows at precise positions.
[0521] An application manager 1036 can control the function of any application over the lifetime of the process. The process or application can be launched from a panel launcher 1040 or from a remote console. The application manager 1036 can intercept the process name and may take appropriate action to manage that process. If the process is not running, the application manager 1036 can load the process and may bring the process to a foreground in a display. The application manager 1036 may also notify the windows manager 1032 to bring the associated window(s) to a top of a window stack for the display. When a process starts from a shell or a notification out of the context of the desktop, the application manager 1036 can scan files to match the process name with the entry name provided. When a match is found, the application manager 1036 can configure the process according to a settings file.
[0522] In some situations, the application manager 1036 may restrict an application as singleton (i.e., restricts the instantiation of a class to one object). If an application is already running and the application manager 1036 is asked to run the application again, the application manager 1036 can bring the running process to a foreground on a display. There can be a notification event exchange between the windows manager 1032 and the application manager 1036 for activating the appropriate window for the foreground process. Once an application is launched, the application may not be terminated or killed. The application can be sent to the background, except, possibly, for some applications (e.g., media player, Bluetooth®, notifications, etc.), which may be given a lowest process priority.
[0523] The panel launcher 1040 can be a widget configured to be placed along a portion of the display. The panel launcher 1040 may be built from desktop files from a desktop folder. The desktop folder location can be configured by a configuration file stored in system data 208. The panel launcher 1040 can allow for the launching or executing of applications or processes by receiving inputs from a user interface to launch programs.
[0524] A desktop plugin 1024 may be a software component that allows for
customization of the desktop or software interface through the initiation of plug-in applications.
[0525] One or more gestures used to interface with the vehicle control system 204 may be as described in conjunction with Fig. 11 A through 1 IK. Figs. 11 A through 11H depict various graphical representations of gesture inputs that may be recognized by the devices 212, 248. The gestures may be performed not only by a user's body part, such as a digit, but also by other devices, such as a stylus, that may be sensed by the contact sensing portion(s) of a screen associated with the device 212, 248. In general, gestures are interpreted differently, based on where the gestures are performed (either directly on a display or in a gesture capture region). For example, gestures in a display may be directed to a desktop or application, and gestures in a gesture capture region may be interpreted as for the system.
[0526] With reference to Figs. 1 lA-11H, a first type of gesture, a touch gesture 1120, is substantially stationary on a portion (e.g., a screen, a display, etc.) of a device 212, 248 for a selected length of time. A circle 1128 represents a touch or other contact type received at particular location of a contact sensing portion of the screen. The circle 1128 may include a border 1132, the thickness of which indicates a length of time that the contact is held substantially stationary at the contact location. For instance, a tap 1120 (or short press) has a thinner border 1132A than the border 1132B for a long press 1124 (or for a normal press). The long press 1124 may involve a contact that remains substantially stationary on the screen for longer time period than that of a tap 1120. As will be appreciated, differently defined gestures may be registered depending upon the length of time that the touch remains stationary prior to contact cessation or movement on the screen. [0527] With reference to Fig. 11C, a drag gesture 1100 on the screen is an initial contact (represented by circle 1128) with contact movement 1136 in a selected direction. The initial contact 1128 may remain stationary on the screen for a certain amount of time represented by the border 1132. The drag gesture typically requires the user to contact an icon, window, or other displayed image at a first location followed by movement of the contact in a drag direction to a new second location desired for the selected displayed image. The contact movement need not be in a straight line but have any path of movement so long as the contact is substantially continuous from the first to the second locations.
[0528] With reference to Fig. 1 ID, a flick gesture 1104 on the screen is an initial contact (represented by circle 1128) with truncated contact movement 1136 (relative to a drag gesture) in a selected direction. A flick may have a higher exit velocity for the last movement in the gesture compared to the drag gesture. The flick gesture can, for instance, be a finger snap following initial contact. Compared to a drag gesture, a flick gesture generally does not require continual contact with the screen from the first location of a displayed image to a predetermined second location. The contacted displayed image is moved by the flick gesture in the direction of the flick gesture to the predetermined second location. Although both gestures commonly can move a displayed image from a first location to a second location, the temporal duration and distance of travel of the contact on the screen is generally less for a flick than for a drag gesture.
[0529] With reference to Fig. 1 IE, a pinch gesture 1108 on the screen is depicted. The pinch gesture 1108 may be initiated by a first contact 1128 A to the screen by, for example, a first digit and a second contact 1128B to the screen by, for example, a second digit. The first and second contacts 1128A,B may be detected by a common contact sensing portion of a common screen, by different contact sensing portions of a common screen, or by different contact sensing portions of different screens. The first contact 1128 A is held for a first amount of time, as represented by the border 1132A, and the second contact 1128B is held for a second amount of time, as represented by the border 1132B. The first and second amounts of time are generally substantially the same, and the first and second contacts 1128A,B generally occur substantially simultaneously. The first and second contacts 1128A,B generally also include corresponding first and second contact movements 1136A,B, respectively. The first and second contact movements 1136A,B are generally in opposing directions. Stated another way, the first contact movement 1136A is towards the second contact 1136B, and the second contact movement 1136B is towards the first contact 1136A. More simply stated, the pinch gesture 1108 may be accomplished by a user's digits touching the screen in a pinching motion.
[0530] With reference to Fig. 1 IF, a spread gesture 1110 on the screen is depicted. The spread gesture 1110 may be initiated by a first contact 1128 A to the screen by, for example, a first digit, and a second contact 1128B to the screen by, for example, a second digit. The first and second contacts 1128A,B may be detected by a common contact sensing portion of a common screen, by different contact sensing portions of a common screen, or by different contact sensing portions of different screens. The first contact 1128 A is held for a first amount of time, as represented by the border 1132A, and the second contact 1128B is held for a second amount of time, as represented by the border 1132B. The first and second amounts of time are generally substantially the same, and the first and second contacts 1128A,B generally occur substantially simultaneously. The first and second contacts 1128A,B generally also include corresponding first and second contact movements 1 136A,B, respectively. The first and second contact movements 1136A,B are generally in an opposing direction. Stated another way, the first and second contact movements 1136A,B are away from the first and second contacts 1128A,B. More simply stated, the spread gesture 1110 may be accomplished by a user's digits touching the screen in a spreading motion.
[0531] The above gestures may be combined in any manner, such as those shown by Figs. 1 IG and 11H, to produce a determined functional result. For example, in Fig. 1 IG a tap gesture 1120 is combined with a drag or flick gesture 1112 in a direction away from the tap gesture 1120. In Fig. 11H, a tap gesture 1120 is combined with a drag or flick gesture 1116 in a direction towards the tap gesture 1120.
[0532] The functional result of receiving a gesture can vary depending on a number of factors, including a state of the vehicle 104, display, or screen of a device, a context associated with the gesture, or sensed location of the gesture, etc. The state of the vehicle 104 commonly refers to one or more of a configuration of the vehicle 104, a display orientation, and user and other inputs received by the vehicle 104. Context commonly refers to one or more of the particular application(s) selected by the gesture and the portion(s) of the application currently executing, whether the application is a single- or multi-screen application, and whether the application is a multi-screen application displaying one or more windows. A sensed location of the gesture commonly refers to whether the sensed set(s) of gesture location coordinates are on a touch sensitive display or a gesture capture region of a device 212, 248, whether the sensed set(s) of gesture location coordinates are associated with a common or different display, or screen, or device 212, 248, and/or what portion of the gesture capture region contains the sensed set(s) of gesture location coordinates.
[0533] A tap, when received by a touch sensitive display of a device 212, 248, can be used, for instance, to select an icon to initiate or terminate execution of a corresponding application, to maximize or minimize a window, to reorder windows in a stack, and/or to provide user input such as by keyboard display or other displayed image. A drag, when received by a touch sensitive display of a device 212, 248, can be used, for instance, to relocate an icon or window to a desired location within a display, to reorder a stack on a display, or to span both displays (such that the selected window occupies a portion of each display simultaneously). A flick, when received by a touch sensitive display of a device 212, 248 or a gesture capture region, can be used to relocate a window from a first display to a second display or to span both displays (such that the selected window occupies a portion of each display simultaneously). Unlike the drag gesture, however, the flick gesture is generally not used to move the displayed image to a specific user-selected location but to a default location that is not configurable by the user.
[0534] The pinch gesture, when received by a touch sensitive display or a gesture capture region of a device 212, 248, can be used to minimize or otherwise increase the displayed area or size of a window (typically when received entirely by a common display), to switch windows displayed at the top of the stack on each display to the top of the stack of the other display (typically when received by different displays or screens), or to display an application manager (a "pop-up window" that displays the windows in the stack). The spread gesture, when received by a touch sensitive display or a gesture capture region of a device 212, 248, can be used to maximize or otherwise decrease the displayed area or size of a window, to switch windows displayed at the top of the stack on each display to the top of the stack of the other display (typically when received by different displays or screens), or to display an application manager (typically when received by an off-screen gesture capture region on the same or different screens).
[0535] The combined gestures of Fig. 11G, when received by a common display capture region in a common display or screen of a device 212, 248, can be used to hold a first window location constant for a display receiving the gesture while reordering a second window location to include a window in the display receiving the gesture. The combined gestures of Fig. 11H, when received by different display capture regions in a common display or screen of a device 212, 248 or in different displays or screens of one more devices 212, 248, can be used to hold a first window location for a display receiving the tap part of the gesture while reordering a second window location to include a window in the display receiving the flick or drag gesture. Although specific gestures and gesture capture regions in the preceding examples have been associated with corresponding sets of functional results, it is to be appreciated that these associations can be redefined in any manner to produce differing associations between gestures and/or gesture capture regions and/or functional results.
[0536] Gestures that may be completed in three-dimensional space and not on a touch sensitive screen or gesture capture region of a device 212, 248 may be as shown in Figs. 1 II- 1 IK. The gestures may be completed in an area where a sensor, such as an optical sensor, infrared sensor, or other type of sensor, may detect the gesture. For example, the gesture 1140 in Fig. I ll may be executed by a person when the person opens their hand 1164 and moves their hand in a back and forth direction 1148 as a gesture 1140 to complete some function with the vehicle 104. For example gesture 1140 may change the station of the radio in the vehicle 104. The sensors 242 may both determine the configuration of the hand 1164 and the vector of the movement. The vector and hand configuration can be interpreted to mean certain things to the vehicle control system 204 and produce different results.
[0537] In another example of a gesture 1152 in Fig. 11 J, a user may configure their hand 1164 to extend two fingers and move the hand 1164 in an up and down operation 1156. This gesture 1152 may control the volume of the radio or some other function. For instance, this gesture 1152 may be configured to place the vehicle in a "valet" mode to, among other things, restrict access to certain features associated with the vehicle. Again, the sensors 242 may determine how the person has configured their hand 1164, and the vector of the movement. In another example of a gesture 1160 shown in Fig. 1 IK, a user may extend their middle three fingers at an angle that is substantially 45° for vertical from straight vertical and circle the hand in a counter-clockwise motion 1166. This gesture 1160 may cause the automobile to change the heat setting or do some other function. As can be understood by one skilled in the art, the configurations of the hand and the types of movement are variable. Thus, the user may configure the hand 1164 in any way imaginable and may also move that hand 1164 in any direction with any vector in three- dimensional space.
[0538] The gestures 1140, 1152, 1160, as shown in Figs. 111-1 IK, may occur in a predetermined volume of space within the vehicle 104. For example, a sensor may be configured to identify such gestures 1140, 1152, 1160 between the front passenger's and front driver's seats over a console area within the passenger compartment of the vehicle 104. The gestures 1140, 1152, 1160 may be made within area 1 508A between zones A 512A and B 512B. However, there may be other areas 508 where a user may use certain gestures, where sensors 242 may be able to determine a certain function is desired.
Gestures that may be similar but used in different areas within the vehicle 104 may cause different functions to be performed. For example, the gesture 1140 in Fig. 1 II, if used in zone E 512E, may change the heat provided in zone E 512E, but may change the station of a radio if used in zone A 512A and/or zone B 512B. Further, the gestures may be made with other body parts or, for example, different expressions of a person's face and may be used to control functions in the vehicle 104. Also, the user may use two hands in some circumstances or do other types of physical movements that can cause different reactions in the vehicle 104.
[0539] Figs. 12A-12D show various embodiments of a data structure 1200 to store different settings. The data structure 1200 may include one or more of data files or data objects 1204, 1250, 1270, 1280. Thus, the data structure 1200 may represent different types of databases or data storage, for example, object-oriented data bases, flat file data structures, relational database, or other types of data storage arrangements. Embodiments of the data structure 1200 disclosed herein may be separate, combined, and/or distributed. As indicated in Figs. 12A-12D, there may be more or fewer portions in the data structure 1200, as represented by ellipses 1244. Further, there may be more or fewer files in the data structure 1200, as represented by ellipses 1248.
[0540] Referring to Fig. 12A, a first data structure is shown. The data file 1204 may include several portions 1208-1242 representing different types of data. Each of these types of data may be associated with a user, as shown in portion 1208.
[0541] There may be one or more user records 1240 and associated data stored within the data file 1204. As provided herein, the user can be any person that uses or rides within the vehicle or conveyance 104. The user may be identified in portion 1212. For the vehicle 104, the user may include a set of one or more features that may identify the user. These features may be the physical characteristics of the person that may be identified by facial recognition or some other type of system. In other situations, the user may provide a unique code to the vehicle control system 204 or provide some other type of data that allows the vehicle control system 204 to identify the user. The features or characteristics of the user are then stored in portion 1212. [0542] Each user, identified in portion 1208, may have a different set of settings for each area 508 and/or each zone 512 within the vehicle 104. Thus, each set of settings may also be associated with a predetermined zone 512 or area 508. The zone 512 is stored in portion 1220, and the area 508 is stored in portion 1216.
[0543] One or more settings may be stored in portion 1224. These settings 1224 may be the configurations of different functions within the vehicle 104 that are specified by or for that user. For example, the settings 1224 may be the position of a seat, the position of a steering wheel, the position of accelerator and/or brake pedals, positions of mirrors, a heating/cooling setting, a radio setting, a cruise control setting, or some other type of setting associated with the vehicle 104. Further, in vehicles adapted to have a
configurable console or a configurable dash or heads-up display, the settings 1224 may also provide for how that heads-up display, dash, or console are configured for this particular user.
[0544] Each setting 1224 may be associated with a different area 508 or zone 512.
Thus, there may be more settings 1224 for when the user is the driver and in zone A 512 A, 512A, of area 1, 508A. However, there may be similar settings 1224 among the different zones 512 or areas 508 as shown in portion 1224. For example, the heating or radio settings for the user may be similar in every zone 512.
[0545] The sensors 242 within the vehicle 104 may be able to either obtain or track health data in portion 1228. Health data 1228 may include any type of physical characteristic associated with the user. For example, a heart rate, a blood pressure, a temperature, or other types of heath data may be obtained and stored in portion 1228. The user may have this health data tracked over a period of time to allow for statistical analysis of the user's health while operating the vehicle 104. In this way, if some function of the user's health deviates from a norm (e.g., a baseline measurement, average measurements taken over time, and the like), the vehicle 104 may be able to determine there is a problem with the person and react to that data.
[0546] One or more gestures may be stored in portion 1232. Thus, the gestures used and described in conjunction Fig. 11 A through 1 IK may be configurable. These gestures may be determined or created by the user and stored in portion 1132. A user may have different gestures for each zone 512 or area 508 within the vehicle. The gestures that do certain things while driving may do other things while in a different area 508 of the vehicle 104. Thus, the user may use a first set of gestures while driving and a second set while a passenger. Further, one or more users may share gestures as shown in portion 1232. Each driver may have a common set of gestures that they use in zone A 512A, 512 A. Each of these gestures may be determined or captured and then stored with their characteristics (e.g., vector, position of gesture, etc.) in portion 1232.
[0547] One or more sets of safety parameters may be stored in portion 1236. Safety parameters 1236 may be common operating characteristics for this driver/passenger or for all drivers/passengers that if deviated from may determine there is a problem with the driver/passenger or the vehicle 104. For example, a certain route may be taken repeatedly and an average speed or mean speed may be determined. If the mean speed deviates by some number of standard deviations, a problem with the vehicle 104 or the user may be determined. In another example, the health characteristics or driving experience of the user may be determined. If the user drives in a certain position where their head occupies a certain portion of three-dimensional space within the vehicle 104, the vehicle control system 204 may determine that the safety parameter includes the users face or head being within this certain portion of the vehicle interior space. If the user's head deviates from that interior space for some amount of time, the vehicle control system 204 can determine that something is wrong with the driver and change the function or operation of the vehicle 104 to assist the driver. This may happen, for example, when a user falls asleep at the wheel. If the user's head droops and no longer occupies a certain three dimensional space, the vehicle control system 204 can determine that the driver has fallen asleep and may take control of the operation of the vehicle 204 and the automobile controller 8104 may steer the vehicle 204 to the side of the road. In other examples, if the user's reaction time is too slow or some other safety parameter is not nominal, the vehicle control system 204 may determine that the user is inebriated or having some other medical problem. The vehicle control system 204 may then assume control of the vehicle to ensure that the driver is safe.
[0548] Information corresponding to a user and/or a user profile may be stored in the profile information portion 1238. For example, the profile information 1238 may include data relating to at least one of current data, historical data, a user preference, user habit, user routine, observation, location data (e.g., programmed and/or requested destinations, locations of parking, routes traveled, average driving time, etc.), social media connections, contacts, brand recognition (e.g., determined via one or more sensors associated with the vehicle 104, a device 212, 248, etc.), audible recording data, text data, email data, political affiliation, preferred retail locations/sites (e.g., physical locations, web-based locations, etc.), recent purchases, behavior associated with the aforementioned data, and the like. The data in the profile information portion 1238 may be stored in one or more of the data structures 1200 provided herein. As can be appreciated, these one or more data structures may be stored in one or more memory locations. Examples of various memory locations are described in conjunction with Fig. 2.
[0549] One or more additional data fields may be stored in the linked data portion 1242 as data and/or locations of data. The linked data 1242 may include at least one of pointers, addresses, location identification, data source information, and other information corresponding to additional data associated with the data structure 1200. Optionally, the linked data portion 1242 may refer to data stored outside of a particular data structure 1200. For example, the linked data portion 1242 may include a link/locator to the external data. Continuing this example, the link/locator may be resolved (e.g., via one or more of the methods and/or systems provided herein, etc.) to access the data stored outside of the data structure 1200. Additionally or alternatively, the linked data portion 1242 may include information configured to link the data objects 1204 to other data files or data objects 1250, 1270, 1280. For instance, the data object 1204 relating to a user may be linked to at least one of a device data object 1250, a vehicle system data object 1270, and a vehicle data object 1280, to name a few.
[0550] An embodiment of a data structure 1200 to store information associated with one or more devices is shown in Fig. 12B. The data file 1250 may include several portions 1216-1262 representing different types of data. Each of these types of data may be associated with a device, as shown in portion 1252.
[0551] There may be one or more device records 1250 and associated data stored within the data file 1250. As provided herein, the device may be any device that is associated with the vehicle 104. For example, a device may be associated with a vehicle 104 when that device is physically located within the interior space 108 of the vehicle 104. As another example, a device may be associated with a vehicle 104 when the device registers with the vehicle 104. Registration may include pairing the device with the vehicle 104 and/or one or more of the vehicle systems (e.g., as provided in Fig. 3). In some cases, the registration of a device with a vehicle 104 may be performed manually and/or
automatically. An example of automatic registration may include detecting, via one or more of the vehicle systems, that a device is inside the vehicle 104. Upon detecting that the device is inside the vehicle 104, the vehicle system may identify the device and determine whether the device is or should be registered. Registration may be performed outside of a vehicle 104 via providing a unique code to the vehicle 104 and/or at least one of the vehicle systems. [0552] The device may be identified in portion 1256. Among other things, the device identification may be based on the hardware associated with the device (e.g., Media Access Control (MAC) address, Burned-In Address (BIA), Ethernet Hardware Address (EHA), physical address, hardware address, and the like).
[0553] Optionally, a device may be associated with one or more users. For example, a tablet and/or graphical user interface (GUI) associated with the vehicle 104 may be used by multiple members of a family. For instance, the GUI may be located in a particular area 508 and/or zone 512 of the vehicle 104. Continuing this example, when a family member is located in the particular area 508 and/or zone 512, the device may include various settings, features, priorities, capabilities, and the like, based on an identification of the family member. The user may be identified in portion 1254. For the device, the user identification portion 1254 may include a set of one or more features that may identify a particular user. These features may be the physical characteristics of the person that may be identified by facial recognition, or some other type of system, associated with the device and/or the vehicle 104. Optionally, the user may provide a unique code to the device, or provide some other type of data, that allows the device to identify the user. The features or characteristics of the user are then stored in portion 1254.
[0554] Each device identified in the device identification portion 1256 may have a different set of settings for each area 508 and/or each zone 512, and/or each user of the device. Thus, each set of settings may also be associated with a predetermined zone 512, area 508, and/or user. The zone 512 is stored in portion 1220 and the area 508 is stored in portion 1216.
[0555] One or more settings may be stored in portion 1224. These settings 1224 may be similar and/or identical to those previously described. Further, the settings 1224 may also provide for how a device is configured for a particular user. Each setting 1224 may be associated with a different area 508 or zone 512. Thus, there may be more restrictive settings 1224 (e.g., restricted multimedia, texting, limited access to device functions, and the like) for the device when the user is the driver and in zone A 512 A, 512 A, of area 1 , 508A. However, when the user is in another zone 512 or area 508, for example, where the user is not operating a vehicle 104, the settings 1224 may provide unrestricted access to one or more features of the device (e.g., allowing texting, multimedia, etc.).
[0556] Optionally, the capabilities of a device may be stored in portion 1258. Examples of device capabilities may include, but are not limited to, a communications ability (e.g., via wireless network, EDGE, 3G, 4G, LTE, wired, Bluetooth®, Near Field Communications (NFC), Infrared (IR), etc.), hardware associated with the device (e.g., cameras, gyroscopes, accelerometers, touch interface, processor, memory, display, etc.), software (e.g., installed, available, revision, release date, etc.), firmware (e.g., type, revision, etc.), operating system, system status, and the like. Optionally, the various capabilities associated with a device may be controlled by one or more of the vehicle systems provided herein. Among other things, this control allows the vehicle 104 to leverage the power and features of various devices to collect, transmit, and/or receive data.
[0557] One or more priorities may be stored in portion 1260. The priority may correspond to a value, or combination of values, configured to determine how a device interacts with the vehicle 104 and/or its various systems. The priority may be based on a location of the device (e.g., as stored in portions 1216, 1220). A default priority can be associated with each area 508 and/or zone 512 of a vehicle 104. For example, the default priority associated with a device found in zone 1 512A of area 1 508A (e.g., a vehicle operator position) may be set higher than an (or the highest of any) alternative zone 512 or area 508 of the vehicle 104. Continuing this example, the vehicle 104 may determine that, although other devices are found in the vehicle, the device, having the highest priority, controls features associated with the vehicle 104. These features may include vehicle control features, critical and/or non-critical systems, communications, and the like.
Additionally or alternatively, the priority may be based on a particular user associated with the device. Optionally, the priority may be used to determine which device will control a particular signal in the event of a conflict.
[0558] Registration data may be stored in portion 1262. As described above, when a particular device registers with a vehicle 104, data related to the registration may be stored in the registration data portion 1262. Such data may include, but is not limited to, registration information, registration codes, initial registration time, expiration of registration, registration timers, and the like. Optionally, one or more systems of the vehicle 104 may refer to the registration data portion 1262 to determine whether a device has been previously registered with the vehicle 104. As shown in Fig. 12B, User 4 of Device 2 has not been registered. In this case, the registration data field 1262, for this user, may be empty, contain a null value, or other information/indication that there is no current registration information associated with the user.
[0559] Additionally or alternatively, the data structure 1200 may include a profile information portion 1238 and/or a linked data portion 1242. Although the profile information portion 1238 and/or the linked data portion 1242 may include different information from that described above, it should be appreciated that the portions 1238, 1242 may be similar, or identical, to those as previously disclosed.
[0560] An embodiment of a data structure 1200 to store information associated with one or more vehicle systems is shown in Fig. 12C. The data file 1270 may include several portions 1216-1279 representing different types of data. Each of these types of data may be associated with a vehicle system, as shown in portion 1272.
[0561] There may be one or more system records 1270 and associated data stored within the data file 1270. As provided herein, the vehicle systems may be any system and/or subsystem that is associated with the vehicle 104. Examples of various systems are described in conjunction with Fig. 3 and other related figures (e.g., systems 324-352, etc.). One example of a system associated with the vehicle 104 is the vehicle control system 204. Other systems may include communications subsystems 344, vehicle subsystems 328, and media subsystems 348, to name a few. It should be appreciated that the various systems may be associated with the interior space 108 and/or the exterior of the vehicle 104.
[0562] Each system may include one or more components. The components may be identified in portion 1274. Identification of the one or more components may be based on hardware associated with the component. This identification may include hardware addresses similar to those described in conjunction with the devices of Fig. 12B.
Additionally or alternatively, a component can be identified by one or more signals sent via the component. Such signals may include an Internet Protocol (IP), or similar, address as part of the signal. Optionally, the signal may identify the component sending the signal via one or more of a header, a footer, a payload, and/or an identifier associated with the signal (e.g., a packet of a signal, etc.).
[0563] Each system and/or component may include priority type information in portion 1276. Among other things, the priority type information stored in portion 1276 may be used by the various methods and systems provided herein to differentiate between critical and non-critical systems. Non-limiting examples of critical systems may correspond to those systems used to control the vehicle 104, such as, steering control, engine control, throttle control, braking control, and/or navigation informational control (e.g., speed measurement, fuel measurement, etc.) Non-critical systems may include other systems that are not directly related to the control of the vehicle 104. By way of example, non- critical systems may include media presentation, wireless communications, comfort settings systems (e.g., climate control, seat position, seat warmers, etc.), and the like. Although examples of critical and/or non-critical systems are provided above, it should be appreciated that the priority type of a system may change (e.g., from critical to non- critical, from non-critical to critical, etc.) depending on the scenario. For instance, although the interior climate control system may be classified as a non-critical system at a first point in time, it may be subsequently classified as a critical system when a temperature inside/outside of the vehicle 104 is measured at a dangerous level (e.g., subzero Fahrenheit, greater than 90-degrees Fahrenheit, etc.). As such, the priority type may be associated with temperature conditions, air quality, times of the day, condition of the vehicle 104, and the like.
[0564] Each system may be associated with a particular area 508 and/or zone 512 of a vehicle 104. Among other things, the location of a system may be used to assess a state of the system and/or provide how the system interacts with one or more users of the vehicle 104. As can be appreciated each system may have a different set of settings for each area 508 and/or each zone 512, and/or each user of the system. Thus, each set of settings may also be associated with a predetermined zone 512, area 508, system, and/or user. The zone 512 is stored in portion 1220 and the area 508 is stored in portion 1216.
[0565] One or more settings may be stored in portion 1224. These settings 1224 may be similar and/or identical to those previously described. Further, the settings 1224 may also provide for how a system is configured for a particular user. Each setting 1224 may be associated with a different area 508 or zone 512. For instance, a climate control system may be associated with more than one area 508 and/or zone 512. As such, a first user seated in zone 1 512A of area 1 508A may store settings related to the climate control of that zone 512A that are different from other users and/or zones 512 of the vehicle 104. Optionally, the settings may not be dependent on a user. For instance, specific areas 508 and/or zones 512 of a vehicle 104 may include different, default, or the same settings based on the information stored in portion 1224.
[0566] The various systems and/or components may be able to obtain or track health status data of the systems and/or components in portion 1278. The health status 1278 may include any type of information related to a state of the systems. For instance, an operational condition, manufacturing date, update status, revision information, time in operation, fault status, state of damage detected, inaccurate data reporting, and other types of component/system health status data may be obtained and stored in portion 1278.
[0567] Each component and/or system may be configured to communicate with users, systems, servers, vehicles, third parties, and/or other endpoints via one or more communication type. At least one communication ability and/or type associated with a system may be stored in the communication type portion 1279. Optionally, the
communication types contained in this portion 1279 may be ordered in a preferential order of communication types. For instance, a system may be configured to preferably communicate via a wired communication protocol over one or more wired communication channels (e.g., due to information transfer speeds, reliability, and the like). However, in this instance, if the one or more wired communication channels fail, the system may transfer information via an alternative communication protocol and channel (e.g., a wireless communication protocol and wireless communication channel, etc.). Among other things, the methods and systems provided herein may take advantage of the information stored in the communication type portion 1279 to open available
communication channels in the event of a communication channel failure, listen on other ports for information transmitted from the systems, provide a reliability rating based on the number of redundant communication types for each component, and more. Optionally, a component or system may be restricted from communicating via a particular
communication type (e.g., based on rules, traffic, critical/non-critical priority type, and the like). In this example, the component or system may be forced by the vehicle control system 204 to use an alternate communication type where available, cease
communications, or store communications for later transfer.
[0568] Additionally or alternatively, the data structure 1200 may include a profile information portion 1238 and/or a linked data portion 1242. Although the profile information portion 1238 and/or the linked data portion 1242 may include different information from that described above, it should be appreciated that the portions 1238, 1242 may be similar, or identical, to those as previously disclosed.
[0569] Referring now to Fig. 12D, a data structure 1200 is shown optionally. The data file 1280 may include several portions 1216-1286 representing different types of data. Each of these types of data may be associated with a vehicle, as shown in portion 1282.
[0570] There may be one or more vehicle records 1280 and associated data stored within the data file 1282. As provided herein, the vehicle 104 can be any vehicle or conveyance 104 as provided herein. The vehicle 104 may be identified in portion 1282. Additionally or alternatively, the vehicle 104 may be identified by one or more systems and/or subsystems. The various systems of a vehicle 104 may be identified in portion 1284. For example, various features or characteristics of the vehicle 104 and/or its systems may be stored in portion 1284. Optionally, the vehicle 104 may be identified via a unique code or some other type of data that allows the vehicle 104 to be identified.
[0571] Each system may be associated with a particular area 508 and/or zone 512 of a vehicle 104. Among other things, the location of a system may be used to assess a state of the system and/or provide how the system interacts with one or more users of the vehicle 104. As can be appreciated each system may have a different set of settings for each area 508 and/or each zone 512, and/or each user of the system. Thus, each set of settings may also be associated with a predetermined zone 512, area 508, system, and/or user. The zone 512 is stored in portion 1220 and the area 508 is stored in portion 1216.
[0572] One or more settings may be stored in portion 1224. These settings 1224 may be similar and/or identical to those previously described. Further, the settings 1224 may also provide for how a vehicle and/or its systems are configured for one or more users. Each setting 1224 may be associated with a different area 508 or zone 512. Optionally, the settings may not be dependent on a particular user. For instance, specific areas 508 and/or zones 512 of a vehicle 104 may include different, default, or the same settings based on the information stored in portion 1224.
[0573] The various systems and/or components may be able to obtain or track health status data of the systems and/or components in portion 1278. The health status 1278 may include any type of information related to a state of the systems. For instance, an operational condition, manufacturing date, update status, revision information, time in operation, fault status, state of damage detected, inaccurate data reporting, and other types of component/system health status data may be obtained and stored in portion 1278.
[0574] One or more warnings may be stored in portion 1286. The warnings data 1286 may include warning generated by the vehicle 104, systems of the vehicle 104, manufacturer of the vehicle, federal agency, third party, and/or a user associated with the vehicle. For example, several components of the vehicle may provide health status information (e.g., stored in portion 1278) that, when considered together, may suggest that the vehicle 104 has suffered some type of damage and/or failure. Recognition of this damage and/or failure may be stored in the warnings data portion 1286. The data in portion 1286 may be communicated to one or more parties (e.g., a manufacturer, maintenance facility, user, etc.). In another example, a manufacturer may issue a recall notification for a specific vehicle 104, system of a vehicle 104, and/or a component of a vehicle 104. It is anticipated that the recall notification may be stored in the warning data field 1286. Continuing this example, the recall notification may then be communicated to the user of the vehicle 104 notifying the user of the recall issued by the manufacturer.
[0575] Additionally or alternatively, the data structure 1200 may include a profile information portion 1238 and/or a linked data portion 1242. Although the profile information portion 1238 and/or the linked data portion 1242 may include different information from that described above, it should be appreciated that the portions 1238, 1242 may be similar, or identical, to those as previously disclosed.
[0576] An embodiment of a method 1300 for storing settings for a user 216 associated with vehicle 104 is shown in Fig. 13. While a general order for the steps of the method 1300 is shown in Fig. 13, the method 1300 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 13. Generally, the method 1300 starts with a start operation 1304 and ends with an end operation 1336. The method 1300 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1300 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-12.
[0577] A person may enter the vehicle space 108. One or more sensors 242 may then identify that a person is sitting within the vehicle 104, in step 1308. For example, sensors 242 in a seat, may determine that some new amount of weight has been registered. The amount of weight may fall within predetermined parameters (e.g., over a threshold, in a specific range, etc.). This weight may then be determined to be a person by one or more optical or other sensors 242. The vehicle control system 204 may then determine that a person is in a certain zone 512 or area 508. For example, the sensors 242 may send signals to the vehicle controls system 204 that an event has occurred. This information may be sent to the vehicle control system processor 304 to determine the zone 512 and area 508 where the event occurred. Further, the vehicle control system 204 may then identify the person, in step 1312.
[0578] The vehicle control system 204 can receive the information from the sensors 242 and use that information to search the database 1200 that may be stored within the system data 208. The sensor data may be compared to ID characteristics 1212 to determine if the person has already been identified. The vehicle control system 204 may also send the characteristic data from the sensors to the communication network 224 to a server 228 to compare the sensor data to stored data 232 that may be stored in a cloud system. The person's features can be compared to stored features 1212 to determine if the person in the vehicle 104 can be identified.
[0579] If the person has been identified previously and their characteristics stored in portion 1212, the method 1300 proceeds YES to step 1316 where that person may be identified. In identifying a person, the information associated with that person 1240 may be retrieved and provided to the vehicle control system 204 for further action. If a person cannot be identified by finding their sensor characteristics in portion 1212, the method 1300 proceeds NO to step 1320. In step 1320, the vehicle control system 204, using an application, may create a new record in table 1200 for the user. This new record may store a user identifier and their characteristics 1212. It may also store the area 508 and zone 512 in data portions 1216 and 1220. The new record may then be capable of receiving new settings data for this particular user. In this way, the vehicle 104 can automatically identify or characterize a person so that settings may be established for the person in the vehicle 104.
[0580] The input module 312 may then determine if settings are to be stored, in step 1324. Settings might be any configuration of the vehicle 104 that may be associated with the user. The determination may be made after receiving a user input from the user. For example, the user may make a selection on a touch sensitive display indicating that settings currently made are to be stored. In other situations, a period of time may elapse after the user has made a configuration. After determining that the user is finished making changes to the settings, based on the length of the period of time since the setting was established, the vehicle control system 204 can save the setting. Thus, the vehicle control system 204 can make settings automatically based on reaching a steady state for settings for user.
[0581] The vehicle control system 204 may then store the settings for the person, in step 1328. The user interaction subsystem 332 can make a new entry for the user 1208 in data structure 1204. The new entry may be either a new user or a new settings listed in 1224. The settings may be stored based on the area 508 and zone 512. As explained previously, the settings can be any kind of configuration of the vehicle 104 that may be associated with the user in that area 508 and the zone 512.
[0582] The settings may also be stored in cloud storage, in step 1332. Thus, the vehicle control system 204 can send the new settings to the server 228 to be stored in storage 232. In this way, these new settings may be ported to other vehicles for the user. Further, the settings in storage system 232 may be retrieved, if local storage does not include the settings in storage system 208.
[0583] Additionally or alternatively, the settings may be stored in profile data 252. As provided herein, the profile data 252 may be associated with one or more devices 212, 248, servers 228, vehicle control systems 204, and the like. Optionally, the settings in profile data 252 may be retrieved in response to conditions. For instance, the settings may be retrieved from at least one source having the profile data if local storage does not include the settings in storage system 208. As another example, a user 216 may wish to transfer settings stored in profile data 252 to the system data 208. In any event, the retrieval and transfer of settings may be performed automatically via one or more devices 204, 212, 248, associated with the vehicle 104.
[0584] An embodiment of a method 1400 to configure the vehicle 104 based on stored settings is shown in Fig. 14. A general order for the steps of the method 1400 is shown in Fig. 14. Generally, the method 1400 starts with a start operation 1404 and ends with an end operation 1428. The method 1400 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 14. The method 1400 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1400 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-13.
[0585] The vehicle control system 204 can determine if a person is in a zone 512 or area 508, in step 1408. This determination may be made by receiving data from one or more sensors 242. The vehicle 104 can use facial recognition, weight sensors, heat sensors, or other sensors to determine whether a person is occupying a certain zone 512.
[0586] Using the information from the sensors 242, the vehicle control system 204 can identify the person, in step 1412. The vehicle control system 204 can obtain
characteristics for the user currently occupying the zone 512 and compare those characteristics to the identifying features in portion 1212 of data structure 1204. Thus, the settings in portion 1224 may be retrieved by identifying the correct zone 512, area 508, and characteristics for the user.
[0587] The vehicle control system 204 can first determine if there are settings associated with the identified person for that zone 512 and/or area 508, in step 1416. After identifying the user by matching characteristics with the features in portion 1212, the vehicle control system 204 can determine if there are settings for the user for the area 1216 and zone 1220 the user currently occupies. If there are settings, then the vehicle control system 204 can make the determination that there are settings in portion 1224, and the vehicle control system 204 may then read and retrieve those settings, in step 1420. The settings may be then used to configure or react to the presence of the user, in step 1424. Thus, these settings may be obtained to change the configuration of the vehicle 104, for example, how the position of the seats or mirrors are set, how the dash, console, or heads up display is configured, how the heat or cooling is configured, how the radio is configured, or how other different configurations are made.
[0588] Embodiments of a method 1500 for storing settings in cloud storage are shown in Fig. 15. A general order for the steps of the method 1500 is shown in Fig. 15. Generally, the method 1500 starts with a start operation 1504 and ends with an end operation 1540. The method 1500 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 15. The method 1500 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1500 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-14.
[0589] The vehicle control system 204 can determine if a person is in a zone 512 or area 508, in step 1508. As explained previously, the vehicle control system 204 can receive vehicle sensor data from vehicle sensors 242 that show a person has occupied a zone 512 or an area 508 of the vehicle 104. Using the vehicle sensor data, the vehicle control system 204 can determine characteristics of the person, in step 1512. These characteristics are compared to the features in portion 1212 of the data structure 1204. From this comparison, the vehicle control system 204 can determine if the person is identified within the data structure 1204, in step 1516. If there is a comparison and the person can be identified, the method 1500 proceeds YES to step 1520. However, if the person cannot be identified, the method 1500 proceeds NO, to step 1524.
[0590] In step 1520, the person is identified in portion 1208 by the successful comparison of the characteristics and the features. It should be noted that there may be a degree of variability between the characteristics and the features in portion 1212. Thus, the comparison may not be an exact comparison but may use methods known in the art to make a statistically significant comparison between the characteristics received from the sensors 242 and the features stored in portion 1212. In step 1524, the characteristics received from sensors 242 are used to characterize the person. In this way, the received characteristics may be used as an ID, in portion 1212, for a new entry for a new user in portion 1208.
[0591] The user may make one or more settings for the vehicle 104. The vehicle control system 204 may determine if the settings are to be stored, in step 1528. If the settings are to be stored, the method 1500 proceeds YES to step 1536. If the settings are not to be stored or if there are no settings to be stored, the method 1500 proceeds NO to step 1532. In step 1532, the vehicle control system 204 can retrieve the settings in the portion 1224 of the data structure 1204. Retrieval of the settings may be as described in conjunction with Fig. 14. If settings are to be stored, the vehicle control system 204 can send those settings to server 228 to be stored in data storage 232, in step 1536. Data storage 232 acts as cloud storage that can be used to retrieve information on the settings from other vehicles or from other sources. Thus, the cloud storage 232 allows for permanent and more robust storage of user preferences for the settings of the vehicle 104.
[0592] An embodiment of a method 1600 for storing gestures associated with the user is shown in Fig. 16. A general order for the steps of the method 1600 is shown in Fig. 16. Generally, the method 1600 starts with a start operation 1604 and ends with an end operation 1640. The method 1600 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 16. The method 1600 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1600 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-15.
[0593] Vehicle control system 204 may receive sensor data from sensors 242 to determine a person is occupying a zone 512 in an area 508 of the vehicle 104, in step 1608. The sensor data may provide characteristics for the person, in step 1612. The vehicle control system 204 may then use the characteristics to determine if the person can be identified, in step 1616. The vehicle control system 204 may compare the
characteristics to the features in portion 1212 for the people having been recognized and having data associated therewith. If a comparison is made between the characteristics and the features in portion 1212, the person can be identified, and the method 1600 proceeds YES to step 1620. If there is no comparison, the method 1600 may proceed NO to step 1624. In step 1620, the person may be identified by the vehicle control system 204. Thus, the person's features and associated data record 1240 may be determined and the user identified in portion 1208. If the person is not identified, the vehicle control system 204 can characterize the person in step 1624 by establishing a new record in data structure 1204 using the characteristics, received from the sensors 242, for the features in portion 1212.
[0594] Thereinafter, the vehicle control system 204 may determine if gestures are to be stored and associated with the user, in step 1628. The vehicle control system 204 may receive user input on a touch sensitive display or some other type of gesture capture region which acknowledges that the user wishes to store one or more gestures. Thus, the user may create their own gestures such as those described in conjunction with Figs. 1 lA-1 IK. These gestures may then be characterized and stored in data structure 1204. If there are gestures to be stored, the method 1600 proceeds YES to step 1636. If gestures are not to be stored the method 1600 may proceed NO to step 1632.
[0595] In step 1632, the vehicle control system 204 can retrieve current gestures from portion 1232, which are associated with user 1240. These gestures may be used then to configure how the vehicle 104 will react if a gesture is received. If gestures are to be stored, the vehicle control system 204 may store characteristics, in step 1636, as received from sensor 242 or from one more user interface inputs. These characteristics may then be used to create the stored gestures 1232, in data structure 1204. The characteristics may include what the gesture looks like or appears and also what affect the gesture should have. This information may then be used to change the configuration or operation of the vehicle 104 based on the gesture if it is received at a later time.
[0596] An embodiment of a method 1700 for receiving a gesture and configuring the vehicle 104 based on the gesture may be as provided in Fig. 17. A general order for the steps of the method 1700 is shown in Fig. 17. Generally, the method 1700 starts with a start operation 1704 and ends with an end operation 1728. The method 1700 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 17. The method 1700 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1700 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-16.
[0597] A vehicle control system 204 can receive sensor data from vehicle sensors 242. The vehicle sensor data can be used by the vehicle control system 204 to determine that a person is in a zone 512 or area 508, in step 1708. The vehicle sensor data may then be used to compare against feature characteristics 1212 to identify a person, in step 1712. The vehicle control system 204 thereinafter may receive a gesture, in step 1716. The gesture may be perceived by vehicle sensors 242 or received in a gesture capture region. The gesture may be as described in conjunction with Figs. 1 lA-1 IK. Upon receiving the gesture, the vehicle control system 204 can compare the gesture to gesture characteristics in portion 1232, in step 1720. The comparison may be made so that a statistically significant correlation between the sensor data or gesture data and the gesture
characteristic 1232 is made. Upon identifying the gesture, the vehicle control system 204 can configure the vehicle 104 and/or react to the gesture, in step 1724. The configuration or reaction to the gesture may be as prescribed in the gesture characteristic 1232.
[0598] An embodiment of a method 1800 for storing health data may be as shown in Fig. 18. A general order for the steps of the method 1800 is shown in Fig. 18. Generally, the method 1800 starts with a start operation 1804 and ends with an end operation 1844. The method 1800 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 18. The method 1800 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1800 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-17.
[0599] Vehicle control system 204 can receive sensor data from sensors 242. The sensor data may be used to determine that a person is in a zone 512 or area 508, in step 1808. The sensor data may then be used to determine characteristics of the person, in step 1812. From the characteristics, the vehicle control system 204 can determine if a person may be identified in data structure 1204, in step 1816. If it is determined that the person can be identified in step 1816, the method 1800 proceeds YES to step 1820. If the person cannot be identified, the method 1800 proceeds NO to step 1824. A person may be identified by matching the characteristics of a person from the sensor data to the features shown in portion 1212. If these comparisons are statistically significant, the person may be identified in portion 1208, in step 1820. However, if the person is not identified in portion 1208, the vehicle control system 204 can characterize the person using the vehicle sensor data, in step 1824. In this way, the vehicle control system 204 can create a new record for a new user in data structure 1204.
[0600] Thereinafter, the vehicle control system 204 may receive health and/or safety data from the vehicle sensors 242, in step 1828. The vehicle control system 204 can determine if the health or safety data is to be stored, in step 1832. The determination is made as to whether or not there is sufficient health data or safety parameters, in portion 1228 and 1236, to provide a reasonable baseline data pattern for the user 1240. If there is data to be received and stored, the vehicle control system 204 can store the data for the person in portions 1228 and 1236 of the data structure 1204, in step 1832.
[0601] The vehicle control system 204 may then wait a period of time, in step 1836. The period of time may be any amount of time from seconds to minutes to days.
Thereinafter, the vehicle control system 204 can receive new data from vehicle sensors 242, in step 1828. Thus, the vehicle control system 204 can receive data periodically and update or continue to refine the health data and safety parameters in data structure 1204. Thereinafter, the vehicle control system 204 may optionally store the health and safety data in cloud storage 232 by sending it through the communication network 224 to the server 228, in step 1840.
[0602] An embodiment of a method 1900 for monitoring the health of a user may be as shown in Fig. 19. A general order for the steps of the method 1900 is shown in Fig. 19. Generally, the method 1900 starts with a start operation 1904 and ends with an end operation 1928. The method 1900 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 19. The method 1900 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1900 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-18.
[0603] The vehicle control system 204 can receive health data from sensors 242. The health data may be received in step 1908. The vehicle control system 204 may then compare the received health data to stored health parameters in portion 1228 or portion 1236, in step 1912. The comparison may check if there is statistically significant separation or disagreement between the received health data and the stored health data. Thus, the vehicle control system 204 can make a health comparison of the user based on a baseline of health data previously stored. A statistically significant comparison may include determining if there are any parameters more than three standard deviations from the average or norm, any parameter that is increasing or decreasing over a period of eight different measurements, a measurement that is more than two standard deviations from the norm more than three measurements consecutively, or other types of statistical comparisons. [0604] If the vehicle control system 204 determines that measured health parameter does deviate from the norm, the vehicle control system 204 can determine whether the health data is within acceptable limits, in step 1916. If the health data is within acceptable limits, the method 1900 proceeds YES back to receiving new health data, in step 1908. In this way, the health data is periodically or continually monitored to ensure that the driver is in a healthy state and able to operate the vehicle. If the health data is not within acceptable parameters, the method 1900 may proceed NO to step 1924 where the vehicle control system 204 may react to the change in the health data. The reaction may include any measure to provide for the safety of the user, such as stopping the vehicle, beginning to drive the vehicle, driving the vehicle to a new location, such as a hospital, waking the driver with an alarm or other noise, or performing some other function that may help maintain the health or safety of the user.
[0605] The health data received may be a reaction from the driver. For example, the driver may call for help or ask the vehicle for assistance. For example, the driver or passenger may say that they are having a medical emergency and ask the car to perform some function to help. The function to help may include driving the person to a hospital or stopping the car and calling for emergency assistance.
[0606] Vehicle Network Security
[0607] Vehicle network security is a major obstacle to be overcome in making vehicles Internet capable. Firewall 484 can provide security but, due to the seriousness of the consequences to public safety of a security breach, additional security measures are desirable.
[0608] Network security can use a common computer to run simultaneously first and second operating systems, respectively, for critical and non-critical tasks, functions, and operations of each processing module of the vehicle control system 204, whereby, in the event of a security breach event, the second operating system is disabled or its execution discontinued.
[0609] Other vehicle network security algorithms will be discussed with reference to Figs. 23-25. An effective approach to protecting the critical tasks, functions, and operations from attack is to have one computer that handles vehicle tasks, functions, and operations and a different computer that handles infotainment tasks, functions, and operations. The first and second computers (and the distributed processing networks of which each of the first and second computers are a part) are disconnected from one another and unable to communicate with one another either wirelessly or non-wirelessly (e.g., by a signal transmission line such as a bus). The first computer (and/or any component in the distributed processing component of which it is a part) is not
discoverable by the vehicle wireless network access points 456 and/or the second computer and does not appear on any network topology (or management information base or MIB) stored in the second computer. The converse may or may not be true with respect to the second computer being discoverable by the first computer; that is, a network topology (or MIB) of a component in the network controlled by the first computer, including the first computer itself, may describe or include information regarding one or more components of the network controlled by the second computer or the second computer itself. Stated another way, the infotainment network (or the distributed processing network containing any component (such as one or more of the IP router 420, IP module 1048, communications module 1052, device discovery daemon 1020, media controller 492, media processor 808, access point 456, match engine 812, video controller 840, audio controller 844, speech synthesis module 820, network transceiver 824, signal processing module 828, device interaction module 818, and combo controller 460) participating in, providing, or handling infotainment content) can be prohibited from communicating with a vehicle non-infotainment control network containing any component participating in, providing, or handling signaling for controlling non- infotainment vehicle control tasks, functions, or operations, such as critical tasks, functions or operations (such as one or more of the vehicle control module 826, location module 896, automobile controller 8104, vehicle systems transceiver 8108, traffic controller 8112, network traffic transceiver 8116, and environmental control module 830). The first computer can have no contactable address (such as a local IP address or global unicast address) on any wireless or non- wireless network, including the infotainment network, and is not addressable, contactable, or otherwise able to communicate with the second computer, a component in the network controlled by the second computer, or any portable computational device (such as the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312).
[0610] Critical tasks, functions or operations can include one or more of (depending on the particular vehicle) monitoring, controlling, and/or operating the ECU, TCU, door settings, window settings, and/or blind spot monitor, monitoring, controlling, and/or operating the safety equipment (e.g., airbag deployment control unit, collision sensor, nearby object sensing system, seat belt control unit, sensors for setting the seat belt, etc.), monitoring and/or controlling certain critical sensors such as the power source controller and energy output sensor, engine temperature, oil pressure sensing, hydraulic pressure sensors, sensors for headlight and other lights (e.g. , emergency light, brake light, parking light, fog light, interior or passenger compartment light, and/or tail light state (on or off)), vehicle control system sensors, and/or steering/torque sensor, controlling the operation of the engine (e.g., ignition), head light control unit, power steering, display panel, switch state control unit, power control unit, and/or brake control unit, and/or issuing alerts to a user and/or remote monitoring entity of potential problems with a vehicle operation.
[0611] Less critical tasks, functions or operations to vehicle operation (which are unrelated to infotainment) can, depending on the application, further include one or more of (depending on the particular vehicle) monitoring, controlling, and/or operating a wireless network sensor (e.g., Wi-Fi and/or Bluetooth sensor), cellular data sensor, emissions control, seating system controller and sensor, monitoring certain non-critical sensors such as ambient (outdoor) weather readings (e.g., temperature, precipitation, wind speed, and the like), odometer reading sensor, trip mileage reading sensor, road condition sensors (e.g., wet, icy, etc.), radar transmitter/receiver output, brake wear sensor, oxygen sensor, ambient lighting sensor, vision system sensor, ranging sensor, parking sensor, heating, venting, and air conditioning (HVAC) system and sensor, water sensor, air-fuel ratio meter, hall effect sensor, microphone, radio frequency (RF) sensor, and/or infrared (IR) sensor.
[0612] Infotainment tasks, functions, or operations can include one or more of monitoring, controlling, and/or operating a wireless network sensor (e.g., Wi-Fi and/or Bluetooth sensor), cellular data sensor, emissions control, entertainment system, receiving, processing, and/or providing media and/or multimedia content. Infotainment tasks, functions, or operations are typically performed by one or more of the IP router 420, IP module 1048, communications module 1052, device discovery daemon 1020, media controller 492, media processor 808, access point 456, match engine 812, video controller 840, audio controller 844, speech synthesis module 820, network transceiver 824, signal processing module 828, device interaction module 818, desktop manager 1012, windows manager 1032, application manager 1036, panel launcher 1040, desktop plugin 1024, and combo controller 460.
[0613] Another approach is depicted in Fig. 23, which shows a common computer 2300 having a microprocessor 2320 executing first and second virtual machines 2308 and 2312. In common memory 2304, each of the first and second virtual machines is an isolated software container with its own operating system 1004 and executing one or more applications. By way of example and with reference to the first and second computers discussed above, the first virtual machine can simulate the first computer while the second virtual machine can simulate the second computer. Applications stored in each virtual machine would be those applications responsible for critical tasks, functions, and operations in the first virtual machine 2308 and non-critical tasks, functions, and operations in the second virtual machine 2312. Alternatively, the applications stored in the first virtual machine 2308 can be those controlling vehicle tasks, functions, and operations while those stored in the second virtual machine 2312 can be those controlling infotainment tasks, functions, and operations. For example, the first virtual machine 2308 can include one or more of vehicle control module 826, location module 896, automobile controller 8104, vehicle systems transceiver 8108, traffic controller 8112, network traffic transceiver 8116, and environmental control module 830 while the second virtual machine 2312 can include one or more of the IP module 1048, communications module 1052, device discovery daemon 1020, media controller 492, media processor 808, access point 456, match engine 812, video controller 840, audio controller 844, speech synthesis module 820, network transceiver 824, signal processing module 828, device interaction module 818, and combo controller 460. Because each of the first and second virtual machines is completely separate and independent, multiple virtual machines can run simultaneously on a common computer. A hypervisor module 2316 decouples the first and second virtual machines 2308 and 2312 from the host and dynamically allocates computing resources to each of the first and second virtual machine as needed.
Computing resources includes, for example, interrupt requests, input/output memory locations, direct memory access, memory capacity allocated for each operating system, scheduling of central processing units, and the like.
[0614] Yet another approach is shown in Fig. 24. As illustrated in Fig. 24, a common computer 2400 includes first and second operating systems 1004a and 1004b stored in common memory 2412. Each of the first and second operating systems 1004a and 1004b executing by the microprocessor 2320, can have one or more of a libraries layer 2408, application framework layer 2416, and application layer 2420. Applications are executed in application layer supported by application framework layer of the respective operating system. For example in the case of infotainment, the application framework layer may include a window manager, activity manager, package manager, resource manager, telephony manager, gesture controller, and/or other managers and services. Each operating system can rely on a common kernel 1028 for process isolation, memory management, and threading support. Libraries layer includes user libraries that implement common functions with input/output and string manipulation, graphics libraries, database libraries, communication libraries, and/or other libraries.
[0615] By way of example and with reference to the prior paragraph, the first operating system and environment can simulate the first computer while the second operating system and environment can simulate the second computer. Applications executing in each operating system and environment would be those applications responsible for critical and non-critical tasks, functions, and operations. More specifically, the applications stored in the application framework layer of the first operating system 1004a can be those controlling vehicle tasks, functions, and operations while those stored in the second operating system 1004b can be those controlling infotainment tasks, functions, and operations. For example, the first operating system 1004a can include one or more of vehicle control module 826, location module 896, automobile controller 8104, vehicle systems transceiver 8108, traffic controller 8112, network traffic transceiver 8116, and environmental control module 830 while the second operating system 1004b can include one or more of the IP module 1048, communications module 1052, device discovery daemon 1020, media controller 492, media processor 808, access point 456, match engine 812, video controller 840, audio controller 844, speech synthesis module 820, network transceiver 824, signal processing module 828, device interaction module 818, and combo controller 460.
[0616] The first operating system 1004a can run in a separate execution environment from the second operating system 1004b. For example, one operating system (typically the first operating system 1004a) can run in a root execution environment while the other operating system (typically the first operating system 1004b) runs in a secondary execution environment established under the root execution environment. Processes and applications running on the second operating system, for instance, can access user libraries, manager(s), and service(s) in the secondary execution environment. Generally applications for one operating system do not run on the other operating system.
[0617] The kernel 1028 is shared such that the first and second operating systems run concurrently and independently on the shared kernel 1028. Specifically, both operating systems 1004a,b interface to the shared kernel 1028 through the same kernel interface, such as by making system calls to the shared kernel 1028. The shared kernel 1028 manages task scheduling for processes of both operating systems and resource
management, such as CPU scheduling, memory access and input/output. [0618] Other techniques exist for running multiple operating systems, including dual- boot, multiple operating systems loaded on a computing device one-at-a-time, hosted virtual machines, and operating system level virtualization.
[0619] Fig. 25 depicts an operation of the hypervisor module 2316 or common kernel 1028 (collectively "computer control module").
[0620] In step 2500, the computer control module detects a stimulus, such as a potential security breach, attack, or other intrusion or attempted intrusion notification, a clock signal, an interrupt and the like. Examples of breaches, attacks, and other intrusions include denial-of-service attacks, spoofing, man in the middle, ARP spoofing, smurf attack, buffer overflow, heap overflow, format string attack, SQL injection, and cyber attack.
[0621] In step 2504, the computer control module collects metrics regarding operations of the first and/or second operating system (e.g., of first or second virtual machine) and/or of the computer or associated therewith (e.g., of the network controlled by the first or second operating system). Metrics include network metrics, computational device metrics, and the like. Examples of network metrics include measuring link utilization, number of hops, speed of the path, packet loss, latency, path reliability, path bandwidth, network overhead, maximum transmission unit or MTU, throughput and the like and of computer metrics include load, transaction times, database access times, available memory, central processing unit ("CPU") usage, memory usage, and the like.
[0622] In decision diamond 2508, the computer control module determines whether or not the collected metrics indicate a normal or abnormal operating state of the computer and/or of the first or second operating system and/or associated therewith. A normal operating state is deemed to occur when the collected metrics are within normal operational ranges. An abnormal operating state is deemed to occur when one or more of the collected metrics are outside of normal operational ranges. The ranges are
predetermined or preconfigured by the manufacturer. Where some metrics are normal and others abnormal, the metrics may be ranked or scored, with the cumulative score being the basis for the normal or abnormal operating state determination.
[0623] In one application, the computer control module acts differently depending on whether the abnormality impacts the first or second operating system. When the first operating system responsible for critical vehicle functions, tasks, or operations is behaving abnormally, the driver can be warned of the potential problem and asked to terminate operation of the vehicle. Execution of certain critical vehicle tasks, functions or operations can be discontinued in favor of manual control of the vehicle tasks, functions, or operations. Execution of certain critical vehicle tasks, functions, or operations can be discontinued while other more critical tasks, functions, or operations continue to be executed. For example, steering, brakes, headlights, wipers, and vehicle acceleration can continue to be executed while environmental or climate control operations are
discontinued. When the second operating system responsible for infotainment functions, tasks, or operations behaves abnormally, the driver can be warned of the potential problem and execution of the second operating system terminated automatically.
[0624] The determination as to whether the first or second operating system is behaving abnormally can be different. The first operating system can require more metrics to be abnormal than the second operating system or vice versa. Because the second operating system is more likely to be subject to an attack from the Internet or a portable device, a lesser deviation from normal operational ranges can be considered to be abnormal operation of the second operating system compared to the deviation required for the first operating system to be considered to be operating abnormally. Stated another way, a lower threshold is required to find abnormal operation of the second operating system when compared to the first operating system.
[0625] When the operating state is abnormal, the computer control module, in step 2512, shuts down or disables the second virtual machine, operating system, or computer so that the first virtual machine, operating system, or computer continues to perform critical and/or automotive tasks, functions or operations while shutting down, terminating, or discontinuing execution of infotainment tasks, functions, or operations.
[0626] When the operating state of a network in communication with the computer control module is abnormal as determined by comparison of a network metric with one or more threshold values, the second, but not the first, operating system can be shut down, particularly when the second operating system (but not the first operating system) is controlling and/or receiving input from the network.
[0627] After step 2512 or when the operating state is normal, the computer control module, in step 2516, records the metrics and findings and returns to step 2500.
[0628] The above discussion can be applied to a vehicle having separate computers running the first and second operating systems, respectively. In other words, the computer controller would monitor the operations of each of the first and second computers and a corresponding network providing input to each and perform the steps of Fig. 25 with respect to each of the first and second operating systems in each of the first and second computers and their respective networks.
[0629] Critical tasks, functions, and operations can be performed by an application executing in the first operating system and less critical tasks, functions, and operations (and infotainment tasks, functions, and operations) can be performed by an application executing in the second operating system. For instance, vehicle controls (e.g., steering, braking, indicating, etc.) can be considered mission critical and performed in the first operating system environment, whereas environmental controls and radio stations and entertainment presentations can be considered as non-critical and performed in the second operating system environment. Among other things, the critical tasks, functions, and operations can be performed not only by the first operating system but also alternatively by a redundant operating system. The redundant operating system may be configured to run in tandem with the first operating system performing critical tasks, functions, and operations and replace, or receive a "hot" handoff from, the first operating system in the event of a failure or malfunction of the first operating system or an application executing in the first operating system. In a "hot" handoff, the redundant operating system may replace the mission critical operating system without interruption of operation.
Additionally or alternatively, the first operating system may utilize one or more of error checking communication protocols, secure communication standards, encrypted communications, high speed communications protocols, and the like.
[0630] The Vehicle as an Artificially Intelligent Assistant
[0631] The vehicle control system 204 can act as an artificially intelligent assistant that, based on an awareness of a selected vehicle occupant's persona (e.g., his or her physical, psychological, mental and other characteristics, needs, desires, behavior, personality, goals, habits, biometric information, likes or interests, dislikes, preferences, and the like) and/or the persona of family and friends of the selected vehicle occupant, proactively (without a preceding request of the selected vehicle occupant) provides feedback, suggestions, reminders, recommendations, and/or other types of assistance to the selected vehicle occupant. For example, the vehicle control system 204 can characterize, define, and/or analyze a person's persona based on profile data 252 of the person, device data 220 associated with the person, stored data 232, system data 208, and information related to the person as collected by vehicle interior and exterior sensors. In other words, a vehicle occupant's persona can include the information in one or more of the profile data 252 of the person, device data 220 associated with the person, stored data 232, system data 208, and information related to the person as collected by vehicle interior and exterior sensors. Based on the persona of selected persona and vehicle-related information (such as vehicle context, state, external surroundings, location, past, current, and/or intended path of travel, waypoints, and destination of the vehicle) can proactively provide suggestions, reminders, recommendations and/or other types of assistance to the selected vehicle occupant.
[0632] The persona can include a user selected and/or configured avatar, which is provided to the user via any graphical user interface by which the vehicle control system provides suggestions, reminders, recommendations and/or other types of assistance to the selected user as a vehicle occupant. As will be appreciated, an "avatar" is the graphical and/or voice representation of an object that can be the user or the user's alter ego or character or of a person selected by the user. It may take either a three-dimensional form, as in games or virtual worlds, or a two-dimensional form as an icon in Internet forums and other online communities. It can have not only a visual image but also distinct voice, mannerisms, and/or other behavior. It is commonly an object selected and/or configured by the user. Avatars can be used as virtual embodiments of embodied agents, which are driven more or less by artificial intelligence rather than real people.
[0633] Each occupant of a vehicle can have a differently configured and/or appearing avatar at his or her display interacting with him or her. As set forth below, the avatar may accompany the user from vehicle-to-vehicle as part of the user's persona, which can be shared from vehicle to vehicle.
[0634] For example, the vehicle control system, based on an awareness of a selected person's persona and the vehicle-related information, can note vehicle related alarms, bring to a selected person's attention instant messages, emails, voice mails from, missed phone calls from, current activities of and/or locations of friends, and points of interest near the vehicle or its path of travel.
[0635] In a further illustration, the vehicle control system notes that a vehicle driver's anniversary is next week, that the driver's spouse has pinned a bracelet that she likes, that the bracelet is on sale at a specific store, and that the driver has an appointment near the store and, in response, alerts the driver to the sale and spousal interest in the bracelet and asks the driver, via any one of the user interfaces 212, 248, input/output module 312, and user interface/input interfaces 324, whether the vehicle control system can (i) add a stop to the driver's calendar and/or waypoint for the store to the driver's path of travel and/or (ii) make the purchase of the bracelet online for the driver. [0636] In yet a further illustration, the vehicle control system notes that a vehicle warning has been triggered, such as an actual or potential engine or other vehicle malfunction (e.g., low battery charge voltage, low oil pressure, high engine pressure, low fuel level, low tire pressure, air bag initiator malfunction, etc.), identifies vehicle service facilities in proximity to the current vehicle position or path of travel that is capable of repairing or addressing the source of the warning (e.g., a service garage, fuel station, dealership, etc.), contacts the facility to provide details on the malfunction to determine service cost, determine possible appointment times, notifies the driver of the possible appointment times, service cost, and corresponding facility location, and, at the option of the driver, schedules an appointment at the facility and adds a stop to the driver's calendar and/or waypoint for the facility to the driver's path of travel.
[0637] In yet a further illustration, the vehicle control system notes that a friend or family member of a vehicle occupant is in spatial proximity to the current vehicle position or path of travel, notifies the vehicle occupant of the identity and current location of the friend or family member, and, at the option of the occupant, contacts the friend or family member to notify them of the vehicle occupant's current position and arrange a meeting at a specified location nearby the current vehicle location or path of travel and adds a stop to the driver's calendar and/or waypoint for the meeting location to the driver's path of travel. This is particularly useful where the friend of family member is mobile too, such as in a vehicle or on a bike.
[0638] In yet a further illustration, the vehicle control system notes that the vehicle is on a long distance trip (based on the driver's or other occupant's electronic calendar and/or current location relative to the driver's or other occupant's home location), identifies hotels or motels in proximity to the current vehicle position or path of travel, contacts the hotel or motel to determine possible room types (e.g., one or two beds, bed size, etc.), cost, and availabilities, notifies the driver of the accommodations available and cost, and, at the option of the driver, books an accommodation at a selected hotel or motel and adds a stop to the driver's calendar and/or waypoint for the hotel or motel to the driver's path of travel.
[0639] In yet a further illustration, the vehicle control system notes that the vehicle is on a long distance trip, identifies a point or location of interest based on a persona of a vehicle occupant in proximity to the current vehicle position or path of travel, collects and provides information about the point or location of interest to the vehicle occupant, optionally contacts the point or location of interest to determine business hours and costs to visit the point or location of interest, notifies the occupant of the business hours and cost, and, at the option of the driver, books an accommodation at the point or location of interest and adds a stop to the occupant's calendar and/or waypoint for the point or location of interest to the driver's path of travel. The point or location of interest can be a historical landmark, museum, library, church, store, restaurant, coffee shop, mall, healthcare facility, and the like.
[0640] In yet a further illustration, the vehicle control system notes, from the electronic calendar of the driver and/or historic driver behavior, that the destination of the vehicle is a meeting, place of employment, or other point of interest and the location thereof, determines that the path of travel of the vehicle will not arrive at the destination timely such as due to traffic, road construction, or other delays, determines a different route that will arrive timely at the destination, proposes the alternate route to the driver, and, at the driver's option, changes the on board navigation to redirect the driver along the alternate route. The vehicle control system can also send a message, such as an email or instant message, to one or more meeting participants or his or her place of employment to let them know that he or she will or may be late.
[0641] In a further illustration, an interior or exterior vehicle sensor 340 senses a condition or activity, such as of a vehicle passenger (e.g., a child passenger unbuckling a seat belt, children fighting in the back seat, an in appropriate gesture of a vehicle occupant, and the like) and notifies a selected vehicle occupant, such as the driver of the condition or activity.
[0642] In a further illustration, a vehicle control system based on information received from a sensor, such as a camera sensor 760, can identify and/or distinguish an occupant's actions, clothing, held objects or objects in the possession of the user, and the like and perform an action. For instance, the vehicle control system can direct the occupant to a current sale of an identified object worn by or in the possession of the occupant that is in spatial proximity to the current vehicle location. Information regarding objects associated with the user can also be identified by radio frequency identification readers and tags, barcode readers, quick response code readers and the like.
[0643] In a further illustration, the vehicle control system determines that the occupant is driving the vehicle to a specific store and provides an alternate store site based on a busyness of one or more store locations. Parking information (e.g., how many spaces are occupied or free based on parking sensors, metered parking information, etc.) may be used to help determine an optimal store or destination location. A user, for instance, may be driving down a street and pull into a Starbucks™ parking lot at where the user and five other vehicles cannot find a parking spot. In this case, the user may spend five minutes waiting for the five other vehicles to clear the lot so the user can travel to another location. However, the vehicle control system, knowing that the parking lot associated with that particular Starbucks™ was fully occupied, may cause the vehicle to alter the route (or suggest an alternate route) to go to another Starbucks™ location based on the traffic and/or lack of parking.
[0644] In a further illustration, the vehicle control system may use image sensors to translate signs (e.g., road signs, store signs, advertisements, billboards, etc.). The translation process may be provided via the processing power of the mobile device, via processing on the cloud, combinations thereof, etc., and provide the translated signage message to the occupant.
[0645] In a further illustration, the vehicle control system may present a translated availability and/or assistance to guide a user in finding products traditionally purchased on trips. This guidance may be based upon the past behavior associated with the user persona. For instance, when a user travels to a new place the user may always (or mostly) shop for toothpaste, water, and ibuprofen. As can be appreciated, some foreign countries may not offer all of the products at a single store, like North American supermarkets. Continuing this example, the vehicle may provide a translation for the purchasing behavior by at least providing one or more shopping outlets where a user can procure toothpaste, water, and ibuprofen. In this example, the vehicle control system may state in the user's own language "You can find toothpaste and ibuprofen at the Apotheke on Dipplestrasse in downtown Munich. For water you may wish to visit the Marktkauf on Maybach Gaenge two blocks south of Dipplestrasse."
[0646] In a further illustration, the user may query for translation and visual assistance, such as "What does Aspirin look like in Germany". In response, the system may provide a picture or image and a translation text, voice, etc. via the mobile device and/or the vehicle.
[0647] In a further illustration, the vehicle control system and/or the mobile device, based on vehicle location and/or path of travel, may present travel and tourism options based on popular sites and/or destinations and, if selected, add the selected option as a waypoint or destination. For instance, the vehicle control system may provide the popular sites and/or destinations with a message, such as "75% of travelers visit the following destinations." The vehicle control system may develop a "perfect itinerary" based on the destinations desired and the timeframe. In some cases the itinerary may be based on desired locations and/or locations to be avoided. For instance a user may not wish to visit an industrial neighborhood, and as such the itinerary may provide routes that avoid industrialization.
[0648] In a further illustration, the vehicle control system can use a vehicle location- based trigger to activate and/or deactivate features and/or settings at a waypoint or destination, such as a home, a garage, and the like. For instance, the user may be returning home from work (this determination may be predicted based on prior trips made at the same or similar time of day by the same user), at a certain distance from the home, the vehicle may send a signal to a home automation system to set appropriate levels of lighting, thermostat, and/or initiate/deactivate other devices and/or open a garage door or entrance gate. A user may be traveling home from work and the vehicle control system may send a message, such as a text message, phone call, email, etc. to a third party to indicate the user is coming home. In some cases, the message may be identified as being sent from the vehicle (e.g., a vehicle icon with the message). The third party may respond to the message, which is then relayed, visually and or audibly, to the user via the components of the vehicle (e.g., dashboard, head unit, speakers, combinations thereof, etc.). By way of example, a third party may state "Please pick up some eggs and fabric softener before you come home, thanks."
[0649] In a further illustration, the vehicle control system can learn which routes one or more users travels at any given time period (hours, days, weeks, months, etc.). This information may be used to adjust routes based on traffic, weather, mood, etc. For instance, a user may plan to travel (or is traveling) to work on a Monday morning where heavy traffic is detected ahead in the user's "usual" Monday morning route. Upon detecting the traffic along the usual route, the vehicle may present the user with alternate route selections, or automatically adjust a route presented to the user. Routes may be altered by the vehicle control system based on past purchases, patterns, preferences and the like associated with a user profile. For instance, if a user visits McDonalds every morning for a cup of coffee and an egg McMuffin, the route guidance may alter the route of the individual to pass along several McDonalds. The route alteration may be terminated once the user stops to purchase products at the McDonalds. As can be appreciated, businesses may pay for route alteration and suggestion as a form of advertising.
[0650] In a further illustration, the vehicle control system centers around interactions and learns from user preferences. For instance, the vehicle control system learns that the user tends to like it warmer in the morning than in the afternoon, and adjust temperature settings in the vehicle accordingly. Further, since there are cameras and facial recognition, it can know my preferences and put them in, no matter what seat the user is in. IN another instance, the user has been surfing the Internet looking for new bikes, and the vehicle control system informs the user that there is a bike shop around the corner and adds the bike shop as a waypoint or destination.
[0651] In a further illustration, the vehicle control system synchronizes with an electronic calendar of the user to create (i) smarter alarms and (ii) updates. Smart alarms mean that instead of a standard 15 minute warning before a meeting, if it is an offsite meeting with an address entered, the system can determine how much time it will take based upon traffic, previous driving habits, the amount of time it generally takes to exit the office and get to the car, etc., and change the warning accordingly. The updates can be triggered based upon the time of arrival determination from the GPS or as calculated above and send SMS notices to other attendees or prompt to call the meeting
leader. Similarly, if the car determines the user is stopping into Starbucks, it could remotely ask the other meeting attendees if they want anything.
[0652] In a further illustration, The vehicle control system knows a child's curfew and also the distance from home. Given that information, it could inform the child driver that he or she must leave to make it home in time to avoid a curfew violation. Further, since it knows the occupants, it could potentially include time to deliver them home too.
[0653] In a further illustration, the user is training for a marathon. He has a wearable device attached to his clothing that uses his biometric data (e.g., heart rate, body temperature, energy levels, sleeping habits, etc.) throughout the day. It also has a GPS that tracks location and terrain and an MP3 player. On his way to work, the wearable device has synchronized with the vehicle control system. The user verbally checks off his breakfast from his marathon widget on the home page displayed on the on board console and confirms his goals to be realized that day as well as his calendar appointments. On his way to a meeting later that day, the vehicle control system notes that it is near lunch time, reviews the restaurants in spatial proximity to the vehicle, determines which restaurants have menu items consistent with the user's dietary goals that day, and recommends a nearby restaurant. The user accepts the recommendation, and the vehicle control system books an advanced reservation and adds the restaurant as a waypoint. On his way back to the office, the vehicle control system notes that the user's heart rate is too low, notifies the user, and, upon the user's confirmation, adjusts the oxygen levels in the vehicle, seat and lumbar settings, and temperature levels to inhibit drowsiness. [0654] In a further illustration, a user and his child enter the vehicle while the child is watching programming (such as a movie on Netflix™) on a tablet computer. The vehicle control system recognizes the user, determines his role in the car or seating position (driver), and applies the user's personal settings. The user's phone is automatically paired with the multimedia controller via the local or on board wireless access point. The vehicle control system determines that the child is in the back seat and recognizes his identity. The vehicle control system asks the user if the programming is to be continued to be provided to the child on the child's rear seat display subsystem. After dropping the child off at school, the vehicle control system determines, based on the user's electronic calendar, that the user is behind schedule for a next appointment due to an accident on the current path of travel, recommends an alternate path of travel and sends a message to the meeting participants that the user will be late and providing an estimated arrival time and reason for the late arrival. The message may be automatically generated or dictated by the user via the vehicle infotainment system. After work is over, the user drives home during a baseball game. The persona of the user informs the vehicle control system that the user is a baseball fan and recommends to the user that he tune into the game on radio. At the user's request, the infotainment system tunes automatically into the game broadcast on the radio.
[0655] The occupant persona is not limited to information obtained and stored directly by the vehicle. It can include information obtained by the vehicle from other sources, such as from a server 228 over a communication network 228 such as the Internet and IP router 420. An example would be from a social network profile or page of a selected vehicle occupant, a web or home page of the selected vehicle occupant, a telecommunications presence server associated with the selected vehicle occupant, and the like. Another example would be from another vehicle driven or occupied by the selected vehicle occupant. Another example would be from a home computer network of the selected vehicle occupant. In the latter examples, the information can be obtained not only via the Internet but also directly via synchronization when the vehicles are parked in a common garage of the occupant's home even when the ignition of the vehicle(s) is/are turned off. Another example would be from a portable communication or computational device of an occupant, such as from a device or user interface 212, 248. As noted, a transceiver of the vehicle can provide a mobile hot spot functionality not only to any user device(s) therein but also to a computer of another vehicle located in proximity to the selected vehicle. Synchronization with another vehicle's computer or with a home computer can be done, for example, by the http server capabilities of the vehicles.
[0656] The persona can be stored as part of a template locally, remotely, or
combinations thereof. The persona can be stored in the cloud, on a personal
communication device (such as a phone or tablet computer), in a memory of the vehicle, on a local storage memory, key fob, and the like. It may be encrypted and accessed and/or modified by the user via authorization through a verification system, which may rely on credentials and/or biometric information. The template may be universally applied by all vehicle manufacturers to enable the persona to be portable among vehicles so that the intelligent capability of the vehicle appears to follow the associated user from vehicle to vehicle even when manufactured by different companies. The template can even follow the user from seat position to seat position within a vehicle, automatically populating vehicle settings, such as seat and console settings and features associated with a new seating position, when the user is identified as having moved to the new seat.
[0657] A console may not support multimedia entertainment, particularly video.
Multimedia content is typically displayed on remote display subsystems dedicated to a specific seating position.
[0658] By way of illustration, user personas may be presented (or uploaded) to a vehicle in advance of a user reaching a vehicle (rental cars, etc.). For example, Hertz® car rental agency can maintain a persona for each customer and upload the persona onto the car rented to the customer at the time the leasing is consummated. While the user is walking from the rental desk to the car, the car is automatically adjusting all of the vehicle settings as set forth in the user persona. In this way, the user need not manually select any settings when he or she begins driving the car.
[0659] By way of further illustration, as part of a travel package, a rental car system may determine where a user is staying (e.g., hotel, motel, building, etc.), a mobile device number associated with the user, and the like. The system may send a text message, email, phone call, or other signal to the mobile device number to request access to the user's persona template. The template can be incorporated with the vehicle and settings associated with the vehicle may be updated prior to the user reaching the vehicle. Using this method, administrative assistants can reserve a vehicle on behalf of another individual and link the user template to the reservation based on the user's mobile device phone number. The mobile device can act as the key. For instance, a phone number may be embedded in the private key of the mobile device. The public key may be presented to the rental car. In some cases a combination of phone number and device ID (e.g., EIN, MAC address, etc.) may be used as part of the authentication.
[0660] The vehicle control system can apply rule sets or templates to trigger
notifications and other actions of the vehicle control system and/or display artificial or computational intelligence based on observing the behavior of a selected person over time. In any of the embodiments disclosed herein, the vehicle control system and/or mobile device may determine patterns, such as, destinations, purchases, etc. to categorize data, advertising, and the like. The artificial or computational intelligence can, for example, have symbolic, sub-symbolic, or hybrid components and/or modules. Examples include neural networks (e.g., acyclic or feedforward neural networks and recurrent neural networks), fuzzy systems, and evolutionary computation.
[0661] An example of a home page 3000 displayed by a device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 is shown in Fig. 30. The layout of the home page 3000 is defined by the user profile. When the user is signed as a guest user, a default layout is provided. The various objects in the layout can be configured as widgets or a lightweight version of an application or item of functionality.
[0662] The home page 3000t includes, in an upper area 3004 of the screen, user selectable icons for various applications (which if selected open or initiate or recall the application), such as navigation ("Nav" icon) 3016, media ("Media" icon) 3018, phone ("Phone" icon) 3020, rear view (to provide the driver with a video stream from a backup camera) ("Rear View" icon) 3022, other applications 3024, and "my dash" (to provide the user with a display having typical dash indicators, such as speedometer, tacometer, oil pressure, warning lights, battery charge indicator, fuel level, and the like) 3026; an upper information bar 3028 including a greeting to the identified driver 3030, satellite reception indicator 3032, WiFi connectivity indicator 3034, cellular connectivity indicator 3036, weather information 3038, and date 3040; in a central area 3008 of the screen, various types of information, such as navigation input fields 3042 (e.g., "set destination" field (which is a drop target into which location-enabled objects can be dragged and dropped) to provide guidance from the user's current user inputted or satellite-based location to the inputted destination), "set waypoints" field 3044 ((which is a drop target field into which any location-enabled object can be dragged and dropped) to set an ordering of waypoints along a selected or inputted route), suggested locations 3046 (based on upcoming events in the electronic calendar of the user and/or based on the user's preferences, likes, schedule, and observed habits and other persona information) (and therefore have location information, such as address information, inputted into the calendar entry and/or satellite- based coordinates based on prior driving history)) to be dragged and dropped into the "set destination" and/or "set waypoints" field, favorite locations 3048 (which is a list of user- defined favorite locations and which can also be dragged and dropped into the "set destination" and/or "set waypoints" field (and therefore have corresponding location information, such as satellite-based geographic coordinates or addresses entered in field(s) associated with the respective favorite locations (a home or work address is an example of a user selected favorite location))), recent locations 3050 (which refer to locations visited within a user-defined time interval of the present time and/or at least a predetermined number of times over a user-defined time interval and which can also be dragged and dropped into the "set destination" and/or "set waypoints" field (and therefore have corresponding location information, such as satellite-based geographic coordinates or addresses entered in field(s) associated with the respective recent locations)), "people" 3052 (which can be the contacts of the user or other list of friends and which provide names, contact information, and addresses for people having a corresponding contact profile, which profile can also be dragged and dropped into either the "set destination" or "set waypoints" field), points of interest 3054 (which are points of interest or potential interest to the operator such as food serving locations, fuel stations, lodging facilities, parking, attractions, health care facilities, historical landmarks, and the like and which can also be dragged and dropped into either the set destination or set waypoints field (and therefore have corresponding location information, such as satellite -based geographic coordinates or addresses entered in field(s) associated with the corresponding point of interest)), display of calendar appointments for a selected day 3056, weather information for a selected day (e.g., temperatures, precipitation, wind speed and direction, atmospheric condition (e.g., sunny, partially sunny, cloudy, partly cloudy, etc.), 3058 and received and/or sent messages {e.g., emails, voice mails, and/or texts) 3068 from friends (which dropdown menu can include family, business contacts, and other user defined category of contacts); and in a lower area 3012 of the screen, a climate control icon ("Climate" icon) for a climate control application 3060, controls icon ("Control" icon) to control vehicle tasks, functions or operations 3062, and speaker volume settings (shown by the speaker and triangle images) 3064 and controls to the left and right of the speaker image 3066 and 3070.
[0663] When a suggested, favorite, recent location, people, or points of interest object or object identifier is dragged and dropped by a sensed gesture or stylus or mouse into the "set destination" or "set waypoint" fields, the navigation subsystem 336, based on the current satellite-based location of the vehicle, automatically determines a route from the vehicle's current location to the stored location associated with the dragged and dropped object. As more objects are dragged and dropped into either the "set destination" or "set waypoint" fields, the route is altered to accommodate the various associated locations. A traceroute on a map and/or directions can be provided to the vehicle operator. The traceroute or directions can be updated to show the vehicle's current location. The traceroute or directions can be altered to reflect traffic and/or road conditions (when the user selects "quickest" route), surface street or highway preferences, roadside aesthetics (when the user selects "most scenic" route), and the like. This information can be displayed by the user selection of the "Nav" icon 3016.
[0664] The screen can provide a map on the background to assist navigation
configuration. As the user configures the "set destination" and "set waypoint" fields, the traceroute can be updated on the background map. In other words, the various location- enabled objects and the "set destination" and "set waypoint" fields are overlays of the map.
[0665] As noted, the screen configuration and content, such as the arrangement of the various icons and displayed information, can be rearranged or reconfigured by the user to reflect his or her needs or desires.
[0666] Because the screen content is complex and/or content-heavy and can distract the operator when driving, the screen may be altered when the vehicle is in a driving mode relative to a parked mode. In a driving mode, the wheels are in motion and the gear shift is out of the "parked" position. In the parked mode, the wheels are not in motion and the gear shift is in the "parked" position. The home screen of Fig. 30 therefore can be closed automatically on the driver's console display and disabled when the vehicle is in the driving mode and opened automatically and enabled when the vehicle is in the parked mode.
[0667] Certain applications can be launched automatically when the vehicle is in the driving mode. An example would be the navigation application so that the console display shows the display for the navigation application instead of the home screen of Fig. 30. One or more other applications can be selected by the user as the display in driving mode.
[0668] Alternatively or additionally, when the vehicle is placed in reverse gear with the home screen active, the rear- view application can be launched automatically to display, in the rear view application display, live video from one or more rear cameras on the back of the vehicle. When the vehicle is placed in forward gear, the rear view application can be automatically closed to terminate the rear view application display and the navigation application automatically opened to provide the navigation display.
[0669] The operation of the profile identification module 848 will be discussed with reference to Fig. 36.
[0670] In step 3600, the, the profile identification module 848 detects a stimulus, such as the vehicle changing operational state, expiration of a selected time period, a door opening or closing, and the like.
[0671] In step 3604, the profile identification module 848 selects a current vehicle occupant, which can be the driver or a passenger.
[0672] In step 3608, the profile identification module 848 determines a seating position of the selected vehicle occupant; that is, the module 848 determines which of the driver's seat, front passenger seat, driver's side rear passenger seat, middle rear passenger seat, and passenger's side rear passenger seat the selected vehicle occupant is occupying. This can be done by the user identification module 822.
[0673] In step 3612, the profile identification module 848 determines the types of information to collect for the persona of the selected occupant. The type of information to be collected can vary by occupant identity, age of occupant, an association of the occupant with the vehicle, and/or occupant seating position. For example, the driving history during the recent vehicle use may be collected for a driver but not a passenger. The driving history during the recent vehicle use may be collected for an adult passenger but not for a child passenger. The driving history during the recent vehicle use may be collected for a vehicle owner but not for a guest driver or driver having a default profile.
[0674] In step 3616, the profile identification module 848 collects the selected types of information.
[0675] In step 3620, the profile identification module 848 determines whether there is a next occupant to be selected. If so, the profile identification module 848 returns to step 3604. If not, the profile identification module 848 returns to step 3600 and awaits the next stimulus instance.
[0676] An operation of the vehicle control system 204 will be discussed with reference to Figs. 37A-B.
[0677] In step 3700, the vehicle control system 204 detects a stimulus, such as the vehicle changing operational state, expiration of a selected time period, a door opening or closing, and the like. [0678] In step 3704, the vehicle control system 204 selects a current vehicle occupant.
[0679] In step 3708, the vehicle control system 204 determines a seating position of the selected occupant.
[0680] In step 3712, the profile identification module 848 accesses, at the request of the vehicle control system 204, the persona of the selected occupant.
[0681] In step 3716, the vehicle control system 204 determines vehicle related information. This information includes typically any information collected by an interior or exterior vehicle sensor 104, 242, including a satellite navigation system, such as GPS
488.
[0682] In step 3720, the vehicle control system 204 determines other relevant types of information, such as the information referenced above in the examples and illustrations. The other relevant types of information can include, for instance, a persona of a selected person not currently in the vehicle, an instant message, email, voice mail from, missed phone call from, current activity of and/or current location of a friend and/or family member of the selected vehicle occupant, a location, hours of operation, and/or descriptive information about a point and/or location of interest near the vehicle and/or its path of travel, a location, hours of operation, and/or descriptive information about a vehicle service facility near the vehicle and/or its path of travel, a location, hours of operation, and descriptive information about a hotel and/or motel near the vehicle and/or its path of travel, a current location of the friend or family member near the vehicle and/or its path of travel, a road condition such as traffic, road construction, or other delay near the vehicle and/or its path of travel, and the like. This information can be collected from the server 228 via the communication network 224.
[0683] In decision diamond 3724, the vehicle control system 204 determines whether or not to perform an action. This determination can be based on a rule-based or template- based analysis of the persona of the selected occupant, vehicle related information, and other relevant types of information. The particular rules selected to apply to the collected information to determine whether or not to perform an action and what action to perform can depend on the identification of the selected vehicle occupant and/or seating position of the selected vehicle occupant. Actions that can be performed include those referenced above, including making an appointment or reservation, purchasing an item on line, adding a waypoint to a path of travel on a navigation system, adding an entry into the selected occupant's calendar, changing a destination or path of travel on a navigation system, warning or notifying the occupant, and sending a message to a person regarding an arrival time at a waypoint or destination.
[0684] When the vehicle control system 204 determines not to perform an action, the vehicle control system 204, in decision diamond 3728, determines if there is a next occupant to be selected in the vehicle. If not, the vehicle control system returns to step 3700 and, if so, to step 3704.
[0685] When an action is to be performed, the vehicle control system 204 in step 3732 optionally prompts the occupant whether or not the action is to be performed.
[0686] When instructed, the action is performed in step 3736.
[0687] In step 3740, the profile identification module 848 updates the persona of the selected vehicle occupant to reflect the action performed and the surrounding context.
This can be important for the vehicle control system 204 creating new rules describing the likely behavior of the selected vehicle occupant.
[0688] In decision diamond 3728, the vehicle control system 204 determines if there is a next occupant to be selected in the vehicle. If not, the vehicle control system returns to step 3700 and, if so, to step 3704.
[0689] Operation of the Device Discovery Module 2080
[0690] The device discovery daemon 1020 can discover automatically computational devices within the vehicle that connect or attempt to connect to network 356 or communication subsystem 1008 or that disconnect or attempt to disconnect from the network 356 or communication subsystem 1008. The device discovery daemon 1020 discovers automatically computational devices, such as portable user communication devices, located within the vehicle and connects, wirelessly, the portable user
communication device with the network 356 or communication subsystem 1008.
[0691] The device discovery daemon 1020 may intercept emitted signals from one or more devices in or about the vehicle to pair a device with the network 356 or
communication subsystem 1008. Rather than requiring an active pair handshake, the network 356 or communication subsystem 1008 may utilize certain receivers to "listen" for cell tower registration signals, sent messages, sent packets (packet sniffing), etc. From this information, the device discovery daemon 1020 may isolate a MAC address, or other identifier, associated with a device and register the device with the vehicle, a vehicle zone, a user, etc. In some cases, and upon detecting a device signal, the device discovery daemon 1020 may request permission from a user before pairing the device. In one example, the pairing may be initiated by the device discovery daemon 1020 (upon a first registration) to a user's device. Subsequent pairings may be initiated by a user's device to the network 356 or communication subsystem 1008. One or more of Bluetooth, Near Field Communications (NFC), and other protocols may be used to pair a device with the network 356 or communication subsystem 1008.
[0692] Fig. 20 depicts an operation of the device discovery daemon 1020.
[0693] In this operation, the device discovery daemon 1020 detects a computational device attempting to connect to or otherwise connectable with the vehicle network 356 or communication subsystem 1008 and, applying rules, determines when it is appropriate to connect with the computational device. The computational device may be a module installed on a backplane, a vehicle component and/or module that is attempting to be installed, a portable user communication device, and the like. The daemon can ping the local communication network each time the vehicle starts or each time a vehicle door opens or closes. The daemon can perform a Bluetooth™ and WiFi™ device discovery too. For each communication device that responds to the ping, the daemon can use plural protocols, such as HPPOPT, ICMP, IGMP, GGP, IP, ST, TCP, EGP, CHAOS, and UDP to populate a device's folder in the device data with device capabilities. Device capabilities are determined by the opened port(s) the device exposes.
[0694] The device discovery daemon 1020, in conjunction with the combo controller can determine a spatial location of the computational device as a prerequisite to permitting the computational device, even if properly authenticated, to connect to the vehicle network 356 or communication subsystem 1008. The spatial location can be required to be in an area 508 or set of areas 508 and/or zone or set of zones. Depending on the type of computational device, the computational device, for example, may be required to be in the area two 508B for connection to be accepted. This can be the case for a portable personal communication device, such as a wireless or cellular phone, tablet computer, personal digital assistant, laptop, and the like. For other types of computational devices, the computational device, for example, may be required to be in area one 508A (such as the engine space) and/or area N 508n (such as the trunk). Thus, different types of
computational devices can have different location requirements as a prerequisite for connectivity to be enabled or accepted. This can be the case for an engine control module, vehicle sensor, or other device that controls or senses a vehicle task, function or operation that, for instance, either connects wirelessly or by a wire-line connection, to the red zone 417, green zone 413, or I/O backplane on the network bus 408. [0695] In step 2000, the device discovery daemon 1020 detects a computational device connectable with the vehicle network 356 or communication subsystem 1008.
"Connectable" can refer to the computational device being physically within the vehicle, the computational device having a threshold signal strength or connection quality with the vehicle network, the computational device being configured properly to connect with the vehicle network, and the like. "Connection" can be done either wirelessly or by wired connection.
[0696] Detection can be done by pinging the computational device and/or receiving a ping from the computational device. As an example, to connect to a Wi-Fi LAN a computational device must be equipped with a wireless network interface controller. The combination of a computer, such as the computer in the computational device, and interface controller is called a station. All stations share a single radio frequency communication channel. Transmissions on this channel are received by all stations within range. The hardware does not signal the user that the transmission was delivered and is therefore called a best-effort delivery mechanism. A carrier wave is used to transmit the data in packets, referred to as "Ethernet frames". Each station is constantly tuned in on the radio frequency communication channel to pick up available transmissions.
[0697] Device detection can be in response to or triggered by a sensed event other than receipt of a ping from the computational device. As noted, device detection can be receipt of information from one or more on board sensors that a new occupant has entered the vehicle. Exemplary information includes a door opening or closing, a successful authentication of an occupant or computational device, a sensed load in a seat, detection of movement within the vehicle, and detection of initiation of a vehicle task, function or operation, such as a key inserted in an ignition, engine start up, and the like.
[0698] In step 2004 and as noted above, the device discovery daemon 1020 accesses computational device accesses configuration and other information and capabilities of the connected computational device. This information can include device type and/or class, manufacturer name, product and/or device identifier or name, firmware identification, services running, device operating system, network address(es), capabilities, user credentials, and the like. The device discovery daemon 1020 can determine presence of the computational device on any input channel, such as Ethernet, USB, WiFi, Bluetooth, and the like.
[0699] In decision diamond 2008, the device discovery daemon 1020 determines whether or not the computational device is recognized and/or authenticated properly. Recognition can be based on any of the configuration and other information and is deemed to occur when selected fields of the information map to and are the same as similar fields previously encountered and/or recorded in system data 208 by the device discovery daemon 1020. Authentication is deemed to occur when the credentials match stored credentials for the recognized computational device.
[0700] When the computational device is not recognized and/or authenticated successfully, the device discovery daemon 1020, in decision diamond 2012, determines whether the computational device is in a predetermined area or zone of the vehicle for the type of computational device corresponding to the computational device. As will be appreciated, some wireless protocols, particularly WiFi™, use an access point (or hotspot) having a defined range, which can be commensurate with the area of the vehicle passenger cabin. The passenger cabin can be surrounded by walls that substantially block radio waves to prevent computational devices external to the passenger cabin from connecting to the network. Multiple overlapping access points can be used to cover larger passenger cabins, such as buses, trains, planes and the like. This determination can be based on received signal strength from the computational device, a satellite-based position of the computational device compared to a satellite-based position of the vehicle or sub-area of the passenger cabin, triangulation based on the relative received signal strengths of multiple access points, camera feedback or image processing of camera or video feed of the passenger compartment, occupant information received from other sensors such as occupant location or presence information received from an infrared sensor 740, motion sensor 744, weight sensor 748, biometric sensor 756, camera sensor 760, audio sensor 764, and associated device sensor 720, user input and the like. User input can be received by a screen pop-up on the computational device requesting the user to designate whether or not and what vehicle the user is currently in and/or what seat position the user has in that vehicle.
[0701] A level of confidence can be assigned by the device discovery daemon 1020 to the computational device indicating a likelihood or probability that the computational device is located within the passenger compartment. This level of confidence can be based on consideration and analysis of multiple of the factors identified above. When multiple factors indicate that the user is within the passenger compartment, a higher likelihood is assigned to that determination; likewise, when multiple factors indicate that the user is outside of the passenger compartment, a lower likelihood is assigned to the determination. When no or an unsatisfactory response is received from the user in response to a request for user input and/or the level of confidence is below a selected threshold, the user is deemed to be located outside of the predetermined area and/or zone.
[0702] A further spatial determination can be made by the device discovery daemon 1020 based on how the computational device connected to the automotive network. If the computational device is hard wire connected, or plugged in, to the automotive network (which can be determined by known techniques), the spatial location of the computational device can be assumed to be within the vehicle regardless of other location mechanisms employed by the device discovery module.
[0703] A further spatial determination can be made by the device discovery daemon 1020 based on whether or not the computational device has moved within a defined time interval. This indicates that the computational device may not be within the vehicle as movement of a computational device within the vehicle is less likely than a computational device located outside of the vehicle, where the person corresponding to the computational device is often moving relative to the position of the vehicle.
[0704] A further spatial determination can be made by the device discovery daemon 1020 based on whether the signal strength or signal-to-noise ratio associated with a ping or other signal from the computational device varies significantly over a selected time interval. The signal-to-noise ratio associated with signaling from a computational device located within the vehicle would remain relatively constant while the signal-to-noise ratio associated with signaling from a computational device located outside of the vehicle would likely vary significantly as the person associated with the computational device moves relative to the vehicle and/or network 356 or communication subsystem 1008.
[0705] A further spatial determination can be made by the device discovery daemon 1020 based on an analysis of signal characteristics. Signal characteristics can include one or more of, but is not limited to analyzing signal attenuation, where a signal with a shrinking strength or increasing strength may be determined to be moving in relative proximity or position to the vehicle, analyzing any Doppler shift in the frequency, which may indicate movement in reference to the vehicle and/or network 356 or communication subsystem 1008, analyzing any kind of delay between receiving the same signal at the various transceivers 260 and/or sensors 242A-N. A difference in the time of receipt can be used to triangulate where the location of the signal originated and if that location is outside the vehicle or inside the vehicle.
[0706] A further spatial determination can be made by the device discovery daemon 1020 based on an analysis of location information. Beyond the signal characteristics, the device discovery daemon 1020may receive information from one or more sensors 242A-N to determine a location of the vehicle. If the location of the vehicle is in an area where there is not a likelihood of signal congestion, for example, in the driveway of someone's home, then all received signals may be determined to have been with inside the vehicle. Thus, as signals are received and if the location has changed, the device discovery daemon 1020 may determine whether the current location is an area where there may be more signals received that would be outside the vehicle or whether the signals received has changed.
[0707] Analysis of the person sending the signal may also be used by the device discovery daemon 1020. Thus, the device discovery daemon 1020 may access historical signal data (which is a record of devices that have previously been granted access to the vehicle communications network) to determine if the signals have been received from this device or from this person before. Thus, the signal may identify a person documented in the signal data, and the device discovery daemon 1020 may determine if that person has used or connected with the device discovery daemon 1020 previously. Further, the device discovery daemon 1020 can determine if there is movement of the vehicle. If a signal remains within the car after the vehicle moves, then that signal can be determined to be inside the vehicle. For example, if the signal is received at the beginning of a route and then at some time thereinafter the signal continues to be received, then it is determined that signal may be inside the vehicle.
[0708] Further, sensor data may be analyzed by the device discovery daemon 1020. Sensor data may include such things as determining if there are people and the number of people within a car. Thus, if there are three people in the vehicle and three signals are received, all three signals may be determined to be inside the vehicle. Further, it may be possible for the sensors 242 to determine if a device is currently being used inside the vehicle. For example, if an optical sensor can view a device within its field of vision and/or if an electromagnetic field sensor determines that there is EMF radiation emanating from a location in the vehicle, then the device discovery daemon 1020 can determine that that signal is originating inside a vehicle.
[0709] A further spatial determination can be made based on the type (or service) of computational device attempting to interact with the automotive network. If the computational device is not the type (or service) of device expected to be positioned within the passenger compartment, its attempt to connect to the automotive network can be disregarded completely. If the computational device is the type (or service) of device that can be connected to the vehicle network but is not the type (or service) of device expected to be associated with an occupant (such as a non-infotainment vehicle task, function, or operation control module, sensor module, a module plugged into a backplane (such as an XM radio module), or other type of device not permitted or expected to be associated with an occupant), the spatial determination may be deemed to be satisfied by the computational device being located somewhere within the vehicle even if outside of the passenger compartment. Conversely, if the computational device is the type (or service) of device that would normally be carried by a vehicle occupant, the spatial location query would be satisfied only when the computational device location were within the passenger compartment. This would prevent an unauthorized computational device, such as a cell phone or tablet computer, from connecting to the automotive network simply by being held above, beside, or beneath the vehicle. The mapping of type of computational device to permissible location (e.g., area and/or zone) within the vehicle can be effected in a look up table or other set of rules.
[0710] The device discovery daemon 1020 may receive one or more of these analyses and resolve the information. Thus, device discovery daemon 1020 can cross-correlate information from different analyses to determine if the signal is within the vehicle.
Different weight may be given to different analyzes to make a determination about where the signal originates. In this way, a more robust decision is made as to whether or not the signal originates in the vehicle and should be allowed to connect to the universal bus or the routing system of the vehicle.
[0711] While this logic is described with respect to the location of the computational device, it is to be understood that this determination is optional. Whether or not a communication device is enabled to connect to the vehicle network 356 or communication subsystem 1008 can be based solely on successful authentication. In this way, the vehicle on board computer can connect automatically to the owner's home virtual private network to upload and/or download information, settings, and other information (such as user input into the vehicle computer, vehicle driving history (e.g., miles traveled, travel traceroutes, speeds traveled, and locations visited), vehicle service information (such as gas and fluid levels, engine problems, alarms or warnings activated, and the like), input received by on board applications from the user (such as scheduled appointments, notes, documents, and the like), applications downloaded, and the like. Synchronization of the on board vehicle and home computer can occur automatically whether the vehicle is turned on or off. This can be highly beneficial when the vehicle is parked in the garage. [0712] When the computational device is recognized and/or authenticated and/or when the computational device is determined to be within a proper area and/or zone of the vehicle for the device type of the computational device, the device discovery daemon 1020, in optional step 2016 determines a spatial location and/or seat position of the computational device to determine appropriate rights and/or privileges and/or restrictions of the user of the computational device as defined by user account or profile associated with the user, the device type of the computational device, the seat position of the user, or area or zone in which the user and/or computational device is located.
[0713] In step 2020, the device discovery daemon 1020 permits or enables connection of the computational device with the vehicle network 356 or communication subsystem 1008 and optionally stipulates or defines what set of tasks, functions, and/or operations the user of the computational device can perform using the computational device, such as based on the location of the computational device within the vehicle and/or based on the
authentication credentials (e.g., the identity of the computational device user).
[0714] When the computational device is determined to be outside of a defined area and/or zone of the vehicle (e.g., outside the passenger compartment or cabin) and the device discovery daemon 1020 does not recognize and/or authenticate the computational device successfully, the device discovery daemon 1020, in step 2024, denies access of the computational device to connect to or access the vehicle network 356 or communication subsystem 1008.
[0715] An example of the device discovery daemon 1020 operation will now be discussed. Assume that a vehicle owner purchases an XM radio module off the shelf from a vendor. The user then plugs in the module to the vehicle network. In attempting to enable the functionality of the XM radio, the device discovery daemon 1020 first attempts to recognize and/or authenticate the module. During the process, the XM radio module can be permitted to communicate through the vehicle network or an antenna internal to the module with a remote website to confirm that the XM radio was purchased by the vehicle owner and/or perform licensing validation checks. The vehicle owner's XM radio account is likely already associated with the XM radio module when purchased.
[0716] When the XM radio module is plugged into a backplane (Fig. 4) in the vehicle network 356 or communication subsystem 1008, the trigger event occurs, and the device discovery daemon 1020 and combo controller commence operation. After device discovery is completed successfully by the device discovery daemon 1020, the combo controller downloads, from the user's XM account, the user's XM preferences (e.g., favorites) and causes the XM functionality in the vehicle to be configured accordingly. During installation, the combo controller can provide an installation in process dialog the user and, upon successful completion, an installation complete dialog. After
acknowledging that the installation is complete, a new application icon for the XM radio appears in the upper area of the screen of Fig. 30. The new application icon can have a different appearance than other previously installed application icons to inform the user that the new application is available. When the user launches the new application, the application screen is shown. The application screen of any application depends on the application and user preferences.
[0717] The environment 400 of Fig. 4 can interface with three different types of blade processors. A first type of blade processors is installed prior to sale by a manufacturer or vendor. These blade processors are in a manufacturer "crate" and not replaceable or upgradeable by the customer but only by a certified service facility. A second type of blade processors can be installed post-sale by or on behalf of the customer. These blade processors are in a customer "crate". A third type of blade processors interfaces with the first and second type of blade processors. These blade processors are in an accessory "crate". The blade processors generally refer to executable instructions executing on a microprocessor. Each blade typically includes a corresponding set of functions (such as 802.11 WiFi™ interface, Bluetooth™ radio, cellphone radio, storage unit, or satellite position system (such as GPS)), a flash memory, a microprocessor, and a network interface. Typically, the function is software or a combination of hardware and software. Examples include a media controller blade, video controller blade, audio controller blade, profile identification module blade, and the like. This configuration may also be used for a wide variety of other software, such as applications, drivers, and the like, and devices, such as an infotainment system, satellite receiving system, and the like and combinations thereof. For example, an aftermarket infotainment system upgrade or component and/or module can be installed by the customer. The infotainment system upgrade or component and/or module searches for, locates, and connects to a suitable antenna installed prior to sale. The output of the blade processor is an IP message framed into an Ethernet packet. The blade processor can be compatible with the Car Area Network ("CAN") protocol. The CAN control system, which was intended for sensor feedback and controller control signaling through a relatively slow, serial bus, can be integrated into the blade processor architecture by terminating CAN buses in one or more Ethernet bus controllers. The output is configured for a discrete Ethernet-based sensor or controller, with the translation to the CAN protocol being made at a CAN Ethernet controller subsystem.
[0718] A "crate" " commonly has a backplane connector and on-backplane Ethernet interfaces that enable blade processors to communicate with one another using Ethernet messages. The crate can be a USB hub have a series of ports to permit devices to connect to the USB hub; therefore, each "crate" generally has a corresponding USB hub and ports connected thereto.
[0719] A master blade processor can be provided. The master blade processor can be substantially the same as the other blade processors except for scheduling and control functionality. The master blade processor can include a service manager that inventories hardware and software capabilities of components and/or modules available (or connected) through hard wire connections within the vehicle and/or computational devices connected wirelessly to the vehicle local area network (e.g., through a wireless access point or local hot spot maintained by the vehicle control system). The output of the inventory step is a network topology comprising assigned network addresses, component and/or module description and capabilities, operational status, and "next hop" addresses to demonstrate positions of components and/or modules next to one another. An application resource table can be maintained that maps application identity, functional description, and/or application type against computational resource requirements and/or capabilities. Requests to the master blade processor allow various applications to access distributed services or automatically disable features not supported by the vehicle's computer system.
[0720] The master blade processor can coordinate a health check for the various components and/or modules in or connected wirelessly to the vehicle's computer system and be responsible for sending a keep alive messages (or status queries or interrogation messages sent at periodic intervals) to the various components and/or modules to evaluate health or identify malfunctions. If no response is received within a predetermined time period after keep alive message transmission, a malfunction is assumed to exist.
[0721] For components and/or modules requiring high availability, the master blade processor can create a high availability set with a required number of blade processors. This table controls the priority of which blades are capable of becoming master blade processors in the event of a malfunction of the current master blade processor. Shadow copies of each high availability component (e.g., device) and/or module (e.g., application) are created and maintained for backup purposes in the event of a malfunction of the corresponding active blade processor. High availability applications write all variables to a file created at application initiation residing in vehicle memory. In the event of a malfunction of a high availability application, the master blade processor reassigns the high availability application to a new blade processor. The application can be reinitiated or refreshed or synchronized (using the shadow file) on the newly assigned blade processor. In this manner, the master blade processor is able to redistribute applications to blade processors and/or stopping execution of applications to realize a selected level of performance for the entire vehicle computer system.
[0722] A table can be maintained mapping a network address (e.g., a MAC and/or IPv6 address) of each component and/or module against information about the corresponding component and/or module and/or a pointer to a variable file for the component and/or module in memory. Entries on the table may have a corresponding state of health or be added when the corresponding component and/or module passes a health check or removed when the corresponding component and/or module fails to pass a health check.
[0723] Upon start up, the master blade processor checks the application resource requirement entries in an application resource table and assigns each application to a blade processor having the most applicable and/or available resources. If an application has specific requirements for execution (e.g., a multimedia infotainment application), the application is assigned to and locked to a specific blade processor (such as a multimedia controller).
[0724] An operation of the master blade processor will be discussed with reference to Fig. 38.
[0725] In step 3800, the blade processor detects a stimulus. The stimulus can be system boot sequence.
[0726] In step 3804, the master blade processor identifies and inventories enabled on board computational resources (such as each blade processor, IP router, vehicle sensors, database management systems, and the like, and the capabilities thereof. On board components and/or modules are typically physically connected to a network bus.
[0727] In step 3808, the master blade processor identifies and inventories enabled portable computational resources and the capabilities thereof. Portable components, such as tablet computers and smart phones, are typically wirelessly connected, via a wireless access point, to a network bus.
[0728] In step 3812, the master blade processor assigns a network address to each enabled resource. [0729] In step 3816, the master blade processor maintains and stores the resulting network topology in system data 208.
[0730] In one configuration, a blade processor is not identified as the "master" blade processor prior to system booting. An application executes on a designated blade processor (such as a blade processor handling multimedia system or media subsystem functions) at boot time to detect the presence of a blade processor having specific capabilities. When the blade processor having the specific capabilities is identified, a variable associated with the designated blade processor is changed from "master" to "slave" and the system reboots, thereby transferring system control to the identified master blade processor. A table can be created controlling a priority of which blade processors are capable of becoming master blade processors in the event of master blade processor malfunction. When a master blade processor malfunctions, a next in priority blade processor is selected as the new master blade processor.
[0731] An operation of the master blade processor will be discussed with reference to Fig. 39.
[0732] In step 3900, the master blade processor detects a stimulus. Exemplary stimuli include determining that a component and/or module has failed a health check, receipt of a request for a component and/or module for blade processor resources to execute, and the like.
[0733] In step 3904, the master blade processor determines the requirements for execution of a selected component and/or module, such as set forth in the application resource requirement table. The selected component and/or module can, for example, be a blade processor, software, such as an application, a component, such as a sensor, a satellite receiving system, and the like, and/or a combination thereof.
[0734] In decision diamond 3908, the master blade processor, based on the resource inventory output in step 3808, determines whether a blade processor is available to execute the selected component and/or module. This determination can be done by any number of techniques. It can be based on one or more of a degree of compatibility of the selected component and/or module with each of the blade processor, a current availability of a blade processor to execute the selected component and/or module (e.g., whether the blade processor is currently executing another component and/or module), a relative importance or priority of the selected component and/or module relative to a relative importance of a currently executing component and/or module on a blade processor, whether the blade processor can satisfy the requirements of the selected component and/or module, a current available bandwidth of a blade processor which may be executing another component and/or module, and the like.
[0735] In step 3912, which is performed when a blade processor is available to execute the selected component and/or module, the master blade processor assigns the selected component and/or module to a blade processor for execution. The assignment is recorded in a resource allocation table which maps application identity and/or description against an identifier of the assigned blade processor. If an application has specific requirements for execution (e.g., a multimedia infotainment application or firewall application), the master blade processor may permanently assign the application (or lock the assignment) to a specific blade processor. The master blade processor can, for unlocked assignments, redistribute or reassign components or modules to blade processors or stop components or modules from executing to realize a maximum performance level for the system and/or ensure that critical vehicle, less critical vehicle, or infotainment tasks, functions, or operations are performed.
[0736] By way of example, the master blade processor can stop a lower importance or priority component and/or module from executing in favor of a higher importance or priority component and/or module requiring a blade processor to execute. For example, a critical task, function or operation has a higher level of importance or priority than a less critical vehicle task, function or operation. A less critical vehicle task, function or operation can have a higher level of importance or priority than an infotainment task, function or operation or vice versa.
[0737] The master blade processor, in next step 3916, updates the network topology.
[0738] In step 3920, which is performed when a blade processor is unavailable to execute the selected component and/or module, the master blade processor denies the selected component and/or module permission to execute.
[0739] When the master blade processor determines that a minimum set of blade processor resources and/or capabilities is not operational, it will send a system reset message to clear operator warning circuitry within an allocated timeframe.
[0740] Streaming of Media
[0741] An integrated circuit board configuration can be provided to enable media streaming to computational devices wirelessly connected to the vehicle network via an access point 456 of the vehicle. In this manner, different media can be provided simultaneously to different user interfaces (e.g., on board display subsystems) of different vehicle occupants. [0742] Fig. 21 depicts a configuration of the media controller subsystem 348 for simultaneously streaming multiple channels of media (e.g., multimedia or infotainment content). Media content 2100, such as audio, video, picture, and/or text media, is received by the network transceiver 824 and provided to a common media processing board 2104. The processing board 2104 comprises a network control server 2108, such as a distributed network control ("DNC") server, a media server 21 12, such as a digital living network alliance ("DLNA") server, and virtual network console ("VNC") software 21 16. The network control server 2108 has an IP address and/or a routable, global unicast address while the media server has a contactable electronic address (such as a media access control or MAC address) on the vehicle network 356 or communication subsystem 1008.
[0743] As will be appreciated, each of the network control and media servers 2108 and 21 12 comprise a microprocessor and memory to buffer media content, execution of application programming, long term storage of program instructions and/or data, and the like, In one application, the media server 21 12 comprises functionality of the media controller 804, media processor 808, and signal processing module 828.
[0744] A user portable (remote) or Bring Your Own Device (BYOD) computational device 2120 is in wireless or wired communication with the board 2104, such as via a wireless protocol (e.g., Bluetooth™, WiFi™, and the like). The VNC software 21 16 effectively provides the remote computational device 2120 (e.g., portable communication device) with remote access to the board 2104, which is typically part of the vehicle on board computer (e.g., the console computer). As will be appreciated, the VNC software can operate in accordance with the RFB (Remote Frame Buffer) protocol on top of the TCP/IP suite of protocols. The result is that the remote computational device appears to the on board computer as if it were part of the on board vehicle computer or vehicle control system 204. This enables the remote computational device to provide commands to the media controller subsystem 348 for desired multimedia content and media presentation features and settings (such as volume, contrast, resolution, channel selection, and the like). In some applications, the remote computational device must download an application from the vehicle to enable multimedia requests from the remote computational device to pull information from the multimedia controller, where the media can be streamed to its final destination. The commands are provided, by the VNC software 21 16 indirectly to the media server 21 12 for delivery to the network control server 2108 or directly to the network control server 2108, which retrieves the requested content from an external network (e.g., an Internet or radio network) accessible source. The media may then be provided to a display at the seat position of the user and/or to the BYOD computational device 2120 (via the wireless access point). When the requested media content is available locally (already stored in vehicle memory), the media content is provided from local storage in preference to accessing the media content via the Internet. When the requested media is not available locally, it is obtained via the Internet and is cached in local storage in parallel with streaming to the user's display and/or BYOD computational device 2120. Each display at each seat position can independently provide a corresponding user with a playlist of audio or video from independent sources.
[0745] With reference to Fig. 22, the operation of the media controller subsystem 348 will be discussed. In step 2200, the VNC software 2116 receives a command or request for content from the user of the BYOD computational device. The device has been previously connected successfully by the device discovery daemon or combo controller to the vehicle network 356 or communication subsystem 1008.
[0746] Typically, the combo controller has ports in the crate, implements certain device discovery functions, and packetizes information for transmission via IP over Ethernet. The combo controller commonly installs the correct driver for the discovered device. It then maps the device (e.g., USB) address to a local TCP/IP address for interaction with one of the blade processors.
[0747] In step 2204, the media server determines, from the system data 208, portable computational device information and capabilities, the identity of the user of the device, an identity of the portable computational device associated with the user, and/or the spatial location of the device (e.g., area and/or zone in which the device is located). This information has been previously determined by the device discovery daemon or combo controller but may be updated by the media server using the location techniques discussed above.
[0748] In step 2208, the media server applies appropriate filters and restrictions based on the identity of the user (e.g., user account) and/or spatial location of the device (e.g., user context). The filters or restrictions can include, for example, age-related content filters and/or restrictions, seating location filters and/or restrictions, and privacy filters and/or restrictions. The filters and/or restrictions can ban the type of content requested or the source of the content (e.g., a blacklist of web sites), redirect the request to a different source of the content, and/or limit the user to content from approved sources, such as permitted by a whitelist. Other user account restrictions can be applied by the media server before the content request is passed to the network control server 2108. [0749] The owner of the vehicle can configure the multimedia controller menus.
Initially, the menu allows for all seat positions to access any media that is available through the multimedia controller. The owner can create profiles for different potential vehicle occupants. By creating the profiles, the owner is able to allow or disallow access to media from selected seat positions, including access to media menus (e.g., audio or video playlists, software applications (such as Internet browsers, applications to control passenger compartment climate control functions, games, and the like). The owner can also limit media and/or media menu access based on a MAC address of a computational device 2120 of an occupant. Only devices having a permitted MAC address can access media and/or menus while those not having a permitted MAC address cannot. This can ensure that control is not only position-based but also device-based.
[0750] In step 2212 and if permitted by the media server, the network control server 2108 accesses the filtered content request (which may be altered to reflect user account restrictions) and retrieves the permissible content via the public network. The network control server 2108, for example, can access the media content 2100 from a media source accessed through an external network, such as a radio network or other type of untrusted public network (such as the Internet). The filtered content request may, for example, be for different content than that originally requested, differently formatted content than that originally requested, differently sourced content (e.g., from a different web server or site) than that originally requested, and the like.
[0751] In step 2216, the network control server 2108 receives the permissible content from a publicly networked source (e.g., web server or site) and passes the retrieved content to the media server 2112. Based on user account restrictions and/or user context, the media server applies appropriate filters and restrictions to remove or filter out
inappropriate content. The filters or restrictions can include, for example, age-related content filters and/or restrictions, seating location filters and/or restrictions, and privacy filters and/or restrictions.
[0752] In one application, the combo controller connects via IP over Ethernet and assigns a local IP address to the BYOD device connected to a port in the consumer crate. The USB layer of the combo controller packetizes information from the BYOD device and communicates with a USB access program that executes on a multimedia blade processor. The USB access program de-packetizes the information and connects to the USB I/O subsystem of the multimedia blade processor to transfer the information to either local storage or directly to one of the output devices associated with the appropriate seating position.
[0753] Other user account restrictions can be applied by the media server before the content is passed by the media server to the user. For example, when the user is driving the vehicle video can be removed automatically from the content to avoid distracting the driver so that only the audio channel accompanying the video channel is provided to the user. In another example and in response to determining that the requesting user or user is driving the vehicle rather than being a (non-driving) passenger, screen magnification can be applied automatically to the visual content to assist the driver of the vehicle; that is, the user is the driver and larger font is easier to see than smaller font. In other examples, the visual content, in response to determining that the requesting user or user is driving the vehicle rather than being a (non-driving) passenger, is presented or rendered automatically in a large font and/or icon size (e.g., without changing screen resolution) compared to the font and/or icon size for the content displayed to a (non-driving) passenger to reduce driver distraction. In yet another example, the media server 2112 determines that the user is driving the car rather than being a (non-driving) passenger and enables or disables automatically removal of all unnecessary animations from the content to be displayed. In yet another example, the media server 2112 determines that the user is driving the car rather than being a (non-driving) passenger and enables automatic removal of background images from, while leaving one or more displayed foreground images in, the content to be displayed. In yet another example, the media server 2112 determines that the user is driving the car rather than being a (non-driving) passenger and automatically enables or renders high contrast (e.g., between text and background colors) to make the displayed content more visible to the user. The feature and/or setting and/or filters could be applied to the displayed content only when the vehicle is in motion or in forward or reverse gear and not parked even when the user is the driver.
[0754] Rather than automatic detection, any of the above features and/or settings and/or filters can be a standard feature and/or setting and/or filter and can be applied to the on board vehicle display for the driver but not to the on board vehicle displays of other (non- driving) passengers. In other words, the feature or setting or filter would not be applied to the on board vehicle display of a (non-driving) passenger. Alternatively, the feature and/or setting and/or filter would be applied to the on board vehicle display for the driver only when the vehicle is in motion or in forward or reverse gear but not when the vehicle is parked. [0755] In step 2220, the media server 2112 configures the filtered media for the capabilities of the user device and/or on board vehicle display and in accordance with user preferences. The media server 2112 directly provides, or streams, the media stream or indirectly provides, or streams, the media stream via the VNC software 2116 to the user computational device 2120 and/or on board vehicle display for presentation to the user. Stated another way, the network control server 2108 provides the media content 2100 while the media server 2112 provides the channel for the media content 2100 to be provided to the user computational device 2120.
[0756] The media controller subsystem 348 can handle multiple (e.g., eight or more) media and/or multimedia streams simultaneously by differing channels. The streams of information may be separated and served on different channels via one or more of WiFi, Bluetooth, NFC, and other communications protocols. Multiple BYOD computational devices can be served simultaneously and can mix and match or otherwise control what is streamed and where it is streamed.
[0757] Assisted driver Display
[0758] The device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 (such as the on board display for the driver) can be reconfigured when the vehicle changes state from parked or substantially motionless to be in motion or in forward or reverse gear to facilitate the driver's ability to view visual content and avoid driver distractions. Other devices or user interfaces 212, 248, user interface (s)/input interface(s) 324 and/or I/O modules 312 of other vehicle occupants or passengers typically are not reconfigured even when the vehicle is in gear or motion. When the state of the vehicle changes to parked (or not in motion), the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 of the driver returns to a different configuration, which is typically the same configuration as the devices or user interfaces 212, 248, user interface (s)/input interface(s) 324 and/or I/O modules 312 of other vehicle occupants or passengers.
[0759] There are a number of reconfigurations that may be implemented for the unimpaired driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 in response to a change of state of the vehicle, particularly from parked or stationary to be in motion or in forward or reverse gear.
[0760] For example, the media controller subsystem 348 can remove video
automatically from the content to be displayed by the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 of the driver to avoid distracting the driver so that only the audio channel accompanying the video channel is provided to the user. A still image derived from one or more frames of the video content can replace the video component in the displayed content.
[0761] In another example, the media controller subsystem 348 applies screen magnification automatically to the visual content to assist the driver of the vehicle; that is, the user is the driver and larger font is easier to see than smaller font. The screen magnifier is software that interfaces with a computer's graphical output to present enlarged screen content. The simplest form of magnification presents an enlarged portion of the original screen content, the focus, so that it covers some or all of the full screen. This enlarged portion should include the content of interest to the user and the pointer or cursor, also suitably enlarged. As the user moves the pointer or cursor the screen magnifier should track with it and show the new enlarged portion.
[0762] In another example, the media controller subsystem 348 presents or renders the automatically in a large font and/or icon size (e.g., without changing screen resolution) to reduce driver distraction. The large font and/or icon size, unlike screen magnification, applies to all, and not simply a portion of, displayed content.
[0763] In yet another example, the media controller subsystem 348 initiates
automatically a screen reader (or other text-to-speech program) to audibly provide the visual content to the user. A screen reader is a software application that attempts to identify and interpret what is being displayed on the screen (or, more accurately, sent to standard output, whether a video monitor is present or not). This interpretation is then represented to the user with text-to-speech and/or sound icons. Screen readers can be used in combination with the screen magnifier to read the content within the enlarged portion of the screen content.
[0764] More specifically, screen narration in vehicle can describe the layout of a current display, including a current position, page, functionality, etc., of displayed content and the position of a digit of the user and/or cursor relative to displayed objects. The names of the displayed objects in proximity to the current digit and/or cursor location can be audibly identified. In another example, a user can identify a displayed object to be located and the media controller subsystem 348 can use "hotter", "hot", "colder", and cold" game to inform the user where his or her digit and digit movement is relative to the identified object. For example, the media controller subsystem 348 can tell the user that "you're getting warmer" as the user's digit moves closer to the identified object. [0765] In yet another example, the media controller subsystem 348 determines that the user is driving the car rather than being a (non-driving) passenger and enables or disables automatically remove all unnecessary animation effects , such as fading effects (e.g., fade windows), from the displayed content.
[0766] In yet another example, the media controller subsystem 348 automatically removes background images from the displayed content while leaving one or more displayed foreground images. The removed background image can be replaced with a uniform background, such as a commonly colored or white background.
[0767] In yet another example, the media controller subsystem 348 enables
automatically a longer (than normal) period for notification dialog boxes to stay open on the display.
[0768] In yet another example, the media controller subsystem 348 automatically disables automatic arrangement of windows when the mouse cursor is moved to the edge of the screen.
[0769] In yet another example, the media controller subsystem 348 automatically enables activate a window by hovering over it with the mouse cursor.
[0770] In yet another example, the media controller subsystem 348 automatically enables keyboard web page navigation.
[0771] In yet another example, the media controller subsystem 348 automatically enables or renders high contrast (e.g., between text and background colors) to make the displayed content more visible to the user.
[0772] In yet another example, the media controller subsystem 348 automatically changes the color and transparency of window borders (making the borders easier to see).
[0773] In yet another example, the media controller subsystem 348 automatically changes the thickness of focus rectangle (around the currently selected item in a dialog box).
[0774] In yet another example, the media controller subsystem 348 automatically changes the color, size, and/or thickness of the on-screen mouse pointer.
[0775] In yet another example, the media controller subsystem 348 automatically changes the keyboard settings (e.g., select how long you need to press a key before the keyboard character starts repeating, the speed at which keyboard characters repeat, and the rate at which the cursor blinks, etc.). [0776] In yet another example, the media controller subsystem 348 automatically ignores colors, font styles and font sizes used on web pages, or formats web pages using a user specified or default style sheet.
[0777] In yet another example, the media controller subsystem 348 automatically enables or renders in the content an increased size of a mouse-selectable screen element or object to provide a larger target.
[0778] In yet another example, the media controller subsystem 348 automatically enables mouse keys to move the mouse pointer or cursor.
[0779] In yet another example, the media controller subsystem 348 automatically enables one or more of sticky keys (which allow the user to enter a combination of multiple keys without having to hold a first key down when he or she depresses a second key), toggle keys (which play an alert each time the user presses Caps Lock, Num Lock, or Scroll Lock keys), and filter keys (which causes the computer to ignore keystrokes that occur in rapid succession or keystrokes held down for several seconds unintentionally).
[0780] In yet another example, the media controller subsystem 348 automatically causes the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to be configured to enable blind typing. In blind typing, the driver can write naturally with a finger on the interface and the interface can determine, from the natural written script, a command or request of the driver. This configuration typically requires the interface to be configured as a touchpad. While other systems require the driver to focus away from the road— either by selecting letters from menus or on a displayed keyboard— blind typing permits the driver to input commands or requests to the vehicle control system without having to visually focus on any screen.
[0781] Although many of the above display changes or reconfigurations have been used to enable impaired users to access a computer, the driver is typically not disabled or impaired (e.g., not visually impaired or disabled). These changes or reconfigurations can however be highly beneficial in enabling a driver to interact with the computer in a manner that substantially avoids or inhibits driver distraction.
[0782] Rather than automatic detection, any of the above features and/or settings and/or filters can be a standard feature and/or setting and/or filter and can be applied to the on board vehicle display for the driver but not to the on board vehicle displays of other (non- driving) passengers. In other words, the feature or setting or filter would not be applied to the on board vehicle display of a (non-driving) passenger. Alternatively, the feature and/or setting and/or filter would be applied to the on board vehicle display for the driver only when the vehicle is in motion or in forward or reverse gear and not when the vehicle is parked.
[0783] A user interface (UI), such as the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312, may be provided that includes an area for navigation by a user. As a user navigates the UI area, the user may be provided with one or more of sounds, audio narration, haptic/tactile feedback, visual indicator (e.g., not associated with the UI), and the like. For example, a location in the UI area may represent a button and/or other icon. When the user makes contact with the location having the button, a sound and/or tactile feedback may be presented to the user to indicate where the user's pointer (e.g., one or more finger, hand, etc.) is located. As one example, as a user reaches a location in the UI area associated with a "call" button, the sound presented may be a ring (e.g., a bell), and a vibration may be provided to the UI.
Alternatively, screen narration by a screen reader can be provided, which informs conversationally the user where his or her digit is located. For example, the narration can be to recite the name corresponding to the button. In another example, as a user moves a pointer about the UI area, a frequency of sound may change to indicate a position in the UI area (e.g., the low left portion of the UI area may provide a low pitch and/or frequency sound, while a position in the upper right portion of the UI area may have a higher pitch and/or frequency sound, and portions in between can have different pitches and/or frequencies, etc. - similar to the Theremin musical instrument).
[0784] By way of example, a graphical UI or GUI may allow a user to interact with certain displayed buttons, features, and/or menus while a vehicle is stationary, but may restrict access and/or interaction when the vehicle is moving. One example of this restricted interaction involves greying-out buttons associated with an associated phone feature when the vehicle moves. In the embodiments provided herein, the display can be completely turned off or blanked out when moving and a user may still be able to interact with features. Because the user will not be distracted by the visual presentation of elements the user is free to drive while interacting.
[0785] In some embodiments a GUI may be used where a display of the GUI is turned off to provide the above described functionality.
[0786] A display navigation mode can be selectively triggered by the user to implement any of the above configurations and/or features. When the user wishes to enter display navigation mode, the user can provide a command, such as by voice or gesture, to enter that mode. In that mode, any of the above assistive configurations or features can be implemented. For example, voice navigation is activated and the residence time of the user's touch on the touch screen required to select a contacted icon is lengthened to avoid incidental or accidental icon selection as the user moves his or her hand/digit around the touch screen. When the user arrives at a desired location, the icon will be activated after touch contact for the longer residence time. When the icon is selected, the touch screen can return to normal operating mode with a shorter residence time for icon activation.
[0787] The user can alternatively use a roller or track ball and selector to effect narrated screen navigation.
[0788] In another embodiment, the user contacts an arbitrary and substantially planar or flat control surface, such as an arm rest, dash, steering wheel, glove box door, center or rear center console, window (e.g., windshield), passenger compartment roof, seat surface, door panel, or other electrically, magnetically, electromagnetically, and optically nonreactive surface to activate icons on the activated or deactivated vehicle computer center or rear center display. The control surface can be any arbitrary surface selected by the user. This can be done by optically or visually separating the surface into segments, each segment corresponding to a segment on the display. The movement of the user's hand or digit is tracked optically over the surface with reference to the segments. As the user moves his or her hand, the user can be audibly told where his or her hand is relative to the display. When the user's hand/digit is at a selected location, the user can leave his or her hand/digit at that location for a selected residence time to cause the icon in the corresponding segment of the display to be activated.
[0789] The activated or deactivated vehicle computer center or rear center display can be any of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312, may be provided that includes an area for navigation by a user.
[0790] An example is described with reference to Figs. 33-34. Figure 34 depicts the display of Fig. 30 with defined segments overlaid on the display. This is done to show the process of mapping the control surface to the displayed objects. Fig. 33 shows the same defined segments positioned on the control surface 3300. Each segment of the control surface 3300 corresponds to an information-containing and/or activatable or selectable object of the current display. As the display changes in response to user input or otherwise, the type, identity, and/or arrangement of displayed information and objects changes requiring the display to be remapped to the control surface 3300. Remapping typically requires the defined segments to be redefined and reconfigured to reflect the new display configuration.
[0791] As shown by Fig. 33, the dividing lines define various cells, each of which corresponds to displayed content on the display of Fig. 34. Specifically, cell 3304a corresponds to display field comprising driver name 3028, cell 3304b to satellite reception indicator 3032, cell 3304c to WiFi connectivity indicator 3034, cell 3304d to cellular connectivity indicator 3036, cell 3304e to display field comprising weather information 3038, cell 3304f to display field comprising date 3040, cell 3034g to icon 3380, cell 3034h to navigation ("Nav" icon) 3016, cell 3034i to media ("Media" icon) 3018, cell 3034j to phone ("Phone" icon) 3020, cell 3034k to rear view ("Rear View" icon) 3022, cell 30341 to other applications icon 3024, cell 3034m to icon 3384, cell 3034n to "my dash" icon 3026, cell 3034o to navigation display field 3088, cell 3034zz to suggested icon, cell 3034p to set destination icon, cell 3034q to display field containing destination, cell 3034r to display field containing suggested waypoint, cell 3034s to display field containing suggested waypoint, cell 3034t to set destination icon, cell 3034u to set destination box, cell 3034v to favorites icon, cell 3034w to recent locations, cell 3034x to people icon, cell 3034y to points of interest icon, cell 3034yy to set waypoints, cell 3034z to set waypoints box, cell 3034aa to go icon, cell 3034bb to calendar icon 3092, cell 3034cc to weather icon 3096, cell 3034dd to display field containing appointment, cell 3034ee to display field containing appointment, cell 3034ff to display field containing appointment, cell 3034gg to display field containing appointment, cell 3034hh to display field containing appointment, cell 3034ii to display field containing appointment, cell 3034jj to display field containing weather information, cell 3034kk to display field containing weather information, cell 303411 to display field containing weather information, cell 3034mm to display field containing weather information, cell 3034nn to display field containing weather information, cell 3034oo to display field containing weather information, cell 3304pp to friends icon, cell 3304qq to received and/or sent messages 3068, cell 3304rr to received and/or sent messages 3068, cell 3304ss to received and/or sent messages 3068, cell 3304tt to received and/or sent messages 3068, cell 3304uu to received and/or sent messages 3068, cell 3304w to received and/or sent messages 3068, cell 3304ww to icon 3098, cell 3304xx to a climate control icon ("Climate" icon), cell 3304aaa to a controls icon ("Control" icon) to control vehicle tasks, functions or operations 3062, cell 3304bbb to speaker volume setting 3064, cells 3304ccc and ddd to controls to the left and right of the speaker image 3066 and 3070, respectively, and cell 3304eee to speaker icon. [0792] By way of example, the user's digit points at 3304h, which corresponds to the navigation icon 3016 of the home page 3000 of Fig. 30. By leaving the user's digit stationary in that location for a selected period of time, the media controller subsystem 348 determines that the user desires to activate the navigation icon 3016, which causes a different display to be rendered. The new display is remapped to the control surface so that the dividing lines on the control surface mirror the dividing lines overlain on the new display. By way of further example, the user's digit points at 3304f, which corresponds to the display field for date and time of the home page 3000 of Fig. 30. A screen reader reads the date and time to the user. By way of further example, the user's digit points at 3304dd, which corresponds to the display field for appointment of the home page 3000 of Fig. 30. A screen reader reads the displayed appointment information to the user.
[0793] Optical tracking of the user's digit enables the location of the digit. Optical tracking typically determines in real-time the position of the digit by tracking the positions of either active or passive infrared markers attached to the object. The position of the point of reflection is determined using the camera sensors 760, motion sensors 744, and/or infrared sensors 740. In one configuration, the user's digit is equipped with : ; ... \. Markers can be light reflectors („passive markers", e.g. retroreflectors) or light emitters ("active markers", e.g. LEDs). To also measure the orientation of a body, several (greater than or equal to 3) of these markers have to be arranged at a known geometry. The camera sensors 760 scan a certain volume and detect the light that comes from the markers. These images are processed to identify and calculate potential marker positions (in image coordinates). The result of each measurement are coordinates that describe the position of the markers, and hence the position and orientation of the body carrying the markers.
[0794] Other techniques can be employed. The cell dividing lines can be projected onto the control surface by the camera sensors 760. The position of a selected digit of the user can be tracked using image processing of images of the control surface with the cell boundary or dividing lines in the images. Examples of other techniques include video tracking, such as blob tracking, kernel-based tracking, contour tracking, visual feature matching, match moving, motion capture, motion estimation, Swistrack, and single particle tracking.
[0795] The operation of the control surface will be discussed with reference to Fig. 35.
[0796] In step 3500, the media controller subsystem 348 detects a stimulus, such as ignition of the engine, motion of the vehicle, the vehicle being placed in gear, a gesture or other activation command from the user, and the like. [0797] In response, the media controller subsystem 348, in step 3504, maps the current display of a selected one of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to a selected control surface. This can include proportional sizing the control surface to the display and determining cell and cell boundary locations. The sizes of the cells are commonly proportional to the
corresponding displayed content.
[0798] In step 3508, the media controller subsystem 348 determines, by one or more of the motion sensors 744, infrared sensors 740, and camera sensors 760, whether a user's digit has moved relative to the stationary cell boundaries and cells on the mapped control surface.
[0799] When movement is detected, the media controller subsystem 348, in step 3512, detects or determines, by one or more of the motion sensors 744, infrared sensors 740, and camera sensors 760, the position of the digit on the control surface relative to the cells and cell boundaries.
[0800] In step 3516, the detected position of the digit is mapped to the currently displayed content. This is done by mapping the cell where the tip of the digit is positioned to the corresponding displayed content on the display.
[0801] In step 3520, the media controller subsystem 348 performs an action based on the detected digit position. The action depends on the type of displayed content
corresponding to the digit position. When the displayed content is activatable or selectable, the media controller subsystem 348 activates the corresponding task, function or operation. When the displayed content is not activatable or selectable, a screen reader in the media controller subsystem 348 reads the displayed content to the user. Some operations are drag-and-drop operations where content is moved by the user's digit from one location to another, such as a destination or waypoint being dragged from one location and dropped into a set destination or wet waypoint box. This can be done by detecting the user's digit in a first location, determining that the displayed content can be dragged and dropped to another location, waiting for the digit to move to a second location, and, when in the second location, dropping the content into displayed content corresponding to the second location when the displayed content permits this operation to be performed.
[0802] As noted, when the task, function or operation causes the display to change the control surface is remapped to the display. [0803] In another embodiment, the UI may not include any display at all. Because any surface can receive input, as provided herein, a display is not needed. Examples of this type of display-less UI can be implemented on any surface that has a substantially flat or planar area or surface. In one embodiment, the UI area may include a capacitive and/or resistive touch interface region. In another embodiment, the UI area may be a surface that includes an area visible from at least one image sensor (e.g., camera, etc.). Other examples of a display-less UI may include a volume of space in which a user can move a pointer.
[0804] In the examples provided above, a user may navigate the non-displayed UI elements (e.g., icons, buttons, etc.) using sound, haptic feedback, etc.
[0805] An operation of the media controller subsystem 348 will now be described with reference to Fig. 26.
[0806] In step 2600, a stimulus is detected by the media controller subsystem 348 based on input from one or more vehicle sensors. An exemplary stimulus includes a change in vehicle operating state. Common changes in vehicle operating state include a change from "parked" to "forward gear" and vice versa, from "parked" to "reverse gear" and vice versa, from "forward gear" to "reverse gear" and vice versa, and from stationary to in motion and vice versa.
[0807] In decision diamond 2604, the media controller subsystem 348 selects a device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 and determines whether or not it is associated with the driver. This is typically determined by determining whether or not the selected device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 is located in an area and/or zone occupied by the driver. Location of the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 can be determined using the techniques noted above.
[0808] When the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 is located in the area and/or zone of the driver, the media controller subsystem 348, in step 2608, reconfigures the user interface to enable or disable (depending on the initial and final states) one of the features, settings, or filters discussed above.
[0809] When the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 is not located in the area and/or zone of the driver, the media controller subsystem 348, in step 2608, does not reconfigure the user interface to enable or disable (depending on the initial and final states) one of the features, settings, or filters discussed above.
[0810] After performing either of steps 2608 or 2612, the media controller subsystem 348 selects a next device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 and returns to decision diamond 2604.
[0811] This process is continued until all devices or user interfaces 212, 248, user interface (s)/input interface(s) 324 and/or I/O modules 312 have been considered.
[0812] In other embodiments, the media controller subsystem 348 can audibly warn of or describe an image detected via an infrared sensor, motion sensor or camera sensor that is in the path of the vehicle. Known image processing and image recognition techniques can be used to identify the object image and compare the image to a projected path of the vehicle. An exemplary technique acquires the image, pre-processes the acquired image, performs feature extraction from the pre-processed image, performs detection and/or segmentation using the extracted features, and performs high level processing and decision making. The path of the vehicle, as determined by a steering wheel setting and a projected path of the center line of the vehicle for that setting along with a width of the vehicle can be compared against the position of the object. When the projected path causes any portion of the vehicle to contact the object or be within an unsafe distance of the object, appropriate instructions or warnings can be issued. Thus, not only can a warning containing the object description be provided to the driver but also directions can be audibly provided to enable the driver to miss the object. For example, the driver can be instructed by one or more of the following: "Watch Out!"; "Turn the wheel left, a little more, a little more, perfect, straighten out"; "Turn the wheel right, a little more, a little more, perfect, straighten out"; and the like. This can provide invaluable assistance to driver's who struggle, for example, with backing up the vehicle due to depth perception and/or orientation issues.
[0813] User Account Management and Occupant Control of Vehicle Tasks, Functions and Operations
[0814] The vehicle control system 204 can manage user accounts that stipulate, control, and define rights and privileges of each user of the vehicle and vehicle network. Different users and/or devices may include different accounts. In some cases these accounts may include user and/or device-specific privileges.
[0815] For example, a system user has exclusive rights and privileges over all communication device firewall rules and running software. The system user can control selected vehicle tasks, functions and operations and infotainment via predetermined or preconfigured gestures. A manufacturer or vendor user can have local or remote access to on board control system data and alter the parameters and settings in the vehicle control system and to perform diagnosis and repair. A dashboard user can have rights to launch a dashboard application and authenticate guest or default users and change permissions to trusted friends or family members. Dashboard users can read on board diagnostic system data but cannot overwrite or change it. A user can be both a system and dashboard user. An Internet source interacting with the vehicle has only http rights to respond to http requests received from the vehicle. The requests can target different user data but these are filtered by default user accounts. Guest users have no rights. Family and friend users have rights to play media from the media controller and to stream its media to the controller. They may or may not have rights to control vehicle tasks, functions and operations, whether or not by gesture. The back seat displays are part of this group of users. The back seat displays can be smart WiFi removable communication devices that can be removed from the vehicle in WiFi range of the communication network.
[0816] In one example, a child user may wish to change the temperature in his zone of the vehicle. Continuing the example, the child may have an account that allows access to this feature (but not other more critical features). As such the child may change the temperature via the device or other control. The child, however, may not be entitled to change the feature for an area or zone other than that in which he or she is currently located. Additionally or alternatively, a driver may have a broader account and privileges. In this case, the driver may control critical features, and even tasks, operations, and functions, of a vehicle. Remote control of a vehicle or infotainment task, operation, or function through the on board vehicle computer and by a portable or handheld
computational device, such as a cellular phone or tablet computer, interconnected with the vehicle computer, can be permitted for some users but not others. For example, a privileged user can control an infotainment or climate control function on the dashboard from the back seat of the vehicle. The control of the vehicle or infotainment task, function, or operation may be made via a device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312. Accordingly, occupant control of a (typically non- or less critical) vehicle task, function, or operation (such as an infotainment task, function or operation) can be done using an on board (or vehicle mounted) computational device or a portable computational device discrete from and not mounted on the vehicle. [0817] In another example, a user in the rear passenger seat of a vehicle may wish to take control of specific features that are allowed in that zone. Such features may include controlling the climate, changing a radio station or infotainment presentation, opening a window, setting a volume level, setting a screen contrast and/or resolution, and/or adjusting a seat position. The control of these features may be provided to the user in the rear passenger seat via a display, gesture control region, or other device associated with the zone in which the rear passenger is seated. Associated devices may include the user's smart-phone, tablet, computer, and/or other device.
[0818] In yet another example, a vehicle occupant can perform a first set of vehicle tasks, functions and operations when in a first area and/or zone and a different set of vehicle tasks, functions, and operations when in a different second area and/or zone. For instance, a vehicle occupant can drive the car or perform another critical vehicle task, function or operation when seated in the driver's seat but not when seated in a different seat. When the vehicle occupant is in the back seat, he is unable to be a back seat driver of the vehicle or perform another critical vehicle task, function or operation.
[0819] In yet another example, a vehicle occupant can perform a first set of vehicle tasks, functions and operations when the vehicle is in a first operating state and a different set of vehicle tasks, functions, and operations when the vehicle is in a second operating state. The vehicle occupant, for instance, can watch a video on the driver's center console or display subsystem when parked and/or the vehicle is stationary (not in motion) but cannot watch the video on the driver's center console or display subsystem when the vehicle is in gear and/or motion.
[0820] In yet another example, a different authentication procedure is used for the vehicle occupant when requesting a first set of vehicle tasks, operations, and functions than when requesting a second set of vehicle tasks, operations, and functions. When the vehicle occupant is entitled to drive the vehicle or perform another critical vehicle task, function or operation, a different and higher or more stringent set of authentication requirements can be required than when the vehicle occupant is not entitled to drive the vehicle. For instance, multiple authentication techniques may be used in the former case but only one in the latter case.
[0821] As noted, the vehicle control system 204 manages user accounts, which contain user credentials, such as user identifiers, passwords, user biometric data (e.g., fingerprints, retina images, facial characteristics, user weight, and other physical characteristics), and rights and privileges of the user in performing vehicle tasks, functions, and operations, both critical and non-critical. In addition to the rights and privileges defined in the user account, the vehicle control system 204 can consider additional gating or filtering factors, such as current user position (e.g., what area and/or zone the user is located or seated in), in determining what rights and privileges apply currently to the user.
[0822] The vehicle control system 204 can manage and enforce accounts for
authenticated and non-authenticated users. Each account specifies rights and privileges for the corresponding user and/or security/authentication requirements and/or personal settings of the user (such as seat and lumbar settings, climate control settings, lighting settings, configuration of instrument cluster, rear view mirror settings, driving modes (such as fuel economy, sport, city, and the like), media channel settings or presets, media delivery preferences, music genre preferences, scheduled programs, playlists,
synchronization with cloud-based data associated with the user (such as iCloud™ of Apple™, Outlook™, and the like with examples of cloud-based data being electronic calendar, email, contacts, media content, texts, voice mail messages, and the like), application-specific personalization and selections, display settings and configurations, and the like) and other information noted above with respect to the profile data 252 and device data 220. These will be henceforth encompassed by any reference to a user's account.
[0823] For example, an owner of the vehicle would have the right and privilege not only to access infotainment media but also to access and control critical and non- or less critical vehicle tasks, functions, and operations. Because he is entitled to the highest level of rights and privileges, his or her authentication requirements may be more demanding than another vehicle occupant not having such rights and privileges. A child of the vehicle owner on the other hand would have a very restricted or no right to access and control vehicle tasks, functions, and operations (except for climate control, seat, and/or window settings) and restricted rights to access infotainment media. As noted above, a child filter could be configured by the vehicle owner to prevent the child from accessing age- restricted media.
[0824] A guest of the vehicle owner having no corresponding account would be provided with a default account having default and restricted rights and privileges to access vehicle and infotainment tasks, functions, and operations, and settings and configurations, which could be expanded based on input from the owner. This could be done in response to a query by the vehicle control system 204 that the vehicle occupant is not recognized (or authenticated). The vehicle control system 204 could provide the owner with multiple default accounts, one for an adult with driving privileges, one for an adult without driving privileges, and one for a child, and request the owner select the appropriate default account for the occupant. An identifier of the occupant and optionally authentication credentials would be inputted by the owner, and the vehicle control system 204 could store image processing information of the occupant's facial or other image, biometric information, and/or occupant weight for purposes of authentication. The owner could define, in an account created or modified by the owner, the rights and privileges of the person associated with the account.
[0825] An operation of the vehicle control system 204 is shown in Fig. 27.
[0826] In step 2700, the vehicle control system 204 detects a stimulus, such as a driver prompt, presence of an occupant, and the like. As noted above, the presence of an occupant can be sensed by seat weight sensors, image processing analysis, motion detection, proximity of a key fob, and/or biometric information.
[0827] In decision diamond 2704, the vehicle control system 204 determines whether or not each of the vehicle occupants has been authenticated successfully. Authentication can be performed by any suitable technique and using any suitable input, such as by one or more of receiving and validating user credentials, sensing an occupant's weight in a seat (with the weight being different from weight ranges predefined for each known occupant), image processing performed on the facial characteristics of the various vehicle occupants (and comparing the sensed facial characteristics against those for known occupants), biometrics (such as a fingerprint image of the occupant sensed by the occupant touching a touchscreen display, retinal scan, heart rhythm as received by conductive elements on a vehicle component such as the steering wheel or gear shift) (which are compared against corresponding stored biometrics for known occupants), and the like. An unauthenticated occupant could be recognized by sensing an occupant's weight in a seat (with the weight being different from weight ranges predefined for each known occupant), image processing performed on the facial characteristics of the various vehicle occupants (with the sensed facial characteristics failing to match stored facial characteristics for known occupants), unrecognized biometrics (which do not correspond to biometrics for known occupants), and the like.
[0828] Authentication using image processing applied to video images of occupants in the vehicle can be performed by any suitable technique. In one technique, facial recognition occurs in two stages, namely face detection and recognition. The face detection stage attempts to obtain an acceptable image from the video stream of the vehicle occupants to use in the face recognition stage. The face recognition process uses the image obtained from the detection stage and compares it with a database of known faces. When the facial image obtained from the detection stage matches a known stored facial image, authentication is successful. When the facial image obtained from the detection stage fails to match a known stored facial image, authentication is unsuccessful. As noted, image processing can be augmented by other security mechanisms for higher level privileges, such as authentication by user credentials (e.g., pattern lock, PIN, or other password).
[0829] When the user is authenticated successfully, the vehicle control system 204, in step 2708, retrieves the corresponding account for the successfully authenticated user.
[0830] In step 2712, the vehicle control system 204 next determines the area and/or zone (e.g., seat position) occupied by the authenticated user, such as by using any of the techniques discussed herein, including without limitation image processing information of the various occupants, sensed seat weight, user input, location coordinates received from a portable computational device, such as a cell phone, tablet computer, or personal digital assistant associated with the user, and the like. The vehicle control system 204 may also determine an operating state of the vehicle, e.g., in motion, parked, in gear, in neutral, and the like.
[0831] In step 2716, based on location (e.g., seat position) of the corresponding authenticated user and/or the operating state of the vehicle, the vehicle control system 204 enables eligible rights, privileges, settings and configurations set forth in the
corresponding account and disables ineligible right or privileges or alters settings and configurations.
[0832] When the user is not authenticated successfully, the vehicle control system 204, in step 2720, creates an account for the unauthenticated user as set forth above. As noted, the account can be a default account.
[0833] Another application of the vehicle control system 204 for authentication by facial recognition is shown in Figs. 28-29.
[0834] In step 2800, the vehicle control system 204 detects a stimulus or trigger event.
[0835] In step 2804, the vehicle control system 204 provides, via the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312, the user with a face detection screen, such as that shown in Fig. 29. As shown in Fig. 29, the face detection screen 2900 is a dashed outline 2904 of a human face having eye holes 2908a and b to enable the user to align his or her face with the screen for a higher quality video image of the user's face. This screen 2900 prompts the user to align his or her face such that the facial recognition camera can obtain a good image. The screen can show the video feed from the facial recognition camera with the dashed outline as an overlay. The dashed outline is intended to assist users situate his or her image properly in the camera's field of view. An instruction line 2912 can appear at the top of the screen 2900.
[0836] When an acceptable facial image is not obtained so that facial recognition cannot be performed, the vehicle control system 204 proceeds 2803to a manual sign-in step (discussed below) and provides the user with a face recognition failure notification dialog over the screen 2900 of the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 in step 2812. If a proper facial image is obtained, the vehicle control system 204, in step 2816, searches through a repository of stored facial images to find a match and, during the mapping, provides the user with a recognition progress indicator modal on the screen 2900.
[0837] When recognition is not successful or no match is found, the vehicle control system 204 proceeds to step 2812 and the manual sign-in step discussed below.
[0838] When recognition is successful and a match is found, the vehicle control system 204 proceeds to step 2820 and applies the personal settings of the user, such as any or all of the settings referenced above. A settings application progress modal can be displayed over the displayed screen.
[0839] In step 2812, the vehicle control system 204 provides the user with a manual sign-in screen (not shown) requesting input of the user's credentials. The manual sign-in screen, for example, can include fields for user identifier and password and options to sign in or continue as a guest.
[0840] When the credentials are not received within a time out period or are incorrect or at the request of the user in step 2812, the vehicle control system 204 can create 2828 a guest or default account for the user as discussed above and applies default settings.
[0841] When the credentials are received and successfully authenticated, the vehicle control system 204 in step 2820 applies the personal settings of the user.
[0842] The vehicle control system 204, after steps 2820, 2824, or 2828 proceeds to step 2832 and provides a home screen page, such as home page 3000, configured in accordance with the applicable settings.
[0843] Configuration of Haptic Feedback and Visual Preferences in Vehicle User Interfaces [0844] The profile data 252 can include, for each occupant in a corresponding user profile, physical impairment information, such as information regarding a disability. The physical impairments include vision impairments (e.g., low vision, blindness, night and color blindness, and light sensitivity), hearing impairments (such as hearing loss, hard-of- hearing, and deafness), dexterity and mobility impairments (such as caused by arthritis, cerebral palsy, multiple sclerosis, loss of limb or digit, spinal cord injury, and repetitive stress injury), and language and/or communication impairments (such as aphasia, delayed speech, dyslexia, and other conditions resulting in difficulties remembering, solving problems, and/or perceiving sensory information).
[0845] System data 208 can include rules to implement by the vehicle control system 204 an accessibility or assistive technology. This can include reconfiguring, for example, the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312. Once the user having the impairment is located in the vehicle or correlated with a particular seat position and/or device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312, the accessibility or assistive technology can be applied to the corresponding device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312. The type of accessibility or assistive technology employed can depend not only on the type and capabilities of the device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 but also on the seat position of the impaired user. Specifically, when the impaired user is a driver of the vehicle, he or she may receive different accessibility or assistive technology compared to when he or she is a non-driving vehicle occupant or passenger.
[0846] The type of assistance provided by the vehicle control system 204 depends on the particular impairment and/or disability involved. For example, if the vehicle control system 204 were to determine that the user has a particular impairment and/or disability, the vehicle control system 204 can recommend and/or implement automatically other feature or device settings and/or accessible technology that may provide improved access for the user. By way of illustration, information, commands, warnings, and requests provided by way of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to users with vision impairments can be one or more of the use of screen magnification, high contrast (e.g., between text and background colors such as white text on a black background, large font size and/or icon size (e.g., without changing screen resolution), color changes on the screen, a screen reader (or other text-to-speech program), speech recognition software (such as to operate the computer and/or software), enablement of a read mode, keyboard web page navigation, and the like. Information, commands, warnings, and requests provided by way of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to users with a hearing impairment can be one or more of the use of text or visual alternatives for sounds, high volume levels, changed computer sounds, sign language interpretation or translation (e.g., by image processing and acquisition based on visual images captured by one or more camera sensors), a text phone application, and the like. Information, commands, warnings, and requests provided by way of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to users with a dexterity and/or mobility impairment can be one or more of the use of particular mouse settings (e.g., mouse button configuration and timing (such as double-click speed), to make the mouse pointer more visible, and to alter the scroll speed of the mouse wheel or how quickly the mouse pointer responds to movements of the mouse), an increased size of a mouse-selectable screen element to provide a larger target, mouse keys to move the mouse pointer, sticky keys, toggle keys, filter keys, keyboard shortcuts, access keys, keyboard settings, an on-screen keyboard, speech recognition software (such as to dictate into almost any application (e.g., the user can dictate documents and email, surf the web, and control selected vehicle tasks, functions, and operations by voice command)), disablement of the automatic arrangement of windows when the mouse cursor is moved to the edge of the screen, enablement of activate a window by hovering over it with the mouse cursor, keyboard web page navigation, a disability assistive keyboard (such as an alternative keyboard) displayed on the screen, a keyboard filter, and the like. Information, commands, warnings, and requests provided by way of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to users with a language and/or communication impairment can be one or more of the use of sticky keys, toggle keys, filter keys, enablement of remove background images, disablement of all unnecessary animations, long period for notification dialog boxes to stay open, a keyboard filter, speech recognition software, enablement of a read mode, a screen reader, keyboard web page navigation, and a speech synthesizer, and the like. Information, commands, warnings, and requests provided by way of the driver's device or user interface 212, 248, user interface (s)/input interface(s) 324 and/or I/O module 312 to users with a learning impairment can be one or more of the use of a word prediction program, a reading tool, a learning disability program, a speech synthesizer, and speech recognition programs.
[0847] For users with an impairment, an assistive technology can use haptic feedback to provide information to a user. Haptic technology, or haptics, is a tactile feedback technology which takes advantage of the sense of touch by applying forces, vibrations, or motions to the user. Haptic feedback commonly uses the sense of touch in a user interface to provide information to an end user. The user interface can be any device with which a body part of the user is in contact and through which haptic feedback can be transmitted. Examples include a steering wheel, tactile electronic display, seat, seat belt, foot pedal, gear shift, center or rear center console, arm rest, other contact surface, and the like. In one configuration, the assistive technology is a system where at least two actuators are positioned beneath a user interface to provide vibratory feedback when a user makes contact with the unit. One actuator can induce a feedback vibration while one or more other actuator creates a second vibration to suppress the first from propagating to unwanted regions of the interface, thereby "localizing" the haptic experience. To minimize driver's attention away from the road or for a visually impaired occupant, tactile (or haptic or aural) feedback can be used to help a user identify which button / icon they are touching or pressing, without drawing their visual attention away from the road. It can also be used, for the hearing impaired, haptic feedback can be used to identify external sounds, such as sirens of an emergency vehicle, and/or a vehicle horn and/or information about a vehicle task, function or operation, such as an engine-related warning being activated, vehicle speed, vehicle gear currently engaged, and the like. Each type of information to be provided by haptic feedback to the user can have a correspondingly different haptic feedback response to enable the user to distinguish between types of information. For example, a police siren can have a different haptic feedback than an ambulance siren, both of which can have a different haptic feedback from an engine- related warning.
[0848] The external sound can be detected by one or more sound sensors positioned interiorly and/or exteriorly to the vehicle. Each sensor generates an analog or digital representation of the sound detected over a selected frequency range. The vehicle control system uses template matching to associate the profile of the sound detected with a corresponding matching profile of type of sound. In other words, the sound profile is the template and the sensed sound profile of a police siren is matched to the template sound profile. [0849] Sensed sound template matching can be used to generate an alert or sound in a different frequency range to a user. All or part of the sensed sound profile can be frequency shifted to provide a corresponding output sound profile covering a different frequency range. The matching type of sound can have a unique corresponding output sound profile. The output sound profile can be played to the vehicle occupants over the sound system of the vehicle. The output sound profile may be suited to a user's particular frequency and/or hearing disability such that the output sound profile contains a set of frequencies that the user can hear. The sound may even adjust an audible signal presented to a user (e.g., statically, dynamically, and/or combinations thereof) to provide a differently shaped or phase-shifted sound profile. The sound profile over certain frequency ranges can be exaggerated in magnitude and/or the sound profile over certain frequency ranges can be decreased in magnitude or eliminated altogether. The output sound profile is played over the vehicle's sound system in temporal proximity to the detection of the corresponding external sound profile to alarm the occupant. In one configuration, the output sound profile is played over the spatially dispersed speakers in the passenger compartment with sound delays (or phase shifts) and/or magnitude levels at each speaker simulating the sound delays and/or magnitude levels of the various portions of the sound sensed by the various sound detectors positioned inside and/or outside of the vehicle. This can indicate to the user spatially where the origin or source of the sound is located relative to the vehicle.
[0850] Alternative to or in addition to playing the output sound profile, visual signals and/or alerts to a user (e.g., flashing lights, increasing intensity associated with a visual signal, animated icons, etc.) can be provided visually to the user. The color, brightness, duration, and/or intensity of the light emitted and/or appearance of the animated icon can vary by the matching type of sound profile such that each type of sound profile has a unique set of visual signals and/or alerts to the user.
[0851] The general interfaces of a vehicle, including but not limited to, at least one of the infotainment system, interfaces, graphical user interfaces, and the like, can be configured to display color-blind friendly colors so that the interface is meaningful and usable for a color-blind user. Colors, contrast, brightness, etc. may be configured to provide an optimum user experience based on the type of color-blindness.
[0852] The vehicle control system can adjust or modify settings on manual controls to account for a user impairment or medical condition, age, physical condition or characteristics, and/or driving history. This information can be recorded in a corresponding user profile, obtained over the Internet from a remote source (such as driving history from a law enforcement agency), and/or sensed by observing the user's behavior over time. For instance, the user may not respond quickly to a vehicle sensed obstacle in the path of the vehicle, may move, shake or jitter the steering wheel slightly during vehicle operation. The user response time, pedal displacement of the user, force applied to a pedal by the user, steering wheel displacement during straight-line driving, and the like can be monitored by the vehicle control system and the monitored behavior analyzed and stored in the corresponding user profile. Analysis can by matching the observed behavior against various templates. The closest template is associated with a corresponding set of configuration rules regarding what settings to use for manual controls of the vehicle, The abilities can be based on driving or responsiveness tests and/or driver profiles maintained by a law enforcement or driver regulating agency.
[0853] Based on impairment, medical condition, age, and/or abilities, for example, the steering wheel, pedals, and/or other systems can be adjusted for play and/or
responsiveness. For example, an individual who suffers from one or more of Parkinsons, old age, shaking, etc. may require more "play" in the steering wheel such that any shaking and/or small movements of the individual do not adversely affect the steering sensitivity of the vehicle. Upon detecting the user impairment, the vehicle control system may alter the characteristics of the steering wheel, for example, by reducing the sensitivity associated with movement. In one case, the sensitivity may be changed by modifying the electrical output in response to receiving a certain mechanical angular movement of the steering wheel. For instance, an angular movement of 10 degrees at the steering wheel may be the minimum default angular movement to cause the wheels on a vehicle to begin to turn. Upon detecting a user's disability, this minimum default angular movement may be increased to 30 degrees, as an example. Therefore, small movements (e.g., under 30 degrees) will not affect the steering. The changes to be implemented are typically based on rule sets in the system data 208 which map a set of user-related descriptors to
recommended or required changes. The descriptors typically describe one or more of user impairment, medical condition, age, and/or abilities.
[0854] As can be appreciated, the sensitivity associated with a steering wheel and/or other manual vehicle control systems may be increased based on a user's abilities. Using the example provided above, an angular movement of 10 degrees at the steering wheel may be the minimum default angular movement to cause the wheels on a vehicle to begin to turn. Upon detecting a user's ability (e.g., quick reactions, steady hands, past driving history, past vehicle history, etc.) this minimum default angular movement may be decreased to 5 degrees, as an example. Therefore, smaller movements (e.g., of 5 degrees applied at the wheel) will cause the vehicle to begin turning (similar to racing vehicles and/or high-end cars, etc.).
[0855] In other examples, (brake, emergency brake, accelerator and/or clutch) pedal movement range can be adjusted; that is, the vehicle control system can automatically adjust one or more of pedal stroke length, sensitivity, etc., based at least partially on user profile data and abilities.
[0856] The adjustment based on user profile and abilities can be extended to adjust settings and configurations of automated vehicle response systems. A collision avoidance system is an automobile safety system designed to reduce the severity of an accident. Also known as precrash system, forward collision warning system or collision mitigating system, it uses radar and sometimes laser and camera sensors to detect an imminent crash. Once the detection is done, these systems either provide a warning to the driver when there is an imminent collision or take action autonomously without any driver input (by braking or steering or both). In one approach, the collision avoidance system works in multiple phases. In a first phase, the collision avoidance system provides warning of an impending accident, while the hazard warning lights are activated, the side windows and sunroof are closed and the front seat belts are tensioned. In a second phase, the warning is followed by light braking, strong enough to win the driver's attention. In a third phase, the collision avoidance system initiates autonomous partial braking. In a fourth phase, the collision avoidance system decelerates the car followed by automatic deceleration at full braking power, roughly half a second before projected impact. In another approach, the collision avoidance system provides a warning to the driver through a Heads Up Display that visually resembles brake lamps. If the driver does not react, the system pre-charges the brakes and increases the brake assist sensitivity to maximize driver braking performance. In another approach, the collision avoidance system comprises radar and camera-based crash imminent braking in which radar technology detects a possible crash threat and alerts the driver. If the driver does not appear to react quickly enough or does not react at all, the system intervenes to apply the brakes in an effort to avoid the crash. In another approach, the collision avoidance system has three warning stages. In the first warning stage, the collision avoidance system includes audible and visual warnings to brake. If ignored, the collision avoidance system in the second stage tugs on the shoulder portion of the seat belt two to three times as an additional tactile warning to the driver to take action. In a third stage, the collision avoidance system predicts that a collision is unavoidable and includes full seat belt slack takeup for more effective seat belt protection and automatic application of the brakes to lessen the severity of the predicted crash. In another approach, the collision avoidance system highlights pedestrians in front of the vehicle at night by alerting the driver with an audible chime and visually displaying them via a Heads Up Display. In another approach, the collision avoidance system uses electronic stability control sensors to measure steering angle, vehicle yaw and lateral acceleration of the vehicle and brake assist (BAS) sensors to detect emergency braking, The collision avoidance system can tighten the seat belts, adjust seat positions including rear seats (if installed), raise folded rear headrests (if installed) and close the sunroof and windows if it detects a possible collision (including rollover). The collision avoidance system can use radar to monitor the traffic situation ahead and provide automatic partial braking if the driver does not react to the warnings and the system detects a severe danger of an accident. The disclosure can be applied to other automated response systems besides collision avoidance systems.
[0857] The vehicle can adjust the settings of the automotive response systems, such as a collision avoidance system, to reflect a driver's unique abilities and/or impairments. The settings for instance can be for instance the time at which a warning of a potential collision is provided to the driver, the duration of the driver response or reaction time interval from a warning of a collision is provided to the driver and the initiation of automated braking, the braking force or deceleration of automated braking, and the time over which automated braking occurs. For example, a driver with particularly quick responses can have less fault tolerant triggers than a driver with slow responses; that is, the collision avoidance system for the former driver will trigger automated braking later than the collision avoidance system for the latter driver.
[0858] An operation of the vehicle control system will be discussed with reference to Fig. 31.
[0859] In step 3100, the vehicle control system senses a stimulus, such as ignition of the vehicle, the vehicle being placed in gear, a door opening or closing to permit a person to enter or exit the vehicle, activation of a graphical user interface, and the like.
[0860] In step 3104, the vehicle control system determines an identity of each occupant and, based on the occupant's user profile, whether or not the occupant has an impairment. The occupant may be identified by any suitable technique, including authentication, electronic address information of a portable communication device associated with the user, image recognition, seat sensor feedback (e.g., weight of occupant), seat and lumbar settings selected by the occupant, and the like.
[0861] When the occupant has an impairment, the vehicle control system determines, in step 3108, occupant seating and/or interface location. The interface can be, for example, a graphical user interface, steering wheel, brake, clutch, or accelerator pedal, gear shift, or other communication interface by which the occupant is provided or provides information to the vehicle control system.
[0862] In step 3112, the vehicle control system configures the vehicle communication interface based on the type and/or severity of the occupant impairment(s), seating position of the occupant, and/or type of interface.
[0863] In step 3116, the vehicle control system selects a next occupant of the vehicle and returns to decision diamond 3104.
[0864] An operation of the vehicle control system will be discussed with reference to Fig. 32.
[0865] In step 3200, the vehicle control system senses a stimulus, such as ignition of the vehicle, the vehicle being placed in gear, a door opening or closing to permit a person to enter or exit the vehicle, activation of a graphical user interface, and the like.
[0866] In step 3204, the vehicle control system determines an identity of each occupant, retrieves the corresponding user profile and relevant system data. The occupant may be identified by any suitable technique, including authentication, electronic address information of a portable communication device associated with the user, image recognition, seat sensor feedback (e.g., weight of occupant), seat and lumbar settings selected by the occupant, and the like.
[0867] In decision diamond 3208, the vehicle control system, based on the occupant's user profile and/or relevant system data, determines whether or not the occupant has an ability that deviates sufficiently from a normal range for the subject vehicle
communication interface and/or automated vehicle response system. The normal range is defined for each type of activity and for each type of interface and/or automated control system. As will be appreciated, different types of activities, different types of interfaces, and/or different types of automated control systems have differing normal ranges.
[0868] When the occupant has an activity that deviates sufficiently from a normal range, the vehicle control system retrieves and applies rule sets, in step 3212, to determine what settings of the vehicle communication interface and/or automated vehicle response system require alteration. [0869] In step 3216, the vehicle control system configures the vehicle communication interface and/or automated vehicle response system based on the rule set application.
[0870] The vehicle control system then returns to step 3200.
[0871] The exemplary systems and methods of this disclosure have been described in relation to configurable vehicle center or rear center consoles and associated devices. However, to avoid unnecessarily obscuring the present disclosure, the preceding description omits a number of known structures and devices. This omission is not to be construed as a limitation of the scopes of the claims. Specific details are set forth to provide an understanding of the present disclosure. It should however be appreciated that the present disclosure may be practiced in a variety of ways beyond the specific detail set forth herein.
[0872] Furthermore, while the exemplary aspects, embodiments, options, and/or configurations illustrated herein show the various components of the system collocated, certain components of the system can be located remotely, at distant portions of a distributed network, such as a LAN and/or the Internet, or within a dedicated system. Thus, it should be appreciated, that the components of the system can be combined in to one or more devices, such as a Personal Computer (PC), laptop, netbook, smart phone, Personal Digital Assistant (PDA), tablet, etc., or collocated on a particular node of a distributed network, such as an analog and/or digital telecommunications network, a packet-switch network, or a circuit-switched network. It will be appreciated from the preceding description, and for reasons of computational efficiency, that the components of the system can be arranged at any location within a distributed network of components without affecting the operation of the system. For example, the various components can be located in a switch such as a PBX and media server, gateway, in one or more communications devices, at one or more users' premises, or some combination thereof. Similarly, one or more functional portions of the system could be distributed between a telecommunications device(s) and an associated computing device.
[0873] Furthermore, it should be appreciated that the various links connecting the elements can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements. These wired or wireless links can also be secure links and may be capable of communicating encrypted information. Transmission media used as links, for example, can be any suitable carrier for electrical signals, including coaxial cables, copper wire and fiber optics, and may take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
[0874] Also, while the flowcharts have been discussed and illustrated in relation to a particular sequence of events, it should be appreciated that changes, additions, and omissions to this sequence can occur without materially affecting the operation of the disclosed embodiments, configuration, and aspects.
[0875] A number of variations and modifications of the disclosure can be used. It would be possible to provide for some features of the disclosure without providing others.
[0876] It should be appreciated that the various processing modules (e.g., processors, vehicle systems, vehicle subsystems, modules, etc.), for example, can perform, monitor, and/or control critical and non-critical tasks, functions, and operations, such as interaction with and/or monitoring and/or control of critical and non-critical on board sensors and vehicle operations (e.g., engine, transmission, throttle, brake power assist/ brake lock-up, electronic suspension, traction and stability control, parallel parking assistance, occupant protection systems, power steering assistance, self-diagnostics, event data recorders, steer- by-wire and/or brake-by-wire operations, vehicle-to-vehicle interactions, vehicle-to- infrastructure interactions, partial and/or full automation, telematics, navigation/SPS, multimedia systems, audio systems, rear seat entertainment systems, game center or rear center consoles, tuners (SDR), heads-up display, night vision, lane departure warning, adaptive cruise control, adaptive headlights, collision warning, blind spot sensors, park reverse assistance, tire pressure monitoring, traffic signal recognition, vehicle tracking (e.g., LoJack™), dashboard/instrument cluster, lights, seats, climate control, voice recognition, remote keyless entry, security alarm systems, and wiper/window control). Processing modules can be enclosed in an advanced EMI-shielded enclosure containing multiple expansion modules. Processing modules can have a "black box" or flight data recorder technology, containing an event (or driving history) recorder
(containing operational information collected from vehicle on board sensors and provided by nearby or roadside signal transmitters), a crash survivable memory unit, an integrated controller and circuitry board, and network interfaces.
[0877] Critical system controller(s) can control, monitor, and/or operate critical systems. Critical systems may include one or more of (depending on the particular vehicle) monitoring, controlling, operating the ECU, TCU, door settings, window settings, blind spot monitor, monitoring, controlling, operating the safety equipment (e.g., airbag deployment control unit, collision sensor, nearby object sensing system, seat belt control unit, sensors for setting the seat belt, etc.), monitoring and/or controlling certain critical sensors such as the power source controller and energy output sensor, engine temperature, oil pressure sensing, hydraulic pressure sensors, sensors for headlight and other lights (e.g., emergency light, brake light, parking light, fog light, interior or passenger compartment light, and/or tail light state (on or off)), vehicle control system sensors, wireless network sensor (e.g., Wi-Fi and/or Bluetooth sensors, etc.), cellular data sensor, and/or steering/torque sensor, controlling the operation of the engine (e.g., ignition, etc.), head light control unit, power steering, display panel, switch state control unit, power control unit, and/or brake control unit, and/or issuing alerts to a user and/or remote monitoring entity of potential problems with a vehicle operation.
[0878] Non-critical system controller(s) can control, monitor, and/or operate non-critical systems. Non-critical systems may include one or more of (depending on the particular vehicle) monitoring, controlling, operating a non-critical system, emissions control, seating system controller and sensor, infotainment/entertainment system, monitoring certain non-critical sensors such as ambient (outdoor) weather readings (e.g., temperature, precipitation, wind speed, and the like), odometer reading sensor, trip mileage reading sensor, road condition sensors (e.g., wet, icy, etc.), radar transmitter/receiver output, brake wear sensor, oxygen sensor, ambient lighting sensor, vision system sensor, ranging sensor, parking sensor, heating, venting, and air conditioning (HVAC) system and sensor, water sensor, air-fuel ratio meter, hall effect sensor, microphone, radio frequency (RF) sensor, and/or infrared (IR) sensor.
[0879] It is an aspect of the present disclosure that one or more of the non-critical components and/or systems provided herein may become critical components and/or systems, and/or vice versa, depending on a context associated with the vehicle.
[0880] Optionally, the systems and methods of this disclosure can be implemented in conjunction with a special purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device or gate array such as PLD, PL A, FPGA, PAL, special purpose computer, any comparable means, or the like. In general, any device(s) or means capable of implementing the methodology illustrated herein can be used to implement the various aspects of this disclosure. Exemplary hardware that can be used for the disclosed embodiments, configurations and aspects includes computers, handheld devices, telephones (e.g., cellular, Internet enabled, digital, analog, hybrids, and others), and other hardware known in the art. Some of these devices include processors (e.g., a single or multiple microprocessors), memory, nonvolatile storage, input devices, and output devices. Furthermore, alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
[0881] In yet another embodiment, the disclosed methods may be readily implemented in conjunction with software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms. Alternatively, the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this disclosure is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or
microcomputer systems being utilized.
[0882] In yet another embodiment, the disclosed methods may be partially implemented in software that can be stored on a storage medium, executed on programmed general- purpose computer with the cooperation of a controller and memory, a special purpose computer, a microprocessor, or the like. In these instances, the systems and methods of this disclosure can be implemented as program embedded on personal computer such as an applet, JAVA® or CGI script, as a resource residing on a server or computer workstation, as a routine embedded in a dedicated measurement system, system component, or the like. The system can also be implemented by physically incorporating the system and/or method into a software and/or hardware system.
[0883] Although the present disclosure describes components and functions
implemented in the aspects, embodiments, and/or configurations with reference to particular standards and protocols, the aspects, embodiments, and/or configurations are not limited to such standards and protocols. Other similar standards and protocols not mentioned herein are in existence and are considered to be included in the present disclosure. Moreover, the standards and protocols mentioned herein and other similar standards and protocols not mentioned herein are periodically superseded by faster or more effective equivalents having essentially the same functions. Such replacement standards and protocols having the same functions are considered equivalents included in the present disclosure. [0884] The present disclosure, in various aspects, embodiments, and/or configurations, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various aspects, embodiments, configurations embodiments, subcombinations, and/or subsets thereof. Those of skill in the art will understand how to make and use the disclosed aspects, embodiments, and/or
configurations after understanding the present disclosure. The present disclosure, in various aspects, embodiments, and/or configurations, includes providing devices and processes in the absence of items not depicted and/or described herein or in various aspects, embodiments, and/or configurations hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease and\or reducing cost of implementation.
[0885] The foregoing discussion has been presented for purposes of illustration and description. The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, embodiments, and/or configurations of the disclosure may be combined in alternate aspects, embodiments, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
[0886] Moreover, though the description has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Claims

What is claimed is:
1. A method, comprising:
maintaining, by a microprocessor executable profile identification module, a persona of a vehicle occupant; and
based on the persona of the vehicle occupant and vehicle-related information, performing, by a microprocessor executable vehicle control system, an action assisting the vehicle occupant.
2. The method of claim 1, wherein the vehicle-related information comprises at least one of a current and/or future vehicle location and path of vehicle travel.
3. The method of claim 2, wherein the action performed depends on a seating position of the vehicle occupant.
4. The method of claim 1 , wherein the persona comprises one or more of bioinformatics, medical information, driving history, personal information, private information, travel information, and Internet browsing history and/or browsed content, wherein vehicle related information comprises one or more of vehicle context, state, external surroundings, location, past, current, and/or intended path of travel, waypoint, and destination, and wherein the action is one or more of making an appointment, making a reservation, purchasing an item on line, adding a waypoint or destination to path of travel on a navigation system of the vehicle, adding an entry into the occupant's electronic calendar, changing a destination or path of travel on the navigation system, warning the occupant, notifying the occupant, and sending a message to a person at a waypoint or destination of the vehicle regarding an arrival time.
5. The method of claim 1, further comprising:
determining, by the vehicle control system, relevant information other than the persona of the vehicle occupant and the vehicle-related information, the relevant information comprising one or more of a persona of a selected person not currently in the vehicle, a message from a friend or family member of the vehicle occupant, a current activity of the friend and/or family member, a location, hours of operation, and/or descriptive information about a point and/or location of interest near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a vehicle service facility near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a hotel and/or motel near the vehicle and/or the vehicle's path of travel, a current location of the friend or family member near the vehicle and/or the vehicle's path of travel, and a road condition along a path of travel of the vehicle.
6. The method of claim 4, wherein the vehicle obtains the persona from one or more of a different vehicle driven by the occupant and a home computer and wherein the vehicle synchronizes with the different vehicle and/or home computer when the vehicle is parked in proximity thereto and an ignition of the vehicle is turned off.
7. The method of claim 1, wherein the maintaining of the persona comprises collecting information from one or more vehicle sensors and/or from a remote information source and wherein a type of information collected depends on a seating position of the occupant.
8. A vehicle, comprising:
a microprocessor executable profile identification module operable to maintain a persona of a vehicle occupant; and
a microprocessor executable vehicle control system operable, based on the persona of the vehicle occupant and vehicle-related information, to perform an action assisting the vehicle occupant.
9. The vehicle of claim 8, wherein the vehicle-related information comprises at least one of a current and/or future vehicle location and path of vehicle travel.
10. The vehicle of claim 9, wherein the action performed depends on a seating position of the vehicle occupant.
11. The vehicle of claim 8 , wherein the persona comprises one or more of bioinformatics, medical information, driving history, personal information, private information, travel information, and Internet browsing history and/or browsed content, wherein vehicle related information comprises one or more of vehicle context, state, external surroundings, location, past, current, and/or intended path of travel, waypoint, and destination, and wherein the action is one or more of making an appointment, making a reservation, purchasing an item on line, adding a waypoint or destination to path of travel on a navigation system of the vehicle, adding an entry into the occupant's electronic calendar, changing a destination or path of travel on the navigation system, warning the occupant, notifying the occupant, and sending a message to a person at a waypoint or destination of the vehicle regarding an arrival time.
12. The vehicle of claim 8, wherein the vehicle control system is further operable to determine relevant information other than the persona of the vehicle occupant and the vehicle-related information, the relevant information comprising one or more of a persona of a selected person not currently in the vehicle, a message from a friend or family member of the vehicle occupant, a current activity of the friend and/or family member, a location, hours of operation, and/or descriptive information about a point and/or location of interest near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a vehicle service facility near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a hotel and/or motel near the vehicle and/or the vehicle's path of travel, a current location of the friend or family member near the vehicle and/or the vehicle's path of travel, and a road condition along a path of travel of the vehicle.
13. The vehicle of claim 11, wherein the vehicle obtains the persona from one or more of a different vehicle driven by the occupant and a home computer and wherein the vehicle synchronizes with the different vehicle and/or home computer when the vehicle is parked in proximity thereto and an ignition of the vehicle is turned off.
14. The vehicle of claim 8, wherein maintaining the persona comprises collecting information from one or more vehicle sensors and/or from a remote information source and wherein a type of information collected depends on a seating position of the occupant.
15. A tangible and non-transient computer readable medium comprising microprocessor executable instructions that, when executed, perform functions
comprising:
maintaining a persona of a vehicle occupant; and
based on the persona of the vehicle occupant and vehicle-related information, perform an action assisting the vehicle occupant.
16. The computer readable medium of claim 15, wherein the vehicle-related information comprises at least one of a current and/or future vehicle location and path of vehicle travel.
17. The computer readable medium of claim 16, wherein the action performed depends on a seating position of the vehicle occupant.
18. The computer readable medium of claim 15, wherein the persona comprises one or more of bioinformatics, medical information, driving history, personal information, private information, travel information, and Internet browsing history and/or browsed content, wherein vehicle related information comprises one or more of vehicle context, state, external surroundings, location, past, current, and/or intended path of travel, waypoint, and destination, and wherein the action is one or more of making an appointment, making a reservation, purchasing an item on line, adding a waypoint or destination to path of travel on a navigation system of the vehicle, adding an entry into the occupant's electronic calendar, changing a destination or path of travel on the navigation system, warning the occupant, notifying the occupant, and sending a message to a person at a waypoint or destination of the vehicle regarding an arrival time.
19. The computer readable medium of claim 15, further comprising the function of determining relevant information other than the persona of the vehicle occupant and the vehicle-related information, the relevant information comprising one or more of a persona of a selected person not currently in the vehicle, a message from a friend or family member of the vehicle occupant, a current activity of the friend and/or family member, a location, hours of operation, and/or descriptive information about a point and/or location of interest near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a vehicle service facility near the vehicle and/or the vehicle's path of travel, a location, hours of operation, and/or descriptive information about a hotel and/or motel near the vehicle and/or the vehicle's path of travel, a current location of the friend or family member near the vehicle and/or the vehicle's path of travel, and a road condition along a path of travel of the vehicle.
20. The computer readable medium of claim 18, wherein the vehicle obtains the persona from one or more of a different vehicle driven by the occupant and a home computer and wherein the vehicle synchronizes with the different vehicle and/or home computer when the vehicle is parked in proximity thereto and an ignition of the vehicle is turned off.
21. The computer readable medium of claim 15, wherein the maintaining of the persona comprises collecting information from one or more vehicle sensors and/or from a remote information source and wherein a type of information collected depends on a seating position of the occupant.
22. The computer readable medium of claim 21 , wherein the maintaining of the persona comprises collecting information from one or more vehicle sensors and/or from a remote information source and wherein a type of information collected depends on one or more of an identity of the occupant, an age of the occupant, and an association of the occupant with the vehicle.
23. The computer readable medium of claim 21 , wherein the persona is defined in a format that can be processed by vehicles of different manufacturers.
24. A vehicle, comprising: a plurality of blade processors, each blade processor performing a function that is at least one of a vehicle task, function, or operation and an infotainment task, function, or operation;
wherein:
the plurality of blade processors have a backplane;
the backplane has a first communication zone defining a trusted network within the vehicle to connect with trusted computational devices and/or modules provided or certified by the vehicle manufacturer but not untrusted computational devices and/or modules provided by vehicle occupants;
the backplane has a second communication zone defining an untrusted network to connect with the untrusted computational devices and/or modules; and
the backplane has a third communication zone providing power and data transmission to the plurality of blade processors.
25. The vehicle of claim 24, wherein the first and second communication zones are connected logically on opposing sides of a firewall blade processor.
26. The vehicle of claim 24, wherein a master blade processor enables a connection to a computational device connected to the first communication zone upon verification that the connected computational device is certified by a vehicle manufacturer.
27. The vehicle of claim 24, wherein the first and second communication zones are configured as separate Ethernet switches and wherein the first and second
communication zones are not in signal communication with one another.
28. The vehicle of claim 24, wherein a firewall has a dedicated slot that bridges the first and second communication zones and uses the third communication zone for power connections.
29. The vehicle of claim 24, wherein an output of each of the plurality of blade processors is an IP message framed into an Ethernet packet, wherein a sensor and/or controller of the vehicle communicates by a Car Area Network ("CAN") protocol, and further comprising an Ethernet bus controller terminating a CAN bus to the sensor and/or controller and a CAN Ethernet controller subsystem to translate an Ethernet message from a blade processor to a CAN protocol-based message.
30. The vehicle of claim 24, wherein the plurality of blade processors are configured as a crate having the backplane connector and on-backplane Ethernet interfaces enabling the blade processors to communicate with one another using Ethernet messages.
31. The vehicle of claim 24, wherein the plurality of blade processors comprises a master blade processor operable to assign a component and/or module requiring a blade processor for execution to a selected blade processor.
32. A vehicle, comprising:
a plurality of blade processors, each blade processor performing a function that is at least one of a vehicle task, function, or operation and an infotainment task, function, or operation;
wherein the plurality of blade processors comprises a master blade processor operable to assign a component and/or module requiring a blade processor for execution to a selected blade processor.
33. The vehicle of claim 32, wherein the master blade processor inventories hardware and software capabilities of components and/or modules in communication with the master blade processor to provide an application resource table comprising component and/or module identity, functional description, and/or computational resource
requirements and/or capabilities needed for execution of the component and/or module.
34. The vehicle of claim 33, wherein a blade processor is assigned to execute the component and/or module when the blade processor satisfies the corresponding computational resource requirements and/or capabilities.
35. The vehicle of claim 33, wherein the master blade processor assigns a network address to each blade processor and communication device in communication with the master blade processor.
36. The vehicle of claim 32, wherein the master blade processor assigns a component and/or module to a selected blade processor when the selected blade processor is available or, though unavailable, is executing a component and/or module having a lower priority than the component and/or module to be assigned to the selected blade processor.
37. The vehicle of claim 32, wherein an output of each of the plurality of blade processors is an IP message framed into an Ethernet packet, wherein a sensor and/or controller of the vehicle communicates by a Car Area Network ("CAN") protocol, and further comprising an Ethernet bus controller terminating a CAN bus to the sensor and/or controller and a CAN Ethernet controller subsystem to translate an Ethernet message from a blade processor to a CAN protocol-based message.
38. The vehicle of claim 32, wherein the plurality of blade processors are configured as a crate having a backplane connector and on-backplane Ethernet interface enabling the blade processors to communicate with one another using Ethernet messages.
39. The vehicle of claim 32, wherein:
the plurality of blade processors have a backplane connector;
the backplane has a first communication zone defining a trusted network within the vehicle to connect with trusted computational devices and/or modules provided or certified by the vehicle manufacturer but not untrusted computational devices and/or modules provided by vehicle occupants;
the backplane has a second communication zone defining an untrusted network to connect with the untrusted computational devices and/or modules; and
the backplane has a third communication zone providing power and data transmission to the plurality of blade processors.
PCT/US2014/034194 2012-03-14 2014-04-15 Intelligent vehicle for assisting vehicle occupants and incorporating vehicle crate for blade processors WO2014172369A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/US2014/034194 WO2014172369A2 (en) 2013-04-15 2014-04-15 Intelligent vehicle for assisting vehicle occupants and incorporating vehicle crate for blade processors
US14/253,506 US9082239B2 (en) 2012-03-14 2014-04-15 Intelligent vehicle for assisting vehicle occupants

Applications Claiming Priority (15)

Application Number Priority Date Filing Date Title
US201361811981P 2013-04-15 2013-04-15
US61/811,981 2013-04-15
US201361865954P 2013-08-14 2013-08-14
US61/865,954 2013-08-14
US201361870698P 2013-08-27 2013-08-27
US61/870,698 2013-08-27
US201361891217P 2013-10-15 2013-10-15
US61/891,217 2013-10-15
US201361904205P 2013-11-14 2013-11-14
US61/904,205 2013-11-14
US201461924572P 2014-01-07 2014-01-07
US61/924,572 2014-01-07
US201461926749P 2014-01-13 2014-01-13
US61/926,749 2014-01-13
PCT/US2014/034194 WO2014172369A2 (en) 2013-04-15 2014-04-15 Intelligent vehicle for assisting vehicle occupants and incorporating vehicle crate for blade processors

Publications (2)

Publication Number Publication Date
WO2014172369A2 true WO2014172369A2 (en) 2014-10-23
WO2014172369A3 WO2014172369A3 (en) 2015-02-26

Family

ID=55070571

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/034194 WO2014172369A2 (en) 2012-03-14 2014-04-15 Intelligent vehicle for assisting vehicle occupants and incorporating vehicle crate for blade processors

Country Status (2)

Country Link
US (1) US9082239B2 (en)
WO (1) WO2014172369A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016156448A1 (en) * 2015-04-02 2016-10-06 Continental Automotive Gmbh Computer system for a vehicle
WO2017200568A1 (en) * 2016-05-17 2017-11-23 Google Llc Application execution while operating vehicle
GB2555088A (en) * 2016-10-11 2018-04-25 Jaguar Land Rover Ltd Interface apparatus and method
EP3422265A1 (en) * 2017-06-30 2019-01-02 Seat, S.A. Method for generating at least one notification to a user in a shared vehicle
US11713600B1 (en) * 2019-09-30 2023-08-01 Ambarella International Lp Child safety lock
US11772603B2 (en) 2021-05-18 2023-10-03 Motional Ad Llc Passenger authentication and entry for autonomous vehicles

Families Citing this family (415)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10915296B2 (en) 2000-11-01 2021-02-09 Flexiworld Technologies, Inc. Information apparatus that includes a touch sensitive screen interface for managing or replying to e-mails
US11204729B2 (en) 2000-11-01 2021-12-21 Flexiworld Technologies, Inc. Internet based digital content services for pervasively providing protected digital content to smart devices based on having subscribed to the digital content service
US20020059415A1 (en) 2000-11-01 2002-05-16 Chang William Ho Manager for device-to-device pervasive digital output
US10860290B2 (en) 2000-11-01 2020-12-08 Flexiworld Technologies, Inc. Mobile information apparatuses that include a digital camera, a touch sensitive screen interface, support for voice activated commands, and a wireless communication chip or chipset supporting IEEE 802.11
WO2002041107A2 (en) 2000-11-20 2002-05-23 Flexiworld Technologies, Inc. Systems and methods for mobile and pervasive output
US20020097419A1 (en) 2001-01-19 2002-07-25 Chang William Ho Information apparatus for universal data output
US7873610B2 (en) 2006-05-26 2011-01-18 Andrew S Poulsen Meta-configuration of profiles
US9111538B2 (en) * 2009-09-30 2015-08-18 T-Mobile Usa, Inc. Genius button secondary commands
US10779761B2 (en) * 2010-06-07 2020-09-22 Affectiva, Inc. Sporadic collection of affect data within a vehicle
US8812021B2 (en) 2011-12-02 2014-08-19 Yellowpages.Com, Llc System and method for coordinating meetings between users of a mobile communication network
US9412273B2 (en) 2012-03-14 2016-08-09 Autoconnect Holdings Llc Radar sensing and emergency response vehicle detection
US9378601B2 (en) 2012-03-14 2016-06-28 Autoconnect Holdings Llc Providing home automation information via communication with a vehicle
US9384609B2 (en) 2012-03-14 2016-07-05 Autoconnect Holdings Llc Vehicle to vehicle safety and traffic communications
WO2014172327A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc Synchronization between vehicle and user device calendar
WO2014172380A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc Altered map routes based on user profile information
US10469506B2 (en) * 2012-07-19 2019-11-05 Tecore, Inc. Systems and methods for communication control in a restricted environment
US9424696B2 (en) * 2012-10-04 2016-08-23 Zonar Systems, Inc. Virtual trainer for in vehicle driver coaching and to collect metrics to improve driver performance
US9088613B2 (en) * 2012-11-13 2015-07-21 Gogo Llc Ground system for vehicle data distribution
JP6032176B2 (en) * 2012-12-19 2016-11-24 株式会社デンソー Display control device for vehicle
US9434236B2 (en) * 2013-03-13 2016-09-06 Deere & Company Work vehicle HVAC control based on operator seat direction
US9479601B2 (en) * 2013-03-15 2016-10-25 Ford Global Technologies, Llc Method and apparatus for seamless application portability over multiple environments
US20140309790A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Adjusting home automation comfort based on vehicle associated user profile data
EP2987153A4 (en) 2013-04-15 2016-09-21 Autoconnect Holdings Llc Central network for automated control of vehicular traffic
US9402174B2 (en) * 2013-05-06 2016-07-26 Verizon Telematics Inc. Vehicle driver identification using portable device
US9410818B2 (en) * 2013-05-22 2016-08-09 Mitsubishi Electric Corporation Navigation device
US9261881B1 (en) * 2013-08-01 2016-02-16 Google Inc. Filtering noisy/high-intensity regions in laser-based lane marker detection
US9264862B2 (en) * 2013-08-15 2016-02-16 Apple Inc. Determining exit from a vehicle
JP6171761B2 (en) * 2013-09-10 2017-08-02 株式会社デンソー Facility use management system, in-vehicle control device, and facility side device
US9807349B1 (en) * 2013-09-27 2017-10-31 Isaac S. Daniel Covert recording alarm apparatus for vehicles
US9807172B2 (en) * 2013-10-18 2017-10-31 At&T Intellectual Property I, L.P. Mobile device intermediary for vehicle adaptation
US9203843B2 (en) 2013-11-08 2015-12-01 At&T Mobility Ii Llc Mobile device enabled tiered data exchange via a vehicle
DE102013223439A1 (en) * 2013-11-18 2015-05-21 Robert Bosch Gmbh Method for issuing an emergency call in a vehicle
US10088844B2 (en) * 2013-11-22 2018-10-02 Ford Global Technologies, Llc Wearable computer in an autonomous vehicle
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
US10585486B2 (en) 2014-01-03 2020-03-10 Harman International Industries, Incorporated Gesture interactive wearable spatial audio system
US9578307B2 (en) 2014-01-14 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10360907B2 (en) 2014-01-14 2019-07-23 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9915545B2 (en) 2014-01-14 2018-03-13 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9629774B2 (en) 2014-01-14 2017-04-25 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10024679B2 (en) 2014-01-14 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10248856B2 (en) 2014-01-14 2019-04-02 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US8843317B1 (en) * 2014-01-15 2014-09-23 Open Invention Network, Llc Transport communication pairing
US9584470B2 (en) * 2014-02-07 2017-02-28 General Motors Llc Dynamic DHCP for Wi-Fi connectivity in a vehicle
US9860708B2 (en) 2014-02-25 2018-01-02 International Business Machines Corporation Vehicle occupant position determination
CA2937972C (en) * 2014-03-06 2022-04-12 David Burton Mobile data management system
US9476730B2 (en) * 2014-03-18 2016-10-25 Sri International Real-time system for multi-modal 3D geospatial mapping, object recognition, scene annotation and analytics
US9884586B2 (en) 2014-03-28 2018-02-06 Pioneer Corporation Vehicle lighting device
WO2015166811A1 (en) * 2014-04-30 2015-11-05 みこらった株式会社 Automatic driving vehicle and program for automatic driving vehicle
CN106716308B (en) * 2014-06-17 2020-02-11 谷歌有限责任公司 Input method editor for inputting geographical location names
US10217134B2 (en) * 2014-06-24 2019-02-26 Google Llc Detour based content selections
US10024667B2 (en) 2014-08-01 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
US9729636B2 (en) * 2014-08-01 2017-08-08 Magna Electronics Inc. Smart road system for vehicles
US9473421B1 (en) * 2014-08-06 2016-10-18 Fourth Cloud, Inc. Modular computer system and method
US10152698B2 (en) * 2014-08-26 2018-12-11 Samsung Electronics Co., Ltd. System and method for mobile payments in a vehicle
US10024678B2 (en) 2014-09-17 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable clip for providing social and environmental awareness
US9922236B2 (en) 2014-09-17 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
US10619874B2 (en) * 2014-10-23 2020-04-14 Trane International Inc. Apparatuses, methods and systems for configuring electronically programmable HVAC system
CN106161744B (en) * 2014-10-23 2019-06-14 Lg电子株式会社 Mobile terminal and its control method
JP6394554B2 (en) * 2014-10-31 2018-09-26 株式会社デンソー Driving assistance device
DE102014223004A1 (en) * 2014-11-11 2016-05-12 Robert Bosch Gmbh Method and device for determining whether a fault condition exists in a motor vehicle or not
WO2016075774A1 (en) * 2014-11-12 2016-05-19 三菱電機株式会社 Display control device and information display device
US10137830B2 (en) * 2014-12-02 2018-11-27 Lenovo (Singapore) Pte. Ltd. Self-adjusting lighting based on viewing location
WO2016090282A1 (en) * 2014-12-05 2016-06-09 Cowbyt Technologies Llc Autonomous navigation system
DE102014018460A1 (en) * 2014-12-11 2016-06-16 Audi Ag Method for controlling the operation of at least one functional component of a motor vehicle and motor vehicle
US9978145B2 (en) * 2014-12-16 2018-05-22 Koninklijke Philips N.V. Assessment of an attentional deficit
US9463805B2 (en) * 2014-12-17 2016-10-11 Honda Motor Co., Ltd. System and method for dynamic vehicle control affecting sleep states of vehicle occupants
US9573600B2 (en) * 2014-12-19 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Method and apparatus for generating and using driver specific vehicle controls
US10032369B2 (en) 2015-01-15 2018-07-24 Magna Electronics Inc. Vehicle vision system with traffic monitoring and alert
US9832241B1 (en) 2015-01-20 2017-11-28 State Farm Mutual Automobile Insurance Company Broadcasting telematics data to nearby mobile devices, vehicles, and infrastructure
US9576460B2 (en) 2015-01-21 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable smart device for hazard detection and warning based on image and audio data
JP6470059B2 (en) * 2015-01-30 2019-02-13 株式会社デンソーテン Information providing method, information providing apparatus, information providing system, and vehicle
US9568920B2 (en) * 2015-02-04 2017-02-14 GM Global Technology Operations LLC Automatic mitigation of vehicle wind buffeting
US10490102B2 (en) 2015-02-10 2019-11-26 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for braille assistance
US10104699B2 (en) * 2015-02-10 2018-10-16 Sphero, Inc. Signal strength representation and automatic connection and control upon a self-propelled device
US9613515B2 (en) * 2015-02-27 2017-04-04 Ford Global Technologies, Llc Method and apparatus for occupant customized wellness monitoring
US10394232B2 (en) * 2015-02-27 2019-08-27 Research Frontiers Incorporated Control system for SPD device and home automation
US9586318B2 (en) 2015-02-27 2017-03-07 Toyota Motor Engineering & Manufacturing North America, Inc. Modular robot with smart device
US10457113B2 (en) * 2015-02-27 2019-10-29 Braeburn Inc. Vehicle extreme temperature safety systems and methods
US9747740B2 (en) 2015-03-02 2017-08-29 Ford Global Technologies, Llc Simultaneous button press secure keypad code entry
US20160257198A1 (en) 2015-03-02 2016-09-08 Ford Global Technologies, Inc. In-vehicle component user interface
EP3269158A1 (en) * 2015-03-09 2018-01-17 AutoConnect Holdings LLC Vehicle and occupant application integration
US9677901B2 (en) * 2015-03-10 2017-06-13 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing navigation instructions at optimal times
US9811752B2 (en) 2015-03-10 2017-11-07 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable smart device and method for redundant object identification
DE102015204282A1 (en) * 2015-03-10 2016-09-15 Robert Bosch Gmbh Method for operating a motor vehicle, control device and computer program product
US9972216B2 (en) 2015-03-20 2018-05-15 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for storing and playback of information for blind users
US10395555B2 (en) 2015-03-30 2019-08-27 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing optimal braille output based on spoken and sign language
US9918346B2 (en) * 2015-04-17 2018-03-13 Barracuda Networks, Inc. System for connecting, securing and managing network devices with a dedicated private virtual network
JP6685008B2 (en) * 2015-04-21 2020-04-22 パナソニックIpマネジメント株式会社 Driving support method and driving support device, driving control device, vehicle, driving support program using the same
DE102015208053A1 (en) * 2015-04-30 2016-11-03 Robert Bosch Gmbh Method and device for reducing the risk to and / or from a vehicle located in a parking space
US9540015B2 (en) 2015-05-04 2017-01-10 At&T Intellectual Property I, L.P. Methods and apparatus to alter a vehicle operation
US20160350882A1 (en) * 2015-05-27 2016-12-01 Uber Technologies, Inc. Communicating information to devices based on a characteristic of a service provider
US20160353880A1 (en) * 2015-06-03 2016-12-08 Tome, Inc. System And Method For Sensing Obstructions Of Sensors Used With An Adjustable Height Work Desk
US9922549B2 (en) 2015-06-10 2018-03-20 Vivint, Inc. Vehicle integration with security and/or automation systems
JP6477281B2 (en) 2015-06-17 2019-03-06 株式会社オートネットワーク技術研究所 In-vehicle relay device, in-vehicle communication system, and relay program
JP6771272B2 (en) * 2015-07-01 2020-10-21 日立オートモティブシステムズ株式会社 In-vehicle electronic control device and stack usage
KR102400899B1 (en) * 2015-07-15 2022-05-23 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9869560B2 (en) 2015-07-31 2018-01-16 International Business Machines Corporation Self-driving vehicle's response to a proximate emergency vehicle
KR102326057B1 (en) * 2015-08-03 2021-11-12 현대모비스 주식회사 Parking assistance apparatus and control method for the same
US9898039B2 (en) 2015-08-03 2018-02-20 Toyota Motor Engineering & Manufacturing North America, Inc. Modular smart necklace
US9483948B1 (en) 2015-08-07 2016-11-01 International Business Machines Corporation Automated control of interactions between self-driving vehicles and pedestrians
US9785145B2 (en) 2015-08-07 2017-10-10 International Business Machines Corporation Controlling driving modes of self-driving vehicles
US9721397B2 (en) 2015-08-11 2017-08-01 International Business Machines Corporation Automatic toll booth interaction with self-driving vehicles
DE102015215546A1 (en) * 2015-08-14 2017-02-16 Robert Bosch Gmbh Method and device for determining whether a fault condition exists in a motor vehicle or not
US9718471B2 (en) 2015-08-18 2017-08-01 International Business Machines Corporation Automated spatial separation of self-driving vehicles from manually operated vehicles
US9481366B1 (en) 2015-08-19 2016-11-01 International Business Machines Corporation Automated control of interactions between self-driving vehicles and animals
EP3133545A1 (en) * 2015-08-20 2017-02-22 Samsung Electronics Co., Ltd. Apparatus and operating method for mobile payments in a vehicle
US9896100B2 (en) 2015-08-24 2018-02-20 International Business Machines Corporation Automated spatial separation of self-driving vehicles from other vehicles based on occupant preferences
US10272921B2 (en) * 2015-08-25 2019-04-30 International Business Machines Corporation Enriched connected car analysis services
US20170061812A1 (en) * 2015-09-01 2017-03-02 Karz Software Technologies Ltd. Driver monitoring and feedback system
US9914418B2 (en) 2015-09-01 2018-03-13 Ford Global Technologies, Llc In-vehicle control location
US9967717B2 (en) 2015-09-01 2018-05-08 Ford Global Technologies, Llc Efficient tracking of personal device locations
US9622159B2 (en) 2015-09-01 2017-04-11 Ford Global Technologies, Llc Plug-and-play interactive vehicle interior component architecture
US9731726B2 (en) * 2015-09-02 2017-08-15 International Business Machines Corporation Redirecting self-driving vehicles to a product provider based on physiological states of occupants of the self-driving vehicles
US9860710B2 (en) 2015-09-08 2018-01-02 Ford Global Technologies, Llc Symmetrical reference personal device location tracking
US9744852B2 (en) 2015-09-10 2017-08-29 Ford Global Technologies, Llc Integration of add-on interior modules into driver user interface
DE102015217275A1 (en) * 2015-09-10 2017-03-16 Robert Bosch Gmbh Method and apparatus for determining whether a motor vehicle is currently being manually or automatically guided
KR102374735B1 (en) * 2015-09-14 2022-03-15 주식회사 만도모빌리티솔루션즈 Driving assistant apparatus and driving assistant method
US9513632B1 (en) * 2015-09-16 2016-12-06 International Business Machines Corporation Driving mode alerts from self-driving vehicles
US9566986B1 (en) 2015-09-25 2017-02-14 International Business Machines Corporation Controlling driving modes of self-driving vehicles
US9604541B1 (en) * 2015-10-06 2017-03-28 Samsung Electronics Co., Ltd. System and method for customizing a vehicle operating environment
US11540088B2 (en) * 2015-10-08 2022-12-27 Voxx International Corporation System and method for locating a portable device in different zones relative to a vehicle and with device zone indicators
US9481367B1 (en) 2015-10-14 2016-11-01 International Business Machines Corporation Automated control of interactions between self-driving vehicles and animals
US9834224B2 (en) 2015-10-15 2017-12-05 International Business Machines Corporation Controlling driving modes of self-driving vehicles
US10144374B2 (en) * 2015-10-19 2018-12-04 Toyota Motor Engineering & Manufacturing North America, Inc. Automatic settings for vehicle components based on measured anthropometric features
US10057739B2 (en) * 2015-10-21 2018-08-21 International Business Machines Corporation Distributed and localized policy and charging control in cellular networks to enable route flexibility
US9944291B2 (en) 2015-10-27 2018-04-17 International Business Machines Corporation Controlling driving modes of self-driving vehicles
US9751532B2 (en) 2015-10-27 2017-09-05 International Business Machines Corporation Controlling spacing of self-driving vehicles based on social network relationships
US10607293B2 (en) 2015-10-30 2020-03-31 International Business Machines Corporation Automated insurance toggling for self-driving vehicles
US10633005B2 (en) * 2015-11-03 2020-04-28 Ford Global Technologies, Llc Wearable device configuration using vehicle and cloud event data
US9494940B1 (en) * 2015-11-04 2016-11-15 Zoox, Inc. Quadrant configuration of robotic vehicles
US9517767B1 (en) 2015-11-04 2016-12-13 Zoox, Inc. Internal safety systems for robotic vehicles
KR101724944B1 (en) * 2015-11-06 2017-04-10 현대자동차주식회사 Airbag system of vehicle
US10176525B2 (en) 2015-11-09 2019-01-08 International Business Machines Corporation Dynamically adjusting insurance policy parameters for a self-driving vehicle
US9791861B2 (en) 2015-11-12 2017-10-17 International Business Machines Corporation Autonomously servicing self-driving vehicles
US10692126B2 (en) 2015-11-17 2020-06-23 Nio Usa, Inc. Network-based system for selling and servicing cars
US10332418B2 (en) * 2015-11-23 2019-06-25 International Business Machines Corporation Personalized vitamin supplement
US10144419B2 (en) 2015-11-23 2018-12-04 Magna Electronics Inc. Vehicle dynamic control system for emergency handling
US20170153636A1 (en) * 2015-11-27 2017-06-01 Bragi GmbH Vehicle with wearable integration or communication
US9944295B2 (en) 2015-11-27 2018-04-17 Bragi GmbH Vehicle with wearable for identifying role of one or more users and adjustment of user settings
US10099636B2 (en) 2015-11-27 2018-10-16 Bragi GmbH System and method for determining a user role and user settings associated with a vehicle
US10104460B2 (en) 2015-11-27 2018-10-16 Bragi GmbH Vehicle with interaction between entertainment systems and wearable devices
US10061326B2 (en) 2015-12-09 2018-08-28 International Business Machines Corporation Mishap amelioration based on second-order sensing by a self-driving vehicle
US10046637B2 (en) 2015-12-11 2018-08-14 Ford Global Technologies, Llc In-vehicle component control user interface
US9884622B2 (en) 2015-12-29 2018-02-06 Thunder Power New Energy Vehicle Development Company Limited Vehicle condition detection and warning system
EP3187872B1 (en) * 2015-12-30 2018-11-28 Skidata Ag Method for determining the condition of access control devices and sales or payment machines of an access control system
US9826256B2 (en) 2015-12-31 2017-11-21 Wellen Sham Facilitating multimedia information delivery through a UAV network
US9955115B2 (en) 2015-12-31 2018-04-24 Wellen Sham Facilitating wide view video conferencing through a drone network
US10454576B2 (en) 2015-12-31 2019-10-22 Wellen Sham UAV network
US9786165B2 (en) 2015-12-31 2017-10-10 Wellen Sham Facilitating location positioning service through a UAV network
US20170193556A1 (en) * 2015-12-31 2017-07-06 Wellen Sham Facilitating targeted information delivery through a uav network
CN106982345A (en) * 2015-12-31 2017-07-25 沈玮 Promote wide-angle view video conference by UAV networks
US9800321B2 (en) 2015-12-31 2017-10-24 Wellen Sham Facilitating communication with a vehicle via a UAV
KR101858698B1 (en) * 2016-01-04 2018-05-16 엘지전자 주식회사 Display apparatus for vehicle and Vehicle
US10088328B1 (en) 2016-01-05 2018-10-02 Open Invention Network Llc Navigation application providing supplemental navigation information
GB2546250B (en) 2016-01-06 2020-06-17 Ge Aviation Systems Taleris Ltd Automated fusion and analysis of multiple sources of aircraft data
KR102584471B1 (en) * 2016-01-13 2023-10-05 에이치디현대인프라코어 주식회사 Mirroring control method and mirroring system for construction machine
US9607449B1 (en) * 2016-01-22 2017-03-28 General Motors Llc Control of wireless communication channels for a vehicle telematics unit
US9836973B2 (en) 2016-01-27 2017-12-05 International Business Machines Corporation Selectively controlling a self-driving vehicle's access to a roadway
US9749470B1 (en) * 2016-01-27 2017-08-29 State Farm Mutual Automobile Insurance Company Systems and methods for handling communications during user operation of a motor vehicle
US10144408B2 (en) * 2016-01-29 2018-12-04 Faraday & Future Inc. System and method for assisting navigation of vehicle through traffic
DE102016211352A1 (en) * 2016-02-02 2017-08-03 Volkswagen Aktiengesellschaft A method for configuring online mobile services
US9731755B1 (en) * 2016-02-16 2017-08-15 GM Global Technology Operations LLC Preview lateral control for automated driving
US10059287B2 (en) * 2016-02-17 2018-08-28 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for enhanced comfort prediction
US10469997B2 (en) 2016-02-26 2019-11-05 Microsoft Technology Licensing, Llc Detecting a wireless signal based on context
US9868391B1 (en) * 2016-02-26 2018-01-16 Waymo Llc Scenario based audible warnings for autonomous vehicles
US10475144B2 (en) * 2016-02-26 2019-11-12 Microsoft Technology Licensing, Llc Presenting context-based guidance using electronic signs
CN107134160A (en) 2016-02-29 2017-09-05 法拉第未来公司 Urgency signal is detected and responded
JP6788477B2 (en) * 2016-03-10 2020-11-25 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Recognition result presentation device and recognition result presentation method
JP2017161479A (en) * 2016-03-11 2017-09-14 株式会社東芝 Wireless device, communication method, and program
WO2017153332A1 (en) * 2016-03-11 2017-09-14 Limbic Life Ag Occupant support device and system for controlling objects
US10024680B2 (en) 2016-03-11 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Step based guidance system
US10082877B2 (en) 2016-03-15 2018-09-25 Ford Global Technologies, Llc Orientation-independent air gesture detection service for in-vehicle environments
CN105865481B (en) * 2016-03-31 2019-05-07 百度在线网络技术(北京)有限公司 A kind of air navigation aid and device based on map
US9896096B2 (en) * 2016-04-11 2018-02-20 David E. Newman Systems and methods for hazard mitigation
US10304335B2 (en) * 2016-04-12 2019-05-28 Ford Global Technologies, Llc Detecting available parking spaces
US10459616B2 (en) * 2016-04-15 2019-10-29 Toyota Motor Engineering & Manufacturing North America, Inc. Automatic full screen display
US9914415B2 (en) 2016-04-25 2018-03-13 Ford Global Technologies, Llc Connectionless communication with interior vehicle components
US9716964B1 (en) * 2016-04-26 2017-07-25 Fmr Llc Modifying operation of computing devices to mitigate short-term impaired judgment
US10036333B2 (en) 2016-05-16 2018-07-31 Ford Global Technologies, Llc Cylinder deactivation control system
US10246073B2 (en) 2016-05-16 2019-04-02 Ford Global Technologies, Llc Control system for a hybrid-electric vehicle
US10196994B2 (en) 2016-05-16 2019-02-05 Ford Global Technologies, Llc Powertrain control system
US10685391B2 (en) 2016-05-24 2020-06-16 International Business Machines Corporation Directing movement of a self-driving vehicle based on sales activity
US9958275B2 (en) 2016-05-31 2018-05-01 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for wearable smart device communications
CN105892375A (en) * 2016-06-08 2016-08-24 京东方科技集团股份有限公司 Vehicle-mounted child safety seat and controller thereof, as well as vehicle-mounted child safety seat system
US9652748B1 (en) * 2016-06-13 2017-05-16 State Farm Mutual Automobile Insurance Company Technology for automatically identifying and scheduling provider appointments in response to accident events
CN106056936B (en) * 2016-06-17 2019-01-01 京东方科技集团股份有限公司 A kind of method of adjustment of traveling lane, apparatus and system
US9862339B1 (en) * 2016-06-24 2018-01-09 Honda Motor Co., Ltd. Smart entry driver ID changing
US10474946B2 (en) 2016-06-24 2019-11-12 Microsoft Technology Licensing, Llc Situation aware personal assistant
US20180012196A1 (en) 2016-07-07 2018-01-11 NextEv USA, Inc. Vehicle maintenance manager
SG11201811836XA (en) * 2016-07-11 2019-02-27 Freakout Holdings Inc Information processing device and program
JP6801274B2 (en) * 2016-07-11 2020-12-16 株式会社デンソー Driving support device and driving support method
US10759255B2 (en) 2016-07-20 2020-09-01 Ford Global Technologies, Llc Autonomous-vehicle climate-control system
US10561519B2 (en) 2016-07-20 2020-02-18 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device having a curved back to reduce pressure on vertebrae
US9928734B2 (en) 2016-08-02 2018-03-27 Nio Usa, Inc. Vehicle-to-pedestrian communication systems
US10571908B2 (en) * 2016-08-15 2020-02-25 Ford Global Technologies, Llc Autonomous vehicle failure mode management
US20180054570A1 (en) * 2016-08-18 2018-02-22 GM Global Technology Operations LLC Systems for effecting progressive driver-distraction-avoidance actions at a vehicle
US20180050659A1 (en) * 2016-08-22 2018-02-22 Faraday&Future Inc. Electric seatbelt notification systems and methods
US10471965B2 (en) * 2016-08-22 2019-11-12 Lenovo (Singapore) Pte. Ltd. Securing guest access to vehicle
CN107808091A (en) * 2016-09-09 2018-03-16 宝马股份公司 For aiding in vehicle, the apparatus and method of vehicle user
US10650621B1 (en) 2016-09-13 2020-05-12 Iocurrents, Inc. Interfacing with a vehicular controller area network
US10093322B2 (en) 2016-09-15 2018-10-09 International Business Machines Corporation Automatically providing explanations for actions taken by a self-driving vehicle
US10643256B2 (en) 2016-09-16 2020-05-05 International Business Machines Corporation Configuring a self-driving vehicle for charitable donations pickup and delivery
US9791291B1 (en) 2016-09-26 2017-10-17 Uber Technologies, Inc. Modifying map configurations based on established location points
US10228271B2 (en) * 2016-10-11 2019-03-12 International Business Machines Corporation Suppressing rerouting for intentional circumstances
CN109890654B (en) * 2016-10-21 2021-11-02 罗伯特·博世有限公司 Pre-impact seat actuator system and method
US10432851B2 (en) 2016-10-28 2019-10-01 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device for detecting photography
US10031523B2 (en) 2016-11-07 2018-07-24 Nio Usa, Inc. Method and system for behavioral sharing in autonomous vehicles
KR20180051842A (en) * 2016-11-09 2018-05-17 엘지전자 주식회사 Display apparatus and method for controlling thereof
US10694357B2 (en) 2016-11-11 2020-06-23 Nio Usa, Inc. Using vehicle sensor data to monitor pedestrian health
US10012505B2 (en) 2016-11-11 2018-07-03 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable system for providing walking directions
US10708547B2 (en) 2016-11-11 2020-07-07 Nio Usa, Inc. Using vehicle sensor data to monitor environmental and geologic conditions
US10410064B2 (en) 2016-11-11 2019-09-10 Nio Usa, Inc. System for tracking and identifying vehicles and pedestrians
US10521669B2 (en) 2016-11-14 2019-12-31 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing guidance or feedback to a user
JP6780461B2 (en) * 2016-11-14 2020-11-04 いすゞ自動車株式会社 Driving support system and driving support method
US10699305B2 (en) 2016-11-21 2020-06-30 Nio Usa, Inc. Smart refill assistant for electric vehicles
US9975475B1 (en) * 2016-11-23 2018-05-22 Ford Global Technologies, Llc Notification device for vehicle
WO2018095876A1 (en) * 2016-11-25 2018-05-31 Philips Lighting Holding B.V. Lighting control
JP6648304B2 (en) * 2016-11-28 2020-02-14 本田技研工業株式会社 Driving support device, driving support system, program, and control method of driving support device
US10249104B2 (en) 2016-12-06 2019-04-02 Nio Usa, Inc. Lease observation and event recording
US10259452B2 (en) 2017-01-04 2019-04-16 International Business Machines Corporation Self-driving vehicle collision management system
US10712166B2 (en) * 2017-01-04 2020-07-14 FLIR Belgium BVBA Electronic navigation systems with user and simulation profiles
US10529147B2 (en) 2017-01-05 2020-01-07 International Business Machines Corporation Self-driving vehicle road safety flare deploying system
US10363893B2 (en) 2017-01-05 2019-07-30 International Business Machines Corporation Self-driving vehicle contextual lock control system
US10074223B2 (en) 2017-01-13 2018-09-11 Nio Usa, Inc. Secured vehicle for user use only
GB2558889B (en) * 2017-01-13 2020-07-29 Jaguar Land Rover Ltd Drive mode selector for a drivetrain, vehicle comprising such a drive mode selector, and a method of selecting a drive mode of a drivetrain
US10471829B2 (en) 2017-01-16 2019-11-12 Nio Usa, Inc. Self-destruct zone and autonomous vehicle navigation
US9984572B1 (en) 2017-01-16 2018-05-29 Nio Usa, Inc. Method and system for sharing parking space availability among autonomous vehicles
US10031521B1 (en) 2017-01-16 2018-07-24 Nio Usa, Inc. Method and system for using weather information in operation of autonomous vehicles
US10464530B2 (en) 2017-01-17 2019-11-05 Nio Usa, Inc. Voice biometric pre-purchase enrollment for autonomous vehicles
US10286915B2 (en) 2017-01-17 2019-05-14 Nio Usa, Inc. Machine learning for personalized driving
US10172760B2 (en) * 2017-01-19 2019-01-08 Jennifer Hendrix Responsive route guidance and identification system
KR20180087921A (en) * 2017-01-25 2018-08-03 삼성디스플레이 주식회사 Display device for vehicle and vehicle control system including the same
US10439895B2 (en) * 2017-01-31 2019-10-08 Salesforce.Com, Inc. Dynamic selection of channels for incoming communication
US10897469B2 (en) 2017-02-02 2021-01-19 Nio Usa, Inc. System and method for firewalls between vehicle networks
US9990846B1 (en) * 2017-02-07 2018-06-05 NoTraffic Ltd. Device, system and method for traffic management
US20180245943A1 (en) * 2017-02-14 2018-08-30 Panasonic Automatic Systems Company of America, Division of Panasonic Corporation of North America Navigation ux enhancement - automatic routing to nearest parking lot or garage
CN108536114A (en) * 2017-03-01 2018-09-14 北京图森未来科技有限公司 A kind of controller of vehicle
US10152060B2 (en) 2017-03-08 2018-12-11 International Business Machines Corporation Protecting contents of a smart vault being transported by a self-driving vehicle
JP6655036B2 (en) * 2017-03-15 2020-02-26 株式会社Subaru VEHICLE DISPLAY SYSTEM AND VEHICLE DISPLAY SYSTEM CONTROL METHOD
US10252688B2 (en) * 2017-03-22 2019-04-09 Ford Global Technologies, Llc Monitoring a vehicle cabin
CN108688593B (en) * 2017-03-31 2023-05-30 本田技研工业株式会社 System and method for identifying at least one passenger of a vehicle by movement pattern
CN110520877B (en) * 2017-03-31 2022-10-14 本田技研工业株式会社 Maintenance notifying system, control method thereof, and medium
US10571907B2 (en) * 2017-04-25 2020-02-25 Ford Global Technologies, Llc Method and apparatus for dynamic remote control reconfiguration based on proximity to a vehicle
US10255775B2 (en) * 2017-04-27 2019-04-09 Bitwave Pte Ltd. Intelligent motion detection
US10371544B2 (en) 2017-05-04 2019-08-06 Wearworks Vibrating haptic device for the blind
KR20180124381A (en) * 2017-05-11 2018-11-21 현대자동차주식회사 System for detecting impaired driving and method thereof
US10627479B2 (en) * 2017-05-17 2020-04-21 Zerokey Inc. Method for determining the position of an object and system employing same
CN108944799B (en) * 2017-05-18 2022-06-03 腾讯科技(深圳)有限公司 Vehicle driving behavior abnormity processing method and device
US11803399B2 (en) * 2017-05-18 2023-10-31 Happy Money, Inc. Interactive virtual assistant system
US20180357580A1 (en) * 2017-06-09 2018-12-13 Ford Global Technologies, Llc Vehicle driver workload management
FR3067835B1 (en) * 2017-06-16 2020-05-08 Valeo Systemes Thermiques APPARATUS FOR IDENTIFYING A CLOTHING FAMILY INDEX
CN107465720B (en) * 2017-06-23 2020-04-24 杭州云乐车辆技术有限公司 Energy-saving method based on wireless vehicle-mounted network management system
US10234302B2 (en) 2017-06-27 2019-03-19 Nio Usa, Inc. Adaptive route and motion planning based on learned external and internal vehicle environment
US10627824B2 (en) * 2017-06-27 2020-04-21 Deere & Company Automatic ground speed control system for a work vehicle
KR102406511B1 (en) * 2017-07-04 2022-06-10 현대자동차주식회사 Vehicle system and controll method of the same
WO2019006503A1 (en) * 2017-07-07 2019-01-10 T.O.M.S International Pty. Ltd. Passenger management
US10416671B2 (en) 2017-07-11 2019-09-17 Waymo Llc Methods and systems for vehicle occupancy confirmation
KR102367053B1 (en) * 2017-07-13 2022-02-24 삼성전자주식회사 Electronic apparatus for performing communication with an external electronic device
US11187550B2 (en) 2017-07-14 2021-11-30 Allstate Insurance Company Distributed data processing systems for processing remotely captured sensor data
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
WO2019017056A1 (en) * 2017-07-19 2019-01-24 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Vehicle-mounted relay device, relay method and program
US10837790B2 (en) 2017-08-01 2020-11-17 Nio Usa, Inc. Productive and accident-free driving modes for a vehicle
US10783558B2 (en) * 2017-08-07 2020-09-22 Harman International Industries, Incorporated System and method for motion onset consumer focus suggestion
CN109425351B (en) * 2017-08-24 2022-02-15 北京嘀嘀无限科技发展有限公司 Travel processing method and device
CN107550128A (en) * 2017-08-28 2018-01-09 京东方科技集团股份有限公司 A kind of Intelligent seat and its control method
US10401858B2 (en) 2017-08-29 2019-09-03 Waymo Llc Arranging passenger pickups for autonomous vehicles
US11315415B2 (en) * 2017-09-03 2022-04-26 Innovart Design Inc. Information sharing system and information sharing method for vehicle
US20190096397A1 (en) * 2017-09-22 2019-03-28 GM Global Technology Operations LLC Method and apparatus for providing feedback
US10755495B1 (en) * 2017-09-25 2020-08-25 State Farm Mutual Automobile Insurance Company Technology for detecting onboard sensor tampering
US10551849B2 (en) * 2017-09-25 2020-02-04 Aptiv Technologies Limited Automated vehicle simulated lane-marking guidance system
JP2019067220A (en) * 2017-10-02 2019-04-25 シャープ株式会社 Parking position display processor, parking position display method, and program
JP6776463B2 (en) * 2017-10-03 2020-10-28 グーグル エルエルシー Multi-factor authentication and access control in a vehicle environment
US10479300B2 (en) * 2017-10-06 2019-11-19 Ford Global Technologies, Llc Monitoring of vehicle window vibrations for voice-command recognition
US10635109B2 (en) 2017-10-17 2020-04-28 Nio Usa, Inc. Vehicle path-planner monitor and controller
US10611381B2 (en) 2017-10-24 2020-04-07 Ford Global Technologies, Llc Decentralized minimum risk condition vehicle control
US20190122143A1 (en) * 2017-10-24 2019-04-25 International Business Machines Corporation Cognitive-based passenger selection
US10935978B2 (en) 2017-10-30 2021-03-02 Nio Usa, Inc. Vehicle self-localization using particle filters and visual odometry
US10606274B2 (en) 2017-10-30 2020-03-31 Nio Usa, Inc. Visual place recognition based self-localization for autonomous vehicles
DE102017219869A1 (en) * 2017-11-08 2019-05-09 Continental Teves Ag & Co. Ohg Control device for a motor vehicle and method for operating the control device
US10717412B2 (en) 2017-11-13 2020-07-21 Nio Usa, Inc. System and method for controlling a vehicle using secondary access methods
DE102017220464A1 (en) * 2017-11-16 2019-05-16 Audi Ag Modular computer architecture of a cockpit and infotainment system for a vehicle
US10410181B2 (en) * 2017-12-05 2019-09-10 Verizon Patent And Licensing Inc. Using a vehicle's on-board diagnostic (OBD) system for audio reminders
JP7076997B2 (en) * 2017-12-12 2022-05-30 矢崎総業株式会社 In-vehicle system and detector hub
US10576975B2 (en) * 2017-12-18 2020-03-03 Fca Us Llc Active rear sense area adjustment of collision avoidance system of a vehicle when vehicle is approaching a positive road grade change
US10648826B2 (en) * 2017-12-20 2020-05-12 Mastercard International Incorporated Providing stop recommendations based on a travel path and transaction data
US20190209100A1 (en) * 2018-01-05 2019-07-11 Byton Limited In-vehicle user health platform systems and methods
US11014533B2 (en) 2018-01-05 2021-05-25 Byton Limited Platform for automotive personalization
US10818190B2 (en) * 2018-01-09 2020-10-27 Ford Global Technologies, Llc System and method for vehicle travelling in caravan mode
JP6743072B2 (en) * 2018-01-12 2020-08-19 本田技研工業株式会社 Control device, control device operating method, and program
US11262756B2 (en) * 2018-01-15 2022-03-01 Uatc, Llc Discrete decision architecture for motion planning system of an autonomous vehicle
CA3089542A1 (en) * 2018-01-26 2019-08-01 Sonitor Technologies As Location-based functionality using acoustic location determination techniques
US10726645B2 (en) 2018-02-16 2020-07-28 Ford Global Technologies, Llc Vehicle diagnostic operation
US11417107B2 (en) 2018-02-19 2022-08-16 Magna Electronics Inc. Stationary vision system at vehicle roadway
US10720156B2 (en) 2018-02-21 2020-07-21 Toyota Motor Engineering & Manufacturing North America, Inc. Co-pilot and conversational companion
EP3533656A1 (en) * 2018-03-02 2019-09-04 Panasonic Intellectual Property Corporation of America Vehicle travel information processing device and program for avoiding collisions
JP7110209B2 (en) * 2018-03-20 2022-08-01 モービルアイ ビジョン テクノロジーズ リミテッド System and method
JP7025966B2 (en) * 2018-03-20 2022-02-25 日立Astemo株式会社 Vehicle control device
US10813169B2 (en) 2018-03-22 2020-10-20 GoTenna, Inc. Mesh network deployment kit
US11548517B2 (en) * 2018-03-28 2023-01-10 Denso International America, Inc. Activating vehicle functions based on vehicle occupant location
EP3561719A1 (en) * 2018-04-25 2019-10-30 Ningbo Geely Automobile Research & Development Co., Ltd. Vehicle occupant management system and method
US10688867B2 (en) 2018-05-22 2020-06-23 International Business Machines Corporation Vehicular medical assistant
US10921812B2 (en) * 2018-05-22 2021-02-16 International Business Machines Corporation System and method for rendezvous coordination of an autonomous automotive vehicle with an ambulance
US11001273B2 (en) * 2018-05-22 2021-05-11 International Business Machines Corporation Providing a notification based on a deviation from a determined driving behavior
US10953888B2 (en) 2018-05-22 2021-03-23 International Business Machines Corporation Autonomous vehicle monitoring
US10369966B1 (en) * 2018-05-23 2019-08-06 Nio Usa, Inc. Controlling access to a vehicle using wireless access devices
US11308458B2 (en) * 2018-06-07 2022-04-19 Jeffrey Derouen Method for directing, scheduling, and facilitating maintenance requirements for autonomous vehicle
DK201870683A1 (en) 2018-07-05 2020-05-25 Aptiv Technologies Limited Identifying and authenticating autonomous vehicles and passengers
EP3591589A1 (en) * 2018-07-05 2020-01-08 Aptiv Technologies Limited Identifying autonomous vehicles and passengers
WO2020018394A1 (en) * 2018-07-14 2020-01-23 Moove.Ai Vehicle-data analytics
JP2021531576A (en) 2018-07-17 2021-11-18 アイ・ティー スピークス エル・エル・シーiT SpeeX LLC Methods, systems, and computer program products for role- and skill-based authority in Intelligent Industry Assistant
US11074297B2 (en) 2018-07-17 2021-07-27 iT SpeeX LLC Method, system, and computer program product for communication with an intelligent industrial assistant and industrial machine
JP2021530799A (en) 2018-07-17 2021-11-11 アイ・ティー スピークス エル・エル・シーiT SpeeX LLC Methods, systems, and computer program products for intelligent industry assistants
FR3084634B1 (en) * 2018-08-01 2021-07-30 Renault Sas MANUAL OR AUTOMATIC SELECTION OF INFORMATION DISPLAY SYSTEM AMONG A PLURALITY OF DISPLAY MODES CLASSIFIED BY DEPENDING ON THEIR LEVEL OF OFFSET IN RELATION TO THE VEHICLE
JP2021536599A (en) 2018-08-14 2021-12-27 モービルアイ ビジョン テクノロジーズ リミテッド Systems and methods for navigating at a safe distance
US10939296B2 (en) * 2018-08-20 2021-03-02 Ford Global Technologies, Llc Vehicle smart connection
US10974729B2 (en) 2018-08-21 2021-04-13 At&T Intellectual Property I, L.P. Application and portability of vehicle functionality profiles
KR20200039519A (en) * 2018-09-27 2020-04-16 현대자동차주식회사 Method for setting display of vehicle infotainment system and vehicle infotainment system to which the method is applied
US20200104966A1 (en) * 2018-09-30 2020-04-02 Strong Force Intellectual Capital, Llc Intelligent transportation systems
EP3856596A4 (en) 2018-09-30 2022-10-12 Strong Force Intellectual Capital, LLC Intelligent transportation systems
CN112889301B (en) * 2018-10-12 2024-02-06 系统软件企业有限责任公司 System and method for managing multiple short-range wireless connections within a vehicle
US10989547B2 (en) * 2018-10-22 2021-04-27 Ford Global Technologies, Llc Autonomous vehicle ride service systems and methods
US10909777B2 (en) * 2018-10-26 2021-02-02 Snap-On Incorporated Method and system for annotating graphs of vehicle data
WO2020092245A1 (en) * 2018-10-29 2020-05-07 Zorday IP, LLC Network-enabled electronic cigarette
US10829130B2 (en) 2018-10-30 2020-11-10 International Business Machines Corporation Automated driver assistance system
JP7121145B2 (en) 2018-11-02 2022-08-17 グーグル エルエルシー Context-Aware Navigation Voice Assistant
JP7172464B2 (en) * 2018-11-07 2022-11-16 トヨタ自動車株式会社 Vehicles and vehicle operation methods
KR102642242B1 (en) * 2018-11-09 2024-03-04 현대자동차주식회사 Vehicle and controlling method of vehicle
KR20200055974A (en) 2018-11-14 2020-05-22 현대자동차주식회사 Vehicle and controlling method of the same
JP2020080503A (en) * 2018-11-14 2020-05-28 本田技研工業株式会社 Agent device, agent presentation method, and program
US10820141B2 (en) 2018-11-16 2020-10-27 Here Global B.V. Method and apparatus for presenting privacy-respectful and personalized location-based comments based on passenger context and vehicle proximity to the location
KR20200057979A (en) * 2018-11-19 2020-05-27 현대자동차주식회사 Vehicle and controlling method of vehicle
US11014577B2 (en) * 2018-11-28 2021-05-25 Here Global B.V. Method and apparatus for presenting a feedforward cue in a user interface before an upcoming vehicle event occurs
CN111332251B (en) * 2018-11-30 2022-11-04 百度在线网络技术(北京)有限公司 Method and device for controlling a vehicle
US10990280B1 (en) * 2018-12-03 2021-04-27 Parallels International Gmbh Smart keyboard
KR102587094B1 (en) * 2018-12-04 2023-10-11 현대자동차주식회사 Apparatus and method for determining driving tendency of driver
CN109334852A (en) * 2018-12-05 2019-02-15 于建庆 A kind of electric vehicle intelligence control device
EP3893189A4 (en) * 2018-12-07 2021-12-29 Panasonic Intellectual Property Corporation of America Information processing method, information processing device, and program
US10882478B2 (en) * 2018-12-18 2021-01-05 Volkswagen Ag Movement-based comfort adjustment
TW202046159A (en) 2019-02-08 2020-12-16 美商艾提史畢克斯有限責任公司 Method, system, and computer program product for developing dialogue templates for an intelligent industrial assistant
US10817063B2 (en) 2019-02-11 2020-10-27 Volvo Car Corporation Facilitating interaction with a vehicle touchscreen using haptic feedback
US11432132B2 (en) * 2019-02-14 2022-08-30 Motorola Mobility Llc Dropping extraneous discovery messages
DE102019202025B4 (en) * 2019-02-15 2020-08-27 Zf Friedrichshafen Ag System and method for the safe operation of an automated vehicle
US11899448B2 (en) * 2019-02-21 2024-02-13 GM Global Technology Operations LLC Autonomous vehicle that is configured to identify a travel characteristic based upon a gesture
CN110930547A (en) * 2019-02-28 2020-03-27 上海商汤临港智能科技有限公司 Vehicle door unlocking method, vehicle door unlocking device, vehicle door unlocking system, electronic equipment and storage medium
KR20200108138A (en) * 2019-03-06 2020-09-17 현대자동차주식회사 Control system for seat of vehicle
US10913428B2 (en) * 2019-03-18 2021-02-09 Pony Ai Inc. Vehicle usage monitoring
US11199415B2 (en) * 2019-03-26 2021-12-14 Lyft, Inc. Systems and methods for estimating vehicle position based on contextual sensor information
CN110008893B (en) * 2019-03-29 2023-05-26 武汉理工大学 Vehicle running deviation automatic detection method based on vehicle-mounted image sensor
US11144053B2 (en) * 2019-04-04 2021-10-12 Toyota Research Institute, Inc. Controlling driving condition components of an autonomous vehicle based on a current driving mode and current conditions
EP3722947A1 (en) * 2019-04-12 2020-10-14 Aptiv Technologies Limited Distributed system for displaying a content
JP2020177072A (en) * 2019-04-16 2020-10-29 株式会社デンソー Device for vehicle and method for controlling device for vehicle
DE102019110040A1 (en) * 2019-04-16 2020-10-22 Bayerische Motoren Werke Aktiengesellschaft Control unit and method for the recognition, classification and prediction of a need for interaction of an automated driving vehicle
JP6541910B1 (en) * 2019-04-17 2019-07-10 京セラ株式会社 Electronic device, control method, and program
AU2019444479A1 (en) * 2019-05-09 2021-08-05 Google Llc Automated assistant suggestions for third-party vehicle computing devices with restricted architecture
US10796177B1 (en) * 2019-05-15 2020-10-06 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for controlling the playback of video in a vehicle using timers
US11368471B2 (en) * 2019-07-01 2022-06-21 Beijing Voyager Technology Co., Ltd. Security gateway for autonomous or connected vehicles
DE102019004781A1 (en) 2019-07-09 2021-01-14 Daimler Ag Method for operating an air conditioning device as a function of climatic information about the surroundings, as well as an air conditioning system
US11665244B2 (en) * 2019-07-11 2023-05-30 Kyndryl, Inc. Selecting user profiles on platforms based on optimal persona of a user in a given context
US11801848B2 (en) 2019-07-26 2023-10-31 Toyota Motor Engineering & Manufacturing North America, Inc. Electronic skin for vehicle components
US11904866B2 (en) * 2019-07-31 2024-02-20 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for brain-machine interfaces and EEG-based driver identification
CN110412983B (en) * 2019-08-01 2022-11-01 北京百度网讯科技有限公司 Vehicle collision avoidance detection method and device and vehicle
US11479148B2 (en) * 2019-08-08 2022-10-25 GM Global Technology Operations LLC Personalization settings based on body measurements
US11115894B2 (en) 2019-08-14 2021-09-07 Motorola Mobility Llc Managing FTM frames of WLAN RTT bursts
JP2021033508A (en) * 2019-08-21 2021-03-01 株式会社東海理化電機製作所 Information notification device, control device, and computer program
US11254238B2 (en) 2019-09-16 2022-02-22 Ford Global Technologies, Llc Occupant position detection with active electric field sensor
US11856480B2 (en) * 2019-09-27 2023-12-26 Ford Global Technologies, Llc Haptic guidance and navigation to mobile points of entry
US11430414B2 (en) 2019-10-17 2022-08-30 Microsoft Technology Licensing, Llc Eye gaze control of magnification user interface
US11507758B2 (en) * 2019-10-30 2022-11-22 Ford Global Technologies, Llc Vehicle-based sign language communication systems and methods
US11247699B2 (en) * 2019-11-04 2022-02-15 Volvo Car Corporation Driver assist interface in a vehicle
US11788852B2 (en) 2019-11-28 2023-10-17 Toyota Motor North America, Inc. Sharing of transport user profile
US11388582B2 (en) 2019-11-28 2022-07-12 Toyota Motor North America, Inc. Providing media based on profile sharing
JP7260464B2 (en) * 2019-12-12 2023-04-18 トヨタ自動車株式会社 Controller, system, program, and control method
US11272790B2 (en) * 2019-12-19 2022-03-15 Ford Global Technologies, Llc Vehicle seating assembly
CN113031630A (en) * 2019-12-24 2021-06-25 广州极飞科技股份有限公司 Speed control curve acquisition method, speed control method and related device
US11511756B2 (en) * 2020-01-13 2022-11-29 Ford Global Technologies, Llc Passenger authentication system for a vehicle
US11873000B2 (en) 2020-02-18 2024-01-16 Toyota Motor North America, Inc. Gesture detection for transport control
US20210256316A1 (en) * 2020-02-18 2021-08-19 Toyota Motor North America, Inc. Distinguishing gesture actions among transport occupants
CN111309017B (en) * 2020-02-27 2023-04-07 广东博智林机器人有限公司 Equipment scheduling method and device, electronic equipment and storage medium
DE102020106266A1 (en) * 2020-03-09 2021-09-09 Bayerische Motoren Werke Aktiengesellschaft Method, device, computer program and computer-readable storage medium for operating a vehicle
US11648893B2 (en) * 2020-03-13 2023-05-16 Toyota Motor Engineering And Manufacturing North America, Inc. System, method and apparatus for configurable multi-purpose input controls for an automobile
US11367356B1 (en) * 2020-03-16 2022-06-21 Wells Fargo Bank, N.A. Autonomous fleet service management
JP7419122B2 (en) * 2020-03-18 2024-01-22 本田技研工業株式会社 Vehicle control device, vehicle control method, and vehicle control program
CN111462487B (en) * 2020-03-31 2021-01-22 长安大学 Optimized edge computing node selection method and system in Internet of vehicles environment
US11315566B2 (en) * 2020-04-04 2022-04-26 Lenovo (Singapore) Pte. Ltd. Content sharing using different applications
US11231834B2 (en) 2020-06-03 2022-01-25 Micron Technology, Inc. Vehicle having an intelligent user interface
US11703335B2 (en) * 2020-06-19 2023-07-18 Toyota Research Institute, Inc. Coordinating and learning maps dynamically
US11653186B2 (en) 2020-06-26 2023-05-16 BlueOwl, LLC Systems and methods for determining application status
US11399261B1 (en) 2020-06-26 2022-07-26 BlueOwl, LLC Systems and methods for determining mobile device status
US11363426B1 (en) 2020-07-07 2022-06-14 BlueOwl, LLC Systems and methods for verifying reliability of sensor data received from mobile devices
US11479268B2 (en) 2020-07-30 2022-10-25 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous control of vehicle driving modes in certain traffic situations
JP7438892B2 (en) * 2020-08-20 2024-02-27 本田技研工業株式会社 Information processing device, information processing method, and program
CN112026790B (en) * 2020-09-03 2022-04-15 上海商汤临港智能科技有限公司 Control method and device for vehicle-mounted robot, vehicle, electronic device and medium
CN114257608A (en) * 2020-09-21 2022-03-29 福特全球技术公司 Information interaction method and information interaction system
US11537661B2 (en) * 2020-10-02 2022-12-27 PolyAI Limited Systems and methods for conversing with a user
CN112146675A (en) * 2020-10-09 2020-12-29 上海博泰悦臻网络技术服务有限公司 Service recommendation method, system, medium and device based on voice conversation
JP7310779B2 (en) * 2020-10-26 2023-07-19 トヨタ自動車株式会社 display system
US11897334B2 (en) 2020-11-19 2024-02-13 Ford Global Technologies, Llc Vehicle having pet bowl communication
CN112416280B (en) * 2020-11-20 2022-05-17 亿咖通(湖北)技术有限公司 Multi-display-screen control method of vehicle-mounted terminal
US11520687B2 (en) 2020-11-20 2022-12-06 Bank Of America Corporation System and method for automated testing of an accessibility feature of a user device by emulating actions of users
US11745095B2 (en) * 2020-12-04 2023-09-05 Cerence Operating Company Automotive entertainment applications
US20220194228A1 (en) * 2020-12-17 2022-06-23 Ford Global Technologies, Llc Vehicle having pet monitoring and related controls
US11682057B1 (en) 2021-01-05 2023-06-20 Wells Fargo Bank, N.A. Management system to facilitate vehicle-to-everything (V2X) negotiation and payment
US11904794B2 (en) 2021-01-28 2024-02-20 Ford Global Technologies, Llc Pet restraint system for a vehicle
US11915202B2 (en) * 2021-02-02 2024-02-27 Toyota Motor Engineering & Manufacturing North America, Inc. Remote meeting and calendar support for the in-vehicle infotainment unit
JP2022121172A (en) * 2021-02-08 2022-08-19 キヤノン株式会社 Information processing apparatus and method for controlling the same, and program
CN113147669B (en) * 2021-04-02 2022-08-02 淮南联合大学 Gesture motion detection system based on millimeter wave radar
US11932156B2 (en) 2021-05-17 2024-03-19 Ford Global Technologies, Llc Vehicle having sliding console
JP2023024153A (en) * 2021-08-06 2023-02-16 トヨタ自動車株式会社 Information input system
US11917395B2 (en) 2021-08-11 2024-02-27 Toyota Motor North America, Inc. Connected vehicle services privacy and protection
US11492005B1 (en) * 2021-08-24 2022-11-08 GM Global Technology Operations LLC Systems and methods for communicating with vision and hearing impaired vehicle occupants
US20230217167A1 (en) * 2022-01-05 2023-07-06 Ford Global Technologies, Llc Vehicle audio enhancement system
US11776377B2 (en) 2022-03-02 2023-10-03 Toyota Connected North America, Inc. Determination that a transport is running in an enclosed area
US20240005359A1 (en) * 2022-06-30 2024-01-04 Gm Cruise Holdings Llc Projected Advertisement Modification
US20240020651A1 (en) * 2022-07-15 2024-01-18 Hsin-Ya Hung Social networking method with global scheduling capabilities

Family Cites Families (242)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4754255A (en) 1984-03-12 1988-06-28 Sanders Rudy T User identifying vehicle control and security device
JPH03238599A (en) 1990-02-15 1991-10-24 Clarion Co Ltd On vehicle navigation equipment
US5296840A (en) 1990-05-25 1994-03-22 Federal Signal Corporation Programmable emergency signalling system for a vehicle
US5529138A (en) * 1993-01-22 1996-06-25 Shaw; David C. H. Vehicle collision avoidance system
US8799461B2 (en) * 1994-11-29 2014-08-05 Apple Inc. System for collecting, analyzing, and transmitting information relevant to transportation networks
US5926113A (en) 1995-05-05 1999-07-20 L & H Company, Inc. Automatic determination of traffic signal preemption using differential GPS
US5572450A (en) 1995-06-06 1996-11-05 Worthy; David G. RF car counting system and method therefor
US7313467B2 (en) 2000-09-08 2007-12-25 Automotive Technologies International Inc. System and method for in-vehicle communications
US6557752B1 (en) 1996-06-12 2003-05-06 Q-International, Inc. Smart card for recording identification, and operational, service and maintenance transactions
US5825283A (en) * 1996-07-03 1998-10-20 Camhi; Elie System for the security and auditing of persons and property
US6148261A (en) 1997-06-20 2000-11-14 American Calcar, Inc. Personal communication system to send and receive voice data positioning information
US7979173B2 (en) 1997-10-22 2011-07-12 Intelligent Technologies International, Inc. Autonomous vehicle travel control systems and methods
US6437692B1 (en) 1998-06-22 2002-08-20 Statsignal Systems, Inc. System and method for monitoring and controlling remote devices
EP1123581A4 (en) 1998-10-21 2005-08-10 American Calcar Inc Positional camera and gps data interchange device
US6161071A (en) * 1999-03-12 2000-12-12 Navigation Technologies Corporation Method and system for an in-vehicle computing architecture
KR100759346B1 (en) 1999-10-19 2007-09-17 아메리칸 캘카어 인코포레이티드 Technique for effective navigation based on user preferences
DE19952854C1 (en) 1999-11-03 2001-08-09 Bosch Gmbh Robert Assistance device in a vehicle
US6611755B1 (en) 1999-12-19 2003-08-26 Trimble Navigation Ltd. Vehicle tracking, communication and fleet management system
US6839614B1 (en) 1999-12-29 2005-01-04 Bellsouth Intellectual Property Corporation Multi-mode in-vehicle control unit with network selectivity for transmitting vehicle data for fleet management
US7526539B1 (en) 2000-01-04 2009-04-28 Pni Corporation Method and apparatus for a distributed home-automation-control (HAC) window
US7167796B2 (en) * 2000-03-09 2007-01-23 Donnelly Corporation Vehicle navigation system for use with a telematics system
WO2001088827A1 (en) 2000-05-15 2001-11-22 Modular Mining Systems, Inc. Permission system for control of autonomous vehicles
AU7593601A (en) 2000-07-14 2002-01-30 Atabok Inc Controlling and managing digital assets
US6356838B1 (en) 2000-07-25 2002-03-12 Sunil Paul System and method for determining an efficient transportation route
US20020103622A1 (en) * 2000-07-17 2002-08-01 Burge John R. Decision-aid system based on wirelessly-transmitted vehicle crash sensor information
US20020009978A1 (en) 2000-07-18 2002-01-24 Semyon Dukach Units for displaying information on vehicles
US20020065046A1 (en) 2000-07-18 2002-05-30 Vert, Inc. Apparatuses, methods, and computer programs for showing information on a vehicle having multiple displays
AU2001278953A1 (en) * 2000-07-28 2002-02-13 American Calcar, Inc. Technique for effective organization and communication of information
US6587781B2 (en) 2000-08-28 2003-07-01 Estimotion, Inc. Method and system for modeling and processing vehicular traffic data and information and applying thereof
US6694234B2 (en) 2000-10-06 2004-02-17 Gmac Insurance Company Customer service automation systems and methods
US20040036622A1 (en) 2000-12-15 2004-02-26 Semyon Dukach Apparatuses, methods, and computer programs for displaying information on signs
ES2261527T3 (en) 2001-01-09 2006-11-16 Metabyte Networks, Inc. SYSTEM, PROCEDURE AND APPLICATION OF SOFTWARE FOR DIRECT ADVERTISING THROUGH A GROUP OF BEHAVIOR MODELS, AND PROGRAMMING PREFERENCES BASED ON BEHAVIOR MODEL GROUPS.
US6671589B2 (en) 2001-02-13 2003-12-30 William Holst Method and apparatus to support remote and automatically initiated data loading and data acquisition of airborne computers using a wireless spread spectrum aircraft data services link
CA2372861A1 (en) 2001-02-20 2002-08-20 Matsushita Electric Industrial Co., Ltd. Travel direction device and travel warning direction device
US6563910B2 (en) * 2001-02-26 2003-05-13 Royal Thoughts, Llc Emergency response information distribution
US20020174021A1 (en) 2001-05-15 2002-11-21 International Business Machines Corporation Optimized shopping list process
US6629034B1 (en) 2001-06-06 2003-09-30 Navigation Technologies Corp. Driving profile method and system
US6617981B2 (en) 2001-06-06 2003-09-09 John Basinger Traffic control method for multiple intersections
US6662091B2 (en) 2001-06-29 2003-12-09 Battelle Memorial Institute Diagnostics/prognostics using wireless links
JP2003072488A (en) 2001-08-31 2003-03-12 Sony Corp Onboard device and processing method of vehicle and vehicle information
US6944536B2 (en) * 2002-02-01 2005-09-13 Medaire, Inc. Method and system for identifying medical facilities along a travel route
US10242255B2 (en) 2002-02-15 2019-03-26 Microsoft Technology Licensing, Llc Gesture recognition system using depth perceptive sensors
US7313547B2 (en) 2002-03-22 2007-12-25 Sun Microsystems, Inc. Manager level device/service arbitrator and methods
US7058898B2 (en) 2002-03-22 2006-06-06 Sun Microsystems, Inc. Abstract user interface manager with prioritization
US6810309B2 (en) 2002-04-25 2004-10-26 Visteon Global Technologies, Inc. Vehicle personalization via biometric identification
US7693720B2 (en) 2002-07-15 2010-04-06 Voicebox Technologies, Inc. Mobile systems and methods for responding to natural language speech utterance
JP4014963B2 (en) 2002-08-05 2007-11-28 三菱電機株式会社 Navigation device, route search method, and map information guidance method
US20040093155A1 (en) * 2002-11-12 2004-05-13 Simonds Craig John System and method for providing vehicle context information
ES2287448T3 (en) 2003-03-03 2007-12-16 Snap-On Technologies Inc. METHOD FOR PROVIDING A SOFTWARE MODULE TO A MOTOR VEHICLE CONTROL UNIT, AND COMPUTER PROGRAM TO EXECUTE THE METHOD.
US6891467B2 (en) 2003-04-29 2005-05-10 Visteon Global Technologies, Inc. Multistage vehicle security system
US6931309B2 (en) 2003-05-06 2005-08-16 Innosurance, Inc. Motor vehicle operating data collection and analysis
CN1826604A (en) 2003-05-19 2006-08-30 精确交通系统公司 Method for incorporating individual vehicle data collection, detection and recording of traffic violations in a traffic signal controller
US20040260470A1 (en) 2003-06-14 2004-12-23 Rast Rodger H. Conveyance scheduling and logistics system
US7873471B2 (en) 2003-07-16 2011-01-18 Harman Becker Automotive Systems Gmbh Transmission of special routes to a navigation device
US6853910B1 (en) 2003-08-11 2005-02-08 General Motors Corporation Vehicle tracking telematics system
EP1709610B1 (en) 2003-10-14 2012-07-18 Siemens Industry, Inc. Method and system for collecting traffic data, monitoring traffic, and automated enforcement at a centralized station
JP2005121531A (en) 2003-10-17 2005-05-12 Navitime Japan Co Ltd Portable navigation device, controlling method, and control program thereof
US7913242B2 (en) 2003-11-04 2011-03-22 Gm Global Technology Operations, Inc. Low cost, open approach for vehicle software installation/updating and on-board diagnostics
US7233861B2 (en) 2003-12-08 2007-06-19 General Motors Corporation Prediction of vehicle operator destinations
GB2410849B (en) 2003-12-30 2007-03-14 Nokia Corp A GPS device
US20050159853A1 (en) 2004-01-15 2005-07-21 Nissan Motor Co., Ltd. Driving status detection device and related method
US7239960B2 (en) 2004-02-19 2007-07-03 Alpine Electronics, Inc. Navigation method and system for visiting multiple destinations by minimum number of stops
US7170400B2 (en) 2004-05-20 2007-01-30 Lear Corporation System for customizing settings and sounds for vehicle
US20050283284A1 (en) 2004-06-16 2005-12-22 Grenier Alain H Vehicle services manager
US7279806B2 (en) 2004-07-27 2007-10-09 Elaine E. Futrell Ignition system with driver identification
US20060058948A1 (en) 2004-07-28 2006-03-16 Ms. Melanie Blass Recordable location-based reminder system organizer
US7295904B2 (en) 2004-08-31 2007-11-13 International Business Machines Corporation Touch gesture based interface for motor vehicle
US7831384B2 (en) 2004-10-29 2010-11-09 Aol Inc. Determining a route to destination based on partially completed route
US9097195B2 (en) 2004-11-26 2015-08-04 Lysanda Limited Vehicular diagnostic system
NZ538796A (en) 2005-03-10 2007-05-31 Brunswick New Technologies Asi Vehicle location and navigation system
JP3987073B2 (en) 2005-04-20 2007-10-03 株式会社ナビタイムジャパン Navigation system, route search server, route search method and program
KR100715181B1 (en) 2005-06-10 2007-05-07 주식회사 현대오토넷 Navigation system which it follows in driver propensity
DK1891848T3 (en) 2005-06-13 2015-10-19 Intelligent Mechatronic Sys VEHICLE SIMMERSIVE COMMUNICATION SYSTEM
US20070061057A1 (en) 2005-08-26 2007-03-15 Palo Alto Research Center Incorporated Vehicle network advertising system
US7812712B2 (en) 2006-02-13 2010-10-12 All Protect, Llc Method and system for controlling a vehicle given to a third party
US8516087B2 (en) 2006-02-14 2013-08-20 At&T Intellectual Property I, L.P. Home automation system and method
US7522066B2 (en) 2006-02-23 2009-04-21 Rockwell Automation Technologies, Inc. Systems and methods that evaluate distance to potential hazards utilizing overlapping sensing zones
US7941753B2 (en) 2006-03-31 2011-05-10 Aol Inc. Communicating appointment and/or mapping information among a calendar application and a navigation application
US7873610B2 (en) 2006-05-26 2011-01-18 Andrew S Poulsen Meta-configuration of profiles
EP1909069B1 (en) 2006-10-02 2010-12-08 Harman Becker Automotive Systems GmbH Intelligent destination setting for navigation systems
US8108501B2 (en) 2006-11-01 2012-01-31 Yahoo! Inc. Searching and route mapping based on a social network, location, and time
KR101329289B1 (en) 2007-01-05 2013-11-14 삼성전자주식회사 Apparatus and method for providing schedule and path
US20080169940A1 (en) 2007-01-12 2008-07-17 Dae-Ryung Lee Intelligent traffic control system and associated methods
US20100321151A1 (en) 2007-04-04 2010-12-23 Control4 Corporation Home automation security system and method
JP4270301B2 (en) 2007-04-13 2009-05-27 株式会社デンソー Center communicating with in-vehicle data collection device
US7861260B2 (en) 2007-04-17 2010-12-28 Almondnet, Inc. Targeted television advertisements based on online behavior
US7667618B2 (en) 2007-04-22 2010-02-23 International Business Machines Corporation System and method for tracking and billing vehicle users based on when and in which road lanes their vehicles have been driven
WO2008137855A2 (en) 2007-05-03 2008-11-13 Hti Ip, Llc Methods, systems, and apparatuses for telematics navigation
US20080300778A1 (en) 2007-05-30 2008-12-04 Tele Atlas North America, Inc. System and method for personalizing trip assistance on a digital map
US8825277B2 (en) * 2007-06-05 2014-09-02 Inthinc Technology Solutions, Inc. System and method for the collection, correlation and use of vehicle collision data
JP4207088B2 (en) 2007-06-20 2009-01-14 トヨタ自動車株式会社 Vehicle travel estimation device
US8078698B2 (en) 2007-06-26 2011-12-13 At&T Intellectual Property I, L.P. Methods, systems, and products for producing persona-based hosts
US8385946B2 (en) 2007-06-28 2013-02-26 Apple Inc. Disfavored route progressions or locations
US8285245B2 (en) 2007-08-30 2012-10-09 Wirelesswerx International, Inc. Messaging in a multi-dimensional space
US20090119657A1 (en) 2007-10-24 2009-05-07 Link Ii Charles M Methods and systems for software upgrades
CN101842818A (en) 2007-11-01 2010-09-22 伊戈尔·尤瑞维其·马修 Traffic monitoring system
US8090489B2 (en) 2007-11-09 2012-01-03 Bruno Delean Computerized driverless vehicles and traffic control system
EP2229576B1 (en) 2007-12-05 2016-04-13 Visteon Global Technologies, Inc. Vehicle user interface systems and methods
US20090157312A1 (en) 2007-12-14 2009-06-18 Microsoft Corporation Social network based routes
US8793065B2 (en) 2008-02-19 2014-07-29 Microsoft Corporation Route-based activity planner
JP5067217B2 (en) 2008-03-18 2012-11-07 アイシン・エィ・ダブリュ株式会社 Traffic information processing system, statistical processing device, traffic information processing method, and traffic information processing program
US8855899B2 (en) 2008-05-15 2014-10-07 Garmin Switzerland Gmbh Virtual traffic sensors
US7864029B2 (en) 2008-05-19 2011-01-04 Gm Global Technology Operations, Inc. Vehicle-setting-based driver identification system
US8140220B2 (en) 2008-05-30 2012-03-20 Fujitsu Ten Limited In-vehicle apparatus
JP4733165B2 (en) 2008-06-30 2011-07-27 株式会社デンソー Vehicle navigation system
US9652023B2 (en) 2008-07-24 2017-05-16 Intelligent Mechatronic Systems Inc. Power management system
US9308879B2 (en) 2008-08-06 2016-04-12 Honeywell International Inc. Method, system, and apparatus of vehicle and fleet operator profile automation and deployment
US20100057337A1 (en) 2008-09-02 2010-03-04 Tele Atlas North America, Inc. System and method for providing digital map, routing, or navigation information with need-based routing
US8010285B1 (en) 2008-09-30 2011-08-30 Denise Jason A Electronic navigation related technology
US20100087987A1 (en) 2008-10-08 2010-04-08 Gm Global Technoloogy Operations, Inc. Apparatus and Method for Vehicle Driver Recognition and Customization Using Onboard Vehicle System Settings
US8700255B2 (en) 2008-10-08 2014-04-15 Trimble Navigation Limited Devices, systems, and methods for monitoring driver and vehicle behavior
WO2010043658A1 (en) 2008-10-15 2010-04-22 Continental Teves Ag & Co. Ohg Improvement and validation of position determination
US20100106344A1 (en) * 2008-10-27 2010-04-29 Edwards Dean B Unmanned land vehicle having universal interfaces for attachments and autonomous operation capabilities and method of operation thereof
CA2749685A1 (en) * 2008-11-13 2010-05-20 Aser Rich Limited System and method for improved vehicle safety through enhanced situation awareness of a driver of a vehicle
US20100125387A1 (en) 2008-11-17 2010-05-20 Chung-Ang University Industry-Academy Cooperation Foundation System of integrated telematics service and method of controlling the system
KR101531193B1 (en) 2008-11-18 2015-06-25 엘지전자 주식회사 Method for controlling map and mobile terminal using the same
US8584031B2 (en) 2008-11-19 2013-11-12 Apple Inc. Portable touch screen device, method, and graphical user interface for using emoji characters
US20100136944A1 (en) 2008-11-25 2010-06-03 Tom Taylor Method and system for performing a task upon detection of a vehicle trigger
US7969324B2 (en) 2008-12-01 2011-06-28 International Business Machines Corporation Optimization of vehicular traffic flow through a conflict zone
WO2010065915A1 (en) 2008-12-04 2010-06-10 Apisphere, Inc. System for and method of location-based process execution
US9305288B2 (en) 2008-12-30 2016-04-05 Ford Global Technologies, Llc System and method for provisioning electronic mail in a vehicle
US8245609B1 (en) 2009-02-05 2012-08-21 Pro-Cut Licensing Company, Llc Reporting system for on-vehicle brake lathe
US8180559B2 (en) 2009-02-13 2012-05-15 International Business Machines Corporation System and methods for geoboundary based traffic monitoring using quick tear down boundaries
EP2221581B1 (en) 2009-02-18 2017-07-19 Harman Becker Automotive Systems GmbH Method of estimating a propulsion-related operating parameter
US20100211304A1 (en) 2009-02-19 2010-08-19 Hwang Timothy H Personalized User Routing and Recommendations
US8332270B2 (en) 2009-02-19 2012-12-11 Qualcomm Incorporated System and method for utilizing a wireless communications device
JP5304882B2 (en) 2009-02-25 2013-10-02 トヨタ自動車株式会社 In-vehicle information processing apparatus and information processing method
US8825222B2 (en) 2009-02-27 2014-09-02 Toyota Motor Engineering & Manufacturing North America, Inc. Remote management of vehicle settings
US9003500B2 (en) * 2009-03-13 2015-04-07 Hti Ip, Llc Method and system for facilitating synchronizing media content between a vehicle device and a user device
US20100274410A1 (en) * 2009-04-24 2010-10-28 Gm Global Technology Operations, Inc. Methods and systems for customizing content for an occupant of a vehicle
US8350697B2 (en) 2009-05-18 2013-01-08 Alarm.Com Incorporated Remote device control and energy monitoring by analyzing data and applying rules
JP4957747B2 (en) 2009-05-18 2012-06-20 トヨタ自動車株式会社 Vehicle environment estimation device
DK2435149T3 (en) 2009-05-28 2015-09-21 Anki Inc Distributed system for autonomous control of toy cars
EP2436224A4 (en) 2009-05-28 2012-12-05 Intelligent Mechatronic Sys Communication system with personal information management and remote vehicle monitoring and control features
US8554831B2 (en) 2009-06-02 2013-10-08 Ford Global Technologies, Llc System and method for executing hands-free operation of an electronic calendar application within a vehicle
TWI393047B (en) 2009-06-30 2013-04-11 Accton Technology Corp An adapting infotainment device
US8350721B2 (en) 2009-07-21 2013-01-08 Verizon Patent And Licensing Inc. Geographically specific emergency notification
US8301320B2 (en) 2009-07-29 2012-10-30 The Invention Science Fund I, Llc Vehicle system for varied compliance benefits
US8233919B2 (en) 2009-08-09 2012-07-31 Hntb Holdings Ltd. Intelligently providing user-specific transportation-related information
US8634984B2 (en) 2009-09-29 2014-01-21 Clarion Co., Ltd. Navigation system
US20110078024A1 (en) 2009-09-29 2011-03-31 General Electric Company System and method for provisioning advertisements to a vehicle user
US20110077808A1 (en) 2009-09-30 2011-03-31 Searete LLC; a limited liability corporation of the State of Delaware Vehicle system for varied compliance benefits
US8451112B2 (en) 2009-10-19 2013-05-28 Qualcomm Incorporated Methods and apparatus for estimating departure time based on known calendar events
DE112009005344B4 (en) 2009-11-06 2016-04-28 Toyota Jidosha Kabushiki Kaisha VEHICLE GATEWAY SYSTEM
US8509987B2 (en) 2009-11-11 2013-08-13 Benjamin Resner Methods and apparatus for automatic internet logging and social comparison of vehicular driving behavior
US8706349B2 (en) 2009-12-07 2014-04-22 At&T Mobility Ii Llc Devices, systems and methods for controlling permitted settings on a vehicle
US20110184642A1 (en) 2009-12-18 2011-07-28 Daimler Trucks North America Llc Fuel efficient routing system and method
KR20110092007A (en) 2010-02-08 2011-08-17 주식회사 만도 Software downloading system and method for a vehicle using the same
US8730033B2 (en) 2010-02-17 2014-05-20 Hti Ip, L.L.C. Method and system for sending information from a user device to a car
US20110247013A1 (en) 2010-04-01 2011-10-06 Gm Global Technology Operations, Inc. Method for Communicating Between Applications on an External Device and Vehicle Systems
WO2011146141A1 (en) 2010-05-21 2011-11-24 Telecommunication Systems, Inc. Personal wireless navigation system
US8762041B2 (en) 2010-06-21 2014-06-24 Blackberry Limited Method, device and system for presenting navigational information
DE102010030309A1 (en) 2010-06-21 2011-12-22 Ford Global Technologies, Llc Method and device for determining an energy consumption optimized route
US9464905B2 (en) 2010-06-25 2016-10-11 Toyota Motor Engineering & Manufacturing North America, Inc. Over-the-air vehicle systems updating and associate security protocols
JP5212427B2 (en) * 2010-07-01 2013-06-19 株式会社デンソー Driving assistance device
US8626439B2 (en) 2010-07-09 2014-01-07 Telenav, Inc. Navigation system with traffic estimation mechanism and method of operation thereof
SG187085A1 (en) 2010-07-16 2013-03-28 Univ Carnegie Mellon Methods and systems for coordinating vehicular traffic using in-vehicle virtual traffic control signals enabled by vehicle-to-vehicle communications
US20120030512A1 (en) 2010-07-27 2012-02-02 Ford Motor Company Provisioning of data to a vehicle infotainment computing system
US8682573B2 (en) 2010-08-20 2014-03-25 Deere & Company Method of initiating vehicle movement
US9511683B2 (en) 2010-08-25 2016-12-06 GM Global Technology Operations LLC Occupant recognition and verification system
EP2437163A1 (en) 2010-09-09 2012-04-04 Harman Becker Automotive Systems GmbH User interface for a vehicle system
US20130245882A1 (en) 2012-03-14 2013-09-19 Christopher P. Ricci Removable, configurable vehicle console
US8995982B2 (en) 2011-11-16 2015-03-31 Flextronics Ap, Llc In-car communication between devices
US20130293364A1 (en) 2012-05-02 2013-11-07 Flextronics Ap, Llc Configurable dash display
US20130293452A1 (en) 2012-05-02 2013-11-07 Flextronics Ap, Llc Configurable heads-up dash display
US20130151088A1 (en) 2011-11-16 2013-06-13 Flextronics Ap, Llc Method and system for vehicle data collection regarding traffic
US20130241720A1 (en) 2012-03-14 2013-09-19 Christopher P. Ricci Configurable vehicle console
US8979159B2 (en) 2011-11-16 2015-03-17 Flextronics Ap, Llc Configurable hardware unit for car systems
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
WO2012050573A1 (en) 2010-10-13 2012-04-19 Hewlett-Packard Development Company, L.P. Dashboard display method and apparatus
US8817087B2 (en) 2010-11-01 2014-08-26 Robert Bosch Gmbh Robust video-based handwriting and gesture recognition for in-car applications
US8552886B2 (en) 2010-11-24 2013-10-08 Bcs Business Consulting Services Pte Ltd. Crash warning system for motor vehicles
US8863256B1 (en) 2011-01-14 2014-10-14 Cisco Technology, Inc. System and method for enabling secure transactions using flexible identity management in a vehicular environment
US8514717B2 (en) 2011-01-21 2013-08-20 T-Mobile Usa, Inc. Smart connection manager
US8788220B2 (en) 2011-01-21 2014-07-22 The United States Of America As Represented By The Secretary Of The Navy Vehicle damage detection system
JP5776962B2 (en) 2011-01-26 2015-09-09 日本電気株式会社 Navigation device
US8825362B2 (en) 2011-01-27 2014-09-02 Honda Motor Co., Ltd. Calendar sharing for the vehicle environment using a connected cell phone
US8825382B2 (en) * 2011-02-22 2014-09-02 Telenav, Inc. Navigation system with multimedia distribution mechanism and method of operation thereof
US20120254763A1 (en) 2011-04-01 2012-10-04 Ford Global Technologies, Llc Methods and systems for using and managing aggregated electronic calendars in a vehicle
US8613385B1 (en) 2011-06-02 2013-12-24 Digecor I.P. And Assets Pty. Ltd. Audio-visual entertainment system and docking systems associated therewith
US8560155B2 (en) 2011-06-15 2013-10-15 Chrysler Group Llc Adaptive powertrain control for plugin hybrid electric vehicles
US9030327B2 (en) 2011-06-23 2015-05-12 Honeywell International Inc. System and method for detecting radiation emissions
US9542241B2 (en) 2011-07-12 2017-01-10 Harman International Industries, Incorporated Navigation application interface
US9014888B2 (en) * 2011-07-21 2015-04-21 Saturna Green Systems Inc. Vehicle communication, analysis and operation system
BR112014001702B1 (en) 2011-07-26 2021-06-29 Toyota Jidosha Kabushiki Kaisha VEHICLE IDENTIFICATION SYSTEM AND VEHICLE IDENTIFICATION DEVICE
US9557981B2 (en) 2011-07-26 2017-01-31 Ford Global Technologies, Llc Method and apparatus for automatic module upgrade
US20130030645A1 (en) 2011-07-28 2013-01-31 Panasonic Corporation Auto-control of vehicle infotainment system based on extracted characteristics of car occupants
US9919598B2 (en) 2011-08-22 2018-03-20 Lg Electronics Inc. Mobile terminal, image display apparatus mounted in vehicle and data processing method using the same
US20130079964A1 (en) 2011-09-27 2013-03-28 Saturna Green Systems Inc. Vehicle communication, analysis and operation system
US20130076653A1 (en) 2011-09-27 2013-03-28 Mohammed Selim Displaying of charging status on dual screen device
WO2013052678A2 (en) 2011-10-04 2013-04-11 Advanergy, Inc. Battery management system and method
US8872647B2 (en) * 2011-10-21 2014-10-28 Ford Global Technologies, Llc Method and apparatus for context adaptive multimedia management
US20130282946A1 (en) 2012-04-23 2013-10-24 Flextronics Ap, Llc Controller area network bus
US20130219039A1 (en) 2011-11-16 2013-08-22 Flextronics Ap, Llc Network selector in a vehicle infotainment system
US20130139140A1 (en) 2011-11-29 2013-05-30 Ford Global Technologies, Llc Method and Apparatus for Mobile Mesh Network Vehicular Software Updating
US8660735B2 (en) 2011-12-14 2014-02-25 General Motors Llc Method of providing information to a vehicle
US9420432B2 (en) 2011-12-23 2016-08-16 Microsoft Technology Licensing, Llc Mobile devices control
US8688290B2 (en) 2011-12-27 2014-04-01 Toyota Motor Enginerring & Manufacturing North America, Inc. Predictive destination entry for a navigation system
US20130198031A1 (en) 2012-01-27 2013-08-01 Guy Mitchell Method and system for optimum routing
US8635018B2 (en) * 2012-02-03 2014-01-21 International Business Machines Corporation Managing a driver profile
US20130212065A1 (en) 2012-02-15 2013-08-15 Flybits, Inc. Zone Oriented Applications, Systems and Methods
US20130218445A1 (en) 2012-02-16 2013-08-22 Intelligent Mechatronic Systems Inc. Traffic portal enquiry and alert system
US9489644B2 (en) 2012-02-23 2016-11-08 Ford Global Technologies, Llc Vehicle drive matching system and method
US8442758B1 (en) 2012-02-27 2013-05-14 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for a vehicle smart calendar
US20130231784A1 (en) 2012-03-01 2013-09-05 Toyota Motor Engineering & Manufacturing North America, Inc. Systems, Methods and Vehicles For Controlling Home Automation
WO2014172380A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc Altered map routes based on user profile information
US20140309863A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Parental control over vehicle features and child alert system
US20140309813A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Guest vehicle user reporting
US20140309849A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Driver facts behavior information storage system
US9384609B2 (en) 2012-03-14 2016-07-05 Autoconnect Holdings Llc Vehicle to vehicle safety and traffic communications
US9378601B2 (en) 2012-03-14 2016-06-28 Autoconnect Holdings Llc Providing home automation information via communication with a vehicle
WO2014172327A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc Synchronization between vehicle and user device calendar
US20140309862A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc User profile exchange via vehicle supported communications protocol
US20140310031A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Transfer of user profile data via vehicle agency control
US8706416B2 (en) 2012-04-03 2014-04-22 Ford Global Technologies, Llc System and method for determining a vehicle route
EP2662847B1 (en) 2012-05-11 2014-08-06 ITCiCo Spain, S.L. In-vehicle information delivery system and method
US20130338914A1 (en) * 2012-06-14 2013-12-19 Wavemarket Inc. System and method for notifying vehicle driver of localized driving conditions
US8818626B2 (en) 2012-06-21 2014-08-26 Visteon Global Technologies, Inc. Mobile device wireless camera integration with a vehicle
US9678573B2 (en) 2012-07-30 2017-06-13 Microsoft Technology Licensing, Llc Interaction with devices based on user state
US20140058672A1 (en) 2012-08-21 2014-02-27 Google Inc. Calculating a travel route based on a user's navigational preferences and travel history
US20140067564A1 (en) 2012-08-30 2014-03-06 Ebay Inc. Shopping list creator and optimizer
US9809185B2 (en) 2012-09-04 2017-11-07 Ford Global Technologies, Llc Method and apparatus for subjective command control of vehicle systems
US9128798B2 (en) 2012-10-17 2015-09-08 Movimento Group Module updating device
US8813061B2 (en) 2012-10-17 2014-08-19 Movimento Group Module updating device
US8805017B2 (en) 2012-12-13 2014-08-12 Intel Corporation Gesture pre-processing of video stream to reduce platform power
US9049564B2 (en) * 2013-02-04 2015-06-02 Zf Friedrichshafen Ag Vehicle broadcasting system
US20140245278A1 (en) 2013-02-22 2014-08-28 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corpor Automotive component self update via software version control
US9075686B2 (en) 2013-02-25 2015-07-07 GM Global Technology Operations LLC System and method to improve control module reflash time
US9499114B2 (en) 2013-02-27 2016-11-22 Gentex Corporation System and method for monitoring vehicle speed and driver notification
US9188451B2 (en) 2013-02-28 2015-11-17 Here Global B.V. Method and apparatus for minimizing power consumption in a navigation system
US11493347B2 (en) 2013-03-12 2022-11-08 Verizon Patent And Licensing Inc. Using historical location data to improve estimates of location
US20140282470A1 (en) 2013-03-13 2014-09-18 Arynga Inc. Remote transfer of electronic images to a vehicle
US9303997B2 (en) 2013-03-15 2016-04-05 Apple Inc. Prediction engine
WO2014172316A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc Building profiles associated with vehicle users
US20140309790A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Adjusting home automation comfort based on vehicle associated user profile data
WO2014172322A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc Vehicle intruder alert detection and indication
EP2817170A4 (en) 2013-04-15 2015-11-04 Access and portability of user profiles stored as templates
US20140310103A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Creating targeted advertising profiles based on user behavior
EP2987153A4 (en) 2013-04-15 2016-09-21 Autoconnect Holdings Llc Central network for automated control of vehicular traffic
WO2014172334A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc User gesture control of vehicle features
WO2014172320A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc Vehicle location-based home automation triggers
EP2817176A4 (en) 2013-04-15 2016-08-10 Autoconnect Holdings Llc User interface and virtual personality presentation based on user profile

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016156448A1 (en) * 2015-04-02 2016-10-06 Continental Automotive Gmbh Computer system for a vehicle
WO2017200568A1 (en) * 2016-05-17 2017-11-23 Google Llc Application execution while operating vehicle
US10501093B2 (en) 2016-05-17 2019-12-10 Google Llc Application execution while operating vehicle
GB2555088A (en) * 2016-10-11 2018-04-25 Jaguar Land Rover Ltd Interface apparatus and method
US11273846B2 (en) 2016-10-11 2022-03-15 Jaguar Land Rover Limited Interface apparatus and method
EP3422265A1 (en) * 2017-06-30 2019-01-02 Seat, S.A. Method for generating at least one notification to a user in a shared vehicle
US11713600B1 (en) * 2019-09-30 2023-08-01 Ambarella International Lp Child safety lock
US11772603B2 (en) 2021-05-18 2023-10-03 Motional Ad Llc Passenger authentication and entry for autonomous vehicles

Also Published As

Publication number Publication date
WO2014172369A3 (en) 2015-02-26
US20140309806A1 (en) 2014-10-16
US9082239B2 (en) 2015-07-14

Similar Documents

Publication Publication Date Title
US9290153B2 (en) Vehicle-based multimode discovery
US9082239B2 (en) Intelligent vehicle for assisting vehicle occupants
US9082238B2 (en) Synchronization between vehicle and user device calendar
US20170247000A1 (en) User interface and virtual personality presentation based on user profile
US20160247377A1 (en) Guest vehicle user reporting
US20140309868A1 (en) User interface and virtual personality presentation based on user profile
WO2014172316A1 (en) Building profiles associated with vehicle users
WO2014172323A1 (en) Driver facts behavior information storage system
WO2014172334A1 (en) User gesture control of vehicle features

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14784681

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 14784681

Country of ref document: EP

Kind code of ref document: A2