US20150081133A1 - Gesture-based system enabling children to control some vehicle functions in a vehicle - Google Patents
Gesture-based system enabling children to control some vehicle functions in a vehicle Download PDFInfo
- Publication number
- US20150081133A1 US20150081133A1 US14/447,465 US201414447465A US2015081133A1 US 20150081133 A1 US20150081133 A1 US 20150081133A1 US 201414447465 A US201414447465 A US 201414447465A US 2015081133 A1 US2015081133 A1 US 2015081133A1
- Authority
- US
- United States
- Prior art keywords
- child
- vehicle
- control
- subsystem
- recited
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000006870 function Effects 0.000 title claims abstract description 38
- 238000001514 detection method Methods 0.000 claims description 59
- 230000004044 response Effects 0.000 claims description 15
- 238000003384 imaging method Methods 0.000 claims description 14
- 238000004891 communication Methods 0.000 claims description 10
- 238000000034 method Methods 0.000 claims description 9
- 238000009434 installation Methods 0.000 claims description 3
- 206010038743 Restlessness Diseases 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/10—Interpretation of driver requests or demands
-
- B60K35/10—
-
- B60K35/656—
-
- B60K2360/146—
Definitions
- the present disclosure relates to a vehicle and more particularly to systems and methods which enable young children to control certain vehicle functions.
- a system for a vehicle is configured to enable child control of vehicle functions.
- the system includes a detection subsystem and a control subsystem.
- the detection subsystem can be configured to detect the presence of a child in a vehicle passenger seat and to detect user commands such as hand gesture commands issued by a child.
- the control subsystem which is in communication with the detection subsystem, is configured to enable a user to control at least one vehicle function.
- the system additionally includes a response subsystem, such as an audiovisual entertainment system or a temperature control system.
- a method for enabling child control of vehicle functions can include a step of equipping a control system, for installation in a vehicle, with a detection subsystem and a control subsystem.
- the detection subsystem can include at least one child detection sensor and at least one command sensor and the control subsystem can be configured to enable a child to control at least one vehicle function.
- the detection subsystem can be operable to transmit command data to the control subsystem and the control subsystem can be operable to receive and interpret the command data.
- a vehicle which possesses a system configured to enable a child to control vehicle functions can include a detection subsystem and a control subsystem.
- the detection subsystem can be configured to detect the presence of a child in a vehicle passenger seat and to detect user commands such as hand gesture commands issued by a child.
- the control subsystem which is in communication with the detection subsystem, is configured to enable a user to control at least one vehicle function.
- the system additionally includes a response subsystem, such as an audiovisual entertainment system or a temperature control system.
- FIG. 1 is an overhead interior plan view of a vehicle having a system for child control of vehicle functions
- FIG. 2 is a partial interior view of the vehicle with components of a detection subsystem which is a portion of the system for child control of vehicle functions;
- FIG. 3 is a schematic representation of the operation of the system for child control of vehicle functions.
- FIG. 4 is a block diagram illustrating an example of a control algorithm useable by the system for child control of vehicle functions.
- the present disclosure describes a system and method to reduce driver distraction by enabling children as young as toddlers to control certain vehicle systems such as entertainment systems.
- vehicle systems such as entertainment systems.
- the embodiments described herein create an interactive environment for children and allow drivers to focus on the road.
- the various embodiments described herein generally include a variety of sensors enabled to detect the presence of a child in a vehicle seat, and to detect gesture commands issued by the child. These sensors are in communication with a control module tasked with interpreting the gesture commands.
- the control module will typically have access to a command database, which it uses to match known commands to the child's detected gestures. The control module will then relay the commands to various execution systems, such as audio/visual systems or temperature control systems.
- a vehicle 100 includes a system 200 configured to facilitate the control of vehicle functions by a child.
- the term “child” as used here refers generally to any minor. But as will become apparent, the system 200 is particularly suited in certain of its operations to facilitate the control of vehicle functions by a young child who has difficulty operating conventional devices such as those controlled by buttons or knobs. In certain of its operations, the system 200 is particularly suited to facilitate control of vehicle functions by a child who is too young to speak.
- the system 200 includes a detection subsystem 210 , configured to detect the presence of a child and to detect user commands.
- the system additionally includes a control subsystem 220 , configured to interpret user commands and to control vehicle functions.
- the detection subsystem 210 and the control subsystem 220 are in communication with one another.
- FIG. 1 illustrates the detection subsystem 210 as being generally located on the left side of the last row of a three seat vehicle
- elements of the detection subsystem 210 can be located anywhere in a vehicle 100 interior, such as in any seat, or anywhere within a headliner, floor liner or door panel, for example.
- the control subsystem 220 is illustrated as being generally located in the vicinity of a vehicle 100 control panel or head unit, elements of the control subsystem 220 can be located anywhere throughout the vehicle.
- the detection subsystem 210 includes at least one child detection sensor 212 , configured to detect the presence of a child in a vehicle passenger seat.
- vehicle passenger seat refers to any appropriate seating area in the vehicle 100 other than the driver's seat, but particularly refers to a second or third row seat or any seat not in the driver's seat row.
- a child detection sensor 212 can include a seat pressure sensor, an imaging sensor such as a closed-circuit television camera detecting two-dimensional or three-dimensional video image data, an audio sensor, or any other device capable of detecting physical properties of a child useful to distinguish a child from an adult. In many instances, more than one child detection sensor 212 will be deployed throughout the vehicle.
- the detection subsystem 210 also includes at least one command detection sensor 214 , configured to detect commands issued by a user and to transmit data relating to said commands.
- the commands to be detected by any given command detection sensor 214 can include visible commands, such as facial expression commands or hand gesture commands.
- the commands to be detected by any given command detection sensor 214 can include audible commands, such as uttered words or other sounds.
- Suitable examples of a gesture detection sensor can include a two-dimensional or three-dimensional imaging sensor capable of detecting user issued commands, in particular gesture commands.
- the same device can function as both a child detection sensor 212 and a command detection sensor 214 .
- an imaging sensor could detect the presence of a child in a seat and detect gesture commands issued by the child.
- a child detection sensor 212 and a command detection sensor 214 can be different devices.
- any given seat in the vehicle 100 can have more than one child detection sensor 212 deployed to detect the presence of a child in that particular seat. In some instances of such a deployment, child detection can proceed through a first determination event indicating the possible presence of a child in the seat, followed by a second detection event confirming the presence of a child in the seat.
- a child detection sensor 212 such as a seat pressure sensor could indicate the possible presence of a child in a vehicle passenger seat, for example by detecting a weight between 30 and 100 pounds disposed on a rear passenger seat.
- a second child detection sensor 212 such as an imaging sensor properly positioned to have a viewing field encompassing the seating area, could be activated by this first determination. The activated second child detection sensor can then monitor the field of view for imaging data consistent with the presence of a child.
- FIG. 2 shows a somewhat stylized, partial interior view of a vehicle having an imaging sensor in the floor functioning as a command detection sensor 214 . As noted above, the imaging sensor of FIG. 2 can also be functioning as a child detection sensor 212 .
- the control subsystem 220 can generally include a control module 222 with a processor 224 , a memory 226 , and an interface 228 .
- the processor 224 may be any type of microprocessor having desired performance characteristics.
- the memory 226 can include any type of computer readable medium which stores the data and control algorithms described herein or otherwise useful to system 200 .
- the functions of an control algorithm that can be included in memory 226 are illustrated in FIG. 4 in terms of a functional block diagram. It should be understood by those skilled in the art with the benefit of this disclosure that these functions may be enacted in either dedicated hardware circuitry or programmed software routines capable of execution in a microprocessor based electronics control embodiment.
- control algorithm can include or access additional algorithms or libraries, such as a gesture command library and/or a gesture interpretation algorithm.
- the memory 226 can include a gesture command library containing all gesture commands interpretable by the system 200 .
- a control algorithm and/or a gesture interpretation algorithm would compare the received data to data stored in the gesture command library to determine whether an executable command had been detected.
- control module 222 may be a portion of a central vehicle control, a stand-alone unit, or other system such as a cloud-based system.
- Other operational software for the processor 224 may also be stored in the memory 226 .
- the interface 228 facilitates communication with other subsystems such as the detection subsystem 210 or a response subsystem 300 discussed below.
- the system 200 will further include a response subsystem 300 in communication with the control subsystem 220 .
- the response subsystem 300 is identifiable with the vehicle function that is subject to control by the system 200 .
- the response subsystem 300 could be an audiovisual system or a temperature control system.
- the illustrative examples of the response subsystem 300 is an audiovisual system such as can play videos, music, or other audiovisual entertainment for a child sitting in a rear passenger seat.
- the system 200 can store child profiles containing identification and/or permissions data relating to specific children, categories of children, or both.
- the system 200 could store a child profile for each of those three children.
- Each child profile can contain, for example, weight data, skeleton joint relationship data, or facial recognition data useable by the system to specifically identify each child when present as a passenger in the vehicle 100 .
- Each child profile can additionally contain permissions data indicating what vehicle functions that child may control or the extent to which s/he may control them. For example, a younger child could have permissions to only control a video playback system directed to his/her seat, while an older child has permissions to control a video playback system as well as localized temperature control system.
- a driver, parent or other vehicle user can input or edit child profiles either through a direct interaction with vehicle controls or remotely such as through a remote personal computer or mobile device application. For example, if a parent/driver discovers that a child passenger routinely misuses the system 200 , the parent/driver can edit that child's profile to restrict control permissions. In other variations, a parent/driver can reversibly deactivate the system 200 .
- a parent places two children, ages two and five, in the back seats of a family minivan. The parent gets in the driver's seat and begins driving.
- System 200 can be activated at various times, such as when the children are placed in their seats, when main vehicle electrical power is engaged, when the parent begins driving, or at another suitable time.
- pressure sensors in the two rear seats operating as child detection sensors 212 , detect twenty-five and fifty pounds pressure, respectively in the two seats.
- the control subsystem 220 further accesses stored child profiles and determines the data are consistent with two specific children from whom profiles are stored.
- the control subsystem 220 then activates two imaging sensors positioned to have a field of view encompassing the two seats.
- the imaging sensors acquire image and/or motion data and communicate these data to the control subsystem 220 .
- the control subsystem 220 compares the newly received data to information stored in the child profiles or elsewhere pertaining to facial recognition, joint skeletal relationships, or the like and confirms on that basis the presence and identities of the two seated children.
- the control subsystem 220 directs two video screens, one each deployed in a convenient viewing area for each child, to display a welcome message, each customized to the respective child.
- the two video screens can be regarded as elements of a response subsystem 300 which can include speakers or other devices.
- the control subsystem 220 continues receiving imaging data from the two imaging sensors and separately compares the received data to a gesture command library.
- the system 200 determines, based on data stored in the child profiles or elsewhere that a relatively small number of gesture commands can be considered executable when detected issued by the two year old, while a larger number of gesture commands can be considered executable when issued by the five year old.
- the two year old issues a first hand gesture, such as a clap or a thumbs up to bring up on the display four images relating to four videos the child may watch.
- the child points at one of the images and that video begins playing.
- the five year old issues a first hand gesture, such as a clap or a thumbs up to bring up on the display a scroll bar to enable scrolling through a variety of images relating to videos or music the child may select.
- the child conducts a series of lateral swipe gestures to scroll through the images and ultimately points at the image pertaining to the content he wishes to select.
- the control subsystem 220 directs the response subsystem to play the selected content.
- the control subsystem 220 upon receiving this information from the relevant imaging sensor of the detection subsystem, directs the vehicles temperature control system to send warm air through vents located near that child's seat.
- the method includes a step of equipping a control system 200 , for installation in a vehicle 100 , with a detection subsystem 210 and a control subsystem 220 .
- the detection subsystem 210 can include at least one child detection sensor 212 and at least one command sensor 214 and the control subsystem 220 can be configured to enable a child to control at least one vehicle function.
- the method is performed such that the detection subsystem 210 is operable to detect a command such as a hand gesture command issued by a child.
- the detection subsystem 210 can also be configured to transmit command data to the control subsystem 220 .
- the control subsystem 220 is operable to receive and interpret the command data transmitted by the detection subsystem 210 .
- the command subsystem 220 can then issue execution instructions to a response subsystem 300 .
- the system 200 , detection subsystem 210 , command subsystem 220 , and response subsystem 300 as used with the method are as described above.
- vehicle 100 having a system 200 of the type described above.
- vehicle 100 can be a car, van, truck, or any motor vehicle which can ordinarily be used to transport children.
Abstract
A system for a vehicle is configured to enable young children to control certain vehicle functions such as audiovisual, entertainment or temperature control functions. In different variations, the system can detect the presence of a child in a rear vehicle seat, determine whether the child a specific child known to the system and then accordingly grant vehicle function control permissions. The system can detect, interpret, and execute gesture commands issued by a child. In many instances, useable gesture commands can be sufficiently simple that they are understandable to and reproducible by children even as young as toddlers. In general, the operation of the system can decrease driver distraction by freeing a driver of the need to operate vehicle functions on behalf of children.
Description
- This application claims priority to U.S. Provisional Application 61/878,898, filed Sep. 17, 2013 and is a continuation-in-part of U.S. patent application Ser. No. 14/180,563, filed Feb. 14, 2014, which claims priority to U.S. Provisional Application No. 61/878,898, filed Sep. 17, 2013, each of which is incorporated herein by reference.
- The present disclosure relates to a vehicle and more particularly to systems and methods which enable young children to control certain vehicle functions.
- Young passengers riding in vehicles can at times become restless and noisy, causing driver distraction. Modern vehicles often contain entertainment systems, such as rear seat DVD displays or other audiovisual entertainment systems, which can decrease driver distraction by providing entertainment or other engagement for young passengers thereby minimizing restless back seat behavior. Frequently however, such systems are not amenable to direct control by young children. When the driver is required to control such systems on childrens' behalf, this can increase driver distraction. Even child-oriented systems with voice recognition or child-friendly controls such as touch screens or the like may not be amenable to young children not-yet-developed speaking ability or manual motor skills
- Research related to young children and sign language indicates that in some cases children even as young as six months can learn and understand rudimentary sign language or gesture-based communication. Young children who have started to speak but have imperfect pronunciation or limited speaking vocabularies are capable of learning and understanding fairly extensive sign language or gesture-based communication.
- A system for a vehicle is configured to enable child control of vehicle functions. The system includes a detection subsystem and a control subsystem. The detection subsystem can be configured to detect the presence of a child in a vehicle passenger seat and to detect user commands such as hand gesture commands issued by a child. The control subsystem, which is in communication with the detection subsystem, is configured to enable a user to control at least one vehicle function. In many cases, the system additionally includes a response subsystem, such as an audiovisual entertainment system or a temperature control system.
- A method for enabling child control of vehicle functions can include a step of equipping a control system, for installation in a vehicle, with a detection subsystem and a control subsystem. The detection subsystem can include at least one child detection sensor and at least one command sensor and the control subsystem can be configured to enable a child to control at least one vehicle function. The detection subsystem can be operable to transmit command data to the control subsystem and the control subsystem can be operable to receive and interpret the command data.
- A vehicle which possesses a system configured to enable a child to control vehicle functions can include a detection subsystem and a control subsystem. The detection subsystem can be configured to detect the presence of a child in a vehicle passenger seat and to detect user commands such as hand gesture commands issued by a child. The control subsystem, which is in communication with the detection subsystem, is configured to enable a user to control at least one vehicle function. In many cases, the system additionally includes a response subsystem, such as an audiovisual entertainment system or a temperature control system.
- Various features will become apparent to those skilled in the art from the following detailed description of the disclosed non-limiting embodiment. The drawings that accompany the detailed description can be briefly described as follows:
-
FIG. 1 is an overhead interior plan view of a vehicle having a system for child control of vehicle functions; -
FIG. 2 is a partial interior view of the vehicle with components of a detection subsystem which is a portion of the system for child control of vehicle functions; -
FIG. 3 is a schematic representation of the operation of the system for child control of vehicle functions; and -
FIG. 4 is a block diagram illustrating an example of a control algorithm useable by the system for child control of vehicle functions. - The present disclosure describes a system and method to reduce driver distraction by enabling children as young as toddlers to control certain vehicle systems such as entertainment systems. By engaging children in the riding experience and freeing drivers of the need to choose and play videos, change music, adjust temperature, etc., the embodiments described herein create an interactive environment for children and allow drivers to focus on the road.
- The various embodiments described herein generally include a variety of sensors enabled to detect the presence of a child in a vehicle seat, and to detect gesture commands issued by the child. These sensors are in communication with a control module tasked with interpreting the gesture commands. The control module will typically have access to a command database, which it uses to match known commands to the child's detected gestures. The control module will then relay the commands to various execution systems, such as audio/visual systems or temperature control systems.
- Referring now to
FIG. 1 , avehicle 100 includes asystem 200 configured to facilitate the control of vehicle functions by a child. The term “child” as used here refers generally to any minor. But as will become apparent, thesystem 200 is particularly suited in certain of its operations to facilitate the control of vehicle functions by a young child who has difficulty operating conventional devices such as those controlled by buttons or knobs. In certain of its operations, thesystem 200 is particularly suited to facilitate control of vehicle functions by a child who is too young to speak. - With continuing reference to
FIG. 1 , thesystem 200 includes adetection subsystem 210, configured to detect the presence of a child and to detect user commands. The system additionally includes acontrol subsystem 220, configured to interpret user commands and to control vehicle functions. Thedetection subsystem 210 and thecontrol subsystem 220 are in communication with one another. - It should be understood that, while
FIG. 1 illustrates thedetection subsystem 210 as being generally located on the left side of the last row of a three seat vehicle, elements of thedetection subsystem 210 can be located anywhere in avehicle 100 interior, such as in any seat, or anywhere within a headliner, floor liner or door panel, for example. Similarly, while thecontrol subsystem 220 is illustrated as being generally located in the vicinity of avehicle 100 control panel or head unit, elements of thecontrol subsystem 220 can be located anywhere throughout the vehicle. - The
detection subsystem 210 includes at least onechild detection sensor 212, configured to detect the presence of a child in a vehicle passenger seat. As used herein, the phrase “vehicle passenger seat” refers to any appropriate seating area in thevehicle 100 other than the driver's seat, but particularly refers to a second or third row seat or any seat not in the driver's seat row. Achild detection sensor 212 can include a seat pressure sensor, an imaging sensor such as a closed-circuit television camera detecting two-dimensional or three-dimensional video image data, an audio sensor, or any other device capable of detecting physical properties of a child useful to distinguish a child from an adult. In many instances, more than onechild detection sensor 212 will be deployed throughout the vehicle. - The
detection subsystem 210 also includes at least onecommand detection sensor 214, configured to detect commands issued by a user and to transmit data relating to said commands. In some variations, the commands to be detected by any givencommand detection sensor 214 can include visible commands, such as facial expression commands or hand gesture commands. In the same or other variations, the commands to be detected by any givencommand detection sensor 214 can include audible commands, such as uttered words or other sounds. Suitable examples of a gesture detection sensor can include a two-dimensional or three-dimensional imaging sensor capable of detecting user issued commands, in particular gesture commands. - It should be understood that in some instances the same device can function as both a
child detection sensor 212 and acommand detection sensor 214. For example, an imaging sensor could detect the presence of a child in a seat and detect gesture commands issued by the child. In other instances, achild detection sensor 212 and acommand detection sensor 214 can be different devices. In various instances, any given seat in thevehicle 100 can have more than onechild detection sensor 212 deployed to detect the presence of a child in that particular seat. In some instances of such a deployment, child detection can proceed through a first determination event indicating the possible presence of a child in the seat, followed by a second detection event confirming the presence of a child in the seat. - As an example of the latter scenario, in a first determination a
child detection sensor 212, such as a seat pressure sensor could indicate the possible presence of a child in a vehicle passenger seat, for example by detecting a weight between 30 and 100 pounds disposed on a rear passenger seat. A secondchild detection sensor 212, such as an imaging sensor properly positioned to have a viewing field encompassing the seating area, could be activated by this first determination. The activated second child detection sensor can then monitor the field of view for imaging data consistent with the presence of a child.FIG. 2 shows a somewhat stylized, partial interior view of a vehicle having an imaging sensor in the floor functioning as acommand detection sensor 214. As noted above, the imaging sensor ofFIG. 2 can also be functioning as achild detection sensor 212. - With reference to
FIG. 3 , thecontrol subsystem 220 can generally include acontrol module 222 with aprocessor 224, amemory 226, and aninterface 228. Theprocessor 224 may be any type of microprocessor having desired performance characteristics. Thememory 226 can include any type of computer readable medium which stores the data and control algorithms described herein or otherwise useful tosystem 200. The functions of an control algorithm that can be included inmemory 226 are illustrated inFIG. 4 in terms of a functional block diagram. It should be understood by those skilled in the art with the benefit of this disclosure that these functions may be enacted in either dedicated hardware circuitry or programmed software routines capable of execution in a microprocessor based electronics control embodiment. - In some instances, control algorithm can include or access additional algorithms or libraries, such as a gesture command library and/or a gesture interpretation algorithm. For example, the
memory 226 can include a gesture command library containing all gesture commands interpretable by thesystem 200. Upon receipt of data from acommand detection sensor 214 of thedetection subsystem 210, such a control algorithm and/or a gesture interpretation algorithm would compare the received data to data stored in the gesture command library to determine whether an executable command had been detected. - With continued reference to
FIG. 3 , thecontrol module 222 may be a portion of a central vehicle control, a stand-alone unit, or other system such as a cloud-based system. Other operational software for theprocessor 224 may also be stored in thememory 226. Theinterface 228 facilitates communication with other subsystems such as thedetection subsystem 210 or aresponse subsystem 300 discussed below. - In many instances, the
system 200 will further include aresponse subsystem 300 in communication with thecontrol subsystem 220. Theresponse subsystem 300 is identifiable with the vehicle function that is subject to control by thesystem 200. For example, theresponse subsystem 300 could be an audiovisual system or a temperature control system. In the examples ofFIGS. 1-3 , the illustrative examples of theresponse subsystem 300 is an audiovisual system such as can play videos, music, or other audiovisual entertainment for a child sitting in a rear passenger seat. - In some variations, the
system 200 can store child profiles containing identification and/or permissions data relating to specific children, categories of children, or both. For example, in avehicle 100 which routinely carries three specific children, thesystem 200 could store a child profile for each of those three children. Each child profile can contain, for example, weight data, skeleton joint relationship data, or facial recognition data useable by the system to specifically identify each child when present as a passenger in thevehicle 100. Each child profile can additionally contain permissions data indicating what vehicle functions that child may control or the extent to which s/he may control them. For example, a younger child could have permissions to only control a video playback system directed to his/her seat, while an older child has permissions to control a video playback system as well as localized temperature control system. - Optionally, a driver, parent or other vehicle user can input or edit child profiles either through a direct interaction with vehicle controls or remotely such as through a remote personal computer or mobile device application. For example, if a parent/driver discovers that a child passenger routinely misuses the
system 200, the parent/driver can edit that child's profile to restrict control permissions. In other variations, a parent/driver can reversibly deactivate thesystem 200. - Following now is an exemplary scenario to further illustrate the use and some operational features of the
system 200. A parent places two children, ages two and five, in the back seats of a family minivan. The parent gets in the driver's seat and begins driving.System 200 can be activated at various times, such as when the children are placed in their seats, when main vehicle electrical power is engaged, when the parent begins driving, or at another suitable time. Upon activation ofsystem 200, pressure sensors in the two rear seats, operating aschild detection sensors 212, detect twenty-five and fifty pounds pressure, respectively in the two seats. These data are sent to thecontrol subsystem 220 which determines that the data are consistent with the presence of children in the two seats. Thecontrol subsystem 220 further accesses stored child profiles and determines the data are consistent with two specific children from whom profiles are stored. - The
control subsystem 220 then activates two imaging sensors positioned to have a field of view encompassing the two seats. The imaging sensors acquire image and/or motion data and communicate these data to thecontrol subsystem 220. Thecontrol subsystem 220 compares the newly received data to information stored in the child profiles or elsewhere pertaining to facial recognition, joint skeletal relationships, or the like and confirms on that basis the presence and identities of the two seated children. - The
control subsystem 220 directs two video screens, one each deployed in a convenient viewing area for each child, to display a welcome message, each customized to the respective child. The two video screens can be regarded as elements of aresponse subsystem 300 which can include speakers or other devices. Thecontrol subsystem 220 continues receiving imaging data from the two imaging sensors and separately compares the received data to a gesture command library. Thesystem 200 determines, based on data stored in the child profiles or elsewhere that a relatively small number of gesture commands can be considered executable when detected issued by the two year old, while a larger number of gesture commands can be considered executable when issued by the five year old. - The two year old issues a first hand gesture, such as a clap or a thumbs up to bring up on the display four images relating to four videos the child may watch. The child points at one of the images and that video begins playing. The five year old issues a first hand gesture, such as a clap or a thumbs up to bring up on the display a scroll bar to enable scrolling through a variety of images relating to videos or music the child may select. The child conducts a series of lateral swipe gestures to scroll through the images and ultimately points at the image pertaining to the content he wishes to select. The
control subsystem 220 directs the response subsystem to play the selected content. - Subsequently, the five year old feels uncomfortably cold, wraps his arms around himself, and grimaces. The
control subsystem 220, upon receiving this information from the relevant imaging sensor of the detection subsystem, directs the vehicles temperature control system to send warm air through vents located near that child's seat. - It should be understood that the scenario described above is exemplary only, and is not intended to describe all uses or operations of the
system 200. Nor is this scenario intended to suggest that the all uses or operations described therein are will be present in different embodiments. Further, the sequence of operations above could be different, and various operations could be separated from one another or merged. - Also disclosed is a method for enabling child control of vehicle functions. The method includes a step of equipping a
control system 200, for installation in avehicle 100, with adetection subsystem 210 and acontrol subsystem 220. Thedetection subsystem 210 can include at least onechild detection sensor 212 and at least onecommand sensor 214 and thecontrol subsystem 220 can be configured to enable a child to control at least one vehicle function. Typically the method is performed such that thedetection subsystem 210 is operable to detect a command such as a hand gesture command issued by a child. Thedetection subsystem 210 can also be configured to transmit command data to thecontrol subsystem 220. Typically, thecontrol subsystem 220 is operable to receive and interpret the command data transmitted by thedetection subsystem 210. Upon interpreting command data, thecommand subsystem 220 can then issue execution instructions to aresponse subsystem 300. In particular characteristics, thesystem 200,detection subsystem 210,command subsystem 220, andresponse subsystem 300 as used with the method are as described above. - Also considered to be specifically within the scope of the disclosure is a
vehicle 100 having asystem 200 of the type described above. Thevehicle 100 can be a car, van, truck, or any motor vehicle which can ordinarily be used to transport children. - The foregoing description is exemplary rather than defined by the limitations within. Various non-limiting embodiments are disclosed herein, however, one of ordinary skill in the art would recognize that various modifications and variations in light of the above teachings will fall within the scope of the appended claims. It is therefore to be appreciated that within the scope of the appended claims, the disclosure may be practiced other than as specifically described. For that reason the appended claims should be studied to determine true scope and content.
Claims (17)
1. A system for a vehicle, comprising:
a detection subsystem configured to detect the presence of a child in a vehicle passenger seat and to detect commands issued by the child; and
a control subsystem in communication with the detection subsystem and configured to enable a child to control at least one vehicle function.
2. The system as recited in claim 1 , further comprising at least one response subsystem.
3. The system as recited in claim 2 , wherein the at least one response subsystem comprises any of an audio function, a video function, and an audiovisual function.
4. The system as recited in claim 1 , wherein the detection subsystem comprises:
at least one child detection sensor; and
at least one command sensor.
5. The system as recited in claim 1 , wherein the at least one command sensor is configured to detect a gesture command.
6. The system as recited in claim 1 , wherein the at least one command sensor comprises an imaging sensor.
7. The system as recited in claim 1 , wherein the control subsystem has access to a gesture command library.
8. The system as recited in claim 1 , further comprising at least one child profile which contains identification and permissions data relevant to a specific child.
9. A method for enabling child control of vehicle functions, the method comprising:
equipping a control system, for installation in a vehicle, with:
a detection subsystem comprising at least one child detection sensor and at least one command sensor; and
a control subsystem configured to enable a child to control at least one vehicle function;
wherein the detection subsystem is operable to transmit command data to the control subsystem and the control subsystem is operable to receive and interpret the command data.
10. A vehicle having a system configured to enable a child to control vehicle functions, the system comprising:
a detection subsystem configured to detect the presence of a child in a vehicle passenger seat and to detect user commands; and
a control subsystem in communication with the detection subsystem and configured to enable a user to control at least one vehicle function.
11. The vehicle as recited in claim 10 , wherein the system further comprises at least one response subsystem.
12. The vehicle as recited in claim 11 , wherein the at least one response subsystem comprises any of an audio function, a video function, and an audiovisual function.
13. The vehicle as recited in claim 10 , wherein the detection subsystem comprises:
at least one child detection sensor; and
at least one command sensor.
14. The vehicle as recited in claim 10 , wherein the at least one command sensor is configured to detect a gesture command.
15. The vehicle as recited in claim 10 , wherein the at least one command sensor comprises an imaging sensor.
16. The vehicle as recited in claim 10 , wherein the control subsystem has access to a gesture command library.
17. The vehicle as recited in claim 10 , further comprising at least one child profile which contains identification and permissions data relevant to a specific child.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/447,465 US20150081133A1 (en) | 2013-09-17 | 2014-07-30 | Gesture-based system enabling children to control some vehicle functions in a vehicle |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361878898P | 2013-09-17 | 2013-09-17 | |
US14/180,563 US20150081167A1 (en) | 2013-09-17 | 2014-02-14 | Interactive vehicle window display system with vehicle function control |
US14/447,465 US20150081133A1 (en) | 2013-09-17 | 2014-07-30 | Gesture-based system enabling children to control some vehicle functions in a vehicle |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/180,563 Continuation-In-Part US20150081167A1 (en) | 2013-09-17 | 2014-02-14 | Interactive vehicle window display system with vehicle function control |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150081133A1 true US20150081133A1 (en) | 2015-03-19 |
Family
ID=52668692
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/447,465 Abandoned US20150081133A1 (en) | 2013-09-17 | 2014-07-30 | Gesture-based system enabling children to control some vehicle functions in a vehicle |
Country Status (1)
Country | Link |
---|---|
US (1) | US20150081133A1 (en) |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140309868A1 (en) * | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | User interface and virtual personality presentation based on user profile |
US20150367789A1 (en) * | 2014-06-18 | 2015-12-24 | GM Global Technology Operations LLC | Vehicle apparatus control from rear seat |
US9340155B2 (en) | 2013-09-17 | 2016-05-17 | Toyota Motor Sales, U.S.A., Inc. | Interactive vehicle window display system with user identification |
US9387824B2 (en) | 2013-09-17 | 2016-07-12 | Toyota Motor Engineering & Manufacturing North America, Inc. | Interactive vehicle window display system with user identification and image recording |
US9400564B2 (en) | 2013-09-17 | 2016-07-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Interactive vehicle window display system with a safe driving reminder system |
US20160244011A1 (en) * | 2012-03-14 | 2016-08-25 | Autoconnect Holdings Llc | User interface and virtual personality presentation based on user profile |
US20170166055A1 (en) * | 2015-12-10 | 2017-06-15 | Myine Electronics, Inc. | Methods and Systems for Interactive Passenger Notification |
US9760698B2 (en) | 2013-09-17 | 2017-09-12 | Toyota Motor Sales, U.S.A., Inc. | Integrated wearable article for interactive vehicle control system |
US9807196B2 (en) | 2013-09-17 | 2017-10-31 | Toyota Motor Sales, U.S.A. | Automated social network interaction system for a vehicle |
US9902266B2 (en) | 2013-09-17 | 2018-02-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Interactive vehicle window display system with personal convenience reminders |
US9928734B2 (en) | 2016-08-02 | 2018-03-27 | Nio Usa, Inc. | Vehicle-to-pedestrian communication systems |
US9946906B2 (en) | 2016-07-07 | 2018-04-17 | Nio Usa, Inc. | Vehicle with a soft-touch antenna for communicating sensitive information |
US9963106B1 (en) | 2016-11-07 | 2018-05-08 | Nio Usa, Inc. | Method and system for authentication in autonomous vehicles |
US9984572B1 (en) | 2017-01-16 | 2018-05-29 | Nio Usa, Inc. | Method and system for sharing parking space availability among autonomous vehicles |
US10031521B1 (en) | 2017-01-16 | 2018-07-24 | Nio Usa, Inc. | Method and system for using weather information in operation of autonomous vehicles |
US10074223B2 (en) | 2017-01-13 | 2018-09-11 | Nio Usa, Inc. | Secured vehicle for user use only |
CN109153332A (en) * | 2016-05-20 | 2019-01-04 | 福特全球技术公司 | The sign language of vehicle user interface inputs |
US20190011993A1 (en) * | 2016-01-04 | 2019-01-10 | Volkswagen Aktiengesellschaft | Method for the interactive presentation of content on an outer face of a transportation vehicle |
US10180682B2 (en) | 2017-02-23 | 2019-01-15 | The Directv Group, Inc. | Shared control of vehicle functions |
US10234302B2 (en) | 2017-06-27 | 2019-03-19 | Nio Usa, Inc. | Adaptive route and motion planning based on learned external and internal vehicle environment |
US10249104B2 (en) | 2016-12-06 | 2019-04-02 | Nio Usa, Inc. | Lease observation and event recording |
US10286915B2 (en) | 2017-01-17 | 2019-05-14 | Nio Usa, Inc. | Machine learning for personalized driving |
GB2568669A (en) * | 2017-11-17 | 2019-05-29 | Jaguar Land Rover Ltd | Vehicle controller |
ES2717343A1 (en) * | 2017-12-20 | 2019-06-20 | Seat Sa | Method and gesture control device of at least one function of a vehicle (Machine-translation by Google Translate, not legally binding) |
US10369974B2 (en) | 2017-07-14 | 2019-08-06 | Nio Usa, Inc. | Control and coordination of driverless fuel replenishment for autonomous vehicles |
US10369966B1 (en) | 2018-05-23 | 2019-08-06 | Nio Usa, Inc. | Controlling access to a vehicle using wireless access devices |
US10410250B2 (en) | 2016-11-21 | 2019-09-10 | Nio Usa, Inc. | Vehicle autonomy level selection based on user context |
US10410064B2 (en) | 2016-11-11 | 2019-09-10 | Nio Usa, Inc. | System for tracking and identifying vehicles and pedestrians |
US10464530B2 (en) | 2017-01-17 | 2019-11-05 | Nio Usa, Inc. | Voice biometric pre-purchase enrollment for autonomous vehicles |
US10471829B2 (en) | 2017-01-16 | 2019-11-12 | Nio Usa, Inc. | Self-destruct zone and autonomous vehicle navigation |
US10606274B2 (en) | 2017-10-30 | 2020-03-31 | Nio Usa, Inc. | Visual place recognition based self-localization for autonomous vehicles |
US10635109B2 (en) | 2017-10-17 | 2020-04-28 | Nio Usa, Inc. | Vehicle path-planner monitor and controller |
US10692126B2 (en) | 2015-11-17 | 2020-06-23 | Nio Usa, Inc. | Network-based system for selling and servicing cars |
US10694357B2 (en) | 2016-11-11 | 2020-06-23 | Nio Usa, Inc. | Using vehicle sensor data to monitor pedestrian health |
US10708547B2 (en) | 2016-11-11 | 2020-07-07 | Nio Usa, Inc. | Using vehicle sensor data to monitor environmental and geologic conditions |
US10710633B2 (en) | 2017-07-14 | 2020-07-14 | Nio Usa, Inc. | Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles |
US10717412B2 (en) | 2017-11-13 | 2020-07-21 | Nio Usa, Inc. | System and method for controlling a vehicle using secondary access methods |
CN111762188A (en) * | 2019-03-27 | 2020-10-13 | 本田技研工业株式会社 | Vehicle equipment control device, vehicle equipment control method, and storage medium |
US20200346546A1 (en) * | 2017-12-26 | 2020-11-05 | Lg Electronics Inc. | In-vehicle display device |
US10837790B2 (en) | 2017-08-01 | 2020-11-17 | Nio Usa, Inc. | Productive and accident-free driving modes for a vehicle |
US10897469B2 (en) | 2017-02-02 | 2021-01-19 | Nio Usa, Inc. | System and method for firewalls between vehicle networks |
US10935978B2 (en) | 2017-10-30 | 2021-03-02 | Nio Usa, Inc. | Vehicle self-localization using particle filters and visual odometry |
Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5774591A (en) * | 1995-12-15 | 1998-06-30 | Xerox Corporation | Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images |
US20020126876A1 (en) * | 1999-08-10 | 2002-09-12 | Paul George V. | Tracking and gesture recognition system particularly suited to vehicular control applications |
US20030190076A1 (en) * | 2002-04-05 | 2003-10-09 | Bruno Delean | Vision-based operating method and system |
US20040052418A1 (en) * | 2002-04-05 | 2004-03-18 | Bruno Delean | Method and apparatus for probabilistic image analysis |
US20050271279A1 (en) * | 2004-05-14 | 2005-12-08 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
US20070298885A1 (en) * | 2006-06-12 | 2007-12-27 | Tran Bao Q | Mesh network game controller with voice transmission, search capability, motion detection, and/or position detection |
US20080051946A1 (en) * | 1999-12-15 | 2008-02-28 | Automotive Technologies International, Inc. | Vehicular Heads-Up Display System |
US20080048930A1 (en) * | 1999-12-15 | 2008-02-28 | Automotive Technologies International, Inc. | Vehicular Heads-Up Display System |
US20080167892A1 (en) * | 2007-01-10 | 2008-07-10 | Neil Clark | System for ride sharing and method therefor |
US20080195428A1 (en) * | 2007-02-12 | 2008-08-14 | O'sullivan Sean | Shared transport system and service network |
US7561966B2 (en) * | 2003-12-17 | 2009-07-14 | Denso Corporation | Vehicle information display system |
US20090278915A1 (en) * | 2006-02-08 | 2009-11-12 | Oblong Industries, Inc. | Gesture-Based Control System For Vehicle Interfaces |
US20110010056A1 (en) * | 2009-07-08 | 2011-01-13 | Aisin Seiki Kabushiki Kaisha | Seat load determining apparatus |
US20120232749A1 (en) * | 2007-12-14 | 2012-09-13 | Schoenberg Gregory B | Systems and Methods for Indicating the Presence of a Child in a Vehicle |
US20120265814A1 (en) * | 2011-04-14 | 2012-10-18 | Stilianos George Roussis | Software Application for Managing Personal Matters and Personal Interactions through a Personal Network |
US20120262403A1 (en) * | 2009-12-22 | 2012-10-18 | Dav | Control device for a motor vehicle |
US20130030645A1 (en) * | 2011-07-28 | 2013-01-31 | Panasonic Corporation | Auto-control of vehicle infotainment system based on extracted characteristics of car occupants |
US20130063336A1 (en) * | 2011-09-08 | 2013-03-14 | Honda Motor Co., Ltd. | Vehicle user interface system |
US20130066526A1 (en) * | 2011-09-09 | 2013-03-14 | Thales Avionics, Inc. | Controlling vehicle entertainment systems responsive to sensed passenger gestures |
US8523667B2 (en) * | 2010-03-29 | 2013-09-03 | Microsoft Corporation | Parental control settings based on body dimensions |
US20130261871A1 (en) * | 2012-04-02 | 2013-10-03 | Google Inc. | Gesture-Based Automotive Controls |
US20130300644A1 (en) * | 2012-05-11 | 2013-11-14 | Comcast Cable Communications, Llc | System and Methods for Controlling a User Experience |
US8942428B2 (en) * | 2009-05-01 | 2015-01-27 | Microsoft Corporation | Isolate extraneous motions |
US9083581B1 (en) * | 2011-01-14 | 2015-07-14 | Cisco Technology, Inc. | System and method for providing resource sharing, synchronizing, media coordination, transcoding, and traffic management in a vehicular environment |
-
2014
- 2014-07-30 US US14/447,465 patent/US20150081133A1/en not_active Abandoned
Patent Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5774591A (en) * | 1995-12-15 | 1998-06-30 | Xerox Corporation | Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images |
US20020126876A1 (en) * | 1999-08-10 | 2002-09-12 | Paul George V. | Tracking and gesture recognition system particularly suited to vehicular control applications |
US7050606B2 (en) * | 1999-08-10 | 2006-05-23 | Cybernet Systems Corporation | Tracking and gesture recognition system particularly suited to vehicular control applications |
US20080051946A1 (en) * | 1999-12-15 | 2008-02-28 | Automotive Technologies International, Inc. | Vehicular Heads-Up Display System |
US20080048930A1 (en) * | 1999-12-15 | 2008-02-28 | Automotive Technologies International, Inc. | Vehicular Heads-Up Display System |
US20030190076A1 (en) * | 2002-04-05 | 2003-10-09 | Bruno Delean | Vision-based operating method and system |
US20040052418A1 (en) * | 2002-04-05 | 2004-03-18 | Bruno Delean | Method and apparatus for probabilistic image analysis |
US7561966B2 (en) * | 2003-12-17 | 2009-07-14 | Denso Corporation | Vehicle information display system |
US20050271279A1 (en) * | 2004-05-14 | 2005-12-08 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
US20090278915A1 (en) * | 2006-02-08 | 2009-11-12 | Oblong Industries, Inc. | Gesture-Based Control System For Vehicle Interfaces |
US20070298885A1 (en) * | 2006-06-12 | 2007-12-27 | Tran Bao Q | Mesh network game controller with voice transmission, search capability, motion detection, and/or position detection |
US20080167892A1 (en) * | 2007-01-10 | 2008-07-10 | Neil Clark | System for ride sharing and method therefor |
US20080195428A1 (en) * | 2007-02-12 | 2008-08-14 | O'sullivan Sean | Shared transport system and service network |
US20120232749A1 (en) * | 2007-12-14 | 2012-09-13 | Schoenberg Gregory B | Systems and Methods for Indicating the Presence of a Child in a Vehicle |
US8942428B2 (en) * | 2009-05-01 | 2015-01-27 | Microsoft Corporation | Isolate extraneous motions |
US20110010056A1 (en) * | 2009-07-08 | 2011-01-13 | Aisin Seiki Kabushiki Kaisha | Seat load determining apparatus |
US20120262403A1 (en) * | 2009-12-22 | 2012-10-18 | Dav | Control device for a motor vehicle |
US8523667B2 (en) * | 2010-03-29 | 2013-09-03 | Microsoft Corporation | Parental control settings based on body dimensions |
US9083581B1 (en) * | 2011-01-14 | 2015-07-14 | Cisco Technology, Inc. | System and method for providing resource sharing, synchronizing, media coordination, transcoding, and traffic management in a vehicular environment |
US20120265814A1 (en) * | 2011-04-14 | 2012-10-18 | Stilianos George Roussis | Software Application for Managing Personal Matters and Personal Interactions through a Personal Network |
US20130030645A1 (en) * | 2011-07-28 | 2013-01-31 | Panasonic Corporation | Auto-control of vehicle infotainment system based on extracted characteristics of car occupants |
US20130063336A1 (en) * | 2011-09-08 | 2013-03-14 | Honda Motor Co., Ltd. | Vehicle user interface system |
US20130066526A1 (en) * | 2011-09-09 | 2013-03-14 | Thales Avionics, Inc. | Controlling vehicle entertainment systems responsive to sensed passenger gestures |
US9037354B2 (en) * | 2011-09-09 | 2015-05-19 | Thales Avionics, Inc. | Controlling vehicle entertainment systems responsive to sensed passenger gestures |
US20130261871A1 (en) * | 2012-04-02 | 2013-10-03 | Google Inc. | Gesture-Based Automotive Controls |
US20130300644A1 (en) * | 2012-05-11 | 2013-11-14 | Comcast Cable Communications, Llc | System and Methods for Controlling a User Experience |
Cited By (72)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160244011A1 (en) * | 2012-03-14 | 2016-08-25 | Autoconnect Holdings Llc | User interface and virtual personality presentation based on user profile |
US20170099295A1 (en) * | 2012-03-14 | 2017-04-06 | Autoconnect Holdings Llc | Access and portability of user profiles stored as templates |
US20140309868A1 (en) * | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | User interface and virtual personality presentation based on user profile |
US9400564B2 (en) | 2013-09-17 | 2016-07-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Interactive vehicle window display system with a safe driving reminder system |
US9902266B2 (en) | 2013-09-17 | 2018-02-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Interactive vehicle window display system with personal convenience reminders |
US9387824B2 (en) | 2013-09-17 | 2016-07-12 | Toyota Motor Engineering & Manufacturing North America, Inc. | Interactive vehicle window display system with user identification and image recording |
US9340155B2 (en) | 2013-09-17 | 2016-05-17 | Toyota Motor Sales, U.S.A., Inc. | Interactive vehicle window display system with user identification |
US9760698B2 (en) | 2013-09-17 | 2017-09-12 | Toyota Motor Sales, U.S.A., Inc. | Integrated wearable article for interactive vehicle control system |
US9807196B2 (en) | 2013-09-17 | 2017-10-31 | Toyota Motor Sales, U.S.A. | Automated social network interaction system for a vehicle |
US20150367789A1 (en) * | 2014-06-18 | 2015-12-24 | GM Global Technology Operations LLC | Vehicle apparatus control from rear seat |
US9688220B2 (en) * | 2014-06-18 | 2017-06-27 | GM Global Technology Operations LLC | Vehicle apparatus control from rear seat |
US10692126B2 (en) | 2015-11-17 | 2020-06-23 | Nio Usa, Inc. | Network-based system for selling and servicing cars |
US11715143B2 (en) | 2015-11-17 | 2023-08-01 | Nio Technology (Anhui) Co., Ltd. | Network-based system for showing cars for sale by non-dealer vehicle owners |
US20170166055A1 (en) * | 2015-12-10 | 2017-06-15 | Myine Electronics, Inc. | Methods and Systems for Interactive Passenger Notification |
US10877566B2 (en) * | 2016-01-04 | 2020-12-29 | Volkswagen Aktiengesellschaft | Method for the interactive presentation of content on an outer face of a transportation vehicle |
US20190011993A1 (en) * | 2016-01-04 | 2019-01-10 | Volkswagen Aktiengesellschaft | Method for the interactive presentation of content on an outer face of a transportation vehicle |
US11009963B2 (en) * | 2016-05-20 | 2021-05-18 | Ford Global Technologies, Llc | Sign language inputs to a vehicle user interface |
CN109153332A (en) * | 2016-05-20 | 2019-01-04 | 福特全球技术公司 | The sign language of vehicle user interface inputs |
US10262469B2 (en) | 2016-07-07 | 2019-04-16 | Nio Usa, Inc. | Conditional or temporary feature availability |
US9946906B2 (en) | 2016-07-07 | 2018-04-17 | Nio Usa, Inc. | Vehicle with a soft-touch antenna for communicating sensitive information |
US10388081B2 (en) | 2016-07-07 | 2019-08-20 | Nio Usa, Inc. | Secure communications with sensitive user information through a vehicle |
US10672060B2 (en) | 2016-07-07 | 2020-06-02 | Nio Usa, Inc. | Methods and systems for automatically sending rule-based communications from a vehicle |
US10032319B2 (en) | 2016-07-07 | 2018-07-24 | Nio Usa, Inc. | Bifurcated communications to a third party through a vehicle |
US9984522B2 (en) | 2016-07-07 | 2018-05-29 | Nio Usa, Inc. | Vehicle identification or authentication |
US11005657B2 (en) | 2016-07-07 | 2021-05-11 | Nio Usa, Inc. | System and method for automatically triggering the communication of sensitive information through a vehicle to a third party |
US10679276B2 (en) | 2016-07-07 | 2020-06-09 | Nio Usa, Inc. | Methods and systems for communicating estimated time of arrival to a third party |
US10354460B2 (en) | 2016-07-07 | 2019-07-16 | Nio Usa, Inc. | Methods and systems for associating sensitive information of a passenger with a vehicle |
US10685503B2 (en) | 2016-07-07 | 2020-06-16 | Nio Usa, Inc. | System and method for associating user and vehicle information for communication to a third party |
US10699326B2 (en) | 2016-07-07 | 2020-06-30 | Nio Usa, Inc. | User-adjusted display devices and methods of operating the same |
US10304261B2 (en) | 2016-07-07 | 2019-05-28 | Nio Usa, Inc. | Duplicated wireless transceivers associated with a vehicle to receive and send sensitive information |
US9928734B2 (en) | 2016-08-02 | 2018-03-27 | Nio Usa, Inc. | Vehicle-to-pedestrian communication systems |
US10031523B2 (en) | 2016-11-07 | 2018-07-24 | Nio Usa, Inc. | Method and system for behavioral sharing in autonomous vehicles |
US9963106B1 (en) | 2016-11-07 | 2018-05-08 | Nio Usa, Inc. | Method and system for authentication in autonomous vehicles |
US10083604B2 (en) | 2016-11-07 | 2018-09-25 | Nio Usa, Inc. | Method and system for collective autonomous operation database for autonomous vehicles |
US11024160B2 (en) | 2016-11-07 | 2021-06-01 | Nio Usa, Inc. | Feedback performance control and tracking |
US10694357B2 (en) | 2016-11-11 | 2020-06-23 | Nio Usa, Inc. | Using vehicle sensor data to monitor pedestrian health |
US10410064B2 (en) | 2016-11-11 | 2019-09-10 | Nio Usa, Inc. | System for tracking and identifying vehicles and pedestrians |
US10708547B2 (en) | 2016-11-11 | 2020-07-07 | Nio Usa, Inc. | Using vehicle sensor data to monitor environmental and geologic conditions |
US10410250B2 (en) | 2016-11-21 | 2019-09-10 | Nio Usa, Inc. | Vehicle autonomy level selection based on user context |
US10699305B2 (en) | 2016-11-21 | 2020-06-30 | Nio Usa, Inc. | Smart refill assistant for electric vehicles |
US10515390B2 (en) | 2016-11-21 | 2019-12-24 | Nio Usa, Inc. | Method and system for data optimization |
US11922462B2 (en) | 2016-11-21 | 2024-03-05 | Nio Technology (Anhui) Co., Ltd. | Vehicle autonomous collision prediction and escaping system (ACE) |
US10970746B2 (en) | 2016-11-21 | 2021-04-06 | Nio Usa, Inc. | Autonomy first route optimization for autonomous vehicles |
US11710153B2 (en) | 2016-11-21 | 2023-07-25 | Nio Technology (Anhui) Co., Ltd. | Autonomy first route optimization for autonomous vehicles |
US10949885B2 (en) | 2016-11-21 | 2021-03-16 | Nio Usa, Inc. | Vehicle autonomous collision prediction and escaping system (ACE) |
US10249104B2 (en) | 2016-12-06 | 2019-04-02 | Nio Usa, Inc. | Lease observation and event recording |
US10074223B2 (en) | 2017-01-13 | 2018-09-11 | Nio Usa, Inc. | Secured vehicle for user use only |
US10471829B2 (en) | 2017-01-16 | 2019-11-12 | Nio Usa, Inc. | Self-destruct zone and autonomous vehicle navigation |
US10031521B1 (en) | 2017-01-16 | 2018-07-24 | Nio Usa, Inc. | Method and system for using weather information in operation of autonomous vehicles |
US9984572B1 (en) | 2017-01-16 | 2018-05-29 | Nio Usa, Inc. | Method and system for sharing parking space availability among autonomous vehicles |
US10464530B2 (en) | 2017-01-17 | 2019-11-05 | Nio Usa, Inc. | Voice biometric pre-purchase enrollment for autonomous vehicles |
US10286915B2 (en) | 2017-01-17 | 2019-05-14 | Nio Usa, Inc. | Machine learning for personalized driving |
US11811789B2 (en) | 2017-02-02 | 2023-11-07 | Nio Technology (Anhui) Co., Ltd. | System and method for an in-vehicle firewall between in-vehicle networks |
US10897469B2 (en) | 2017-02-02 | 2021-01-19 | Nio Usa, Inc. | System and method for firewalls between vehicle networks |
US10942034B2 (en) | 2017-02-23 | 2021-03-09 | The Directv Group, Inc. | Shared control of vehicle functions |
US10180682B2 (en) | 2017-02-23 | 2019-01-15 | The Directv Group, Inc. | Shared control of vehicle functions |
US10234302B2 (en) | 2017-06-27 | 2019-03-19 | Nio Usa, Inc. | Adaptive route and motion planning based on learned external and internal vehicle environment |
US10710633B2 (en) | 2017-07-14 | 2020-07-14 | Nio Usa, Inc. | Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles |
US10369974B2 (en) | 2017-07-14 | 2019-08-06 | Nio Usa, Inc. | Control and coordination of driverless fuel replenishment for autonomous vehicles |
US10837790B2 (en) | 2017-08-01 | 2020-11-17 | Nio Usa, Inc. | Productive and accident-free driving modes for a vehicle |
US11726474B2 (en) | 2017-10-17 | 2023-08-15 | Nio Technology (Anhui) Co., Ltd. | Vehicle path-planner monitor and controller |
US10635109B2 (en) | 2017-10-17 | 2020-04-28 | Nio Usa, Inc. | Vehicle path-planner monitor and controller |
US10935978B2 (en) | 2017-10-30 | 2021-03-02 | Nio Usa, Inc. | Vehicle self-localization using particle filters and visual odometry |
US10606274B2 (en) | 2017-10-30 | 2020-03-31 | Nio Usa, Inc. | Visual place recognition based self-localization for autonomous vehicles |
US10717412B2 (en) | 2017-11-13 | 2020-07-21 | Nio Usa, Inc. | System and method for controlling a vehicle using secondary access methods |
GB2568669A (en) * | 2017-11-17 | 2019-05-29 | Jaguar Land Rover Ltd | Vehicle controller |
GB2568669B (en) * | 2017-11-17 | 2020-03-25 | Jaguar Land Rover Ltd | Proximity based vehicle controller |
EP3501874A1 (en) * | 2017-12-20 | 2019-06-26 | Seat, S.A. | Gesture control method and device for at least one function of a vehicle |
ES2717343A1 (en) * | 2017-12-20 | 2019-06-20 | Seat Sa | Method and gesture control device of at least one function of a vehicle (Machine-translation by Google Translate, not legally binding) |
US20200346546A1 (en) * | 2017-12-26 | 2020-11-05 | Lg Electronics Inc. | In-vehicle display device |
US10369966B1 (en) | 2018-05-23 | 2019-08-06 | Nio Usa, Inc. | Controlling access to a vehicle using wireless access devices |
CN111762188A (en) * | 2019-03-27 | 2020-10-13 | 本田技研工业株式会社 | Vehicle equipment control device, vehicle equipment control method, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150081133A1 (en) | Gesture-based system enabling children to control some vehicle functions in a vehicle | |
US9942522B2 (en) | In-vehicle camera system | |
US10884491B2 (en) | Gaze driven interaction for a vehicle | |
EP3237256B1 (en) | Controlling a vehicle | |
US20180232057A1 (en) | Information Processing Device | |
US10647237B2 (en) | Systems and methods for providing customized and adaptive massaging in vehicle seats | |
US20140316607A1 (en) | Occupant presence detection and identification | |
CN100443334C (en) | Informing device | |
US20180357040A1 (en) | In-vehicle infotainment with multi-modal interface | |
US9703472B2 (en) | Method and system for operating console with touch screen | |
US20160288708A1 (en) | Intelligent caring user interface | |
JP6386618B2 (en) | Intelligent tutorial for gestures | |
US10189434B1 (en) | Augmented safety restraint | |
US11167776B2 (en) | Seat haptic system and method of equalizing haptic output | |
JP2017090614A (en) | Voice recognition control system | |
CN113157080A (en) | Instruction input method for vehicle, storage medium, system and vehicle | |
CN113423597A (en) | Control method and control device for vehicle-mounted display device, electronic equipment and vehicle | |
JP2009059229A (en) | Operation support method and operation support system | |
US11931520B2 (en) | Determination of a tendency of a passenger to get motion sickness in a vehicle | |
KR101875626B1 (en) | Guidance apparatus for controlling of vehicle and method thereof | |
US20230211790A1 (en) | Multi-function input devices for vehicles | |
JP2018018201A (en) | Guidance device and guidance method | |
US11167693B2 (en) | Vehicle attention system and method | |
CN116931850A (en) | Vehicle-mounted display screen control method, device, medium and equipment | |
CN115534826A (en) | Control method and device for vehicle central control display screen, electronic equipment and vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA MOTOR SALES, U.S.A., INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHULZ, JASON A.;REEL/FRAME:033480/0824 Effective date: 20140724 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |