US20100020170A1 - Vehicle Imaging System - Google Patents

Vehicle Imaging System Download PDF

Info

Publication number
US20100020170A1
US20100020170A1 US12/508,840 US50884009A US2010020170A1 US 20100020170 A1 US20100020170 A1 US 20100020170A1 US 50884009 A US50884009 A US 50884009A US 2010020170 A1 US2010020170 A1 US 2010020170A1
Authority
US
United States
Prior art keywords
vehicle
vision system
driver
image
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/508,840
Inventor
Michael J. Higgins-Luthman
Yuesheng Lu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magna Electronics Inc
Original Assignee
Magna Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magna Electronics Inc filed Critical Magna Electronics Inc
Priority to US12/508,840 priority Critical patent/US20100020170A1/en
Assigned to MAGNA ELECTRONICS INC. reassignment MAGNA ELECTRONICS INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HIGGINS-LUTHMAN, MICHAEL J., LU, YUESHENG
Publication of US20100020170A1 publication Critical patent/US20100020170A1/en
Priority to US13/866,376 priority patent/US9509957B2/en
Priority to US15/361,748 priority patent/US11091105B2/en
Priority to US17/445,100 priority patent/US20210370855A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0231Circuits relating to the driving or the functioning of the vehicle
    • B60R16/0236Circuits relating to the driving or the functioning of the vehicle for economical driving
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/14Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
    • B60Q1/1415Dimming circuits
    • B60Q1/1423Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/14Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
    • B60Q1/1415Dimming circuits
    • B60Q1/1423Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic
    • B60Q1/143Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic combined with another condition, e.g. using vehicle recognition from camera images or activation of wipers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/02Rear-view mirror arrangements
    • B60R1/06Rear-view mirror arrangements mounted on vehicle exterior
    • B60R1/062Rear-view mirror arrangements mounted on vehicle exterior with remote control for adjusting position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/03Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/05Special features for controlling or switching of the light beam
    • B60Q2300/054Variable non-standard intensity, i.e. emission of various beam intensities different from standard intensities, e.g. continuous or stepped transitions of intensity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • B60R2001/1215Mirror assemblies combined with other articles, e.g. clocks with information displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/307Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/30Road curve radius
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/40Coefficient of friction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk

Definitions

  • the present invention relates generally to vehicle imaging systems.
  • Vehicle vision systems or imaging systems are known. Examples of such vision and/or imaging systems are described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454; and 6,824,281, which are all hereby incorporated herein by reference in their entireties.
  • a vehicle vision system for a vehicle includes an image sensor having a forward field of view for capturing image data of a road surface forward of the vehicle and an image processor processing the image data.
  • the vehicle vision system determines at least an estimate of a traction condition of at least a portion of the imaged road surface.
  • the vehicle vision system may use the estimated traction condition (such as a friction condition, such as a state of friction or state of traction or coefficient of friction or the like) estimate of an upcoming road surface and may estimate a targeted separation gap between the host vehicle and a leading vehicle, and optionally the targeted separation gap may be adjusted and estimated based on a current driving condition.
  • the vehicle vision system may adjust the targeted separation gap based on the driving capabilities of the driver of the host vehicle.
  • a vehicle vision system for a vehicle includes an image sensor having a field of view and capturing image data of a scene exterior of the vehicle, a monitoring device monitoring power consumption of the vehicle, at least one lighting system that draws electrical power from the vehicle when operated, and an image processor that processes the captured image data.
  • the electrical power drawn by the lighting system is varied at least in part responsive to processing of the image data by the image processor in order to adjust fuel consumption by the vehicle.
  • the system thus may detect situations in which the vehicle lighting system can be turned off or operated under reduced power consumption in order to enhance the efficiency of the vehicle and enhance or maximize the miles per gallon of the vehicle during operation of the vehicle.
  • the vehicle vision system may reduce the light generated by the vehicle lighting system in areas where it is determined that less light is desired or needed while maintaining or directing light at areas where it is determined that light is desired or needed.
  • the image sensor may have a forward field of view and may capture image data of a scene forward of the vehicle and in the direction of forward travel of the vehicle.
  • a vehicle vision system for a vehicle includes an image sensor and image processor.
  • the image sensor has a field of view exterior of the vehicle for capturing image data of a scene forward of the vehicle.
  • the image processor processes the image data and the vehicle vision system may detect and identify animals on or at or near the road and generally forward of the vehicle, and the system may distinguish the presence of a live animal from a dead animal within the field of view.
  • the system at least one of (a) generates an alert (such as responsive to detection and/or identification of a live or dead animal within the field of view), and (b) controls the vehicle to assist in avoiding a collision (such as with a detected and/or identified animal within the field of view).
  • the system may be adaptable to the driver's assumption of risk when operating to avoid a collision with the animal or to continue on the vehicle's path of travel.
  • the system may be adaptable to react differently depending on the type of animal that is detected and identified.
  • the system may be adaptable to react differently depending on whether the detected animal is distinguished as a live animal or a dead animal.
  • the vehicle vision system may comprise at least two image sensors having at least one of (a) a forward field of view and capturing image data of a scene forward of the vehicle, (b) a rearward field of view and capturing image data of a scene rearward of the vehicle and (c) a sideward field of view and capturing image data of a scene to the side of the vehicle.
  • a display may display the captured images as a merged image with image stitching of the component images to minimize artifacts of image stitching.
  • FIG. 1 is a schematic of a vehicle vision system, showing a windshield sun visor in accordance with the present invention
  • FIGS. 2 and 3 are schematics of a vehicle vision system, showing an animal detection system in accordance with the present invention.
  • FIGS. 4-14 are images representative of a vehicle vision system that is operable to merge images from two or more cameras in accordance with the present invention.
  • an imaging system ( FIG. 1 ) is operable to provide a sun visor or light absorbing or light inhibiting element or device may be operable to block light from external to the vehicle, such as sun, sun reflections, headlamps from oncoming vehicles and other glaring light sources, while allowing other light to pass through the vehicle windshield.
  • a sun visor may be embedded in or on or at or near a windshield that can block sun light, headlamp light and/or other glaring light sources, while leaving the rest of the scene unblocked.
  • a light blocking or light limiting device or system 10 of a vehicle 11 may comprise an addressable LCD type variable transmittance element or glass substrate or windshield portion 12 between the driver's eyes and the light source.
  • the windshield thus may comprise a “transition lens” type windshield coating that is selectively darkened by a scanning energy beam 14 (such as an ultraviolet or UV scanning energy beam or an infrared energy beam or the like) and returns to normal clearness when the energy beam is turned off.
  • the system may adjust or “darken” the windshield portion in response to a detection of a light source that is determined to be at a location where light from the light source may cause glare to the driver of the vehicle.
  • the number and size of the “darken” areas on the windshield are determined by the number and the size of the glaring objects (sun, headlamps or other glaring sources), which can be determined by a forward facing camera 16 of a forward facing camera system such as described below.
  • the size of the “darken” area on the windshield may also be determined by the driver's eye aperture size, which varies from a smaller area (such as about 2 mm or thereabouts) in brighter lighting conditions, to a larger area (such as about 8 mm or thereabouts) in darker lighting conditions.
  • the windshield normally or can be designed to attenuate most of UV and IR wavelength bands.
  • the windshield coating is preferably applied to the inner surface of the windshield so that the windshield serves as a cut-off filter to avoid the exposure of designated UV or IR from solar radiation and other external light sources, which may cause unintended “darkening” of the windshield.
  • the windshield coating is only darkened by the energy beam emitted from the visor system and returns to its undarkened state when the energy beam is deactivated.
  • the energy beam system or device 15 may comprise a source device or devices, a scanner, or scanners, and optics that control the beam size and shape.
  • the source may be a laser or a light emitting diode (LED) or the like, which emits an energy beam 14 that darkens the windshield coating.
  • the location of device 15 may be any suitable location, and may be as shown in FIG. 1 or other convenient or suitable location.
  • the device may be located such that that the energy beam does not reflect from the windshield to the vehicle occupants' eyes or skin.
  • the power density of the energy beam controls the darkness or the level of attenuation of the windshield coating. One may electrically control the power output level at the source.
  • the scanner may comprise 2 galvanometer scanning mirrors that scan the energy beam in the X and Y directions on the windshield, or a lens mounted on a 2-D scanning device that can deflect and scan the energy beam in the X and Y directions on the windshield, or the scanner may comprise any other suitable beam scanning means.
  • a digital projector-like energy beam system may be used, where a planar energy beam source, an addressable device (such as a liquid crystal device or micro mirror array or the like) and optics are used and which deliver and control the energy beams onto the windshield to form addressable darkened spots.
  • An electric control module may be employed to control the address or coordinates and the darkness or attenuation level. This module may interface with or be a part of the central control module of the visor system.
  • the system may include an object detection device or system (such as a forward facing camera or image processor and associated processor or control circuitry) and a driver detection device or system, which is operable to determine the location of driver's head or eyes (such as via a stereo camera, a structured light and camera, and/or the like), whereby the system or systems may determine whether or not light from a detected light source may cause glare to the driver of the vehicle, and may determine the level of “darkening” or light attenuation needed, and the address or coordinates of the “darkened” areas.
  • an object detection device or system such as a forward facing camera or image processor and associated processor or control circuitry
  • driver detection device or system which is operable to determine the location of driver's head or eyes (such as via a stereo camera, a structured light and camera, and/or the like), whereby the system or systems may determine whether or not light from a detected light source may cause glare to the driver of the vehicle, and may determine the level of “darkening” or light attenu
  • the travel direction or heading of the vehicle and/or a global positioning system may be used to determine whether a detected light source is at a location that may cause glare to the driver.
  • the system may determine the location or angle or setting of the vehicle mirrors to indicate or approximate the location of the head of the driver of the vehicle to assist in determining whether light from a detected light source may cause glare to the driver of the vehicle.
  • a forward facing camera or imaging sensor may capture images of the scene occurring forward of the vehicle that may encompass the sun and headlamps of oncoming vehicles.
  • the system may distinguish detection of the sun as compared to detection of headlamps of oncoming vehicles because the sun is slow moving unless the vehicle is turning, while the motion of headlamps is faster when the headlamps are near to the host vehicle.
  • the image processor may process the captured images and may generate addresses or coordinates for the visor pixels and transmittance of light through the visor.
  • a glaring light source such as the sun or headlamp of an oncoming vehicle
  • the system may determine an appropriate window area that is to be “darkened” or that is to have a reduced transmissivity of light therethrough (such as an area or region of the windshield between the detected light source and the driver's eyes), and may actuate the energy source to scan or raster the energy beam across the appropriate window area to effectively darken the windshield at that area while allowing light to pass through the rest of the windshield substantially unaffected by operation of the energy source.
  • a window dimming device may comprise a window having at least a portion that is treated with a coating and an energy emitting device that is operable to emit energy toward a targeted region of the window.
  • the coated portion of the window is selectively darkened by energy emitted by the energy emitting device.
  • the energy emitting device may emit a scanning energy beam comprising one of an ultraviolet scanning energy beam and an infrared scanning energy beam.
  • the window may comprise a window of a vehicle or other transparent or substantially transparent window or glass or polymeric substrate.
  • the energy emitting device emits energy toward a selected portion of the window portion to darken the selected portion in response to a detection of a light source that is determined to be at a location where light from the light source may cause glare to a driver or occupant of a vehicle.
  • the window darkening system may be suitable for use in non-automotive or non-windshield applications as well.
  • the system may be utilized at other vehicle windows, such as side windows or a rear backlite or a sun roof or the like).
  • aspects of the darkening system may be suitable for use in or on eyeglasses (such as sunglasses or prescription glasses or the like).
  • the size of each blocking area may be approximate the aperture size of the human eye, and may vary from a smaller area (such as about 2 mm or thereabouts) in brighter lighting conditions, to a larger area (such as about 8 mm or thereabouts) in darker lighting conditions.
  • the eyeglasses of the present invention may have an energy beam system similar in concept to the windshield system of FIG. 1 .
  • an eyeglass dimming device for eyeglasses may comprise an eyeglass lens or optic element (such as supported in an eyeglass frame for viewing through by a person wearing the eyeglasses) having at least a portion that is treated with a coating.
  • An energy emitting device (that may be disposed at the eyeglasses, such as at the frame of the eyeglasses or the like) is operable to emit energy toward a targeted region of the lens.
  • the coated portion of the lens is selectively darkened by energy emitted by the energy emitting device.
  • the energy emitting device emits energy toward a selected portion of the lens portion to darken the selected portion in response to a detection of a light source that is determined to be at a location where light from the light source may cause glare to a wearer of the eyeglasses.
  • the eyeglasses may have individually addressable elements, such as in liquid crystal displays similar to computer laptop displays, or such as in spatial light modulators. With such addressable elements in the eyeglasses, the energy beam system is not needed.
  • Such dimmable or selectively darkenable sunglasses may be suitable for driving glasses and/or for some sports glasses, such as, for example, golf glasses (where the glasses may be selectively dimmed to reduce glare from the sun when the golfer looks up to follow the flight of the ball, but the rest of the glasses are not dimmed or darkened to allow the golfer to follow the flight of the ball when it is not between the sun and the golfer's eyes) or the like.
  • an eyeglass dimming device for eyeglasses may comprise an eyeglass lens or optic element (such as supported in an eyeglass frame for viewing through by a person wearing the eyeglasses) with addressable elements.
  • the addressable elements are selectively darkened by the eyeglass dimming device (such as an electronic control or the like that may be disposed at the eyeglasses, such as at the frame of the eyeglasses or the like) in response to a detection of a light source that is determined to be at a location where light from the light source may cause glare to the eyeglass wearer.
  • an imaging system of the present invention may include multiple headlights, such as multiple forward facing light emitting diodes (LEDs), where the intensity of the LEDs can be controlled so that a machine vision system can see the light variations emitted by the LEDs, while a human may not discern such variations.
  • LEDs forward facing light emitting diodes
  • humans typically cannot see more than 60-70 Hz variations and for isolated flashes humans typically can sum photons up to 100 ms.
  • the system may selectively energize or activate the LEDs so they blast or emit light at short intervals (faster than the threshold rate at which humans may detect the flashing of the lights) and humans may not see or discern the blast from the overall illumination integrated at slower time intervals.
  • the system can blast or emit light forwardly of the vehicle and may detect or see a substantial increase or rise in reflected light as captured by a forward facing camera or imaging system, and if done at a high enough rate or a short enough blast or interval, a human cannot see or discern the presence of the blast of light.
  • the system thus may provide an easy way to see if detected light sources or items (in images captured by the camera or imaging sensor) are reflective items or objects (such as signs or the like) or light sources (such as headlamps of oncoming vehicles or taillights of leading vehicles or the like).
  • Such a system may be suitable for use in intelligent headlamp control systems and/or other automotive vision tasks for machines.
  • the system may utilize super-fast lighting for the machine vision system to “learn” the environment without alienating the driving public (such as drivers of other vehicles on the road with the host vehicle).
  • the system may utilize different lights (such as different colored lights), and may use lights that, when energized together, sum perceptually to a white colored light, but that may flash different color components that would be discernible to machine vision while being imperceptible or not readily discernible to the human eyes.
  • the system may utilize multiple LED headlights, whereby the headlight orientation and intensity can be controlled quickly by the vision or imaging system.
  • This allows the vision system to provide enhanced illumination when desired and may reduce or increase lighting of respective regions in response to various inputs, such as inputs from an object detection system or the like.
  • the system thus may be operable to increase or reduce the intensity of the headlights as desired or appropriate, and the lights may be controlled to provide a tailored illumination of the area forward of and/or sideward of the vehicle.
  • the lights may be selectively activated or energized and/or aimed to illuminate the area forward of the vehicle, while substantially not illuminating or directing light toward areas where other vehicles are located (such as an oncoming vehicle or a leading vehicle on the road with the subject or host vehicle).
  • Such a tailorable lighting or vision system (which may adjust the direction of the lighting in response to a forward facing camera that detects objects or vehicles in front of the host vehicle or a gaze detection device that detects the gaze direction of the driver of the host vehicle and adjusts the headlights accordingly, such as by utilizing aspects of the systems described in U.S. patent application Ser. No. 12/171,436, filed Jul. 11, 2008 by Higgins-Luthman et al. for AUTOMATIC LIGHTING SYSTEM WITH ADAPTIVE ALIGNMENT FUNCTION, published Jan. 15, 2009 as U.S. Patent Publication No. US2009/0016073; and U.S. provisional application Ser. No. 60/949,352, filed Jul.
  • an output of the vision system may provide an input for vision systems of other vehicles.
  • the vision system may provide a reduced power consumption by the vehicle during operation of the headlights as compared to operation of conventional headlamps.
  • a 4-10 percent power consumption loss (or thereabouts) with the lights on may result in a 4-10 percent increase (or thereabouts) in fuel efficiency for the host vehicle, while driving at night.
  • This may be a significant improvement to vehicle manufacturers or owners/operators of fleets of vehicles or truck companies or the like, and may assist the vehicle manufacturers in meeting increased Corporate Average Fuel Economy (CAFE) requirements.
  • CAFE Corporate Average Fuel Economy
  • the overall “light pollution” may be reduced.
  • the vehicle includes at least one lighting system that draws power from the vehicle when operated and the vision system may include a monitoring device that monitors the electrical power consumption of the lighting system and/or vehicle.
  • the image processor may process captured image data and may detect situations in which the vehicle lighting system can be turned off or operated under reduced power consumption in order to maximize the fuel efficiency or miles per gallon of the vehicle in a safe manner without reducing the light output in a way that may adversely affect the viewability of the scene by the driver.
  • the electrical power drawn by the at least one lighting system thus may be varied (such as reduced) at least in part responsive to the image processor in order to adjust (such as reduce) fuel consumption by the vehicle.
  • the vehicle vision system may reduce the light generated by the vehicle lighting system during driving conditions when less vehicle lighting is desired while directing light at areas where it is determined that light is desired.
  • the image sensor may have a forward field of view and may capture image data of a scene forward of the vehicle and in the direction of forward travel of the vehicle.
  • the system may control or reduce fuel consumption, such as gasoline consumption or electrical power consumption (such as for an electric vehicle or the like) or other types of fuels utilized for operation of the vehicle and/or lighting system.
  • the system may control or reduce or minimize vehicle emissions responsive at least in part to the image processor.
  • the vision system may detect ice or water in or on the road surface in front of the vehicle.
  • the vision system may utilize aspects of the systems described in U.S. patent application Ser. No. 11/948,086, filed Nov. 30, 2007, which is hereby incorporated herein by reference in its entirety, and may warn the driver of hard to see black ice.
  • such a system may measure water depth.
  • the system may be operable to identify the road surface (such as asphalt, concrete, metal, rain, snow, ice, water or the like) ahead of the vehicle and on which the vehicle is traveling and any associated road surface coatings, such as via processing image data captured by a forward facing imaging sensor or the like, and may determine (such as via a look up table or database) at least an estimate of a traction condition of or a friction condition of or the coefficient of friction for that road surface and/or coating or a portion of the road surface ahead of the vehicle. For example, if the system determines that the upcoming surface looks just like the current surface did, the system can determine that the traction condition or coefficient of friction will probably be the same as it was for the current road surface.
  • the road surface such as asphalt, concrete, metal, rain, snow, ice, water or the like
  • the system determines that the upcoming road surface (or at least a portion thereof) looks different, the system can prepare for different traction on the upcoming surface as compared to the current traction.
  • the system may adjust a traction control system or cruise control system or may generate an alert to the driver of the vehicle responsive to a detection of a change in traction or traction condition on the road surface ahead of the vehicle.
  • the system may use estimates of the host tire contribution to the traction condition or coefficient of friction when calculating or estimating the traction condition or coefficient of friction between the vehicle tires and the road surface.
  • the vehicle may estimate the traction condition or coefficient of friction or change in the traction condition or coefficient of friction based on a detection of movement of other vehicles or objects on the road surface. For example, if the vehicle is traveling on a curve and a leading vehicle moves in a manner indicative to a skid or slide, then the system may determine that the traction condition or coefficient of friction may be reduced ahead of the host vehicle.
  • the system may calculate or determine the traction condition or coefficient of friction by using the relationship between water, ice, road surface, speed and coefficient of friction.
  • the stopping distance is related to square of the vehicle speed and the coefficient of friction. The stopping distance gets worse (larger or longer) with increased speed. The stopping distance is inversely related to the coefficient of friction.
  • the system may have a table and/or calculation database embedded in the processor to assist in determining the traction condition or coefficient of friction.
  • the vision system may be operable to identify the water, snow and/or ice up ahead as well as near the tires of the vehicle. Because antilock brakes can be worse than standard brakes if snow or gravel piles into a dam when brakes lock up, it is beneficial that the vision system may be operable to identify such build up of snow or gravel in front of the vehicle.
  • Systems for estimating the coefficient of friction are generally for the tire road interface during actual braking. While this may be helpful (such as for antilock braking systems), it is still a reactive process. Knowing the depth of water and ice on an upcoming road surface would allow preparation of the braking system, and an equivalent risk of collision gap could be set for adaptive cruise control systems. For example, the stopping distance can be altered by a factor of 2 or 3 or more by knowing the conditions of the road ahead. Although better brake reactions are good, predictive knowledge is better.
  • the vision system may be operable in conjunction with an adaptive cruise control system.
  • the adaptive cruise control system may function to keep or maintain the gap between the host vehicle and the leading vehicle at a substantially constant time to collision standard or a separation distance standard.
  • the vision system may use the traction condition or coefficient of friction measures ahead of the host vehicle to change the separation gap based on a determined or calculated or estimated stopping distance (based on the speed of the vehicle and the traction condition or coefficient of friction of the road surface ahead of the vehicle).
  • the vision system may utilize measures of driver capability (in-vehicle), a template drive over golden routes, visibility, threat measures and/or the like to adjust or tune the stopping and steering distances for adaptive live measures rather than pre-set values.
  • driver capability in-vehicle
  • the system may adjust the traffic gap or separation distance as a function of a predetermined standard traffic gap as a standard safety margin by a standard driver and vehicle, such as a young driver with clear, 20/20 color vision, and normal visual threshold and contrast and reaction time for different contrast color brightness, and the like.
  • the system may compare the host vehicle driver to the standard driver and adjust the separation gap or time to collision accordingly.
  • the stopping distance and/or separation gap may be calculated or determined or estimated as a function of the traction condition or coefficient of friction, the driver's reaction time, the visibility of the leading vehicle or object or obstacle in front of the host vehicle, the time of day, any indications of driver alertness, the separation gap between the host vehicle and the detected object or leading vehicle, the tire tread of the vehicle's tires, other cars in a cocoon multi-axis accelerometer, and/or the like.
  • Such information may be gathered by and/or utilized with various vehicle systems, such as an adaptive cruise control system, an intelligent headlamp control system, a forward facing camera, forward collision warning (FCW) system a blind spot detection/lane change aide (BSD/LCA) system, a reverse facing camera, or a side mounted camera looking downward near parking areas (such as used in Japan), a global position system (GPS), a temperature sensor, a humidity sensor, a traction condition or coefficient of friction detection or determination and/or other information for upcoming vehicles, an electronic stability control, an internal driver view, a miner's light and/or the like.
  • FCW forward collision warning
  • BSD/LCA blind spot detection/lane change aide
  • a reverse facing camera or a side mounted camera looking downward near parking areas (such as used in Japan)
  • GPS global position system
  • a temperature sensor a humidity sensor
  • traction condition or coefficient of friction detection or determination and/or other information for upcoming vehicles such as used in Japan
  • an electronic stability control such as an internal driver view,
  • the stopping distance could be fed into an intelligent transportation system (ITS) in a weighted sum of leading vehicles and this closeness to next vehicles could be fed back into ITS but also into a center high mounted stop lamp (CHMSL) type light or brake light steganography for other vision systems.
  • ITS intelligent transportation system
  • CHMSL center high mounted stop lamp
  • the host vehicle has a vision system then it should monitor the driver and the environment so that other vehicles are warned if unsafe actions are going to occur, or probably or possibly going to occur.
  • the host vehicle has a miner's light then the light may be adjusted or directed to provide enhanced light on areas of concern.
  • vehicles for handicapped drivers may be extended because all drivers at various times are handicapped or challenged.
  • the vision system may detect characteristics of the driver that may be indicative of the driver being inattentive, drowsy, under the influence of substance use, bored, young/old, healthy, having less than 20/20 vision, color deficient, poor field of view, poor contrast sensitivity, poor clutter analysis, poor reaction time, poor car maintenance, or encountering a challenging environment, such as rain, snow, fog, traffic in cocoon, safe space around car, poor lighting, unsafe area-past accidents, icy conditions, curves, intersections and/or the like.
  • the driver assistance system may tend to make each driver have at least a minimum standard of equivalent safety margin, until such time as there exists totally automatic traffic.
  • the system may inform or alert other drivers of a probability that the driver of the host vehicle is potentially less than a standard driver in semi-objective ways, such as via communication of such information via a wireless communication or steganographic lighting communication or the like.
  • a normal 60 meter gap is a 2 second gap between vehicles traveling at 65 mph, but a slower reaction time of older driver and probability of ice makes the gap for a forward collision warning to be about 120 meters.
  • the forward collision warning system if detecting a gap at less than a threshold level (based on the particular driver and driving conditions), such as less than about 0.7 of the calculated or determined gap (such as 120 meters for the given example), may provide a warning or alert to the driver of the host vehicle, or may provide a steganographic warning to the leading vehicle so that the leading vehicle may relay a warning back to the driver of the host vehicle, such as through the leading vehicle's CHMSL brake light or the like.
  • the standard 60 meter gap is less meaningful since what is truly desired is that the particular driver of the vehicle keeps the gap to the leading vehicle in such a way that the driver can stop safely if the leading vehicle suddenly decelerates or brakes. This depends upon how good the driver is and how good the vehicle is at stopping and a blanket 60 meter distance obscures all these individual differences.
  • the purchaser of the vehicle may want to continue the “nanny-ness” over the teenage driver even when the parent exits the car, and may want to have the elderly parent continue semi-independent driving as long as safely possible.
  • the system may have an override feature, but that feature may be turned off or not available for some drivers of the vehicle.
  • beginning and senior drivers could use an adaptive record of how to drive standard routes and commutes. For example, GPS, start and end choices, and TSR allow overall route identification.
  • the driving record by average of trips or a “golden drive” by a good driver leads to a record of speed, acceleration and braking, lane deviation from center, typical IHC dimming distances, coefficient of friction interaction with precipitation (traction and rain sensor), and this may be extended to an ACC system to allow measurement of any deviation from a benchmark drive so that performance of a suboptimal driver can be identified so that the vehicle risk management behaviors can be tailored for the current driver.
  • Such records could be sent to parents and adult children for monitoring of driver performance.
  • inside monitoring of passenger count could bias risk management for teenage drivers (who typically drive worse with more passengers in the vehicle).
  • the system may alert the driver of such driving deficiencies or deviations from the expected or targeted performance.
  • the system may use optimally perceived warnings for teenage drivers (who may hear higher frequencies so that they alone will be warned, but nearby adults will be spared the sound).
  • the vision system may be operable to detect animals or objects in the road or path of travel of the host vehicle.
  • the system may detect animals, both live and road-kill, and may identify the animals based on expected behavior patterns and size/shape of animals (such as determined from an animal database).
  • the system may, for both large (deer) and small (pets) animals, provide optimum detection and evasive action, such as by detecting deer in air for pre-collision settings and by detecting static road-kill and moving live animals, and providing an analysis of hit versus drive-over animals and dodge probabilities versus driver risk preferences, as discussed below. For example, pet lovers and vegetarians may choose more risky maneuvers to avoid animal impacts, while hunters may simply want to maximize driver safety with less risk, and without as much concern for the animal.
  • the vision system may use laser line patterns and triangulation to detect animals and/or the like, such as by utilizing aspects of the machine vision of upcoming road surface for predictive suspension described in U.S. patent application Ser. No. 12/251,672, filed Oct. 15, 2008, and published Apr. 16, 2009 as U.S. Patent Publication No. US2009/0097038; and U.S. provisional application Ser. No. 60/980,265, filed Oct. 16, 2007, which is hereby incorporated herein by reference in its entirety.
  • Such a system may provide localized accurate sensors for ranges less than 6 meters.
  • the system may be operable to distinguish dead animals from live animals (live animals move while dead animals do not, and live animals are warn and dead animals typically are not; and this may be detected by a heat sensing device or visible or near-infrared or thermal-infrared sensors or the like).
  • the vision system may detect and identify animals on the road or in the path of travel of the vehicle, and may provide an alert or may take evasive action to avoid the detected animal.
  • the vision system may detect and identify animals, such as dead animals by comparing image data of a detected object (indicative of the size, shape, height, color, profile, and/or the like of the object) to a data base of dead animal profiles (such as profiles of dead animals as viewed at high speed by drivers).
  • the system may determine the size of the object or animal by processing image data over time, and may match the height of the object (such as the height or location of the detected object in the images captured by the forward facing camera) and the tire location so that evasive action could be programmed into the active steering and braking for minimal interruption and risk to driver's path.
  • the location of the object may be at one height and position in the captured images, and as the vehicle approaches the detected object, the object in the images captured by the forward facing camera may lower toward the road surface and increase in size, and such position at the road surface may be compared to the tire position of the equipped vehicle to determine if evasive action is necessary or desired.
  • evasive action may be responsive to the detected location and/or size of the detected object and/or the steering angle or vehicle path of the equipped vehicle.
  • the area about five to eight meters immediately in front of the vehicle may be a blind zone or area where the driver may not readily view the road surface along which the vehicle is traveling. The system thus may determine the predicted path of the vehicle's tires to determine if the tire or tires may impact the detected object (that may not be visible to the driver as the vehicle further approaches the object).
  • the vision system may detect and identify live animals by comparing image data of a detected object (indicative of the size, shape, height, color, profile, and/or the like of the object) to a data base of live animal profiles (such as profiles of live animals as viewed at high speed by drivers).
  • the database may include data pertaining to probable animal movements, locations of animals, probable animal reactive movements after animal gets closer to an approaching vehicle.
  • the system may provide static obstacle vehicle countermeasures from the dead animal scenario.
  • the vision system may be responsive to a user input, whereby the driver of the vehicle can input the utility function tailored for their preferences. For example, the driver could select a level of risk that is acceptable to the driver in order to miss a detected animal. For example, some people may take no risk to avoid hitting a small animal, but may tolerate an increase accident risk level to avoid a collision if the animal in the path of the vehicle is identified as a dog or cat. Typically, a driver may take risks to avoid collisions with large animals, such as deer. If the system determines that a collision is imminent, then the system may trigger the vehicle's mitigating behaviors.
  • the vision system may differentiate or distinguish between animal detection and pedestrian detection.
  • the animal detection system may detect erect, moving and dead/injured beings, and a pedestrian detection system or subsystem may be targeted for detected erect pedestrians.
  • the vision system may take recommended actions in response to detection of an object or animal, and the actions may be different depending upon the vehicle speed, collision probability, size of the detected animal, driver preferences, legal preferences, kind of animal, and whether or not the animal may be a human.
  • the system may be adaptable for rules or regulations of the governmental or regulatory bodies of the region in which the vehicle is traveling, since governmental and/or regulatory bodies may mandate evasive actions and risky behaviors for different animals.
  • the system may be tailored or adapted to avoid hitting cattle in order to protect cattle, while in Australia, the system may be tailored or adapted to avoid hitting koalas and/or kangaroos, while in the United States, the system may be tailored or adapted to avoid hitting common pets.
  • the vision system offers an objective way to accommodate regulations which vary from place to place.
  • a vehicle-based global positioning system GPS
  • the vision system may include a rearward facing camera or image sensor and may be used in conjunction with a back up assist system or reverse aid system or rear vision system or the like.
  • the system may include a display device or display screen for viewing by the driver of the vehicle, and may provide a graphic overlay (such as by utilizing aspects of the systems described in U.S. Pat. Nos. 5,670,935; 5,949,331; 6,222,447; and 6,611,202, and/or PCT Application No. PCT/US08/76022, filed Sep. 11, 2008, and published Mar. 19, 2009 as International Publication No. WO2009/036176; and/or U.S. provisional application Ser. No. 60/971,397, filed Sep.
  • the system may detect the contrast of an image or a graphic overlay at the image and may adjust the contrast sensitivity (which depends on the surroundings) of the display device.
  • the system may utilize different look-up-tables (LUTs) to map input gray levels to output gray levels.
  • the system may be operable to provide tunable settings to help the driver better see and view and discern the displayed image (especially for those people with more severe defects), so that the driver or person viewing the display may experience enhanced viewability and discernibility of the displayed image and/or may see and discern the graphic overlays better than with a standard contrast setting.
  • the system may provide a display with tunable colors (such as for a three different color graphic overlay) so that the about 2-8 percent of the population that have color vision deficiencies do not see an overlay with brown and yellow only but will be able to see and discern the three different colors of the graphic overlay.
  • the machine vision or detector/tracking system guides the movement of the apparent position and visibility of the added artificial pattern or light to the driver.
  • the vision system senses the presence and/or position and/or velocity of a possible collision-danger object near the vehicle.
  • the system could also sense dangerous movements of the host vehicle, such as crossing lane markers.
  • the vision system directs the light/pattern for the appropriate representation of meaningful information to the vehicle driver. This direction is guided by rules of thumb or other assumptions in lower cost systems up to advanced systems that actually measure where the driver's eyes are to place the patterns with high accuracy in the driver's field of view.
  • the system may use human reaction times to signal light/pattern movements so that the driver can react to dangers appropriately.
  • the vision system may be overridden by the driver, such as in conditions where the environment would normally trigger inappropriate light/pattern movements.
  • the light/pattern can appear to the driver to track the objects ideally, or it could partially track the objects in a weighted fashion. Partial tracking could protect against extreme apparent motions of the pattern, and also encourage some ergonomically recommended driver head movements, preventing inattention.
  • the vision system thus would provide enhanced viewing and recognition of detected objects at the inside and/or outside mirrors, such as for multiple blind spots, for more objects than just vehicles, for areas themselves, for tracking pattern movements, for binary pattern movements, for system tuning, and for ergonomic features.
  • the binary pattern movements could move between two definite locations, each of which signals information to the driver, such as common versus potentially dangerous situations.
  • the mirrors would be “assisting” the driver, in that they show to the driver scenes that the total system “believes” that the driver “should” see, before he or she acts unwisely.
  • a vehicle vision system may comprise an image sensor having a forward and/or rearward and/or sideward field of view and capturing image data of a scene forward and/or rearward and/or to the sides of the vehicle, an image processor processing the image data and detecting objects of interest, and a display displaying information to the driver of the vehicle using the interior rear view mirror and or the windshield as a display area without compromising the driver's field of view of the scene and keeping the driver's attention generally focused forward of the vehicle and above the dashboard and below the display.
  • the vehicle vision system may be operable to highlight a portion of the display at or near the image of a detected object of interest and may track the image of the detected object of interest as the image moves across the display.
  • the display may be disposed at a mirror reflective element of the vehicle and the vehicle vision system may highlight a portion of the display at or near the reflected image of a detected object of interest and may track the reflection of the detected object of interest as the reflected image moves across the mirror reflective element.
  • aspects of such a vision system may be implemented into navigational displays using camera videos and graphical overlays.
  • the use of the mirror itself (with the lights being at or behind the reflective element) provides all the dynamic range of a mirror, and all the resolution capability of a mirror. These ranges and resolutions are of optical quality, which may be orders of magnitude better than conventional navigational displays (such as CRT, LCD, and/or the like).
  • the vision system encourages the use of present technology (rearview mirrors), which has been ingrained into generations of drivers.
  • the light/patterns can be at the border of a mirror, just slightly displaced from the apparent location of the “dangerous” object/vehicle.
  • the light/patterns can also be presented to the driver in locations in the viewed mirror scene which are known to be background low-risk areas. These areas include the road surface just in front of the dangerous object/vehicle, and the sky area immediately above the object/vehicle.
  • the added artificial information if projected, can be presented in such a way that the optical path of the artificial information will give a similar optical path distance to the eye, so that the overlay information appears to be close to the same depth plane of the actual object/vehicle.
  • the added artificial information can also be related to the actual object/vehicle so that, for example, sounds and flashing lights similar to a real police car could be overlaid upon the apparent visual scene in the mirror when a vehicle approaches at very high closing velocities.
  • the vision system may present information to the driver, without requiring the driver to look, or hear, or respond in any different way than he or she normally would.
  • the vision system may present the extra information in a manner similar to the driver's vast personal experience.
  • the vision system thus of the present invention may allow all the current richness of driver experience, and present extra information in ways that minimize the cognitive, sensory, and motor load of the extra information to the driver's physical and mental processing capability.
  • using the mirror positions, as set by the driver allows a good estimation of the driver's eye positions. Knowing eye positions, mirror positions, along with the camera positions of the machine vision system, together with trigonometry calculations, allows a good estimation of the position of the driver-viewed reflection of the candidate object/vehicle in a mirror. Knowing the position of the object reflection location and the eye location allows the appropriate position of the overlaid light pattern to be calculated.
  • a lower cost or inexpensive system may present the appropriate light/pattern in the mirror boundary close to the apparent location of the object/vehicle, while more advanced systems may present the additional light/pattern much closer, or actually surrounding, the detected object/vehicle's apparent location in the driver's field of view.
  • an even more advanced system may use a sensor (camera, radar, lidar, ultrasonic system or the like) to measure the real-time location of the driver's eyes (or driver's gaze direction).
  • the vision system could use an external driver-owned sensor system, such as the compressed video output from the driver's cell phone camera pointed at the driver, when placed in a docking cradle in the car.
  • the system may adjust the angle or tilt or setting of the mirror reflector (such as the mirror reflector of an exterior side mirror) in response to an object or vehicle detection, in order to enhance the driver's viewability of the object and awareness of the detected object.
  • An angularly adjustable mirror mounted inside or outside a vehicle is normally positioned to display to the driver reflections of the vehicle's surrounding area.
  • the mirror may be temporarily angularly adjusted by a motor to be out of its normal position and into a temporary position(s) which display to the driver an object, or vehicle, or area that is a potential collision danger, or source of such dangers, to the vehicle. These areas could include the vehicle's “blind spots”.
  • the motor or mirror actuator moves the mirror to potentially multiple positions and then back to the original position, responsive to signals from a detector/tracking system.
  • the detector/tracking system could be a machine vision system or the like.
  • the vision system senses the presence, and/or position, and/or velocity of a possible collision-danger object near the vehicle.
  • the system could also sense dangerous movements of the host vehicle, such as crossing lane markers.
  • the vision system signals the adjustable mirror, for the appropriate display of a candidate vehicle, or object, or dangerous areas, such as blind spots, to the vehicle driver.
  • the system may use human reaction times to signal the mirror movements in sufficient time so that the driver can react to dangers appropriately.
  • the driver by viewing the mirror from one head position, would be able to “track” the other object, or vehicle, because the mirror would move in such a way as to allow this “tracking” to occur. Also from one head position, the driver would be able to see dangerous areas, or not, as the mirror moves. After the system determines that the danger has passed, the mirror returns to its normal position. Optionally, the system may be overridden by the driver, such as in conditions where the environment would normally trigger inappropriate mirror movements.
  • the system can track the objects ideally, or it could partially track the objects. Partial tracking could protect against extreme mirror motions, and also encourage some ergonomically recommended driver head movements, preventing inattention.
  • the vision system thus would provide enhanced viewing and recognition of detected objects at the inside and/or outside mirrors, for multiple blind spots, for more objects than just vehicles, for areas themselves, for tracking mirror movements, for binary mirror movements, for system tuning, and for ergonomic features.
  • the mirrors would be adaptive or “intelligent”, or “assisting”, in that they show to the driver scenes that the total system “believes” that the driver “should” see, before acting unwisely.
  • an electrochromic (EC) mirror that adapts to the light environment surrounding the vehicle), with angular adaptive capability, adapts to the environment of light, and objects, surrounding the vehicle, and may allow the driver to readily see the relevant environment, with minimal head movements, and/or minimal visual adaptation.
  • the concepts of a vision system for enhanced viewing and recognition of detected objects as discussed above, with display at or near or around or overlaid on a rear-view mirror can also be applied to display systems for heads-up or larger displays at or on the windshield.
  • These windshield displays can utilize forward facing imaging systems with machine vision of important or threat objects or objects of interest and may display icons or attention-getting patterns to the driver. Similar subsystems which monitor the driver's field of view can be utilized so that the windshield display enhances the driver's knowledge of the forward scene.
  • Energy beam or projector-like systems as mentioned above with respect to the simulated visor system can be used to highlight relevant objects in the forward windshield scenes as viewed by the driver.
  • the image stitching area should be adaptively placed, such as near a lane marker line in a side camera view.
  • the system may utilize adaptive merge planes, or surfaces, using vehicle neighbors, and/or may use a merge surface intersecting side object detection algorithm distances for first vehicles in left, center, and right lanes. Objects farther away and road surfaces closer to the host may be double, missing or wrong sized, but this may only minimally affect the driver. This is because drivers typically track the nearby vehicles and mostly ignore the road in front of, and objects behind, the closest vehicle in the host's lane and adjacent lanes.
  • the stitching area may change, depending upon visible vehicles and/or other detected objects.
  • the image stitching process may limit distortion of dominant objects, vehicles and lane markers. From the side cameras, the adjacent lane car images don't undergo cross stitching. Only closely following car sides may cross the default stitching area or lines. Each straight highway lane marker line may comprise an image taken from one of the cameras. The default stitching may be just outside the host lane marker lines for all cameras. In the illustrated embodiment, there are three lane-specific cameras: host, driver-adjacent, or passenger-adjacent, and the zipper stitching could follow the center vehicle, and limit other-lane effects. The system may judge vehicles/lanes priority, and use that for setting the stitching and merge planes.
  • the system may merge surfaces to follow either the most dangerous vehicle (as determined based on rate of approach and/or location of the detected vehicle) or the closest vehicle.
  • the stitching may follow the lane curves, as much as possible, and may use special line fitting algorithms or calculations (such as a cubic spline line fitting function or the like) so that lane markers have a substantially continuous slope if they must cross the stitching area.
  • the system may alter the stitching a small amount, while with more severe curves, the system may need to default to a center or curve-side camera dominance for stitching and merge surfaces.
  • the system may merge surfaces so as to adapt for vehicle presence, or not, in each of the three lanes.
  • the imaging device and control and image processor and illumination source may comprise any suitable components, and may utilize aspects of the vision systems of the text described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454; and 6,824,281, which are all hereby incorporated herein by reference in their entireties.
  • the imaging device and/or control may be part of or share components or circuitry with other image or imaging or vision systems of the vehicle, such as headlamp control systems and/or rain sensing systems and/or cabin monitoring systems and/or the like.

Abstract

A vehicle vision system includes an image sensor having a forward field of view and capturing image data of a road surface forward of the vehicle. An image processor processes the image data and the vehicle vision system determines at least an estimate of a traction condition of at least a portion of the imaged road surface. The vision system may detect objects forward of the vehicle and may distinguish between live and dead animals in the field of view of the image sensor, and may at least one of (a) generate an alert and (b) control the vehicle to assist in avoiding a collision. The system may detect situations in which the vehicle lighting system can be turned off or operated under reduced power consumption in order to enhance fuel efficiency of the vehicle.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • The present application claims the benefit of U.S. provisional application Ser. No. 61/083,222, filed Jul. 24, 2008, which is hereby incorporated herein by reference in its entirety.
  • FIELD OF THE INVENTION
  • The present invention relates generally to vehicle imaging systems.
  • BACKGROUND OF THE INVENTION
  • Vehicle vision systems or imaging systems are known. Examples of such vision and/or imaging systems are described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454; and 6,824,281, which are all hereby incorporated herein by reference in their entireties.
  • SUMMARY OF THE INVENTION
  • According to an aspect of the present invention, a vehicle vision system for a vehicle includes an image sensor having a forward field of view for capturing image data of a road surface forward of the vehicle and an image processor processing the image data. The vehicle vision system determines at least an estimate of a traction condition of at least a portion of the imaged road surface.
  • Optionally, the vehicle vision system may use the estimated traction condition (such as a friction condition, such as a state of friction or state of traction or coefficient of friction or the like) estimate of an upcoming road surface and may estimate a targeted separation gap between the host vehicle and a leading vehicle, and optionally the targeted separation gap may be adjusted and estimated based on a current driving condition. Optionally, the vehicle vision system may adjust the targeted separation gap based on the driving capabilities of the driver of the host vehicle.
  • According to another aspect of the present invention, a vehicle vision system for a vehicle includes an image sensor having a field of view and capturing image data of a scene exterior of the vehicle, a monitoring device monitoring power consumption of the vehicle, at least one lighting system that draws electrical power from the vehicle when operated, and an image processor that processes the captured image data. The electrical power drawn by the lighting system is varied at least in part responsive to processing of the image data by the image processor in order to adjust fuel consumption by the vehicle. The system thus may detect situations in which the vehicle lighting system can be turned off or operated under reduced power consumption in order to enhance the efficiency of the vehicle and enhance or maximize the miles per gallon of the vehicle during operation of the vehicle.
  • Optionally, the vehicle vision system may reduce the light generated by the vehicle lighting system in areas where it is determined that less light is desired or needed while maintaining or directing light at areas where it is determined that light is desired or needed. Optionally, the image sensor may have a forward field of view and may capture image data of a scene forward of the vehicle and in the direction of forward travel of the vehicle.
  • According to another aspect of the present invention, a vehicle vision system for a vehicle includes an image sensor and image processor. The image sensor has a field of view exterior of the vehicle for capturing image data of a scene forward of the vehicle. The image processor processes the image data and the vehicle vision system may detect and identify animals on or at or near the road and generally forward of the vehicle, and the system may distinguish the presence of a live animal from a dead animal within the field of view. The system at least one of (a) generates an alert (such as responsive to detection and/or identification of a live or dead animal within the field of view), and (b) controls the vehicle to assist in avoiding a collision (such as with a detected and/or identified animal within the field of view).
  • Optionally, the system may be adaptable to the driver's assumption of risk when operating to avoid a collision with the animal or to continue on the vehicle's path of travel. Optionally, the system may be adaptable to react differently depending on the type of animal that is detected and identified. Optionally, the system may be adaptable to react differently depending on whether the detected animal is distinguished as a live animal or a dead animal.
  • Optionally, the vehicle vision system may comprise at least two image sensors having at least one of (a) a forward field of view and capturing image data of a scene forward of the vehicle, (b) a rearward field of view and capturing image data of a scene rearward of the vehicle and (c) a sideward field of view and capturing image data of a scene to the side of the vehicle. A display may display the captured images as a merged image with image stitching of the component images to minimize artifacts of image stitching.
  • These and other objects, advantages and features of this invention will become apparent upon review of the following specification in conjunction with the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic of a vehicle vision system, showing a windshield sun visor in accordance with the present invention;
  • FIGS. 2 and 3 are schematics of a vehicle vision system, showing an animal detection system in accordance with the present invention; and
  • FIGS. 4-14 are images representative of a vehicle vision system that is operable to merge images from two or more cameras in accordance with the present invention.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Referring now to the drawings and the illustrative embodiments depicted therein, an imaging system (FIG. 1) is operable to provide a sun visor or light absorbing or light inhibiting element or device may be operable to block light from external to the vehicle, such as sun, sun reflections, headlamps from oncoming vehicles and other glaring light sources, while allowing other light to pass through the vehicle windshield. For example, a sun visor may be embedded in or on or at or near a windshield that can block sun light, headlamp light and/or other glaring light sources, while leaving the rest of the scene unblocked.
  • In the illustrated embodiment, a light blocking or light limiting device or system 10 of a vehicle 11 (FIG. 1) may comprise an addressable LCD type variable transmittance element or glass substrate or windshield portion 12 between the driver's eyes and the light source. The windshield thus may comprise a “transition lens” type windshield coating that is selectively darkened by a scanning energy beam 14 (such as an ultraviolet or UV scanning energy beam or an infrared energy beam or the like) and returns to normal clearness when the energy beam is turned off. The system may adjust or “darken” the windshield portion in response to a detection of a light source that is determined to be at a location where light from the light source may cause glare to the driver of the vehicle. The number and size of the “darken” areas on the windshield are determined by the number and the size of the glaring objects (sun, headlamps or other glaring sources), which can be determined by a forward facing camera 16 of a forward facing camera system such as described below. The size of the “darken” area on the windshield may also be determined by the driver's eye aperture size, which varies from a smaller area (such as about 2 mm or thereabouts) in brighter lighting conditions, to a larger area (such as about 8 mm or thereabouts) in darker lighting conditions. The windshield normally or can be designed to attenuate most of UV and IR wavelength bands. The windshield coating is preferably applied to the inner surface of the windshield so that the windshield serves as a cut-off filter to avoid the exposure of designated UV or IR from solar radiation and other external light sources, which may cause unintended “darkening” of the windshield. The windshield coating is only darkened by the energy beam emitted from the visor system and returns to its undarkened state when the energy beam is deactivated.
  • The energy beam system or device 15 may comprise a source device or devices, a scanner, or scanners, and optics that control the beam size and shape. The source may be a laser or a light emitting diode (LED) or the like, which emits an energy beam 14 that darkens the windshield coating. The location of device 15 may be any suitable location, and may be as shown in FIG. 1 or other convenient or suitable location. Preferably, the device may be located such that that the energy beam does not reflect from the windshield to the vehicle occupants' eyes or skin. The power density of the energy beam controls the darkness or the level of attenuation of the windshield coating. One may electrically control the power output level at the source. Optionally, one may control the duty cycle of pulse width modulation of the energy beam to control the power density at the windshield coating. The scanner may comprise 2 galvanometer scanning mirrors that scan the energy beam in the X and Y directions on the windshield, or a lens mounted on a 2-D scanning device that can deflect and scan the energy beam in the X and Y directions on the windshield, or the scanner may comprise any other suitable beam scanning means. Optionally, a digital projector-like energy beam system may be used, where a planar energy beam source, an addressable device (such as a liquid crystal device or micro mirror array or the like) and optics are used and which deliver and control the energy beams onto the windshield to form addressable darkened spots. An electric control module may be employed to control the address or coordinates and the darkness or attenuation level. This module may interface with or be a part of the central control module of the visor system.
  • The system may include an object detection device or system (such as a forward facing camera or image processor and associated processor or control circuitry) and a driver detection device or system, which is operable to determine the location of driver's head or eyes (such as via a stereo camera, a structured light and camera, and/or the like), whereby the system or systems may determine whether or not light from a detected light source may cause glare to the driver of the vehicle, and may determine the level of “darkening” or light attenuation needed, and the address or coordinates of the “darkened” areas. Spatial coordinates transformation and computations that involve measured angular coordinates of the glaring objects, the driver's eye position coordinates, the position of the cameras, as well as the position and angle of the windshield, will result to the output of the position of the “darkened” areas on windshield. Optionally, the travel direction or heading of the vehicle and/or a global positioning system may be used to determine whether a detected light source is at a location that may cause glare to the driver. Optionally, the system may determine the location or angle or setting of the vehicle mirrors to indicate or approximate the location of the head of the driver of the vehicle to assist in determining whether light from a detected light source may cause glare to the driver of the vehicle. Optionally, a forward facing camera or imaging sensor may capture images of the scene occurring forward of the vehicle that may encompass the sun and headlamps of oncoming vehicles. The system may distinguish detection of the sun as compared to detection of headlamps of oncoming vehicles because the sun is slow moving unless the vehicle is turning, while the motion of headlamps is faster when the headlamps are near to the host vehicle.
  • The image processor may process the captured images and may generate addresses or coordinates for the visor pixels and transmittance of light through the visor. Thus, upon detection of and identification of or recognition of a glaring light source (such as the sun or headlamp of an oncoming vehicle), such as in the forward path of the vehicle, the system may determine an appropriate window area that is to be “darkened” or that is to have a reduced transmissivity of light therethrough (such as an area or region of the windshield between the detected light source and the driver's eyes), and may actuate the energy source to scan or raster the energy beam across the appropriate window area to effectively darken the windshield at that area while allowing light to pass through the rest of the windshield substantially unaffected by operation of the energy source.
  • Optionally, for example, a window dimming device may comprise a window having at least a portion that is treated with a coating and an energy emitting device that is operable to emit energy toward a targeted region of the window. The coated portion of the window is selectively darkened by energy emitted by the energy emitting device. The energy emitting device may emit a scanning energy beam comprising one of an ultraviolet scanning energy beam and an infrared scanning energy beam. The window may comprise a window of a vehicle or other transparent or substantially transparent window or glass or polymeric substrate. The energy emitting device emits energy toward a selected portion of the window portion to darken the selected portion in response to a detection of a light source that is determined to be at a location where light from the light source may cause glare to a driver or occupant of a vehicle.
  • Optionally, the window darkening system may be suitable for use in non-automotive or non-windshield applications as well. For example, the system may be utilized at other vehicle windows, such as side windows or a rear backlite or a sun roof or the like). Optionally, it is envisioned that aspects of the darkening system may be suitable for use in or on eyeglasses (such as sunglasses or prescription glasses or the like). For example, the size of each blocking area may be approximate the aperture size of the human eye, and may vary from a smaller area (such as about 2 mm or thereabouts) in brighter lighting conditions, to a larger area (such as about 8 mm or thereabouts) in darker lighting conditions. Thus, a smaller number of addressable areas are needed in applications on glasses since glasses may only have about 25-30 units or pixels across their width and less units or pixels across their vertical dimension, whereby the maximum area (in pixels or units to be scanned or energized) may be less than about 500 area units for glasses. The eyeglasses of the present invention may have an energy beam system similar in concept to the windshield system of FIG. 1.
  • Optionally, for example, an eyeglass dimming device for eyeglasses may comprise an eyeglass lens or optic element (such as supported in an eyeglass frame for viewing through by a person wearing the eyeglasses) having at least a portion that is treated with a coating. An energy emitting device (that may be disposed at the eyeglasses, such as at the frame of the eyeglasses or the like) is operable to emit energy toward a targeted region of the lens. The coated portion of the lens is selectively darkened by energy emitted by the energy emitting device. The energy emitting device emits energy toward a selected portion of the lens portion to darken the selected portion in response to a detection of a light source that is determined to be at a location where light from the light source may cause glare to a wearer of the eyeglasses.
  • Optionally, the eyeglasses may have individually addressable elements, such as in liquid crystal displays similar to computer laptop displays, or such as in spatial light modulators. With such addressable elements in the eyeglasses, the energy beam system is not needed. Such dimmable or selectively darkenable sunglasses may be suitable for driving glasses and/or for some sports glasses, such as, for example, golf glasses (where the glasses may be selectively dimmed to reduce glare from the sun when the golfer looks up to follow the flight of the ball, but the rest of the glasses are not dimmed or darkened to allow the golfer to follow the flight of the ball when it is not between the sun and the golfer's eyes) or the like.
  • Optionally, for example, an eyeglass dimming device for eyeglasses may comprise an eyeglass lens or optic element (such as supported in an eyeglass frame for viewing through by a person wearing the eyeglasses) with addressable elements. The addressable elements are selectively darkened by the eyeglass dimming device (such as an electronic control or the like that may be disposed at the eyeglasses, such as at the frame of the eyeglasses or the like) in response to a detection of a light source that is determined to be at a location where light from the light source may cause glare to the eyeglass wearer.
  • Optionally, an imaging system of the present invention may include multiple headlights, such as multiple forward facing light emitting diodes (LEDs), where the intensity of the LEDs can be controlled so that a machine vision system can see the light variations emitted by the LEDs, while a human may not discern such variations. Typically, humans cannot see more than 60-70 Hz variations and for isolated flashes humans typically can sum photons up to 100 ms. Thus, the system may selectively energize or activate the LEDs so they blast or emit light at short intervals (faster than the threshold rate at which humans may detect the flashing of the lights) and humans may not see or discern the blast from the overall illumination integrated at slower time intervals. Thus, the system can blast or emit light forwardly of the vehicle and may detect or see a substantial increase or rise in reflected light as captured by a forward facing camera or imaging system, and if done at a high enough rate or a short enough blast or interval, a human cannot see or discern the presence of the blast of light. The system thus may provide an easy way to see if detected light sources or items (in images captured by the camera or imaging sensor) are reflective items or objects (such as signs or the like) or light sources (such as headlamps of oncoming vehicles or taillights of leading vehicles or the like). Such a system may be suitable for use in intelligent headlamp control systems and/or other automotive vision tasks for machines.
  • Optionally, the system may utilize super-fast lighting for the machine vision system to “learn” the environment without alienating the driving public (such as drivers of other vehicles on the road with the host vehicle). Optionally, the system may utilize different lights (such as different colored lights), and may use lights that, when energized together, sum perceptually to a white colored light, but that may flash different color components that would be discernible to machine vision while being imperceptible or not readily discernible to the human eyes.
  • Optionally, and although some regulatory constraints exist, the system may utilize multiple LED headlights, whereby the headlight orientation and intensity can be controlled quickly by the vision or imaging system. This allows the vision system to provide enhanced illumination when desired and may reduce or increase lighting of respective regions in response to various inputs, such as inputs from an object detection system or the like. The system thus may be operable to increase or reduce the intensity of the headlights as desired or appropriate, and the lights may be controlled to provide a tailored illumination of the area forward of and/or sideward of the vehicle. For example, the lights may be selectively activated or energized and/or aimed to illuminate the area forward of the vehicle, while substantially not illuminating or directing light toward areas where other vehicles are located (such as an oncoming vehicle or a leading vehicle on the road with the subject or host vehicle).
  • Such a tailorable lighting or vision system (which may adjust the direction of the lighting in response to a forward facing camera that detects objects or vehicles in front of the host vehicle or a gaze detection device that detects the gaze direction of the driver of the host vehicle and adjusts the headlights accordingly, such as by utilizing aspects of the systems described in U.S. patent application Ser. No. 12/171,436, filed Jul. 11, 2008 by Higgins-Luthman et al. for AUTOMATIC LIGHTING SYSTEM WITH ADAPTIVE ALIGNMENT FUNCTION, published Jan. 15, 2009 as U.S. Patent Publication No. US2009/0016073; and U.S. provisional application Ser. No. 60/949,352, filed Jul. 12, 2007, which are hereby incorporated herein by reference in their entireties) may provide enhanced or targeted illumination and may provide enhanced safety for the driver and passenger(s) of the host vehicle, pedestrians, other vehicles, and/or animals, and may enhance the detection of objects or obstacles or road irregularities on the road surface on which the host vehicle is traveling. Optionally, an output of the vision system may provide an input for vision systems of other vehicles.
  • Optionally, it is envisioned that, by selectively activating and deactivating some of the light sources and selectively increasing and decreasing the intensity of light sources so as to not constantly brightly illuminate areas that are not of interest to the driver of the vehicle, the vision system may provide a reduced power consumption by the vehicle during operation of the headlights as compared to operation of conventional headlamps. For example, a 4-10 percent power consumption loss (or thereabouts) with the lights on may result in a 4-10 percent increase (or thereabouts) in fuel efficiency for the host vehicle, while driving at night. This may be a significant improvement to vehicle manufacturers or owners/operators of fleets of vehicles or truck companies or the like, and may assist the vehicle manufacturers in meeting increased Corporate Average Fuel Economy (CAFE) requirements. Also, by reducing the light emitted by the vehicle headlights when it is not needed or even desired, the overall “light pollution” may be reduced.
  • Optionally, for example, the vehicle includes at least one lighting system that draws power from the vehicle when operated and the vision system may include a monitoring device that monitors the electrical power consumption of the lighting system and/or vehicle. The image processor may process captured image data and may detect situations in which the vehicle lighting system can be turned off or operated under reduced power consumption in order to maximize the fuel efficiency or miles per gallon of the vehicle in a safe manner without reducing the light output in a way that may adversely affect the viewability of the scene by the driver. The electrical power drawn by the at least one lighting system thus may be varied (such as reduced) at least in part responsive to the image processor in order to adjust (such as reduce) fuel consumption by the vehicle.
  • The vehicle vision system may reduce the light generated by the vehicle lighting system during driving conditions when less vehicle lighting is desired while directing light at areas where it is determined that light is desired. Optionally, the image sensor may have a forward field of view and may capture image data of a scene forward of the vehicle and in the direction of forward travel of the vehicle. The system may control or reduce fuel consumption, such as gasoline consumption or electrical power consumption (such as for an electric vehicle or the like) or other types of fuels utilized for operation of the vehicle and/or lighting system. Optionally, the system may control or reduce or minimize vehicle emissions responsive at least in part to the image processor.
  • Optionally, the vision system may detect ice or water in or on the road surface in front of the vehicle. For example, the vision system may utilize aspects of the systems described in U.S. patent application Ser. No. 11/948,086, filed Nov. 30, 2007, which is hereby incorporated herein by reference in its entirety, and may warn the driver of hard to see black ice. Optionally, such a system may measure water depth. Optionally, the system may be operable to identify the road surface (such as asphalt, concrete, metal, rain, snow, ice, water or the like) ahead of the vehicle and on which the vehicle is traveling and any associated road surface coatings, such as via processing image data captured by a forward facing imaging sensor or the like, and may determine (such as via a look up table or database) at least an estimate of a traction condition of or a friction condition of or the coefficient of friction for that road surface and/or coating or a portion of the road surface ahead of the vehicle. For example, if the system determines that the upcoming surface looks just like the current surface did, the system can determine that the traction condition or coefficient of friction will probably be the same as it was for the current road surface. If, on the other hand, the system determines that the upcoming road surface (or at least a portion thereof) looks different, the system can prepare for different traction on the upcoming surface as compared to the current traction. The system may adjust a traction control system or cruise control system or may generate an alert to the driver of the vehicle responsive to a detection of a change in traction or traction condition on the road surface ahead of the vehicle.
  • Optionally, the system may use estimates of the host tire contribution to the traction condition or coefficient of friction when calculating or estimating the traction condition or coefficient of friction between the vehicle tires and the road surface. Optionally, the vehicle may estimate the traction condition or coefficient of friction or change in the traction condition or coefficient of friction based on a detection of movement of other vehicles or objects on the road surface. For example, if the vehicle is traveling on a curve and a leading vehicle moves in a manner indicative to a skid or slide, then the system may determine that the traction condition or coefficient of friction may be reduced ahead of the host vehicle.
  • Optionally, the system may calculate or determine the traction condition or coefficient of friction by using the relationship between water, ice, road surface, speed and coefficient of friction. For example, the stopping distance is related to square of the vehicle speed and the coefficient of friction. The stopping distance gets worse (larger or longer) with increased speed. The stopping distance is inversely related to the coefficient of friction. Optionally, the system may have a table and/or calculation database embedded in the processor to assist in determining the traction condition or coefficient of friction.
  • The vision system may be operable to identify the water, snow and/or ice up ahead as well as near the tires of the vehicle. Because antilock brakes can be worse than standard brakes if snow or gravel piles into a dam when brakes lock up, it is beneficial that the vision system may be operable to identify such build up of snow or gravel in front of the vehicle.
  • Systems for estimating the coefficient of friction are generally for the tire road interface during actual braking. While this may be helpful (such as for antilock braking systems), it is still a reactive process. Knowing the depth of water and ice on an upcoming road surface would allow preparation of the braking system, and an equivalent risk of collision gap could be set for adaptive cruise control systems. For example, the stopping distance can be altered by a factor of 2 or 3 or more by knowing the conditions of the road ahead. Although better brake reactions are good, predictive knowledge is better.
  • Optionally, the vision system may be operable in conjunction with an adaptive cruise control system. Optionally, the adaptive cruise control system may function to keep or maintain the gap between the host vehicle and the leading vehicle at a substantially constant time to collision standard or a separation distance standard. Optionally, the vision system may use the traction condition or coefficient of friction measures ahead of the host vehicle to change the separation gap based on a determined or calculated or estimated stopping distance (based on the speed of the vehicle and the traction condition or coefficient of friction of the road surface ahead of the vehicle).
  • Optionally, the vision system may utilize measures of driver capability (in-vehicle), a template drive over golden routes, visibility, threat measures and/or the like to adjust or tune the stopping and steering distances for adaptive live measures rather than pre-set values. For example, the system may adjust the traffic gap or separation distance as a function of a predetermined standard traffic gap as a standard safety margin by a standard driver and vehicle, such as a young driver with clear, 20/20 color vision, and normal visual threshold and contrast and reaction time for different contrast color brightness, and the like. For example, the system may compare the host vehicle driver to the standard driver and adjust the separation gap or time to collision accordingly.
  • The stopping distance and/or separation gap may be calculated or determined or estimated as a function of the traction condition or coefficient of friction, the driver's reaction time, the visibility of the leading vehicle or object or obstacle in front of the host vehicle, the time of day, any indications of driver alertness, the separation gap between the host vehicle and the detected object or leading vehicle, the tire tread of the vehicle's tires, other cars in a cocoon multi-axis accelerometer, and/or the like. Such information may be gathered by and/or utilized with various vehicle systems, such as an adaptive cruise control system, an intelligent headlamp control system, a forward facing camera, forward collision warning (FCW) system a blind spot detection/lane change aide (BSD/LCA) system, a reverse facing camera, or a side mounted camera looking downward near parking areas (such as used in Japan), a global position system (GPS), a temperature sensor, a humidity sensor, a traction condition or coefficient of friction detection or determination and/or other information for upcoming vehicles, an electronic stability control, an internal driver view, a miner's light and/or the like. Optionally, the stopping distance could be fed into an intelligent transportation system (ITS) in a weighted sum of leading vehicles and this closeness to next vehicles could be fed back into ITS but also into a center high mounted stop lamp (CHMSL) type light or brake light steganography for other vision systems. If the host vehicle has a vision system then it should monitor the driver and the environment so that other vehicles are warned if unsafe actions are going to occur, or probably or possibly going to occur. Optionally, if the host vehicle has a miner's light then the light may be adjusted or directed to provide enhanced light on areas of concern.
  • Optionally, vehicles for handicapped drivers may be extended because all drivers at various times are handicapped or challenged. For example, the vision system may detect characteristics of the driver that may be indicative of the driver being inattentive, drowsy, under the influence of substance use, bored, young/old, healthy, having less than 20/20 vision, color deficient, poor field of view, poor contrast sensitivity, poor clutter analysis, poor reaction time, poor car maintenance, or encountering a challenging environment, such as rain, snow, fog, traffic in cocoon, safe space around car, poor lighting, unsafe area-past accidents, icy conditions, curves, intersections and/or the like. The driver assistance system may tend to make each driver have at least a minimum standard of equivalent safety margin, until such time as there exists totally automatic traffic. Optionally, the system may inform or alert other drivers of a probability that the driver of the host vehicle is potentially less than a standard driver in semi-objective ways, such as via communication of such information via a wireless communication or steganographic lighting communication or the like. For example, a normal 60 meter gap is a 2 second gap between vehicles traveling at 65 mph, but a slower reaction time of older driver and probability of ice makes the gap for a forward collision warning to be about 120 meters. The forward collision warning system if detecting a gap at less than a threshold level (based on the particular driver and driving conditions), such as less than about 0.7 of the calculated or determined gap (such as 120 meters for the given example), may provide a warning or alert to the driver of the host vehicle, or may provide a steganographic warning to the leading vehicle so that the leading vehicle may relay a warning back to the driver of the host vehicle, such as through the leading vehicle's CHMSL brake light or the like. In such a situation, the standard 60 meter gap is less meaningful since what is truly desired is that the particular driver of the vehicle keeps the gap to the leading vehicle in such a way that the driver can stop safely if the leading vehicle suddenly decelerates or brakes. This depends upon how good the driver is and how good the vehicle is at stopping and a blanket 60 meter distance obscures all these individual differences.
  • Thus, the camera in the host vehicle can measure different contrast color brightness, direction, a number of items in the field of view and/or the like. When the automatic control systems are disabled, an advanced vehicle system can measure or determine the reaction time for various driver activities. For example, high-low beam switching, average forward collision distance, blind spot detection, braking time, relative vehicle positioning within the host vehicle lane as measured by a lane departure warning systems (LDW), steering wheel movements, and following capability may be determined by the vision system and compared to the “standard driver”. The determination of driver capability may be communicated ostensibly or steganographically to ITS systems and/or other systems or entities. Optionally, for example, such data can be relayed to driver license bodies and to other drivers and to the driver himself/herself
  • Optionally, the system may adjust automatic controls or the like to match the driver if desired. For example, intelligent headlamp control (IHC) detection distance can be shorter or longer based upon how driver behaves—such as for applications where the control or behavior is a matter of preference and not safety. Optionally, some applications will not match driver, but compensate for driver deviations from standard, and may make all drivers generally equally (or similarly) safe, even if the particular driver has defects of driving deficiencies.
  • Optionally, the system may assist beginning and senior drivers, such as by utilizing traffic sign recognition (TSR), IHC, LDW, adaptive cruise control (ACC) and/or GPS navigating to monitor the behavior of the driver, and score it along multiple risk dimensions so that the car can behave like a normal car, or more nanny-like (such as for training of the driver or easing the driving for the more handicapped or deficient drivers). For example, teenage drivers typically have good perception, poorer judgment, and quick reflexes, while seniors typically have poorer perception, slower reflexes and better driver learning but are sometimes forgetful and have worse workload performance. The purchaser of the vehicle (or of an aftermarket or add-on feature to a cell phone or on aftermarket vehicle system) may want to continue the “nanny-ness” over the teenage driver even when the parent exits the car, and may want to have the elderly parent continue semi-independent driving as long as safely possible. Thus, the system may have an override feature, but that feature may be turned off or not available for some drivers of the vehicle.
  • Optionally, beginning and senior drivers could use an adaptive record of how to drive standard routes and commutes. For example, GPS, start and end choices, and TSR allow overall route identification. The driving record by average of trips or a “golden drive” by a good driver leads to a record of speed, acceleration and braking, lane deviation from center, typical IHC dimming distances, coefficient of friction interaction with precipitation (traction and rain sensor), and this may be extended to an ACC system to allow measurement of any deviation from a benchmark drive so that performance of a suboptimal driver can be identified so that the vehicle risk management behaviors can be tailored for the current driver. Such records could be sent to parents and adult children for monitoring of driver performance. Optionally, inside monitoring of passenger count could bias risk management for teenage drivers (who typically drive worse with more passengers in the vehicle). Optionally, the system may alert the driver of such driving deficiencies or deviations from the expected or targeted performance. The system may use optimally perceived warnings for teenage drivers (who may hear higher frequencies so that they alone will be warned, but nearby adults will be spared the sound).
  • Optionally, the vision system may be operable to detect animals or objects in the road or path of travel of the host vehicle. For example, the system may detect animals, both live and road-kill, and may identify the animals based on expected behavior patterns and size/shape of animals (such as determined from an animal database). Thus, the system may, for both large (deer) and small (pets) animals, provide optimum detection and evasive action, such as by detecting deer in air for pre-collision settings and by detecting static road-kill and moving live animals, and providing an analysis of hit versus drive-over animals and dodge probabilities versus driver risk preferences, as discussed below. For example, pet lovers and vegetarians may choose more risky maneuvers to avoid animal impacts, while hunters may simply want to maximize driver safety with less risk, and without as much concern for the animal.
  • The vision system may use laser line patterns and triangulation to detect animals and/or the like, such as by utilizing aspects of the machine vision of upcoming road surface for predictive suspension described in U.S. patent application Ser. No. 12/251,672, filed Oct. 15, 2008, and published Apr. 16, 2009 as U.S. Patent Publication No. US2009/0097038; and U.S. provisional application Ser. No. 60/980,265, filed Oct. 16, 2007, which is hereby incorporated herein by reference in its entirety. Such a system may provide localized accurate sensors for ranges less than 6 meters. Optionally, and with reference to FIG. 2, the vision system of a vehicle 11 may detect the driver's gaze direction 20 and a forward facing camera 22 of the vehicle 11 may be aimed in that direction to capture images of the object or animal 24 that the driver is looking at, whereby the image data may be processed to detect and identify the object or animal and to control the vehicle or provide an alert accordingly. The vision system may provide the potential for detecting and identifying animals as a special case of a bump (some detected “bumps” may be dead or live animals) in the road, and may provide the potential for long range detection and identification of larger animals within standard vision scene. Optionally, the system may be operable to distinguish dead animals from live animals (live animals move while dead animals do not, and live animals are warn and dead animals typically are not; and this may be detected by a heat sensing device or visible or near-infrared or thermal-infrared sensors or the like). Thus, the vision system may detect and identify animals on the road or in the path of travel of the vehicle, and may provide an alert or may take evasive action to avoid the detected animal.
  • Optionally, the vision system may detect and identify animals, such as dead animals by comparing image data of a detected object (indicative of the size, shape, height, color, profile, and/or the like of the object) to a data base of dead animal profiles (such as profiles of dead animals as viewed at high speed by drivers). Optionally, and with reference to FIG. 3, the system may determine the size of the object or animal by processing image data over time, and may match the height of the object (such as the height or location of the detected object in the images captured by the forward facing camera) and the tire location so that evasive action could be programmed into the active steering and braking for minimal interruption and risk to driver's path. For example, when a detected object on the road surface is about 50 meters in front of the equipped vehicle, the location of the object may be at one height and position in the captured images, and as the vehicle approaches the detected object, the object in the images captured by the forward facing camera may lower toward the road surface and increase in size, and such position at the road surface may be compared to the tire position of the equipped vehicle to determine if evasive action is necessary or desired. Such a determination of evasive action may be responsive to the detected location and/or size of the detected object and/or the steering angle or vehicle path of the equipped vehicle. As shown in FIG. 3, the area about five to eight meters immediately in front of the vehicle may be a blind zone or area where the driver may not readily view the road surface along which the vehicle is traveling. The system thus may determine the predicted path of the vehicle's tires to determine if the tire or tires may impact the detected object (that may not be visible to the driver as the vehicle further approaches the object).
  • Optionally, the vision system may detect and identify live animals by comparing image data of a detected object (indicative of the size, shape, height, color, profile, and/or the like of the object) to a data base of live animal profiles (such as profiles of live animals as viewed at high speed by drivers). The database may include data pertaining to probable animal movements, locations of animals, probable animal reactive movements after animal gets closer to an approaching vehicle. The system may provide static obstacle vehicle countermeasures from the dead animal scenario.
  • Optionally, the vision system may be responsive to a user input, whereby the driver of the vehicle can input the utility function tailored for their preferences. For example, the driver could select a level of risk that is acceptable to the driver in order to miss a detected animal. For example, some people may take no risk to avoid hitting a small animal, but may tolerate an increase accident risk level to avoid a collision if the animal in the path of the vehicle is identified as a dog or cat. Typically, a driver may take risks to avoid collisions with large animals, such as deer. If the system determines that a collision is imminent, then the system may trigger the vehicle's mitigating behaviors.
  • Optionally, the vision system may differentiate or distinguish between animal detection and pedestrian detection. For example, the animal detection system may detect erect, moving and dead/injured beings, and a pedestrian detection system or subsystem may be targeted for detected erect pedestrians. The vision system may take recommended actions in response to detection of an object or animal, and the actions may be different depending upon the vehicle speed, collision probability, size of the detected animal, driver preferences, legal preferences, kind of animal, and whether or not the animal may be a human. Optionally, the system may be adaptable for rules or regulations of the governmental or regulatory bodies of the region in which the vehicle is traveling, since governmental and/or regulatory bodies may mandate evasive actions and risky behaviors for different animals. For example, in India, the system may be tailored or adapted to avoid hitting cattle in order to protect cattle, while in Australia, the system may be tailored or adapted to avoid hitting koalas and/or kangaroos, while in the United States, the system may be tailored or adapted to avoid hitting common pets. Thus, the vision system offers an objective way to accommodate regulations which vary from place to place. Optionally, a vehicle-based global positioning system (GPS) could activate different system actions related to protected animals in the region/regions in which the vehicle is travelling.
  • Optionally, the vision system may include a rearward facing camera or image sensor and may be used in conjunction with a back up assist system or reverse aid system or rear vision system or the like. The system may include a display device or display screen for viewing by the driver of the vehicle, and may provide a graphic overlay (such as by utilizing aspects of the systems described in U.S. Pat. Nos. 5,670,935; 5,949,331; 6,222,447; and 6,611,202, and/or PCT Application No. PCT/US08/76022, filed Sep. 11, 2008, and published Mar. 19, 2009 as International Publication No. WO2009/036176; and/or U.S. provisional application Ser. No. 60/971,397, filed Sep. 11, 2007, which are hereby incorporated herein by reference in their entireties) at the displayed image to enhance the driver's viewing and understanding of the displayed image. Optionally, and desirably, if the vehicle is not in reverse gear position, the graphic overlays are not presented. Optionally, the system may detect the contrast of an image or a graphic overlay at the image and may adjust the contrast sensitivity (which depends on the surroundings) of the display device. For example, the system may utilize different look-up-tables (LUTs) to map input gray levels to output gray levels. Because about 8 percent of the population has some defect or deficiency in color vision, the system may be operable to provide tunable settings to help the driver better see and view and discern the displayed image (especially for those people with more severe defects), so that the driver or person viewing the display may experience enhanced viewability and discernibility of the displayed image and/or may see and discern the graphic overlays better than with a standard contrast setting. Thus, the system may provide a display with tunable colors (such as for a three different color graphic overlay) so that the about 2-8 percent of the population that have color vision deficiencies do not see an overlay with brown and yellow only but will be able to see and discern the three different colors of the graphic overlay.
  • Optionally, the vision system may include or provide a detector/tracking system that detects an object rearward or to the side of the vehicle and that tracks the object or highlights the object at the rearview mirror (such as at the interior rearview mirror and/or the exterior rearview mirror or mirrors) so that the driver can readily see and discern the detected/highlighted object in the reflected image. Angularly adjustable mirrors mounted inside or outside a vehicle are normally positioned to display to the driver reflections of the vehicle's surrounding area. The vision system may provide overlaid patterns or lights that are made visible upon the mirror, or mirror boundary, into position(s) which indicate to the driver an object, or vehicle, or area that is a potential collision danger, or source of such dangers, to the vehicle. These areas could include the vehicle's “blind spots”. The lights or patterns may be provided at regions outside of the viewing region or may be provided within the viewing region, such as via a display on demand type display through a transflective mirror reflector, so that the lights are viewable through the reflective element when the lights are activated, but are substantially not viewable or discernible through the reflective element when the lights are deactivated.
  • The machine vision or detector/tracking system guides the movement of the apparent position and visibility of the added artificial pattern or light to the driver. The vision system senses the presence and/or position and/or velocity of a possible collision-danger object near the vehicle. The system could also sense dangerous movements of the host vehicle, such as crossing lane markers. The vision system directs the light/pattern for the appropriate representation of meaningful information to the vehicle driver. This direction is guided by rules of thumb or other assumptions in lower cost systems up to advanced systems that actually measure where the driver's eyes are to place the patterns with high accuracy in the driver's field of view. The system may use human reaction times to signal light/pattern movements so that the driver can react to dangers appropriately.
  • The driver, by viewing the mirror from one head position, would see that the apparent location of the additional artificial light/pattern will “track” with the apparent location of the other real object, or vehicle, in the mirror. This is because the artificial light pattern will appear to move in synchrony with the mirror's reflection of the other object/vehicle's apparent movement in the mirror scene, as viewed by the driver. After the system determines that the danger has passed, the artificial light/pattern can become invisible or not discernible to the driver.
  • Optionally, the vision system may be overridden by the driver, such as in conditions where the environment would normally trigger inappropriate light/pattern movements. Optionally, the light/pattern can appear to the driver to track the objects ideally, or it could partially track the objects in a weighted fashion. Partial tracking could protect against extreme apparent motions of the pattern, and also encourage some ergonomically recommended driver head movements, preventing inattention.
  • The vision system thus would provide enhanced viewing and recognition of detected objects at the inside and/or outside mirrors, such as for multiple blind spots, for more objects than just vehicles, for areas themselves, for tracking pattern movements, for binary pattern movements, for system tuning, and for ergonomic features. The binary pattern movements could move between two definite locations, each of which signals information to the driver, such as common versus potentially dangerous situations. The mirrors would be “assisting” the driver, in that they show to the driver scenes that the total system “believes” that the driver “should” see, before he or she acts unwisely.
  • Optionally, for example, a vehicle vision system may comprise an image sensor having a forward and/or rearward and/or sideward field of view and capturing image data of a scene forward and/or rearward and/or to the sides of the vehicle, an image processor processing the image data and detecting objects of interest, and a display displaying information to the driver of the vehicle using the interior rear view mirror and or the windshield as a display area without compromising the driver's field of view of the scene and keeping the driver's attention generally focused forward of the vehicle and above the dashboard and below the display. The vehicle vision system may be operable to highlight a portion of the display at or near the image of a detected object of interest and may track the image of the detected object of interest as the image moves across the display. The display may be disposed at a mirror reflective element of the vehicle and the vehicle vision system may highlight a portion of the display at or near the reflected image of a detected object of interest and may track the reflection of the detected object of interest as the reflected image moves across the mirror reflective element.
  • Optionally, aspects of such a vision system may be implemented into navigational displays using camera videos and graphical overlays. However, the use of the mirror itself (with the lights being at or behind the reflective element) provides all the dynamic range of a mirror, and all the resolution capability of a mirror. These ranges and resolutions are of optical quality, which may be orders of magnitude better than conventional navigational displays (such as CRT, LCD, and/or the like). In addition, the vision system encourages the use of present technology (rearview mirrors), which has been ingrained into generations of drivers.
  • The light/patterns can be at the border of a mirror, just slightly displaced from the apparent location of the “dangerous” object/vehicle. The light/patterns can also be presented to the driver in locations in the viewed mirror scene which are known to be background low-risk areas. These areas include the road surface just in front of the dangerous object/vehicle, and the sky area immediately above the object/vehicle. The added artificial information, if projected, can be presented in such a way that the optical path of the artificial information will give a similar optical path distance to the eye, so that the overlay information appears to be close to the same depth plane of the actual object/vehicle. The added artificial information can also be related to the actual object/vehicle so that, for example, sounds and flashing lights similar to a real police car could be overlaid upon the apparent visual scene in the mirror when a vehicle approaches at very high closing velocities. The vision system may present information to the driver, without requiring the driver to look, or hear, or respond in any different way than he or she normally would. The vision system may present the extra information in a manner similar to the driver's vast personal experience. The vision system thus of the present invention may allow all the current richness of driver experience, and present extra information in ways that minimize the cognitive, sensory, and motor load of the extra information to the driver's physical and mental processing capability.
  • Optionally, using the mirror positions, as set by the driver, allows a good estimation of the driver's eye positions. Knowing eye positions, mirror positions, along with the camera positions of the machine vision system, together with trigonometry calculations, allows a good estimation of the position of the driver-viewed reflection of the candidate object/vehicle in a mirror. Knowing the position of the object reflection location and the eye location allows the appropriate position of the overlaid light pattern to be calculated.
  • Optionally, a lower cost or inexpensive system may present the appropriate light/pattern in the mirror boundary close to the apparent location of the object/vehicle, while more advanced systems may present the additional light/pattern much closer, or actually surrounding, the detected object/vehicle's apparent location in the driver's field of view. Optionally, an even more advanced system may use a sensor (camera, radar, lidar, ultrasonic system or the like) to measure the real-time location of the driver's eyes (or driver's gaze direction). Optionally, the vision system could use an external driver-owned sensor system, such as the compressed video output from the driver's cell phone camera pointed at the driver, when placed in a docking cradle in the car. When using real-time information about the driver's changing eye position, if the driver moves, the apparent location of added artificial information as seen in the mirror, can move in synchrony with the apparent location of the targeted object/vehicle in the mirror.
  • Optionally, the system may adjust the angle or tilt or setting of the mirror reflector (such as the mirror reflector of an exterior side mirror) in response to an object or vehicle detection, in order to enhance the driver's viewability of the object and awareness of the detected object. An angularly adjustable mirror mounted inside or outside a vehicle is normally positioned to display to the driver reflections of the vehicle's surrounding area. The mirror may be temporarily angularly adjusted by a motor to be out of its normal position and into a temporary position(s) which display to the driver an object, or vehicle, or area that is a potential collision danger, or source of such dangers, to the vehicle. These areas could include the vehicle's “blind spots”.
  • The motor or mirror actuator moves the mirror to potentially multiple positions and then back to the original position, responsive to signals from a detector/tracking system. The detector/tracking system could be a machine vision system or the like. The vision system senses the presence, and/or position, and/or velocity of a possible collision-danger object near the vehicle. The system could also sense dangerous movements of the host vehicle, such as crossing lane markers. The vision system signals the adjustable mirror, for the appropriate display of a candidate vehicle, or object, or dangerous areas, such as blind spots, to the vehicle driver. The system may use human reaction times to signal the mirror movements in sufficient time so that the driver can react to dangers appropriately.
  • The driver, by viewing the mirror from one head position, would be able to “track” the other object, or vehicle, because the mirror would move in such a way as to allow this “tracking” to occur. Also from one head position, the driver would be able to see dangerous areas, or not, as the mirror moves. After the system determines that the danger has passed, the mirror returns to its normal position. Optionally, the system may be overridden by the driver, such as in conditions where the environment would normally trigger inappropriate mirror movements. The system can track the objects ideally, or it could partially track the objects. Partial tracking could protect against extreme mirror motions, and also encourage some ergonomically recommended driver head movements, preventing inattention.
  • The vision system thus would provide enhanced viewing and recognition of detected objects at the inside and/or outside mirrors, for multiple blind spots, for more objects than just vehicles, for areas themselves, for tracking mirror movements, for binary mirror movements, for system tuning, and for ergonomic features. The mirrors would be adaptive or “intelligent”, or “assisting”, in that they show to the driver scenes that the total system “believes” that the driver “should” see, before acting unwisely. Optionally, an electrochromic (EC) mirror (that adapts to the light environment surrounding the vehicle), with angular adaptive capability, adapts to the environment of light, and objects, surrounding the vehicle, and may allow the driver to readily see the relevant environment, with minimal head movements, and/or minimal visual adaptation.
  • Optionally, the concepts of a vision system for enhanced viewing and recognition of detected objects as discussed above, with display at or near or around or overlaid on a rear-view mirror can also be applied to display systems for heads-up or larger displays at or on the windshield. These windshield displays can utilize forward facing imaging systems with machine vision of important or threat objects or objects of interest and may display icons or attention-getting patterns to the driver. Similar subsystems which monitor the driver's field of view can be utilized so that the windshield display enhances the driver's knowledge of the forward scene. Energy beam or projector-like systems as mentioned above with respect to the simulated visor system can be used to highlight relevant objects in the forward windshield scenes as viewed by the driver.
  • Optionally, the vision system may provide a panoramic display that combines or merges images from two or more cameras or image sensors (such as from a center, rearward viewing camera and two side, rearward viewing cameras) so the driver of the vehicle may view a single display that displays the area rearward and to the sides of the host vehicle. Such a vision system may utilize aspects of the systems described in U.S. Pat. Nos. 5,670,935; 5,949,331; 6,222,447; and 6,611,202, which are hereby incorporated herein by reference in their entireties. With reference to FIGS. 4-14, the vision system may perform an image stitching function to merge or stitch images together along a desired stitch or merge line or lines to provide an enhanced generally uniform merged image for displaying to the driver. The image stitching area should be adaptively placed, such as near a lane marker line in a side camera view. The system may utilize adaptive merge planes, or surfaces, using vehicle neighbors, and/or may use a merge surface intersecting side object detection algorithm distances for first vehicles in left, center, and right lanes. Objects farther away and road surfaces closer to the host may be double, missing or wrong sized, but this may only minimally affect the driver. This is because drivers typically track the nearby vehicles and mostly ignore the road in front of, and objects behind, the closest vehicle in the host's lane and adjacent lanes. The stitching area may change, depending upon visible vehicles and/or other detected objects.
  • The image stitching process may limit distortion of dominant objects, vehicles and lane markers. From the side cameras, the adjacent lane car images don't undergo cross stitching. Only closely following car sides may cross the default stitching area or lines. Each straight highway lane marker line may comprise an image taken from one of the cameras. The default stitching may be just outside the host lane marker lines for all cameras. In the illustrated embodiment, there are three lane-specific cameras: host, driver-adjacent, or passenger-adjacent, and the zipper stitching could follow the center vehicle, and limit other-lane effects. The system may judge vehicles/lanes priority, and use that for setting the stitching and merge planes. For example, the system may merge surfaces to follow either the most dangerous vehicle (as determined based on rate of approach and/or location of the detected vehicle) or the closest vehicle. The stitching may follow the lane curves, as much as possible, and may use special line fitting algorithms or calculations (such as a cubic spline line fitting function or the like) so that lane markers have a substantially continuous slope if they must cross the stitching area. For example, with gentle curves, the system may alter the stitching a small amount, while with more severe curves, the system may need to default to a center or curve-side camera dominance for stitching and merge surfaces. Optionally, the system may merge surfaces so as to adapt for vehicle presence, or not, in each of the three lanes.
  • Optionally, the fiducials used for calibration of the merging surfaces and merging borders in such a vision system may be indicators or LEDs along the host lane boundaries and back a distance from the host vehicle (such as for situations where there are no vehicles present behind the host vehicle) to enhance the depth perception of the displayed image. Optionally, the fiducials of such a vision system may be a series of vertical poles (such as poles that appear to be 5 meters high), along the host lane boundaries. In this way big trucks may look good, even up close in the host lane. There can be several fiducial sets of cones and LEDs. For example, there may be one set for no-curvature lanes and other sets for lane curvatures of say 100, 30, 10 meters for the respective radius of curvature. These sets of fiducials could be used to select multiple calibrations for merging surfaces when lanes have been measured for various radii of curvature for image aligning and conversion from image space to 3D space.
  • The process of image joining can similarly include those of front and side looking cameras also in a fashion similar to the rear-side system described above for an image that combines these images into one. The joining of images can include a resulting 360 degree image combining images from the front, rear and side facing cameras. The image stitching in such an application would follow the rules stated above. The stitching area itself, the pixels of the border, can be camouflaged by replacing by non-important image areas (pixels) nearby the stitching area which have similar contrast and color.
  • The imaging device and control and image processor and illumination source may comprise any suitable components, and may utilize aspects of the vision systems of the text described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454; and 6,824,281, which are all hereby incorporated herein by reference in their entireties. The imaging device and/or control may be part of or share components or circuitry with other image or imaging or vision systems of the vehicle, such as headlamp control systems and/or rain sensing systems and/or cabin monitoring systems and/or the like.
  • Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law including the doctrine of equivalents.

Claims (13)

1. A vehicle vision system for a vehicle, said vehicle vision system comprising:
an image sensor having a forward field of view for capturing image data of a road surface forward of the vehicle; and
an image processor processing said image data, said vehicle vision system determining at least an estimate of a traction condition of at least a portion of the imaged road surface.
2. The vehicle vision system of claim 1, wherein said vehicle vision system estimates a targeted separation gap between the host vehicle and a leading vehicle.
3. The vehicle vision system of claim 2, wherein said targeted separation gap is adjusted based on a current driving condition.
4. The vehicle vision system of claim 2, wherein said vehicle vision system adjusts said targeted separation gap based on the driving capabilities of the driver of the host vehicle.
5. A vehicle vision system for a vehicle, said vehicle vision system comprising:
an image sensor having a field of view and capturing image data of a scene exterior of the vehicle;
a monitor monitoring power consumption of the vehicle;
at least one lighting system that draws electrical power from the vehicle when operated;
an image processor processing said image data; and
wherein the electrical power drawn by said at least one lighting system is varied at least in part responsive to processing of said image data by said image processor in order to adjust fuel consumption by the vehicle.
6. The vehicle vision system of claim 5, wherein the electrical power drawn by said at least one lighting system is reduced at least in part responsive to processing of said image data by said image processor in order to reduce fuel consumption by the vehicle.
7. The vehicle vision system of claim 5, wherein said vehicle vision system reduces the light generated by said vehicle lighting system during driving conditions when less vehicle lighting is desired while directing light at areas where it is determined that light is desired.
8. The vehicle vision system of claim 5, wherein said image sensor has a forward field of view and captures image data of a scene forward of the vehicle and in the direction of forward travel of the vehicle.
9. A vehicle vision system for a vehicle, said vehicle vision system comprising:
an image sensor and image processor, said image sensor having a field of view exterior of the vehicle for capturing image data of a scene forward of the vehicle, said image processor processing said image data;
wherein said vehicle vision system distinguishes the presence of a live animal from a dead animal imaged within said field of view; and
wherein said system at least one of (a) generates an alert and (b) controls the vehicle to assist in avoiding a collision.
10. The vehicle vision system of claim 9, wherein said system is adaptable to the driver's assumption of risk when operating to avoid a collision with the animal or to continue on the vehicle's path of travel.
11. The vehicle vision system of claim 9, wherein said system is adaptable to react differently depending on the type of animal that is detected and identified.
12. The vehicle vision system of claim 9, wherein said system is adaptable to react differently depending on whether the detected animal is distinguished as a live animal or a dead animal.
13. The vehicle vision system of claim 9, comprising:
at least two image sensors having at least one of (a) a forward field of view and capturing image data of a scene forward of the vehicle, (b) a rearward field of view and capturing image data of a scene rearward of the vehicle and (c) a sideward field of view and capturing image data of a scene to the side of the vehicle; and
a display displaying the captured images as a merged image with image stitching of the component images to minimize artifacts of image stitching.
US12/508,840 2008-07-24 2009-07-24 Vehicle Imaging System Abandoned US20100020170A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US12/508,840 US20100020170A1 (en) 2008-07-24 2009-07-24 Vehicle Imaging System
US13/866,376 US9509957B2 (en) 2008-07-24 2013-04-19 Vehicle imaging system
US15/361,748 US11091105B2 (en) 2008-07-24 2016-11-28 Vehicle vision system
US17/445,100 US20210370855A1 (en) 2008-07-24 2021-08-16 Vehicular control system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US8322208P 2008-07-24 2008-07-24
US12/508,840 US20100020170A1 (en) 2008-07-24 2009-07-24 Vehicle Imaging System

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/866,376 Division US9509957B2 (en) 2008-07-24 2013-04-19 Vehicle imaging system

Publications (1)

Publication Number Publication Date
US20100020170A1 true US20100020170A1 (en) 2010-01-28

Family

ID=41568272

Family Applications (4)

Application Number Title Priority Date Filing Date
US12/508,840 Abandoned US20100020170A1 (en) 2008-07-24 2009-07-24 Vehicle Imaging System
US13/866,376 Active 2031-08-14 US9509957B2 (en) 2008-07-24 2013-04-19 Vehicle imaging system
US15/361,748 Active 2031-03-05 US11091105B2 (en) 2008-07-24 2016-11-28 Vehicle vision system
US17/445,100 Pending US20210370855A1 (en) 2008-07-24 2021-08-16 Vehicular control system

Family Applications After (3)

Application Number Title Priority Date Filing Date
US13/866,376 Active 2031-08-14 US9509957B2 (en) 2008-07-24 2013-04-19 Vehicle imaging system
US15/361,748 Active 2031-03-05 US11091105B2 (en) 2008-07-24 2016-11-28 Vehicle vision system
US17/445,100 Pending US20210370855A1 (en) 2008-07-24 2021-08-16 Vehicular control system

Country Status (1)

Country Link
US (4) US20100020170A1 (en)

Cited By (159)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090118909A1 (en) * 2007-10-31 2009-05-07 Valeo Vision Process for detecting a phenomenon limiting the visibility for a motor vehicle
US20090315723A1 (en) * 2008-06-24 2009-12-24 Visiocorp Patents S.A.R.L Optical system and method for detecting optical system obscuration in a vehicle
US20100060738A1 (en) * 2008-09-08 2010-03-11 Toyota Jidosha Kabushiki Kaisha Road surface division mark recognition apparatus, and lane departure prevention apparatus
US20110163866A1 (en) * 2011-03-14 2011-07-07 Ford Global Technologies, Llc Sun Protection System for Automotive Vehicle
US20110260886A1 (en) * 2010-04-21 2011-10-27 Denso Corporation Driver assistance device and method of controlling the same
US20120229028A1 (en) * 2009-10-24 2012-09-13 Daimler Ag Device for Controlling a Low Beam of a Vehicle
EP2544162A1 (en) * 2010-03-01 2013-01-09 Honda Motor Co., Ltd. Surrounding area monitoring device for vehicle
US8376595B2 (en) 2009-05-15 2013-02-19 Magna Electronics, Inc. Automatic headlamp control
US20130096773A1 (en) * 2010-04-07 2013-04-18 Tomoyuki Doi Vehicle driving-support apparatus
DE112011103834T5 (en) 2010-11-19 2013-09-05 Magna Electronics, Inc. Lane departure warning and lane centering
US20130229525A1 (en) * 2010-11-16 2013-09-05 Honda Motor Co., Ltd. A device for monitoring surroundings of a vehicle
US20130253815A1 (en) * 2012-03-23 2013-09-26 Institut Francais Des Sciences Et Technologies Des Transports, De L'amenagement System of determining information about a path or a road vehicle
WO2013187829A1 (en) * 2012-06-11 2013-12-19 Scania Cv Ab Warning system
WO2013187828A1 (en) * 2012-06-11 2013-12-19 Scania Cv Ab Warning system
US20130343071A1 (en) * 2012-06-26 2013-12-26 Honda Motor Co., Ltd. Light distribution controller
US8694224B2 (en) 2012-03-01 2014-04-08 Magna Electronics Inc. Vehicle yaw rate correction
US20140184789A1 (en) * 2013-01-02 2014-07-03 The Boeing Company Automated Water Drop Measurement and Ice Detection System
US20140211007A1 (en) * 2013-01-28 2014-07-31 Fujitsu Ten Limited Object detector
US20140277990A1 (en) * 2011-08-03 2014-09-18 Continental Teves Ag & Co. Ohg Method and system for adaptively controlling distance and speed and for stopping a motor vehicle, and a motor vehicle which works with same
US8874267B1 (en) * 2012-06-20 2014-10-28 Google Inc. Avoiding blind spots of other vehicles
US20150042799A1 (en) * 2013-08-07 2015-02-12 GM Global Technology Operations LLC Object highlighting and sensing in vehicle image display systems
US20150066345A1 (en) * 2013-08-28 2015-03-05 Elwha Llc Vehicle collision management system responsive to user-selected preferences
US20150085121A1 (en) * 2011-10-31 2015-03-26 Rosco, Inc. Mirror monitor using two levels of reflectivity
US9000950B2 (en) 2012-11-13 2015-04-07 International Business Machines Corporation Managing vehicle detection
US20150109444A1 (en) * 2013-10-22 2015-04-23 GM Global Technology Operations LLC Vision-based object sensing and highlighting in vehicle image display systems
US20150131086A1 (en) * 2012-05-18 2015-05-14 Denso Corporation Traveling environment detection device
US20150138361A1 (en) * 2013-11-15 2015-05-21 Denso Corporation Lane departure warning apparatus
US20150156383A1 (en) * 2013-12-04 2015-06-04 Magna Electronics Inc. Vehicle vision system with camera having liquid lens optic
US20150165975A1 (en) * 2013-12-16 2015-06-18 Honda Motor Co., Ltd. Fail-safe mirror for side camera failure
US9092986B2 (en) 2013-02-04 2015-07-28 Magna Electronics Inc. Vehicular vision system
US9090234B2 (en) 2012-11-19 2015-07-28 Magna Electronics Inc. Braking control system for vehicle
DE102015202846A1 (en) 2014-02-19 2015-08-20 Magna Electronics, Inc. Vehicle vision system with display
US9126525B2 (en) 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
US9128354B2 (en) 2012-11-29 2015-09-08 Bendix Commercial Vehicle Systems Llc Driver view adapter for forward looking camera
US20150251600A1 (en) * 2014-03-06 2015-09-10 Panasonic Intellectual Property Management Co., Ltd. Display control device, display device, display control method, and non-transitory storage medium
US9174574B2 (en) 2011-10-19 2015-11-03 Magna Electronics Inc. Vehicle vision system for controlling a vehicle safety feature responsive to an imaging sensor having an exterior rearward field of view and before a collision occurs
US9177427B1 (en) 2011-08-24 2015-11-03 Allstate Insurance Company Vehicle driver feedback device
US9194943B2 (en) 2011-04-12 2015-11-24 Magna Electronics Inc. Step filter for estimating distance in a time-of-flight ranging system
US20150343947A1 (en) * 2014-05-30 2015-12-03 State Farm Mutual Automobile Insurance Company Systems and Methods for Determining a Vehicle is at an Elevated Risk for an Animal Collision
US9260095B2 (en) 2013-06-19 2016-02-16 Magna Electronics Inc. Vehicle vision system with collision mitigation
US9327693B2 (en) 2013-04-10 2016-05-03 Magna Electronics Inc. Rear collision avoidance system for vehicle
US9340227B2 (en) 2012-08-14 2016-05-17 Magna Electronics Inc. Vehicle lane keep assist system
US9376066B2 (en) 2011-04-18 2016-06-28 Magna Electronics, Inc. Vehicular camera with variable focus capability
US20160257307A1 (en) * 2014-04-30 2016-09-08 Toyota Motor Engineering & Manufacturing North America, Inc. Detailed map format for autonomous driving
US20160280133A1 (en) * 2015-03-23 2016-09-29 Magna Electronics Inc. Vehicle vision system with thermal sensor
US9481301B2 (en) 2012-12-05 2016-11-01 Magna Electronics Inc. Vehicle vision system utilizing camera synchronization
US20160332560A1 (en) * 2015-05-14 2016-11-17 Stanley Electric Co., Ltd. Headlight controller and vehicle headlight system
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
US9509957B2 (en) 2008-07-24 2016-11-29 Magna Electronics Inc. Vehicle imaging system
US9547795B2 (en) 2011-04-25 2017-01-17 Magna Electronics Inc. Image processing method for detecting objects using relative motion
US20170069212A1 (en) * 2014-05-21 2017-03-09 Yazaki Corporation Safety Confirmation Assist Device
US9623878B2 (en) 2014-04-02 2017-04-18 Magna Electronics Inc. Personalized driver assistance system for vehicle
EP3156298A1 (en) * 2015-10-13 2017-04-19 Volvo Car Corporation Driving aid arrangement, a vehicle and a method of controlling a longitudinal velocity of a vehice
US20170113613A1 (en) * 2015-10-27 2017-04-27 Magna Electronics Inc. Vehicle vision system with enhanced night vision
US9650007B1 (en) 2015-04-13 2017-05-16 Allstate Insurance Company Automatic crash detection
US20170154241A1 (en) * 2015-12-01 2017-06-01 Mobileye Vision Technologies Ltd. Detecting visual information corresponding to an animal
US9674664B1 (en) * 2016-04-28 2017-06-06 T-Mobile Usa, Inc. Mobile device in-motion proximity guidance system
US20170158132A1 (en) * 2014-07-22 2017-06-08 Denso Corporation Vehicular display control apparatus
US9681062B2 (en) 2011-09-26 2017-06-13 Magna Electronics Inc. Vehicle camera image quality improvement in poor visibility conditions by contrast amplification
US20170168495A1 (en) * 2015-12-10 2017-06-15 Uber Technologies, Inc. Active light sensors for determining expected traction value of a road segment
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
CN107054364A (en) * 2016-01-11 2017-08-18 Trw汽车股份有限公司 Irregular control system and method for determining road surface
US9743002B2 (en) 2012-11-19 2017-08-22 Magna Electronics Inc. Vehicle vision system with enhanced display functions
US20170242244A1 (en) * 2016-02-24 2017-08-24 L-3 Communications Corporation Transparent display with eye protection
US9761142B2 (en) 2012-09-04 2017-09-12 Magna Electronics Inc. Driver assistant system using influence mapping for conflict avoidance path determination
CN107531217A (en) * 2015-05-12 2018-01-02 深圳市大疆创新科技有限公司 Identification or the apparatus and method of detection barrier
WO2018004421A1 (en) * 2016-06-28 2018-01-04 Scania Cv Ab Method and control unit for a digital rear view mirror
DE102016112483A1 (en) * 2016-07-07 2018-01-11 Connaught Electronics Ltd. Method for reducing interference signals in a top view image showing a motor vehicle and a surrounding area of the motor vehicle, driver assistance system and motor vehicle
US9900490B2 (en) 2011-09-21 2018-02-20 Magna Electronics Inc. Vehicle vision system using image data transmission and power supply via a coaxial cable
US9921585B2 (en) 2014-04-30 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Detailed map format for autonomous driving
US9925980B2 (en) 2014-09-17 2018-03-27 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US20180086346A1 (en) * 2015-04-03 2018-03-29 Denso Corporation Information presentation apparatus
FR3056806A1 (en) * 2016-09-28 2018-03-30 Valeo Schalter Und Sensoren Gmbh SYSTEM FOR MONITORING FREE SPACE AROUND A MOTOR VEHICLE
WO2018081610A1 (en) * 2016-10-27 2018-05-03 Nrg Systems, Inc. System and methods for detecting bats and avian carcasses
US20180137698A1 (en) * 2015-04-24 2018-05-17 Pai-R Co., Ltd. Drive recorder
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
DE102016224905A1 (en) * 2016-12-14 2018-06-14 Conti Temic Microelectronic Gmbh Apparatus and method for fusing image data from a multi-camera system for a motor vehicle
DE102016225073A1 (en) * 2016-12-15 2018-06-21 Conti Temic Microelectronic Gmbh DEVICE FOR PROVIDING AN IMPROVED OBSTACLE IDENTIFICATION
US10027930B2 (en) 2013-03-29 2018-07-17 Magna Electronics Inc. Spectral filtering for vehicular driver assistance systems
US10025994B2 (en) 2012-12-04 2018-07-17 Magna Electronics Inc. Vehicle vision system utilizing corner detection
US20180204538A1 (en) * 2017-01-13 2018-07-19 Continental Automotive Systems, Inc. External light dimming system and method
US10055651B2 (en) 2016-03-08 2018-08-21 Magna Electronics Inc. Vehicle vision system with enhanced lane tracking
US10083551B1 (en) 2015-04-13 2018-09-25 Allstate Insurance Company Automatic crash detection
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US20180304813A1 (en) * 2017-04-20 2018-10-25 Subaru Corporation Image display device
US10116873B1 (en) * 2015-11-09 2018-10-30 Ambarella, Inc. System and method to adjust the field of view displayed on an electronic mirror using real-time, physical cues from the driver in a vehicle
US20180320876A1 (en) * 2017-05-03 2018-11-08 Fluence Bioengineering Systems and methods for coupling a metal core pcb to a heat sink
US10127463B2 (en) 2014-11-21 2018-11-13 Magna Electronics Inc. Vehicle vision system with multiple cameras
US10129221B1 (en) 2016-07-05 2018-11-13 Uber Technologies, Inc. Transport facilitation system implementing dual content encryption
US10144419B2 (en) 2015-11-23 2018-12-04 Magna Electronics Inc. Vehicle dynamic control system for emergency handling
US10192445B1 (en) 2014-05-30 2019-01-29 State Farm Mutual Automobile Insurance Company Systems and methods for determining a vehicle is at an elevated risk for an animal collision
US10204528B2 (en) 2015-08-05 2019-02-12 Uber Technologies, Inc. Augmenting transport services using driver profiling
CN109383241A (en) * 2017-08-11 2019-02-26 通用汽车环球科技运作有限责任公司 For sun-proof system and method
US10222224B2 (en) 2013-06-24 2019-03-05 Magna Electronics Inc. System for locating a parking space based on a previously parked space
US10232797B2 (en) 2013-04-29 2019-03-19 Magna Electronics Inc. Rear vision system for vehicle with dual purpose signal lines
US10262211B2 (en) * 2016-09-28 2019-04-16 Wipro Limited Windshield and a method for mitigating glare from a windshield of an automobile
US10286855B2 (en) 2015-03-23 2019-05-14 Magna Electronics Inc. Vehicle vision system with video compression
US10326969B2 (en) 2013-08-12 2019-06-18 Magna Electronics Inc. Vehicle vision system with reduction of temporal noise in images
US20190184985A1 (en) * 2017-12-18 2019-06-20 Stephen Tokish Active rear sense area adjustment of collision avoidance system of a vehicle when vehicle is approaching a positive road grade change
US10329827B2 (en) 2015-05-11 2019-06-25 Uber Technologies, Inc. Detecting objects within a vehicle in connection with a service
CN109952231A (en) * 2016-12-30 2019-06-28 金泰克斯公司 With the on-demand full display mirror for scouting view
DE102018201316A1 (en) * 2018-01-29 2019-08-01 Conti Temic Microelectronic Gmbh Surroundview system for one vehicle
US20190235056A1 (en) * 2018-01-31 2019-08-01 Honeywell International Inc. Dimmable glass for eye safety for lidar technology
US10371542B2 (en) 2017-02-17 2019-08-06 Uber Technologies, Inc. System and methods for performing multivariate optimizations based on location data
JP2019132847A (en) * 2019-02-28 2019-08-08 株式会社ニコン Imaging device
US10402771B1 (en) * 2017-03-27 2019-09-03 Uber Technologies, Inc. System and method for evaluating drivers using sensor data from mobile computing devices
US10417914B1 (en) 2014-05-30 2019-09-17 State Farm Mutual Automobile Insurance Company Systems and methods for determining a vehicle is at an elevated risk for an animal collision
US20190291642A1 (en) * 2016-07-11 2019-09-26 Lg Electronics Inc. Driver assistance apparatus and vehicle having the same
US20190306396A1 (en) * 2018-03-29 2019-10-03 Varroc Lighting Systems, s.r.o. Communication Device of a Motor Vehicle, a Motor Vehicle Lighting Device for the Communication Device of a Motor Vehicle and a Car2Car or Car2X Communication Method for a Motor Vehicle
US10444346B2 (en) * 2014-07-25 2019-10-15 Robert Bosch Gmbh Method for migrating radar sensor limitations with video camera input for active braking for pedestrians
US10445950B1 (en) 2017-03-27 2019-10-15 Uber Technologies, Inc. Vehicle monitoring system
US10449902B1 (en) 2013-09-24 2019-10-22 Rosco, Inc. Mirror monitor using two levels of reflectivity and transmissibility
US10459087B2 (en) 2016-04-26 2019-10-29 Uber Technologies, Inc. Road registration differential GPS
US10482684B2 (en) 2015-02-05 2019-11-19 Uber Technologies, Inc. Programmatically determining location information in connection with a transport service
US10489686B2 (en) 2016-05-06 2019-11-26 Uatc, Llc Object detection for an autonomous vehicle
US10504237B2 (en) * 2014-12-17 2019-12-10 Bayerische Motoren Werke Aktiengesellschaft Method for determining a viewing direction of a person
US10523904B2 (en) 2013-02-04 2019-12-31 Magna Electronics Inc. Vehicle data recording system
US10525883B2 (en) 2014-06-13 2020-01-07 Magna Electronics Inc. Vehicle vision system with panoramic view
US10567705B2 (en) 2013-06-10 2020-02-18 Magna Electronics Inc. Coaxial cable with bidirectional data transmission
WO2020047302A1 (en) * 2018-08-29 2020-03-05 Buffalo Automation Group Inc. Lane and object detection systems and methods
US10591311B2 (en) * 2014-09-18 2020-03-17 Bayerische Motoren Werke Aktiengesellschaft Method, device, system, and computer program product for displaying driving route section factors influencing a vehicle
US10609335B2 (en) 2012-03-23 2020-03-31 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US10640040B2 (en) 2011-11-28 2020-05-05 Magna Electronics Inc. Vision system for vehicle
US10640046B1 (en) 2013-09-24 2020-05-05 Rosco, Inc. Convex rearview mirror and monitor with reversible back/socket mount
CN111103753A (en) * 2018-10-09 2020-05-05 先进光电科技股份有限公司 Panoramic image system and driving assistance system
US10672198B2 (en) 2016-06-14 2020-06-02 Uber Technologies, Inc. Trip termination determination for on-demand transport
US10678262B2 (en) 2016-07-01 2020-06-09 Uatc, Llc Autonomous vehicle localization using image analysis and manipulation
US10684361B2 (en) 2015-12-16 2020-06-16 Uatc, Llc Predictive sensor array configuration system for an autonomous vehicle
US10712742B2 (en) 2015-12-16 2020-07-14 Uatc, Llc Predictive sensor array configuration system for an autonomous vehicle
US10712160B2 (en) 2015-12-10 2020-07-14 Uatc, Llc Vehicle traction map for autonomous vehicles
US10717439B2 (en) * 2017-09-15 2020-07-21 Honda Motor Co., Ltd Traveling control system and vehicle control method
US10726280B2 (en) 2016-03-09 2020-07-28 Uatc, Llc Traffic signal analysis system
US10773725B1 (en) * 2017-08-25 2020-09-15 Apple Inc. Tire-road friction estimation and mapping
US10775504B2 (en) 2016-09-29 2020-09-15 Honeywell International Inc. Laser air data sensor mounting and operation for eye safety
US20200353922A1 (en) * 2019-05-07 2020-11-12 Hyundai Mobis Co., Ltd. Vehicle scc system based on complex information and method of controlling the same
US20210001850A1 (en) * 2018-03-01 2021-01-07 Jaguar Land Rover Limited Vehicle control method and apparatus
US10902525B2 (en) 2016-09-21 2021-01-26 Allstate Insurance Company Enhanced image capture and analysis of damaged tangible objects
EP3785994A1 (en) * 2019-08-29 2021-03-03 Ningbo Geely Automobile Research & Development Co. Ltd. A system and method for highlighting of an object to a vehicle occupant
US10946798B2 (en) 2013-06-21 2021-03-16 Magna Electronics Inc. Vehicle vision system
US20210157330A1 (en) * 2019-11-23 2021-05-27 Ha Q Tran Smart vehicle
EP3842307A1 (en) * 2019-12-27 2021-06-30 Volvo Car Corporation System and method for providing vehicle safety distance and speed alerts under slippery road conditions
WO2021141873A1 (en) * 2020-01-06 2021-07-15 Gentex Corporation Dynamic imaging system
WO2021144029A1 (en) * 2020-01-17 2021-07-22 Volvo Truck Corporation A cruise control system and a method for controlling a powertrain
US11124193B2 (en) 2018-05-03 2021-09-21 Volvo Car Corporation System and method for providing vehicle safety distance and speed alerts under slippery road conditions
US20210323473A1 (en) * 2020-04-17 2021-10-21 Magna Mirrors Of America, Inc. Interior rearview mirror assembly with driver monitoring system
US11170227B2 (en) 2014-04-08 2021-11-09 Bendix Commercial Vehicle Systems Llc Generating an image of the surroundings of an articulated vehicle
US20210382560A1 (en) * 2020-06-05 2021-12-09 Aptiv Technologies Limited Methods and System for Determining a Command of an Occupant of a Vehicle
US11205083B2 (en) 2019-04-02 2021-12-21 Magna Electronics Inc. Vehicular driver monitoring system
US11209658B2 (en) * 2017-07-06 2021-12-28 Boe Technology Group Co., Ltd. Virtual image display apparatus, vehicle comprising head-up display apparatus comprising virtual image display apparatus, and method for virtual image display
US11208053B2 (en) * 2019-01-07 2021-12-28 Ability Opto Electronics Technology Co., Ltd. Movable carrier auxiliary system
CN113924824A (en) * 2019-05-29 2022-01-11 法雷奥照明公司 Method for operating a vehicle lighting device and vehicle lighting device
US11361380B2 (en) 2016-09-21 2022-06-14 Allstate Insurance Company Enhanced image capture and analysis of damaged tangible objects
US11364915B2 (en) * 2017-12-07 2022-06-21 Nissan Motor Co., Ltd. Road condition determination method and road condition determination device
US20220309923A1 (en) * 2019-04-29 2022-09-29 Qualcomm Incorporated Method and apparatus for vehicle maneuver planning and messaging
US11465559B2 (en) * 2017-12-27 2022-10-11 Denso Corporation Display processing device and display control device
US11472338B2 (en) 2014-09-15 2022-10-18 Magna Electronics Inc. Method for displaying reduced distortion video images via a vehicular vision system
US11494517B2 (en) 2020-02-12 2022-11-08 Uber Technologies, Inc. Computer system and device for controlling use of secure media recordings
US11508156B2 (en) 2019-12-27 2022-11-22 Magna Electronics Inc. Vehicular vision system with enhanced range for pedestrian detection
US11560153B2 (en) 2019-03-07 2023-01-24 6 River Systems, Llc Systems and methods for collision avoidance by autonomous vehicles

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010020208A1 (en) * 2010-05-12 2011-11-17 Volkswagen Ag Method for parking or parking a vehicle and corresponding assistance system and vehicle
KR20150076532A (en) * 2013-12-27 2015-07-07 주식회사 만도 System for measuring distance and Method thereof
WO2016145175A1 (en) * 2015-03-12 2016-09-15 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Heads up display of vehicle stopping distance
FR3035523B1 (en) * 2015-04-23 2017-04-21 Parrot IMMERSION DRONE DRIVING SYSTEM
US10486599B2 (en) * 2015-07-17 2019-11-26 Magna Mirrors Of America, Inc. Rearview vision system for vehicle
US10324297B2 (en) 2015-11-30 2019-06-18 Magna Electronics Inc. Heads up display system for vehicle
US10401621B2 (en) 2016-04-19 2019-09-03 Magna Electronics Inc. Display unit for vehicle head-up display system
DE102016210632A1 (en) * 2016-06-15 2017-12-21 Bayerische Motoren Werke Aktiengesellschaft Method for checking a media loss of a motor vehicle and motor vehicle and system for carrying out such a method
US10275662B1 (en) * 2016-09-30 2019-04-30 Zoox, Inc. Estimating friction based on image data
EP3562708B1 (en) * 2016-12-27 2021-02-17 Gentex Corporation Rear vision system with eye-tracking
JP6624105B2 (en) * 2017-02-08 2019-12-25 トヨタ自動車株式会社 Image display device
US11142200B2 (en) 2017-02-23 2021-10-12 Magna Electronics Inc. Vehicular adaptive cruise control with enhanced vehicle control
JP6680254B2 (en) * 2017-03-22 2020-04-15 トヨタ自動車株式会社 Vehicle display control device
DE102017206923B4 (en) * 2017-04-25 2020-12-31 Robert Bosch Gmbh Controlling a directional light source
US11131857B2 (en) 2017-06-26 2021-09-28 Gentex Corporation Dynamic calibration of optical properties of a dimming element
US20190031084A1 (en) 2017-07-27 2019-01-31 International Business Machines Corporation Adaptive vehicle illumination utilizing visual pattern learning and cognitive enhancing
DE102017214505A1 (en) * 2017-08-21 2019-02-21 Bayerische Motoren Werke Aktiengesellschaft Method for operating an assistance system for a vehicle and assistance system
US10921142B2 (en) 2017-12-14 2021-02-16 Waymo Llc Methods and systems for sun-aware vehicle routing
US11206375B2 (en) 2018-03-28 2021-12-21 Gal Zuckerman Analyzing past events by utilizing imagery data captured by a plurality of on-road vehicles
US11138418B2 (en) 2018-08-06 2021-10-05 Gal Zuckerman Systems and methods for tracking persons by utilizing imagery data captured by on-road vehicles
US11574543B2 (en) 2020-03-23 2023-02-07 Toyota Motor North America, Inc. Transport dangerous location warning
US11718288B2 (en) 2020-03-23 2023-08-08 Toyota Motor North America, Inc. Consensus-based transport event severity
US11574468B2 (en) 2020-03-31 2023-02-07 Toyota Research Institute, Inc. Simulation-based learning of driver interactions through a vehicle window
JP2022155838A (en) * 2021-03-31 2022-10-14 本田技研工業株式会社 Vehicle control device, route generation device, vehicle control method, route generation method, and program
US20230177976A1 (en) * 2021-12-08 2023-06-08 J.J. Keller & Associates, Inc. Commercial Vehicle Back-Up Trainer and Method

Citations (93)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2632040A (en) * 1952-05-01 1953-03-17 Rabinow Jacob Automatic headlight dimmer
US2827594A (en) * 1954-09-02 1958-03-18 Rabinow Jacob Color discriminating headlight dimmer
US3665224A (en) * 1970-09-08 1972-05-23 Ruth Arthur P Ambient light level detector including transient suppression circuitry
US3708231A (en) * 1969-11-10 1973-01-02 G Walters Precision angle measuring device
US3807832A (en) * 1972-11-09 1974-04-30 American Cyanamid Co Electrochromic (ec) mirror which rapidly changes reflectivity
US3811046A (en) * 1973-01-22 1974-05-14 Le Van Electronics Inc Light sensitive security system
US3813540A (en) * 1973-03-26 1974-05-28 Ncr Circuit for measuring and evaluating optical radiation
US3862798A (en) * 1973-11-19 1975-01-28 Charles L Hopkins Automatic rear view mirror adjuster
US3947095A (en) * 1974-03-18 1976-03-30 Marie Saratore Rear view vision device
US4200361A (en) * 1977-01-25 1980-04-29 Fiat Societa Per Azioni Liquid crystal mirror for use as a rear-view mirror for vehicles
US4247870A (en) * 1977-05-30 1981-01-27 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Highways And Transportation Highway premarking guidance system
US4249160A (en) * 1975-11-21 1981-02-03 Chilvers Graham R Vehicle mounted light activated control system
US4266856A (en) * 1978-07-20 1981-05-12 Wainwright Basil E Rear view mirror
US4381888A (en) * 1980-06-11 1983-05-03 Canon Kabushiki Kaisha Retrofocus type large aperture wide angle objective
US4431896A (en) * 1979-04-26 1984-02-14 A.G. Fur Industrielle Elektronik Agie Method and apparatus for orienting the wire guidance heads on spark erosion cutting equipment for eroding with a great wire slope
US4443057A (en) * 1981-06-01 1984-04-17 Gentex Corporation Automatic rearview mirror for automotive vehicles
US4491390A (en) * 1982-05-06 1985-01-01 Tong Shen Hsieh Automatic liquid-crystal light shutter
US4512637A (en) * 1981-10-29 1985-04-23 Carl-Zeiss-Stiftung, Heidenheim/Brenz Method and means for stepwise charge control of electrochromic layers
US4571082A (en) * 1982-05-18 1986-02-18 Downs Michael J Apparatus and method for measuring refractive index
US4572619A (en) * 1983-01-27 1986-02-25 Daimler-Benz Aktiengesellschaft Electrically dimmable rear-view mirror for motor vehicles
US4580875A (en) * 1984-03-30 1986-04-08 Gentex Corporation Electronic control system for automatic rearview mirrors for automotive vehicles
US4647161A (en) * 1981-05-20 1987-03-03 Mueller Rolf Fish eye lens system
US4727290A (en) * 1987-05-29 1988-02-23 General Motors Corporation Automatic vehicle headlamp dimming control
US4731669A (en) * 1985-06-18 1988-03-15 Matsushita Electric Industrial Co., Ltd. Camera apparatus with movably supported lens barrel
US4741603A (en) * 1985-05-08 1988-05-03 Nissan Motor Co., Ltd. Electrochromic nonglaring mirror
US4817948A (en) * 1983-09-06 1989-04-04 Louise Simonelli Reduced-scale racing system
US4820933A (en) * 1986-12-31 1989-04-11 Samsung Electronics Co., Ltd. Control circuit for liquid crystal rear-vision mirror
US4825232A (en) * 1988-03-25 1989-04-25 Enserch Corporation Apparatus for mounting aerial survey camera under aircraft wings
US4891559A (en) * 1985-06-13 1990-01-02 Nippondenso Soken, Inc. Apparatus for controlling a headlight of a vehicle
US4892345A (en) * 1988-09-23 1990-01-09 Rachael Iii Stephen Armored vehicle
US4895790A (en) * 1987-09-21 1990-01-23 Massachusetts Institute Of Technology High-efficiency, multilevel, diffractive optical elements
US4896030A (en) * 1987-02-27 1990-01-23 Ichikoh Industries Limited Light-reflectivity controller for use with automotive rearview mirror using electrochromic element
US4910591A (en) * 1988-08-08 1990-03-20 Edward Petrossian Side and rear viewing apparatus for motor vehicles
US4916374A (en) * 1989-02-28 1990-04-10 Donnelly Corporation Continuously adaptive moisture sensor system for wiper control
US4917477A (en) * 1987-04-06 1990-04-17 Gentex Corporation Automatic rearview mirror system for automotive vehicles
US4987357A (en) * 1989-12-18 1991-01-22 General Motors Corporation Adaptive motor vehicle cruise control
US4991054A (en) * 1988-05-13 1991-02-05 Pacific Scientific Company Time-delay outdoor lighting control systems
US5001558A (en) * 1985-06-11 1991-03-19 General Motors Corporation Night vision system with color video camera
US5003288A (en) * 1988-10-25 1991-03-26 Nartron Corporation Ambient light sensing method and apparatus
US5012082A (en) * 1989-03-01 1991-04-30 Hamamatsu Photonics K.K. Two-dimensional incident position detector device for light or radiation
US5016977A (en) * 1989-02-06 1991-05-21 Essilor International-Compagnie Generale Optical lens for correcting astigmatism
US5086253A (en) * 1990-10-15 1992-02-04 Lawler Louis N Automatic headlight dimmer apparatus
US5096287A (en) * 1990-03-15 1992-03-17 Aisin Seiki K.K. Video camera for an automobile
US5182502A (en) * 1991-05-06 1993-01-26 Lectron Products, Inc. Automatic headlamp dimmer
US5184956A (en) * 1990-02-20 1993-02-09 Codes Rousseau Method and device for training in the driving of vehicles
US5193029A (en) * 1991-11-19 1993-03-09 Donnelly Corporation Single sensor adaptive drive circuit for rearview mirror system
US5204778A (en) * 1992-04-06 1993-04-20 Gentex Corporation Control system for automatic rearview mirrors
US5208701A (en) * 1991-12-24 1993-05-04 Xerox Corporation Wobble correction lens with binary diffractive optic surface and refractive cylindrical surface
US5276389A (en) * 1991-12-14 1994-01-04 Leopold Kostal Gmbh & Co. Kg Method of controlling a windshield wiper system
US5289321A (en) * 1993-02-12 1994-02-22 Secor James O Consolidated rear view camera and display system for motor vehicle
US5289182A (en) * 1991-10-16 1994-02-22 Ii Bc-Sys Electronic anti-collison device carried on board a vehicle
US5305012A (en) * 1992-04-15 1994-04-19 Reveo, Inc. Intelligent electro-optical system and method for automatic glare reduction
US5307136A (en) * 1991-10-22 1994-04-26 Fuji Jukogyo Kabushiki Kaisha Distance detection system for vehicles
US5313072A (en) * 1993-02-16 1994-05-17 Rockwell International Corporation Optical detector for windshield wiper control
US5386285A (en) * 1992-02-28 1995-01-31 Mitsubishi Denki Kabushiki Kaisha Obstacle detecting device for a vehicle
US5406395A (en) * 1993-11-01 1995-04-11 Hughes Aircraft Company Holographic parking assistance device
US5410346A (en) * 1992-03-23 1995-04-25 Fuji Jukogyo Kabushiki Kaisha System for monitoring condition outside vehicle using imaged picture by a plurality of television cameras
US5414257A (en) * 1991-04-23 1995-05-09 Introlab Pty Limited Moisture sensor for detecting moisture on a windshield
US5414461A (en) * 1991-11-15 1995-05-09 Nissan Motor Co., Ltd. Vehicle navigation apparatus providing simultaneous forward and rearward views
US5416318A (en) * 1991-10-03 1995-05-16 Hegyi; Dennis J. Combined headlamp and climate control sensor having a light diffuser and a light modulator
US5498866A (en) * 1993-06-01 1996-03-12 Leopold Kostal Gmbh & Co. Kg Optoelectronic sensor for detecting moisture on a windshield with means to compensate for a metallic layer in the windshield
US5510983A (en) * 1992-11-13 1996-04-23 Yazaki Corporation On-vehicle display
US5515448A (en) * 1992-07-28 1996-05-07 Yazaki Corporation Distance measuring apparatus of a target tracking type
US5614788A (en) * 1995-01-31 1997-03-25 Autosmart Light Switches, Inc. Automated ambient condition responsive daytime running light system
US5867591A (en) * 1995-04-21 1999-02-02 Matsushita Electric Industrial Co., Ltd. Method of matching stereo images and method of measuring disparity between these image
US5877897A (en) * 1993-02-26 1999-03-02 Donnelly Corporation Automatic rearview mirror, vehicle lighting control and vehicle interior monitoring system using a photosensor array
US5883739A (en) * 1993-10-04 1999-03-16 Honda Giken Kogyo Kabushiki Kaisha Information display device for vehicle
US5890021A (en) * 1996-12-05 1999-03-30 Canon Kabushiki Kaisha Distance detecting device, focus state detecting device and camera having same
US5892434A (en) * 1996-02-09 1999-04-06 David M. Carlson Automotive driving pattern monitor
US5896085A (en) * 1995-09-07 1999-04-20 Toyota Jidosha Kabushiki Kaisha Apparatus for controlling light distributions of head lamps
US6020704A (en) * 1997-12-02 2000-02-01 Valeo Electrical Systems, Inc. Windscreen sensing and wiper control system
US6049171A (en) * 1998-09-18 2000-04-11 Gentex Corporation Continuously variable headlamp control
US6066933A (en) * 1998-10-02 2000-05-23 Ponziana; Richard L. Rain sensing system and method having automatically registered and oriented rain sensor
US6172613B1 (en) * 1998-02-18 2001-01-09 Donnelly Corporation Rearview mirror assembly incorporating vehicle information display
US6201642B1 (en) * 1999-07-27 2001-03-13 Donnelly Corporation Vehicular vision system with a wide angle lens including a diffractive element
US6222447B1 (en) * 1993-02-26 2001-04-24 Donnelly Corporation Rearview vision system with indicia of backup travel
US20020015153A1 (en) * 2000-07-27 2002-02-07 Downs Michael John Jamin-type interferometers and components therefor
US6396397B1 (en) * 1993-02-26 2002-05-28 Donnelly Corporation Vehicle imaging system with stereo imaging
US6534884B2 (en) * 1998-12-16 2003-03-18 Donnelly Corporation Proximity sensing system for vehicles
US6553130B1 (en) * 1993-08-11 2003-04-22 Jerome H. Lemelson Motor vehicle warning and control system and method
US6559435B2 (en) * 1993-02-26 2003-05-06 Donnelly Corporation Vehicle headlight control using imaging sensor identifying objects by geometric configuration
US6672731B2 (en) * 2000-11-20 2004-01-06 Donnelly Corporation Vehicular rearview mirror with blind spot viewing system
US6717610B1 (en) * 1998-11-25 2004-04-06 Donnelly Corporation Wide angle image capture system for vehicle
US20040084569A1 (en) * 2002-11-04 2004-05-06 Bonutti Peter M. Active drag and thrust modulation system and method
US6891563B2 (en) * 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
US20060018512A1 (en) * 1997-04-02 2006-01-26 Stam Joseph S Vehicle automatic exterior light control
US20060018511A1 (en) * 2000-03-20 2006-01-26 Stam Joseph S System for controlling vehicle equipment
US20060091813A1 (en) * 1997-04-02 2006-05-04 Stam Joseph S Control system to automatically control vehicle headlamps
US20060293841A1 (en) * 2005-06-15 2006-12-28 Davor Hrovat Traction control system and method
US20070023613A1 (en) * 1993-02-26 2007-02-01 Donnelly Corporation Vehicle headlight control using imaging sensor
US20090016073A1 (en) * 2007-07-12 2009-01-15 Higgins-Luthman Michael J Automatic Lighting System with Adaptive Alignment Function
US20090045323A1 (en) * 2007-08-17 2009-02-19 Yuesheng Lu Automatic Headlamp Control System
US20090097038A1 (en) * 2007-10-16 2009-04-16 Higgins-Luthman Michael J Machine Vision for Predictive Suspension

Family Cites Families (247)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3141393A (en) 1961-03-14 1964-07-21 Photo File Surveys Inc Apparatus for obtaining roadway photographs
US3986022A (en) 1973-06-04 1976-10-12 Gilbert Peter Hyatt Illumination control system
US3680951A (en) 1970-04-01 1972-08-01 Baldwin Co D H Photoelectrically-controlled rear-view mirrow
US3689695A (en) 1970-04-10 1972-09-05 Harry C Rosenfield Vehicle viewing system
US3601614A (en) 1970-05-25 1971-08-24 Chrysler Corp Automatic anti-glare rearview mirror system
JPS50638Y1 (en) 1970-06-15 1975-01-09
US3612666A (en) 1970-07-13 1971-10-12 Libman Max L Electrically controlled rearview mirror employing self-contained power supply and motion-actuated power switch
US4672457A (en) 1970-12-28 1987-06-09 Hyatt Gilbert P Scanner system
US3746430A (en) 1971-09-15 1973-07-17 Baldwin Co D H Impulse-operated, day-night, rear view mirror
US4614415A (en) 1973-06-04 1986-09-30 Hyatt Gilbert P Illumination signal processing system
DE2460426A1 (en) 1974-12-20 1976-06-24 Daimler Benz Ag DEVICE FOR THE INDEPENDENT REGULATION OF THE LAMP RANGE OF VEHICLE HEADLIGHTS
US3962600A (en) 1975-02-14 1976-06-08 Arvin Hong Kong Ltd. Ambient light responsive illumination brightness control circuit
US4052712A (en) 1975-05-13 1977-10-04 Pacific Kogyo Kabushiki Kaisha Apparatus for photographing road ruts
US3985424A (en) 1975-06-18 1976-10-12 Lawrence Peska Associates, Inc. Panoramic rear viewing system
GB1535182A (en) 1976-03-31 1978-12-13 British Aircraft Corp Ltd Optical viewing apparatus
US4057911A (en) 1976-07-29 1977-11-15 Sack Thomas F Portable psycho-physical automobile driver testing device
US4093364A (en) 1977-02-04 1978-06-06 Miller Keith G Dual path photographic camera for use in motor vehicles
IT1117275B (en) 1977-02-25 1986-02-17 Remo Bedini AUTOMATIC METHOD AND DEVICE FOR THE ATTENTION OF THE DRIVING PHENOMENA IN REFLECTED LIGHT
US4111720A (en) 1977-03-31 1978-09-05 International Business Machines Corporation Method for forming a non-epitaxial bipolar integrated circuit
US4214266A (en) 1978-06-19 1980-07-22 Myers Charles H Rear viewing system for vehicles
GB2029343A (en) 1978-09-06 1980-03-19 Rickson C Mirrors with Control of Reflecting Power
EP0009414B1 (en) 1978-09-25 1984-04-25 Raymond James Noack Apparatus and method for controlling windscreen wiper and windscreen washer apparatus of a vehicle
US4277804A (en) 1978-11-01 1981-07-07 Elburn Robison System for viewing the area rearwardly of a vehicle
JPS5575264A (en) 1978-12-01 1980-06-06 Mitsubishi Electric Corp Charge transfer element
US4281898A (en) 1979-02-07 1981-08-04 Murakami Kaimeido Co., Ltd. Automatic antiglare rearview mirror
US4236099A (en) 1979-03-05 1980-11-25 Irving Rosenblum Automatic headlight system
FR2492748A2 (en) 1979-11-07 1982-04-30 Massoni Francois DEVICE FOR AUTOMATICALLY CONTROLLING IGNITION AND EXTINGUISHING LIGHTS IN A VEHICLE
US4288814A (en) 1980-02-04 1981-09-08 Talley & Sons, Inc. Closed circuit video guidance system for farming vehicles and method
GR78012B (en) 1980-09-19 1984-09-26 Waddingtons Ltd
US4388701A (en) 1980-09-30 1983-06-14 International Business Machines Corp. Recirculating loop memory array having a shift register buffer for parallel fetching and storing
JPS57173801U (en) 1981-04-27 1982-11-02
US5170374A (en) 1981-05-13 1992-12-08 Hitachi, Ltd. Semiconductor memory
JPS57208531A (en) 1981-06-19 1982-12-21 Ichikoh Ind Ltd Automatic dazzle-resistant mirror device
JPS57208530A (en) 1981-06-19 1982-12-21 Ichikoh Ind Ltd Antidazzling mirror device
FR2513198A1 (en) 1981-09-21 1983-03-25 Herrmann Henri Anti-dazzle rear view mirror - has liquid crystal material with variable transparency controlling intensity of reflected light using photodetector
DE3142909A1 (en) 1981-10-29 1983-05-11 Fa. Carl Zeiss, 7920 Heidenheim CONTINUOUS CHARGE CONTROL FOR ELECTROCHROME LAYERS
DE3142907A1 (en) 1981-10-29 1983-05-11 Fa. Carl Zeiss, 7920 Heidenheim OPTICAL CONTROL CIRCUIT FOR ELECTROCHROME LAYERS
US4460831A (en) 1981-11-30 1984-07-17 Thermo Electron Corporation Laser stimulated high current density photoelectron generator and method of manufacture
JPS58110334U (en) 1982-01-21 1983-07-27 相生精機株式会社 Workpiece changer for machine tools
JPS58173274A (en) 1982-04-02 1983-10-12 株式会社デンソー Control apparatus for vehicle
US4420238A (en) 1982-04-19 1983-12-13 Felix Larry L Apparatus for enabling concealing surveillance by use of a camera in a vehicle
ATA160682A (en) 1982-04-26 1984-10-15 Krippner & Kletzmaier Elektro DEVICE FOR DETERMINING THE Dazzling Effect Of A Light Source
JPS58209635A (en) 1982-05-31 1983-12-06 Nissan Motor Co Ltd Rear view display device for car
US5845000A (en) 1992-05-05 1998-12-01 Automotive Technologies International, Inc. Optical identification and monitoring system using pattern recognition for use with vehicles
US6442465B2 (en) 1992-05-05 2002-08-27 Automotive Technologies International, Inc. Vehicular component control systems and methods
US4603946A (en) 1982-09-29 1986-08-05 Kabushiki Kaisha Tokai Rika Denki Seisakusho Reflection controllable view mirror device for motor vehicle or the like
LU84433A1 (en) 1982-10-22 1984-05-10 Mecan Arbed Dommeldange S A R DEVICE FOR PROVIDING CARBONATED AND SOLID MATERIALS TO A METAL BATH IN THE REFINING PROCESS
US4690508A (en) 1982-12-15 1987-09-01 C-D Marketing, Ltd. Liquid crystal closed-loop controlled mirror systems
JPS59115677A (en) 1982-12-22 1984-07-04 Hitachi Ltd Picture processor
DE3248511A1 (en) 1982-12-29 1984-07-05 Derek 2110 Buchholz Bissenden All-round viewing device for motor vehicles
JPS59114139U (en) 1983-01-19 1984-08-01 株式会社吉野工業所 Cup with gargle container
JPS59133336U (en) 1983-02-26 1984-09-06 日本電気ホームエレクトロニクス株式会社 vehicle rear vision device
DD228953A3 (en) 1983-03-29 1985-10-23 Zeiss Jena Veb Carl ARRANGEMENT FOR OPERATING BLADE GOGGLES
US4626850A (en) 1983-05-16 1986-12-02 David Chey Vehicle detection and collision avoidance apparatus
EP0146672B1 (en) 1983-11-14 1988-10-19 Nippondenso Co., Ltd. Drive apparatus for a liquid crystal dazzle free mirror arrangement
US4623222A (en) 1983-11-14 1986-11-18 Nippondenso Co., Ltd. Liquid crystal type dazzle-free transmissive-reflective mirror
JPS60117218A (en) 1983-11-29 1985-06-24 Nippon Denso Co Ltd Liquid crystal antidazzling type reflecting mirror
JPS60139545A (en) 1983-12-27 1985-07-24 Nippon Denso Co Ltd Driving device for dazzle-proof type reflection mirror of vehicle
US4692798A (en) 1984-01-09 1987-09-08 Nissan Motor Company, Limited Apparatus and process for improving visibility of object within visual field
JPS60148733A (en) 1984-01-12 1985-08-06 Nippon Denso Co Ltd Dazzle-proofing type reflection mirror driving device for vehicle
JPS60174342A (en) 1984-02-16 1985-09-07 Nippon Denso Co Ltd Dazzlement preventing reflection mirror driving unit for vehicle
JPS60156528U (en) 1984-03-28 1985-10-18 株式会社東海理化電機製作所 Anti-glare mirror
JPS60166651U (en) 1984-04-16 1985-11-05 市光工業株式会社 Anti-glare mirror for vehicles
JPS60261275A (en) 1984-06-08 1985-12-24 Yokogawa Hokushin Electric Corp Plant information display device
JPS6159301A (en) 1984-08-30 1986-03-26 Nippon Denso Co Ltd Nonglaring type reflecting mirror controller
US4713685A (en) 1984-10-05 1987-12-15 Matsushita Electric Industrial Co., Ltd. Video monitoring apparatus
US4701022A (en) 1984-11-28 1987-10-20 C-D Marketing, Ltd. Day/night mirror
US4629941A (en) 1985-01-07 1986-12-16 Ellis Edward H Differential illumination sensitive switching circuit
US4630109A (en) 1985-02-28 1986-12-16 Standard Telephones & Cables Public Limited Company Vehicle tracking system
NO850900L (en) 1985-03-06 1986-09-08 Hans Christian Flaaten DEVICE FOR AUTOMATIC, SELECTIVE LIGHT CONTROL FOR VEHICLES.
US4711544A (en) 1985-04-12 1987-12-08 Yazaki Corporation Display system for vehicle
DE3522204A1 (en) 1985-06-21 1987-01-02 Anthony Stewart REARVIEW MIRROR
US4620141A (en) 1985-07-03 1986-10-28 Vericom Corp. Rain-controlled windshield wipers
JPS6223210A (en) 1985-07-23 1987-01-31 Matsushita Electric Ind Co Ltd Local oscillation circuit
FR2585991A3 (en) 1985-08-07 1987-02-13 Andrieux Christian Electronic rear-view device for vehicles
JPS62122844A (en) 1985-11-25 1987-06-04 Pioneer Electronic Corp Rear monitoring apparatus for vehicle
DE3601388A1 (en) 1986-01-18 1987-07-23 Bosch Gmbh Robert HEADLIGHT SYSTEM FOR VEHICLES, ESPECIALLY FOR MOTOR VEHICLES
JPS62122487U (en) 1986-01-28 1987-08-04
JPS62131837U (en) 1986-02-12 1987-08-20
JPS62255262A (en) 1986-04-30 1987-11-07 Nissan Motor Co Ltd Wiper controller
US4793690A (en) 1986-07-18 1988-12-27 Donnelly Corporation Rearview mirror control circuit
US4867561A (en) 1986-08-22 1989-09-19 Nippondenso Co., Ltd. Apparatus for optically detecting an extraneous matter on a translucent shield
JPH0114700Y2 (en) 1986-11-14 1989-04-28
JPS6336027Y2 (en) 1986-12-26 1988-09-26
US4789904A (en) 1987-02-13 1988-12-06 Peterson Roger D Vehicle mounted surveillance and videotaping system
US4847772A (en) 1987-02-17 1989-07-11 Regents Of The University Of Minnesota Vehicle detection through image processing for traffic surveillance and control
IE59698B1 (en) 1987-04-08 1994-03-23 Donnelly Mirrors Ltd Rearview mirror control circuit
US5064274A (en) 1987-08-26 1991-11-12 Siegel-Robert, Inc. Automatic automobile rear view mirror assembly
US4961625A (en) 1987-09-18 1990-10-09 Flight Dynamics, Inc. Automobile head-up display system with reflective aspheric surface
US4872051A (en) 1987-10-01 1989-10-03 Environmental Research Institute Of Michigan Collision avoidance alarm system
JP2696516B2 (en) 1987-11-09 1998-01-14 三菱自動車工業株式会社 Vehicle safety monitoring device
US4862037A (en) 1987-12-24 1989-08-29 Ford Motor Company Automatic headlamp dimming system
US4871917A (en) 1988-04-19 1989-10-03 Donnelly Corporation Vehicular moisture sensor and mounting apparatus therefor
JPH01278848A (en) 1988-05-02 1989-11-09 Nissan Motor Co Ltd Headlight device for vehicle
DE3844364C2 (en) 1988-12-30 1996-07-25 Bosch Gmbh Robert Method for controlling the light emission of a headlight arrangement of a vehicle and headlight arrangement for carrying out the method
US4937796A (en) 1989-01-10 1990-06-26 Tendler Robert K Vehicle backing aid
US4956591A (en) 1989-02-28 1990-09-11 Donnelly Corporation Control for a moisture sensor
US4970653A (en) 1989-04-06 1990-11-13 General Motors Corporation Vision method of detecting lane boundaries and obstacles
JPH07105496B2 (en) 1989-04-28 1995-11-13 三菱電機株式会社 Insulated gate bipolar transistor
JPH02308575A (en) 1989-05-24 1990-12-21 Nissan Motor Co Ltd Photodetector cell
US5027001A (en) 1989-08-29 1991-06-25 Torbert William F Moisture sensitive automatic windshield wiper and headlight control device
DE3929842A1 (en) 1989-09-08 1991-03-14 Vdo Schindling DISPLAY DEVICE FOR VEHICLES
IE903904A1 (en) 1989-11-03 1991-05-08 Donnelly Corp Drive circuit for an electrochromic cell
US4974078A (en) 1989-11-13 1990-11-27 Eastman Kodak Company Digital compression method and system with improved coding efficiency
US5059877A (en) 1989-12-22 1991-10-22 Libbey-Owens-Ford Co. Rain responsive windshield wiper control
JP2843079B2 (en) 1989-12-22 1999-01-06 本田技研工業株式会社 Driving path determination method
JPH0399952U (en) 1990-01-26 1991-10-18
US5044706A (en) 1990-02-06 1991-09-03 Hughes Aircraft Company Optical element employing aspherical and binary grating optical surfaces
US5303205A (en) 1990-02-26 1994-04-12 Trend Tec Inc. Vehicular distance measuring system with integral mirror display
US5072154A (en) 1990-03-13 1991-12-10 Chen Min Hsiung Automatic luminosity control device for car and motor bicycle headlamps
JPH03125623U (en) 1990-04-03 1991-12-18
DE4111993B4 (en) 1990-04-23 2005-05-25 Volkswagen Ag Camera for an image processing system
GB2244187B (en) 1990-05-16 1994-05-18 Laurence David Scott Allen Vehicle mounted video camera
JPH0752706Y2 (en) 1990-06-20 1995-12-06 日本バスプラ工業株式会社 Ultra-small ultrasonic washing machine with drying function for small clothes
US5121200A (en) 1990-07-06 1992-06-09 Choi Seung Lyul Travelling monitoring system for motor vehicles
US5027200A (en) 1990-07-10 1991-06-25 Edward Petrossian Enhanced viewing at side and rear of motor vehicles
US5148014A (en) 1990-08-10 1992-09-15 Donnelly Corporation Mirror system with remotely actuated continuously variable reflectant mirrors
US5124549A (en) 1990-10-15 1992-06-23 Lectron Products, Inc. Automatic headlamp dimmer with optical baffle
SE467553B (en) 1990-11-30 1992-08-03 Sten Loefving OPTICAL METHOD TO DETECT AND CLASSIFY RETURNS BY DETECTING SPRITT RESP BACKGROUND LIGHT FROM A BRIGHT
JPH04245886A (en) 1991-01-31 1992-09-02 Sony Corp Vehicle
FR2672857B1 (en) 1991-02-14 1995-04-21 Renault LATERAL ELECTRONIC REVIEW DEVICE FOR A MOTOR VEHICLE.
FR2673499B1 (en) 1991-03-01 1995-08-04 Renault CAMERA REVIEW DEVICE FOR A MOTOR VEHICLE.
DE4107965A1 (en) 1991-03-13 1991-09-26 Walter Loidl All-round viewing mirror system for motor vehicle - combines views normally obtd. from wing mirrors into panoramic display lenses and fibre=optic bundles
US5451822A (en) 1991-03-15 1995-09-19 Gentex Corporation Electronic control system
GB9108023D0 (en) 1991-04-16 1991-06-05 Ulph Nicholas N C Viewing apparatus
JPH05319174A (en) 1991-05-17 1993-12-03 Mutsurou Buntou Visibility display device for automobile
DE4118208A1 (en) 1991-06-04 1991-11-07 Veit Dr Thomas Fast viewed displays control under variable lighting - uses light sensors for intensity, direction and spectrum and has liquid-crystal viewing panels
US5245422A (en) 1991-06-28 1993-09-14 Zexel Corporation System and method for automatically steering a vehicle within a lane in a road
JP2782990B2 (en) 1991-07-11 1998-08-06 日産自動車株式会社 Vehicle approach determination device
DE4123641A1 (en) 1991-07-17 1993-01-21 Trebe Elektronik Inh Joannis T Vehicle windscreen wiper motor speed controller depending on rain intensity - detects modulation of infrared light reflected from screen by rain drops via optical sensor coupled to motor controller
JP3284413B2 (en) 1991-07-25 2002-05-20 東ソー株式会社 Method for producing hydrated zirconia sol and zirconia powder
US5469298A (en) 1991-08-14 1995-11-21 Prince Corporation Reflective display at infinity
JPH0554276A (en) 1991-08-23 1993-03-05 Matsushita Electric Ind Co Ltd Obstacle detection device
US5535314A (en) 1991-11-04 1996-07-09 Hughes Aircraft Company Video image processor and method for detecting vehicles
US5336980A (en) 1992-12-10 1994-08-09 Leopold Kostal Gmbh & Co. Apparatus and method for controlling a windshield wiping system
US5461357A (en) 1992-01-29 1995-10-24 Mazda Motor Corporation Obstacle detection device for vehicle
US5168378A (en) 1992-02-10 1992-12-01 Reliant Laser Corporation Mirror with dazzle light attenuation zone
DE59205359D1 (en) 1992-04-21 1996-03-28 Pietzsch Ibp Gmbh Device for driving vehicles
US5325386A (en) 1992-04-21 1994-06-28 Bandgap Technology Corporation Vertical-cavity surface emitting laser assay display system
US5253109A (en) 1992-04-27 1993-10-12 Donnelly Corporation Electro-optic device with constant light transmitting area
GB2267341B (en) 1992-05-27 1996-02-21 Koito Mfg Co Ltd Glare sensor for a vehicle
JPH0785280B2 (en) 1992-08-04 1995-09-13 タカタ株式会社 Collision prediction judgment system by neural network
US5351044A (en) 1992-08-12 1994-09-27 Rockwell International Corporation Vehicle lane position detection system
DE69325455T2 (en) 1992-08-14 2000-03-30 Vorad Safety Systems Inc INTELLIGENT DEAD ANGLE DETECTING SENSOR
JP2783079B2 (en) 1992-08-28 1998-08-06 トヨタ自動車株式会社 Light distribution control device for headlamp
US5448319A (en) 1992-09-22 1995-09-05 Olympus Optical Co., Ltd. Optical system for monitor cameras to be mounted on vehicles
JP3418985B2 (en) 1992-12-14 2003-06-23 株式会社デンソー Image display device
JP3263699B2 (en) 1992-12-22 2002-03-04 三菱電機株式会社 Driving environment monitoring device
EP0605045B1 (en) 1992-12-29 1999-03-31 Laboratoires D'electronique Philips S.A.S. Image processing method and apparatus for generating one image from adjacent images
US5529138A (en) 1993-01-22 1996-06-25 Shaw; David C. H. Vehicle collision avoidance system
US5550677A (en) 1993-02-26 1996-08-27 Donnelly Corporation Automatic rearview mirror system using a photosensor array
US6498620B2 (en) 1993-02-26 2002-12-24 Donnelly Corporation Vision system for a vehicle including an image capture device and a display system having a long focal length
US6822563B2 (en) 1997-09-22 2004-11-23 Donnelly Corporation Vehicle imaging system with accessory control
JPH06267304A (en) 1993-03-17 1994-09-22 Toyota Motor Corp Head lamp device for vehicle
JPH06276524A (en) 1993-03-19 1994-09-30 Toyota Motor Corp Device for recognizing vehicle running in opposite direction
JP3468428B2 (en) 1993-03-24 2003-11-17 富士重工業株式会社 Vehicle distance detection device
JP2887039B2 (en) 1993-03-26 1999-04-26 三菱電機株式会社 Vehicle periphery monitoring device
DE4408745C2 (en) 1993-03-26 1997-02-27 Honda Motor Co Ltd Driving control device for vehicles
US6430303B1 (en) 1993-03-31 2002-08-06 Fujitsu Limited Image processing apparatus
WO1994022693A1 (en) 1993-03-31 1994-10-13 Automotive Technologies International, Inc. Vehicle occupant position and velocity sensor
JPH06295601A (en) 1993-04-08 1994-10-21 Toyota Motor Corp Headlight for vehicle
US6084519A (en) 1993-05-07 2000-07-04 Control Devices, Inc. Multi-function light sensor for vehicle
DE69432588T2 (en) 1993-05-07 2004-04-01 Hegyi, Dennis J., Ann Arbor MULTIFUNCTIONAL LIGHT SENSOR FOR A VEHICLE
GB9317983D0 (en) 1993-08-28 1993-10-13 Lucas Ind Plc A driver assistance system for a vehicle
US5457493A (en) 1993-09-15 1995-10-10 Texas Instruments Incorporated Digital micro-mirror based image simulation system
US5374852A (en) 1993-09-17 1994-12-20 Parkes; Walter B. Motor vehicle headlight activation apparatus for inclement weather conditions
US5440428A (en) 1993-09-30 1995-08-08 Hughes Aircraft Company Automotive instrument 3-D virtual image display
JP3522317B2 (en) 1993-12-27 2004-04-26 富士重工業株式会社 Travel guide device for vehicles
US5430431A (en) 1994-01-19 1995-07-04 Nelson; Louis J. Vehicle protection system and method
US5471515A (en) 1994-01-28 1995-11-28 California Institute Of Technology Active pixel sensor with intra-pixel charge transfer
US5402214A (en) 1994-02-23 1995-03-28 Xerox Corporation Toner concentration sensing system for an electrophotographic printer
US5461361A (en) 1994-03-11 1995-10-24 Chrysler Corporation Automotive instrument panel apparatus
JP3358099B2 (en) 1994-03-25 2002-12-16 オムロン株式会社 Optical sensor device
US5537003A (en) 1994-04-08 1996-07-16 Gentex Corporation Control system for automotive vehicle headlamps and other vehicle equipment
US5963247A (en) 1994-05-31 1999-10-05 Banitt; Shmuel Visual display systems and a system for producing recordings for visualization thereon and methods therefor
ES1028357Y (en) 1994-06-03 1995-06-16 Cortes Luis Leon Lamata RECEIVING DEVICE FOR REAR VIEW SCREEN.
US5574443A (en) 1994-06-22 1996-11-12 Hsieh; Chi-Sheng Vehicle monitoring apparatus with broadly and reliably rearward viewing
JP3287117B2 (en) 1994-07-05 2002-05-27 株式会社日立製作所 Environment recognition device for vehicles using imaging device
FR2726144B1 (en) 1994-10-24 1996-11-29 Valeo Vision METHOD AND DEVICE FOR IMPROVING NIGHT VISION, ESPECIALLY FOR MOTOR VEHICLES
US5793420A (en) 1994-10-28 1998-08-11 Schmidt; William P. Video recording system for vehicle
JP3503230B2 (en) 1994-12-15 2004-03-02 株式会社デンソー Nighttime vehicle recognition device
JPH08175263A (en) 1994-12-27 1996-07-09 Murakami Kaimeidou:Kk Interior mirror with built-in display device
KR960029148A (en) 1995-01-13 1996-08-17 방영수 Vehicle rear and rear monitoring system
US5528698A (en) 1995-03-27 1996-06-18 Rockwell International Corporation Automotive occupant sensing device
US5568027A (en) 1995-05-19 1996-10-22 Libbey-Owens-Ford Co. Smooth rain-responsive wiper control
US8538636B2 (en) * 1995-06-07 2013-09-17 American Vehicular Sciences, LLC System and method for controlling vehicle headlights
US20070154063A1 (en) 1995-06-07 2007-07-05 Automotive Technologies International, Inc. Image Processing Using Rear View Mirror-Mounted Imaging Device
US5850284A (en) 1995-06-13 1998-12-15 Robotic Vision Systems, Inc. Apparatus for detecting a polarization altering substance on a surface
WO1997020433A1 (en) 1995-12-01 1997-06-05 Southwest Research Institute Methods and apparatus for traffic incident detection
US5760826A (en) 1996-05-10 1998-06-02 The Trustees Of Columbia University Omnidirectional imaging apparatus
US5661303A (en) 1996-05-24 1997-08-26 Libbey-Owens-Ford Co. Compact moisture sensor with collimator lenses and prismatic coupler
JP3805832B2 (en) 1996-07-10 2006-08-09 富士重工業株式会社 Vehicle driving support device
US5798575A (en) 1996-07-11 1998-08-25 Donnelly Corporation Vehicle mirror digital network and dynamically interactive mirror system
JPH1059068A (en) 1996-08-23 1998-03-03 Yoshihisa Furuta Dead angle confirmation device for vehicle
WO1998014974A1 (en) 1996-09-30 1998-04-09 Zimmerman Consulting, L.L.C. Optical detection of water droplets on vehicle window
US5990469A (en) 1997-04-02 1999-11-23 Gentex Corporation Control circuit for image array sensors
US5923027A (en) 1997-09-16 1999-07-13 Gentex Corporation Moisture sensor and windshield fog detector using an image sensor
JP4477148B2 (en) 1997-06-18 2010-06-09 クラリティー リミテッド ライアビリティ カンパニー Blind signal separation method and apparatus
GB2327823A (en) 1997-07-30 1999-02-03 Nigel Geoffrey Ley Rear-view viewing system
US6087953A (en) 1998-02-18 2000-07-11 Donnelly Corporation Rearview mirror support incorporating vehicle information display
US6124886A (en) 1997-08-25 2000-09-26 Donnelly Corporation Modular rearview mirror assembly
US6313454B1 (en) 1999-07-02 2001-11-06 Donnelly Corporation Rain sensor
EP1025702B9 (en) 1997-10-30 2007-10-03 Donnelly Corporation Rain sensor with fog discrimination
US6243003B1 (en) 1999-08-25 2001-06-05 Donnelly Corporation Accessory module for vehicle
JP3982177B2 (en) * 1998-05-11 2007-09-26 株式会社日立製作所 Vehicle control device
GB9823468D0 (en) 1998-10-28 1998-12-23 Secr Defence Novel enzyme
US6144022A (en) 1999-03-15 2000-11-07 Valeo Electrical Systems, Inc. Rain sensor using statistical analysis
JP3937728B2 (en) * 1999-05-12 2007-06-27 株式会社日立製作所 Vehicle travel control device and vehicle
US7062300B1 (en) 2000-11-09 2006-06-13 Ki Il Kim Cellular phone holder with charger mounted to vehicle dashboard
US6424273B1 (en) 2001-03-30 2002-07-23 Koninklijke Philips Electronics N.V. System to aid a driver to determine whether to change lanes
US6497503B1 (en) 2001-06-21 2002-12-24 Ford Global Technologies, Inc. Headlamp system with selectable beam pattern
US6636258B2 (en) 2001-10-19 2003-10-21 Ford Global Technologies, Llc 360° vision system for a vehicle
CN1194782C (en) * 2001-12-14 2005-03-30 株式会社唯红 Recovery machine for game chip
US6975775B2 (en) 2002-03-06 2005-12-13 Radiant Imaging, Inc. Stray light correction method for imaging light and color measurement system
EP1504276B1 (en) * 2002-05-03 2012-08-08 Donnelly Corporation Object detection system for vehicle
US7541743B2 (en) 2002-12-13 2009-06-02 Ford Global Technologies, Llc Adaptive vehicle communication controlled lighting system
US7206697B2 (en) 2003-10-14 2007-04-17 Delphi Technologies, Inc. Driver adaptive collision warning system
JP4114587B2 (en) 2003-09-29 2008-07-09 株式会社デンソー Own vehicle travel position detection device and program
US7019463B2 (en) * 2003-10-21 2006-03-28 Raymond Kesterson Daytime running light module and system
US7269504B2 (en) 2004-05-12 2007-09-11 Motorola, Inc. System and method for assigning a level of urgency to navigation cues
JP2005353477A (en) * 2004-06-11 2005-12-22 Koito Mfg Co Ltd Lighting system for vehicles
US7720580B2 (en) 2004-12-23 2010-05-18 Donnelly Corporation Object detection system for vehicle
JP2006264416A (en) * 2005-03-22 2006-10-05 Takata Corp Object detection system, protection system, and vehicle
JP4781707B2 (en) * 2005-04-15 2011-09-28 富士重工業株式会社 Vehicle driving support device
US20070027349A1 (en) * 2005-07-28 2007-02-01 Stephan Brandstadter Halogenated Compositions
AU2006300775B2 (en) * 2005-10-07 2012-01-19 Bendix Commercial Vehicle Systems Llc Adaptive cruise control for heavy-duty vehicles
JP4304517B2 (en) * 2005-11-09 2009-07-29 トヨタ自動車株式会社 Object detection device
US7428449B2 (en) 2006-03-14 2008-09-23 Temic Automotive Of North America, Inc. System and method for determining a workload level of a driver
JP4173901B2 (en) * 2006-05-19 2008-10-29 本田技研工業株式会社 Vehicle periphery monitoring device
JP4193886B2 (en) * 2006-07-26 2008-12-10 トヨタ自動車株式会社 Image display device
WO2008024639A2 (en) * 2006-08-11 2008-02-28 Donnelly Corporation Automatic headlamp control system
US7554435B2 (en) 2006-09-07 2009-06-30 Nissan Technical Center North America, Inc. Vehicle on-board unit
JP4650401B2 (en) * 2006-11-22 2011-03-16 株式会社デンソー Power consumption recording device and program for power consumption recording device
US20080129541A1 (en) 2006-12-01 2008-06-05 Magna Electronics Black ice detection and warning system
US20080147277A1 (en) 2006-12-18 2008-06-19 Ford Global Technologies, Llc Active safety system
US8068968B2 (en) * 2007-02-06 2011-11-29 Denso Corporation Vehicle travel control system
EP2191457B1 (en) 2007-09-11 2014-12-10 Magna Electronics Imaging system for vehicle
US8446470B2 (en) 2007-10-04 2013-05-21 Magna Electronics, Inc. Combined RGB and IR imaging sensor
US8131442B2 (en) * 2007-11-28 2012-03-06 GM Global Technology Operations LLC Method for operating a cruise control system for a vehicle
US8014928B2 (en) 2008-06-17 2011-09-06 Ford Global Technologies, Llc Automotive slipstreaming support system
JP5213113B2 (en) 2008-07-02 2013-06-19 株式会社エイチアンドエフ Die cushion device and press machine having the same
US20100020170A1 (en) 2008-07-24 2010-01-28 Higgins-Luthman Michael J Vehicle Imaging System
US8358074B2 (en) * 2009-02-26 2013-01-22 GM Global Technology Operations LLC Daytime running lamp activation control methods and apparatus
PL2563742T3 (en) 2010-04-26 2021-05-04 Dow Global Technologies Llc Composition for extrusion-molded bodies
JP5951301B2 (en) 2012-03-21 2016-07-13 株式会社クレハ環境 Method for producing activated carbon slurry
JP6107035B2 (en) 2012-09-28 2017-04-05 株式会社三洋物産 Game machine
JP6414700B2 (en) 2015-10-28 2018-10-31 トヨタ自動車株式会社 Nonaqueous electrolyte secondary battery

Patent Citations (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2632040A (en) * 1952-05-01 1953-03-17 Rabinow Jacob Automatic headlight dimmer
US2827594A (en) * 1954-09-02 1958-03-18 Rabinow Jacob Color discriminating headlight dimmer
US3708231A (en) * 1969-11-10 1973-01-02 G Walters Precision angle measuring device
US3665224A (en) * 1970-09-08 1972-05-23 Ruth Arthur P Ambient light level detector including transient suppression circuitry
US3807832A (en) * 1972-11-09 1974-04-30 American Cyanamid Co Electrochromic (ec) mirror which rapidly changes reflectivity
US3811046A (en) * 1973-01-22 1974-05-14 Le Van Electronics Inc Light sensitive security system
US3813540A (en) * 1973-03-26 1974-05-28 Ncr Circuit for measuring and evaluating optical radiation
US3862798A (en) * 1973-11-19 1975-01-28 Charles L Hopkins Automatic rear view mirror adjuster
US3947095A (en) * 1974-03-18 1976-03-30 Marie Saratore Rear view vision device
US4249160A (en) * 1975-11-21 1981-02-03 Chilvers Graham R Vehicle mounted light activated control system
US4200361A (en) * 1977-01-25 1980-04-29 Fiat Societa Per Azioni Liquid crystal mirror for use as a rear-view mirror for vehicles
US4247870A (en) * 1977-05-30 1981-01-27 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Highways And Transportation Highway premarking guidance system
US4266856A (en) * 1978-07-20 1981-05-12 Wainwright Basil E Rear view mirror
US4431896A (en) * 1979-04-26 1984-02-14 A.G. Fur Industrielle Elektronik Agie Method and apparatus for orienting the wire guidance heads on spark erosion cutting equipment for eroding with a great wire slope
US4381888A (en) * 1980-06-11 1983-05-03 Canon Kabushiki Kaisha Retrofocus type large aperture wide angle objective
US4647161A (en) * 1981-05-20 1987-03-03 Mueller Rolf Fish eye lens system
US4443057A (en) * 1981-06-01 1984-04-17 Gentex Corporation Automatic rearview mirror for automotive vehicles
US4512637A (en) * 1981-10-29 1985-04-23 Carl-Zeiss-Stiftung, Heidenheim/Brenz Method and means for stepwise charge control of electrochromic layers
US4491390A (en) * 1982-05-06 1985-01-01 Tong Shen Hsieh Automatic liquid-crystal light shutter
US4571082A (en) * 1982-05-18 1986-02-18 Downs Michael J Apparatus and method for measuring refractive index
US4572619A (en) * 1983-01-27 1986-02-25 Daimler-Benz Aktiengesellschaft Electrically dimmable rear-view mirror for motor vehicles
US4817948A (en) * 1983-09-06 1989-04-04 Louise Simonelli Reduced-scale racing system
US4580875A (en) * 1984-03-30 1986-04-08 Gentex Corporation Electronic control system for automatic rearview mirrors for automotive vehicles
US4741603A (en) * 1985-05-08 1988-05-03 Nissan Motor Co., Ltd. Electrochromic nonglaring mirror
US5001558A (en) * 1985-06-11 1991-03-19 General Motors Corporation Night vision system with color video camera
US4891559A (en) * 1985-06-13 1990-01-02 Nippondenso Soken, Inc. Apparatus for controlling a headlight of a vehicle
US4731669A (en) * 1985-06-18 1988-03-15 Matsushita Electric Industrial Co., Ltd. Camera apparatus with movably supported lens barrel
US4820933A (en) * 1986-12-31 1989-04-11 Samsung Electronics Co., Ltd. Control circuit for liquid crystal rear-vision mirror
US4896030A (en) * 1987-02-27 1990-01-23 Ichikoh Industries Limited Light-reflectivity controller for use with automotive rearview mirror using electrochromic element
US4917477A (en) * 1987-04-06 1990-04-17 Gentex Corporation Automatic rearview mirror system for automotive vehicles
US4727290A (en) * 1987-05-29 1988-02-23 General Motors Corporation Automatic vehicle headlamp dimming control
US4895790A (en) * 1987-09-21 1990-01-23 Massachusetts Institute Of Technology High-efficiency, multilevel, diffractive optical elements
US4825232A (en) * 1988-03-25 1989-04-25 Enserch Corporation Apparatus for mounting aerial survey camera under aircraft wings
US4991054A (en) * 1988-05-13 1991-02-05 Pacific Scientific Company Time-delay outdoor lighting control systems
US4910591A (en) * 1988-08-08 1990-03-20 Edward Petrossian Side and rear viewing apparatus for motor vehicles
US4892345A (en) * 1988-09-23 1990-01-09 Rachael Iii Stephen Armored vehicle
US5003288A (en) * 1988-10-25 1991-03-26 Nartron Corporation Ambient light sensing method and apparatus
US5016977A (en) * 1989-02-06 1991-05-21 Essilor International-Compagnie Generale Optical lens for correcting astigmatism
US4916374A (en) * 1989-02-28 1990-04-10 Donnelly Corporation Continuously adaptive moisture sensor system for wiper control
US5012082A (en) * 1989-03-01 1991-04-30 Hamamatsu Photonics K.K. Two-dimensional incident position detector device for light or radiation
US4987357A (en) * 1989-12-18 1991-01-22 General Motors Corporation Adaptive motor vehicle cruise control
US5184956A (en) * 1990-02-20 1993-02-09 Codes Rousseau Method and device for training in the driving of vehicles
US5096287A (en) * 1990-03-15 1992-03-17 Aisin Seiki K.K. Video camera for an automobile
US5086253A (en) * 1990-10-15 1992-02-04 Lawler Louis N Automatic headlight dimmer apparatus
US5414257A (en) * 1991-04-23 1995-05-09 Introlab Pty Limited Moisture sensor for detecting moisture on a windshield
US5182502A (en) * 1991-05-06 1993-01-26 Lectron Products, Inc. Automatic headlamp dimmer
US5416318A (en) * 1991-10-03 1995-05-16 Hegyi; Dennis J. Combined headlamp and climate control sensor having a light diffuser and a light modulator
US5289182A (en) * 1991-10-16 1994-02-22 Ii Bc-Sys Electronic anti-collison device carried on board a vehicle
US5307136A (en) * 1991-10-22 1994-04-26 Fuji Jukogyo Kabushiki Kaisha Distance detection system for vehicles
US5414461A (en) * 1991-11-15 1995-05-09 Nissan Motor Co., Ltd. Vehicle navigation apparatus providing simultaneous forward and rearward views
US5193029A (en) * 1991-11-19 1993-03-09 Donnelly Corporation Single sensor adaptive drive circuit for rearview mirror system
US5276389A (en) * 1991-12-14 1994-01-04 Leopold Kostal Gmbh & Co. Kg Method of controlling a windshield wiper system
US5208701A (en) * 1991-12-24 1993-05-04 Xerox Corporation Wobble correction lens with binary diffractive optic surface and refractive cylindrical surface
US5386285A (en) * 1992-02-28 1995-01-31 Mitsubishi Denki Kabushiki Kaisha Obstacle detecting device for a vehicle
US5410346A (en) * 1992-03-23 1995-04-25 Fuji Jukogyo Kabushiki Kaisha System for monitoring condition outside vehicle using imaged picture by a plurality of television cameras
US5204778A (en) * 1992-04-06 1993-04-20 Gentex Corporation Control system for automatic rearview mirrors
US5305012A (en) * 1992-04-15 1994-04-19 Reveo, Inc. Intelligent electro-optical system and method for automatic glare reduction
US5515448A (en) * 1992-07-28 1996-05-07 Yazaki Corporation Distance measuring apparatus of a target tracking type
US5510983A (en) * 1992-11-13 1996-04-23 Yazaki Corporation On-vehicle display
US5289321A (en) * 1993-02-12 1994-02-22 Secor James O Consolidated rear view camera and display system for motor vehicle
US5313072A (en) * 1993-02-16 1994-05-17 Rockwell International Corporation Optical detector for windshield wiper control
US20070109653A1 (en) * 1993-02-26 2007-05-17 Kenneth Schofield Image sensing system for a vehicle
US20070109652A1 (en) * 1993-02-26 2007-05-17 Donnelly Corporation, A Corporation Of The State Of Michigan Image sensing system for a vehicle
US20040051634A1 (en) * 1993-02-26 2004-03-18 Kenneth Schofield Vision system for a vehicle including image processor
US6559435B2 (en) * 1993-02-26 2003-05-06 Donnelly Corporation Vehicle headlight control using imaging sensor identifying objects by geometric configuration
US5877897A (en) * 1993-02-26 1999-03-02 Donnelly Corporation Automatic rearview mirror, vehicle lighting control and vehicle interior monitoring system using a photosensor array
US20060028731A1 (en) * 1993-02-26 2006-02-09 Kenneth Schofield Vehicular vision system
US6222447B1 (en) * 1993-02-26 2001-04-24 Donnelly Corporation Rearview vision system with indicia of backup travel
US20070023613A1 (en) * 1993-02-26 2007-02-01 Donnelly Corporation Vehicle headlight control using imaging sensor
US6523964B2 (en) * 1993-02-26 2003-02-25 Donnelly Corporation Vehicle control system and method
US20070109406A1 (en) * 1993-02-26 2007-05-17 Donnelly Corporation, A Corporation Of The State Of Michigan Image sensing system for a vehicle
US20070109654A1 (en) * 1993-02-26 2007-05-17 Donnelly Corporation, A Corporation Of The State Of Michigan Image sensing system for a vehicle
US6396397B1 (en) * 1993-02-26 2002-05-28 Donnelly Corporation Vehicle imaging system with stereo imaging
US5498866A (en) * 1993-06-01 1996-03-12 Leopold Kostal Gmbh & Co. Kg Optoelectronic sensor for detecting moisture on a windshield with means to compensate for a metallic layer in the windshield
US6553130B1 (en) * 1993-08-11 2003-04-22 Jerome H. Lemelson Motor vehicle warning and control system and method
US5883739A (en) * 1993-10-04 1999-03-16 Honda Giken Kogyo Kabushiki Kaisha Information display device for vehicle
US5406395A (en) * 1993-11-01 1995-04-11 Hughes Aircraft Company Holographic parking assistance device
US5614788A (en) * 1995-01-31 1997-03-25 Autosmart Light Switches, Inc. Automated ambient condition responsive daytime running light system
US5867591A (en) * 1995-04-21 1999-02-02 Matsushita Electric Industrial Co., Ltd. Method of matching stereo images and method of measuring disparity between these image
US5896085A (en) * 1995-09-07 1999-04-20 Toyota Jidosha Kabushiki Kaisha Apparatus for controlling light distributions of head lamps
US5892434A (en) * 1996-02-09 1999-04-06 David M. Carlson Automotive driving pattern monitor
US6891563B2 (en) * 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
US5890021A (en) * 1996-12-05 1999-03-30 Canon Kabushiki Kaisha Distance detecting device, focus state detecting device and camera having same
US20060018512A1 (en) * 1997-04-02 2006-01-26 Stam Joseph S Vehicle automatic exterior light control
US20060091813A1 (en) * 1997-04-02 2006-05-04 Stam Joseph S Control system to automatically control vehicle headlamps
US6020704A (en) * 1997-12-02 2000-02-01 Valeo Electrical Systems, Inc. Windscreen sensing and wiper control system
US6172613B1 (en) * 1998-02-18 2001-01-09 Donnelly Corporation Rearview mirror assembly incorporating vehicle information display
US6049171A (en) * 1998-09-18 2000-04-11 Gentex Corporation Continuously variable headlamp control
US6066933A (en) * 1998-10-02 2000-05-23 Ponziana; Richard L. Rain sensing system and method having automatically registered and oriented rain sensor
US6717610B1 (en) * 1998-11-25 2004-04-06 Donnelly Corporation Wide angle image capture system for vehicle
US6534884B2 (en) * 1998-12-16 2003-03-18 Donnelly Corporation Proximity sensing system for vehicles
US6201642B1 (en) * 1999-07-27 2001-03-13 Donnelly Corporation Vehicular vision system with a wide angle lens including a diffractive element
US20060018511A1 (en) * 2000-03-20 2006-01-26 Stam Joseph S System for controlling vehicle equipment
US20020015153A1 (en) * 2000-07-27 2002-02-07 Downs Michael John Jamin-type interferometers and components therefor
US6672731B2 (en) * 2000-11-20 2004-01-06 Donnelly Corporation Vehicular rearview mirror with blind spot viewing system
US20040084569A1 (en) * 2002-11-04 2004-05-06 Bonutti Peter M. Active drag and thrust modulation system and method
US20060293841A1 (en) * 2005-06-15 2006-12-28 Davor Hrovat Traction control system and method
US20090016073A1 (en) * 2007-07-12 2009-01-15 Higgins-Luthman Michael J Automatic Lighting System with Adaptive Alignment Function
US20090045323A1 (en) * 2007-08-17 2009-02-19 Yuesheng Lu Automatic Headlamp Control System
US20090097038A1 (en) * 2007-10-16 2009-04-16 Higgins-Luthman Michael J Machine Vision for Predictive Suspension

Cited By (358)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090118909A1 (en) * 2007-10-31 2009-05-07 Valeo Vision Process for detecting a phenomenon limiting the visibility for a motor vehicle
US8315766B2 (en) * 2007-10-31 2012-11-20 Valeo Vision Process for detecting a phenomenon limiting the visibility for a motor vehicle
US20090315723A1 (en) * 2008-06-24 2009-12-24 Visiocorp Patents S.A.R.L Optical system and method for detecting optical system obscuration in a vehicle
US8395514B2 (en) * 2008-06-24 2013-03-12 Smr Patents S.A.R.L. Optical system and method for detecting optical system obscuration in a vehicle
US9509957B2 (en) 2008-07-24 2016-11-29 Magna Electronics Inc. Vehicle imaging system
US11091105B2 (en) 2008-07-24 2021-08-17 Magna Electronics Inc. Vehicle vision system
US8265872B2 (en) * 2008-09-08 2012-09-11 Toyota Jidosha Kabushiki Kaisha Road surface division mark recognition apparatus, and lane departure prevention apparatus
US20120293657A1 (en) * 2008-09-08 2012-11-22 Toyota Jidosha Kabushiki Kaisha Road Surface Division Mark Recognition Apparatus, And Lane Departure Prevention Apparatus
US8392115B2 (en) * 2008-09-08 2013-03-05 Toyota Jidosha Kabushiki Kaisha Road surface division mark recognition apparatus, and lane departure prevention apparatus
US20100060738A1 (en) * 2008-09-08 2010-03-11 Toyota Jidosha Kabushiki Kaisha Road surface division mark recognition apparatus, and lane departure prevention apparatus
US9911050B2 (en) 2009-02-27 2018-03-06 Magna Electronics Inc. Driver active safety control system for vehicle
US11288888B2 (en) 2009-02-27 2022-03-29 Magna Electronics Inc. Vehicular control system
US10839233B2 (en) 2009-02-27 2020-11-17 Magna Electronics Inc. Vehicular control system
US9126525B2 (en) 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
US11763573B2 (en) 2009-02-27 2023-09-19 Magna Electronics Inc. Vehicular control system
US8376595B2 (en) 2009-05-15 2013-02-19 Magna Electronics, Inc. Automatic headlamp control
US11511668B2 (en) 2009-05-15 2022-11-29 Magna Electronics Inc. Vehicular driver assistance system with construction zone recognition
US9187028B2 (en) 2009-05-15 2015-11-17 Magna Electronics Inc. Driver assistance system for vehicle
US10744940B2 (en) 2009-05-15 2020-08-18 Magna Electronics Inc. Vehicular control system with temperature input
US10005394B2 (en) 2009-05-15 2018-06-26 Magna Electronics Inc. Driver assistance system for vehicle
US20120229028A1 (en) * 2009-10-24 2012-09-13 Daimler Ag Device for Controlling a Low Beam of a Vehicle
US9523473B2 (en) * 2009-10-24 2016-12-20 Daimler Ag Device for controlling a low beam of a vehicle
EP2544162A4 (en) * 2010-03-01 2014-01-22 Honda Motor Co Ltd Surrounding area monitoring device for vehicle
EP2544162A1 (en) * 2010-03-01 2013-01-09 Honda Motor Co., Ltd. Surrounding area monitoring device for vehicle
US9321399B2 (en) 2010-03-01 2016-04-26 Honda Motor Co., Ltd. Surrounding area monitoring device for vehicle
US9145137B2 (en) * 2010-04-07 2015-09-29 Toyota Jidosha Kabushiki Kaisha Vehicle driving-support apparatus
US20130096773A1 (en) * 2010-04-07 2013-04-18 Tomoyuki Doi Vehicle driving-support apparatus
US8878693B2 (en) * 2010-04-21 2014-11-04 Denso Corporation Driver assistance device and method of controlling the same
US20110260886A1 (en) * 2010-04-21 2011-10-27 Denso Corporation Driver assistance device and method of controlling the same
US9160986B2 (en) * 2010-11-16 2015-10-13 Honda Motor Co., Ltd. Device for monitoring surroundings of a vehicle
US20130229525A1 (en) * 2010-11-16 2013-09-05 Honda Motor Co., Ltd. A device for monitoring surroundings of a vehicle
US11198434B2 (en) 2010-11-19 2021-12-14 Magna Electronics Inc. Vehicular lane centering system
US10427679B2 (en) 2010-11-19 2019-10-01 Magna Electronics Inc. Lane keeping system and lane centering system
US11753007B2 (en) 2010-11-19 2023-09-12 Magna Electronics Inc. Vehicular lane centering system
US9758163B2 (en) 2010-11-19 2017-09-12 Magna Electronics Inc. Lane keeping system and lane centering system
DE112011103834T5 (en) 2010-11-19 2013-09-05 Magna Electronics, Inc. Lane departure warning and lane centering
US9180908B2 (en) 2010-11-19 2015-11-10 Magna Electronics Inc. Lane keeping system and lane centering system
US8044784B2 (en) 2011-03-14 2011-10-25 Ford Global Technologies, Llc Sun protection system for automotive vehicle
CN102673350A (en) * 2011-03-14 2012-09-19 福特全球技术公司 Sun protection system for automotive vehicle
US20110163866A1 (en) * 2011-03-14 2011-07-07 Ford Global Technologies, Llc Sun Protection System for Automotive Vehicle
US9194943B2 (en) 2011-04-12 2015-11-24 Magna Electronics Inc. Step filter for estimating distance in a time-of-flight ranging system
US10288724B2 (en) 2011-04-12 2019-05-14 Magna Electronics Inc. System and method for estimating distance between a mobile unit and a vehicle using a TOF system
US9376066B2 (en) 2011-04-18 2016-06-28 Magna Electronics, Inc. Vehicular camera with variable focus capability
US9547795B2 (en) 2011-04-25 2017-01-17 Magna Electronics Inc. Image processing method for detecting objects using relative motion
US10043082B2 (en) 2011-04-25 2018-08-07 Magna Electronics Inc. Image processing method for detecting objects using relative motion
US10452931B2 (en) 2011-04-25 2019-10-22 Magna Electronics Inc. Processing method for distinguishing a three dimensional object from a two dimensional object using a vehicular system
US20140277990A1 (en) * 2011-08-03 2014-09-18 Continental Teves Ag & Co. Ohg Method and system for adaptively controlling distance and speed and for stopping a motor vehicle, and a motor vehicle which works with same
US9358962B2 (en) * 2011-08-03 2016-06-07 Continental Teves Ag & Co. Ohg Method and system for adaptively controlling distance and speed and for stopping a motor vehicle, and a motor vehicle which works with same
US10604013B1 (en) 2011-08-24 2020-03-31 Allstate Insurance Company Vehicle driver feedback device
US10730388B1 (en) 2011-08-24 2020-08-04 Allstate Insurance Company In vehicle driver feedback device
US9588735B1 (en) 2011-08-24 2017-03-07 Allstate Insurance Company In vehicle feedback device
US9177427B1 (en) 2011-08-24 2015-11-03 Allstate Insurance Company Vehicle driver feedback device
US11548390B1 (en) 2011-08-24 2023-01-10 Allstate Insurance Company Vehicle driver feedback device
US10065505B1 (en) 2011-08-24 2018-09-04 Allstate Insurance Company Vehicle driver feedback device
US11820229B2 (en) 2011-08-24 2023-11-21 Allstate Insurance Company In vehicle driver feedback device
US9900490B2 (en) 2011-09-21 2018-02-20 Magna Electronics Inc. Vehicle vision system using image data transmission and power supply via a coaxial cable
US10827108B2 (en) 2011-09-21 2020-11-03 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US11877054B2 (en) 2011-09-21 2024-01-16 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US11201994B2 (en) 2011-09-21 2021-12-14 Magna Electronics Inc. Vehicular multi-camera surround view system using image data transmission and power supply via coaxial cables
US10284764B2 (en) 2011-09-21 2019-05-07 Magna Electronics Inc. Vehicle vision using image data transmission and power supply via a coaxial cable
US11638070B2 (en) 2011-09-21 2023-04-25 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US10567633B2 (en) 2011-09-21 2020-02-18 Magna Electronics Inc. Vehicle vision system using image data transmission and power supply via a coaxial cable
US10257432B2 (en) 2011-09-26 2019-04-09 Magna Electronics Inc. Method for enhancing vehicle camera image quality
US9681062B2 (en) 2011-09-26 2017-06-13 Magna Electronics Inc. Vehicle camera image quality improvement in poor visibility conditions by contrast amplification
US9774790B1 (en) 2011-09-26 2017-09-26 Magna Electronics Inc. Method for enhancing vehicle camera image quality
US11738705B2 (en) 2011-10-19 2023-08-29 Magna Electronics Inc. Vehicular safety system for controlling a safety feature
US11548461B2 (en) 2011-10-19 2023-01-10 Magna Electronics Inc. Vehicular safety system for controlling a safety feature
US10974674B2 (en) 2011-10-19 2021-04-13 Magna Electronics Inc. Vehicular safety system for controlling a safety feature
US9908495B2 (en) 2011-10-19 2018-03-06 Magna Electronics Inc. Vehicle safety system for controlling a safety feature
US9174574B2 (en) 2011-10-19 2015-11-03 Magna Electronics Inc. Vehicle vision system for controlling a vehicle safety feature responsive to an imaging sensor having an exterior rearward field of view and before a collision occurs
US10239479B2 (en) 2011-10-19 2019-03-26 Magna Electronics Inc. Vehicle safety system for controlling a safety feature
US10632950B2 (en) 2011-10-19 2020-04-28 Magna Electronics Inc. Vehicular safety system for controlling a safety feature
US10549695B1 (en) 2011-10-31 2020-02-04 Rosco, Inc. Mirror monitor using two levels of reflectivity and transmissibility
US9215429B2 (en) * 2011-10-31 2015-12-15 Rosco, Inc. Mirror monitor using two levels of reflectivity
US10814789B2 (en) 2011-10-31 2020-10-27 Rosco, Inc. Mirror monitor using two levels of reflectivity and transmissibility
US11390217B2 (en) 2011-10-31 2022-07-19 Rosco, Inc. Mirror monitor using two levels of reflectivity and transmissibility
US20150085121A1 (en) * 2011-10-31 2015-03-26 Rosco, Inc. Mirror monitor using two levels of reflectivity
US11142123B2 (en) 2011-11-28 2021-10-12 Magna Electronics Inc. Multi-camera vehicular vision system
US10640040B2 (en) 2011-11-28 2020-05-05 Magna Electronics Inc. Vision system for vehicle
US11634073B2 (en) 2011-11-28 2023-04-25 Magna Electronics Inc. Multi-camera vehicular vision system
US9715769B2 (en) 2012-03-01 2017-07-25 Magna Electronics Inc. Process for determining state of a vehicle
US9916699B2 (en) 2012-03-01 2018-03-13 Magna Electronics Inc. Process for determining state of a vehicle
US9346468B2 (en) 2012-03-01 2016-05-24 Magna Electronics Inc. Vehicle vision system with yaw rate determination
US8849495B2 (en) 2012-03-01 2014-09-30 Magna Electronics Inc. Vehicle vision system with yaw rate determination
US10127738B2 (en) 2012-03-01 2018-11-13 Magna Electronics Inc. Method for vehicular control
US8694224B2 (en) 2012-03-01 2014-04-08 Magna Electronics Inc. Vehicle yaw rate correction
US11184585B2 (en) 2012-03-23 2021-11-23 Magna Electronics Inc. Vehicular vision system with accelerated determination of an object of interest
US11627286B2 (en) 2012-03-23 2023-04-11 Magna Electronics Inc. Vehicular vision system with accelerated determination of another vehicle
US10911721B2 (en) 2012-03-23 2021-02-02 Magna Electronics Inc. Vehicle vision system with accelerated determination of an object of interest
US20130253815A1 (en) * 2012-03-23 2013-09-26 Institut Francais Des Sciences Et Technologies Des Transports, De L'amenagement System of determining information about a path or a road vehicle
US10609335B2 (en) 2012-03-23 2020-03-31 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US11769335B2 (en) 2012-05-18 2023-09-26 Magna Electronics Inc. Vehicular rear backup system
US20150131086A1 (en) * 2012-05-18 2015-05-14 Denso Corporation Traveling environment detection device
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US9372112B2 (en) * 2012-05-18 2016-06-21 Denso Corporation Traveling environment detection device
US11308718B2 (en) 2012-05-18 2022-04-19 Magna Electronics Inc. Vehicular vision system
US10515279B2 (en) 2012-05-18 2019-12-24 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US10922563B2 (en) 2012-05-18 2021-02-16 Magna Electronics Inc. Vehicular control system
US11508160B2 (en) 2012-05-18 2022-11-22 Magna Electronics Inc. Vehicular vision system
WO2013187829A1 (en) * 2012-06-11 2013-12-19 Scania Cv Ab Warning system
WO2013187828A1 (en) * 2012-06-11 2013-12-19 Scania Cv Ab Warning system
US9690296B1 (en) 2012-06-20 2017-06-27 Waymo Llc Avoiding blind spots of other vehicles
US10591919B1 (en) 2012-06-20 2020-03-17 Waymo Llc Avoiding blind spots of other vehicles
US11868133B1 (en) 2012-06-20 2024-01-09 Waymo Llc Avoiding blind spots of other vehicles
US8874267B1 (en) * 2012-06-20 2014-10-28 Google Inc. Avoiding blind spots of other vehicles
US9180882B1 (en) * 2012-06-20 2015-11-10 Google Inc. Avoiding blind spots of other vehicles
US9090205B2 (en) * 2012-06-26 2015-07-28 Honda Motor Co., Ltd. Light distribution controller
US20130343071A1 (en) * 2012-06-26 2013-12-26 Honda Motor Co., Ltd. Light distribution controller
US9340227B2 (en) 2012-08-14 2016-05-17 Magna Electronics Inc. Vehicle lane keep assist system
US9761142B2 (en) 2012-09-04 2017-09-12 Magna Electronics Inc. Driver assistant system using influence mapping for conflict avoidance path determination
US11663917B2 (en) 2012-09-04 2023-05-30 Magna Electronics Inc. Vehicular control system using influence mapping for conflict avoidance path determination
US10115310B2 (en) 2012-09-04 2018-10-30 Magna Electronics Inc. Driver assistant system using influence mapping for conflict avoidance path determination
US10733892B2 (en) 2012-09-04 2020-08-04 Magna Electronics Inc. Driver assistant system using influence mapping for conflict avoidance path determination
US9000950B2 (en) 2012-11-13 2015-04-07 International Business Machines Corporation Managing vehicle detection
US9481344B2 (en) 2012-11-19 2016-11-01 Magna Electronics Inc. Braking control system for vehicle
US10104298B2 (en) 2012-11-19 2018-10-16 Magna Electronics Inc. Vehicle vision system with enhanced display functions
US10023161B2 (en) 2012-11-19 2018-07-17 Magna Electronics Inc. Braking control system for vehicle
US9743002B2 (en) 2012-11-19 2017-08-22 Magna Electronics Inc. Vehicle vision system with enhanced display functions
US9090234B2 (en) 2012-11-19 2015-07-28 Magna Electronics Inc. Braking control system for vehicle
US10321064B2 (en) 2012-11-19 2019-06-11 Magna Electronics Inc. Vehicular vision system with enhanced display functions
US9128354B2 (en) 2012-11-29 2015-09-08 Bendix Commercial Vehicle Systems Llc Driver view adapter for forward looking camera
US10025994B2 (en) 2012-12-04 2018-07-17 Magna Electronics Inc. Vehicle vision system utilizing corner detection
US9481301B2 (en) 2012-12-05 2016-11-01 Magna Electronics Inc. Vehicle vision system utilizing camera synchronization
US9912841B2 (en) 2012-12-05 2018-03-06 Magna Electronics Inc. Vehicle vision system utilizing camera synchronization
US10873682B2 (en) 2012-12-05 2020-12-22 Magna Electronics Inc. Method of synchronizing multiple vehicular cameras with an ECU
US10560610B2 (en) 2012-12-05 2020-02-11 Magna Electronics Inc. Method of synchronizing multiple vehicular cameras with an ECU
US10171709B2 (en) 2012-12-05 2019-01-01 Magna Electronics Inc. Vehicle vision system utilizing multiple cameras and ethernet links
CN103910062A (en) * 2013-01-02 2014-07-09 波音公司 Automated Water Drop Measurement and Ice Detection System
US9227733B2 (en) * 2013-01-02 2016-01-05 The Boeing Company Automated water drop measurement and ice detection system
JP2014131906A (en) * 2013-01-02 2014-07-17 Boeing Co Automated water drop measurement and ice detection system
US20140184789A1 (en) * 2013-01-02 2014-07-03 The Boeing Company Automated Water Drop Measurement and Ice Detection System
US20140211007A1 (en) * 2013-01-28 2014-07-31 Fujitsu Ten Limited Object detector
US9811741B2 (en) * 2013-01-28 2017-11-07 Fujitsu Ten Limited Object detector
US10803744B2 (en) 2013-02-04 2020-10-13 Magna Electronics Inc. Vehicular collision mitigation system
US10523904B2 (en) 2013-02-04 2019-12-31 Magna Electronics Inc. Vehicle data recording system
US9824285B2 (en) 2013-02-04 2017-11-21 Magna Electronics Inc. Vehicular control system
US9318020B2 (en) 2013-02-04 2016-04-19 Magna Electronics Inc. Vehicular collision mitigation system
US9563809B2 (en) 2013-02-04 2017-02-07 Magna Electronics Inc. Vehicular vision system
US11798419B2 (en) 2013-02-04 2023-10-24 Magna Electronics Inc. Vehicular collision mitigation system
US11012668B2 (en) 2013-02-04 2021-05-18 Magna Electronics Inc. Vehicular security system that limits vehicle access responsive to signal jamming detection
US9092986B2 (en) 2013-02-04 2015-07-28 Magna Electronics Inc. Vehicular vision system
US10497262B2 (en) 2013-02-04 2019-12-03 Magna Electronics Inc. Vehicular collision mitigation system
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
US10027930B2 (en) 2013-03-29 2018-07-17 Magna Electronics Inc. Spectral filtering for vehicular driver assistance systems
US9545921B2 (en) 2013-04-10 2017-01-17 Magna Electronics Inc. Collision avoidance system for vehicle
US11718291B2 (en) 2013-04-10 2023-08-08 Magna Electronics Inc. Vehicular collision avoidance system
US10875527B2 (en) 2013-04-10 2020-12-29 Magna Electronics Inc. Collision avoidance system for vehicle
US11485358B2 (en) 2013-04-10 2022-11-01 Magna Electronics Inc. Vehicular collision avoidance system
US9802609B2 (en) 2013-04-10 2017-10-31 Magna Electronics Inc. Collision avoidance system for vehicle
US9327693B2 (en) 2013-04-10 2016-05-03 Magna Electronics Inc. Rear collision avoidance system for vehicle
US10207705B2 (en) 2013-04-10 2019-02-19 Magna Electronics Inc. Collision avoidance system for vehicle
US10232797B2 (en) 2013-04-29 2019-03-19 Magna Electronics Inc. Rear vision system for vehicle with dual purpose signal lines
US11025859B2 (en) 2013-06-10 2021-06-01 Magna Electronics Inc. Vehicular multi-camera vision system using coaxial cables with bidirectional data transmission
US11792360B2 (en) 2013-06-10 2023-10-17 Magna Electronics Inc. Vehicular vision system using cable with bidirectional data transmission
US11533452B2 (en) 2013-06-10 2022-12-20 Magna Electronics Inc. Vehicular multi-camera vision system using coaxial cables with bidirectional data transmission
US11290679B2 (en) 2013-06-10 2022-03-29 Magna Electronics Inc. Vehicular multi-camera vision system using coaxial cables with bidirectional data transmission
US10567705B2 (en) 2013-06-10 2020-02-18 Magna Electronics Inc. Coaxial cable with bidirectional data transmission
US20160163199A1 (en) * 2013-06-19 2016-06-09 Magna Electronics Inc. Vehicle vision system with collision mitigation
US10692380B2 (en) 2013-06-19 2020-06-23 Magna Electronics Inc. Vehicle vision system with collision mitigation
US9824587B2 (en) * 2013-06-19 2017-11-21 Magna Electronics Inc. Vehicle vision system with collision mitigation
US9260095B2 (en) 2013-06-19 2016-02-16 Magna Electronics Inc. Vehicle vision system with collision mitigation
US11572017B2 (en) 2013-06-21 2023-02-07 Magna Electronics Inc. Vehicular vision system
US11247609B2 (en) 2013-06-21 2022-02-15 Magna Electronics Inc. Vehicular vision system
US10946798B2 (en) 2013-06-21 2021-03-16 Magna Electronics Inc. Vehicle vision system
US10718624B2 (en) 2013-06-24 2020-07-21 Magna Electronics Inc. Vehicular parking assist system that determines a parking space based in part on previously parked spaces
US10222224B2 (en) 2013-06-24 2019-03-05 Magna Electronics Inc. System for locating a parking space based on a previously parked space
US20150042799A1 (en) * 2013-08-07 2015-02-12 GM Global Technology Operations LLC Object highlighting and sensing in vehicle image display systems
US10326969B2 (en) 2013-08-12 2019-06-18 Magna Electronics Inc. Vehicle vision system with reduction of temporal noise in images
US20150066345A1 (en) * 2013-08-28 2015-03-05 Elwha Llc Vehicle collision management system responsive to user-selected preferences
US11084426B2 (en) 2013-09-24 2021-08-10 Rosco, Inc. Mirror monitor using two levels of reflectivity and transmissibility
US11518308B2 (en) 2013-09-24 2022-12-06 Rosco, Inc. Convex rearview mirror and monitor with reversible back/socket mount
US10640046B1 (en) 2013-09-24 2020-05-05 Rosco, Inc. Convex rearview mirror and monitor with reversible back/socket mount
US10870395B2 (en) 2013-09-24 2020-12-22 Rosco, Inc. Convex rearview mirror and monitor with reversible back/socket mount
US10449902B1 (en) 2013-09-24 2019-10-22 Rosco, Inc. Mirror monitor using two levels of reflectivity and transmissibility
US11712998B2 (en) 2013-09-24 2023-08-01 Rosco, Inc. Mirror monitor using two levels of reflectivity and transmissibility
US11919450B2 (en) 2013-09-24 2024-03-05 Rosco, Inc. Convex rearview mirror and monitor with reversible back/socket mount
US20150109444A1 (en) * 2013-10-22 2015-04-23 GM Global Technology Operations LLC Vision-based object sensing and highlighting in vehicle image display systems
US20150138361A1 (en) * 2013-11-15 2015-05-21 Denso Corporation Lane departure warning apparatus
US20150156383A1 (en) * 2013-12-04 2015-06-04 Magna Electronics Inc. Vehicle vision system with camera having liquid lens optic
US10137892B2 (en) 2013-12-05 2018-11-27 Magna Electronics Inc. Vehicle monitoring system
US10870427B2 (en) 2013-12-05 2020-12-22 Magna Electronics Inc. Vehicular control system with remote processor
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
US11618441B2 (en) 2013-12-05 2023-04-04 Magna Electronics Inc. Vehicular control system with remote processor
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
US10688993B2 (en) 2013-12-12 2020-06-23 Magna Electronics Inc. Vehicle control system with traffic driving control
US9216689B2 (en) * 2013-12-16 2015-12-22 Honda Motor Co., Ltd. Fail-safe mirror for side camera failure
US20150165975A1 (en) * 2013-12-16 2015-06-18 Honda Motor Co., Ltd. Fail-safe mirror for side camera failure
DE102015202846A1 (en) 2014-02-19 2015-08-20 Magna Electronics, Inc. Vehicle vision system with display
US9776567B2 (en) * 2014-03-06 2017-10-03 Panasonic Intellectual Property Management Co., Ltd. Display control device, display device, display control method, and non-transitory storage medium
US20150251600A1 (en) * 2014-03-06 2015-09-10 Panasonic Intellectual Property Management Co., Ltd. Display control device, display device, display control method, and non-transitory storage medium
US9950707B2 (en) 2014-04-02 2018-04-24 Magna Electronics Inc. Method for controlling a vehicle in accordance with parameters preferred by an identified driver
US11565690B2 (en) 2014-04-02 2023-01-31 Magna Electronics Inc. Vehicular driving assistance system that controls a vehicle in accordance with parameters preferred by an identified driver
US9623878B2 (en) 2014-04-02 2017-04-18 Magna Electronics Inc. Personalized driver assistance system for vehicle
US11130487B2 (en) 2014-04-02 2021-09-28 Magna Electronics Inc. Method for controlling a vehicle in accordance with parameters preferred by an identified driver
US11170227B2 (en) 2014-04-08 2021-11-09 Bendix Commercial Vehicle Systems Llc Generating an image of the surroundings of an articulated vehicle
US9921585B2 (en) 2014-04-30 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Detailed map format for autonomous driving
US10118614B2 (en) * 2014-04-30 2018-11-06 Toyota Motor Engineering & Manufacturing North America, Inc. Detailed map format for autonomous driving
US20160257307A1 (en) * 2014-04-30 2016-09-08 Toyota Motor Engineering & Manufacturing North America, Inc. Detailed map format for autonomous driving
US20170069212A1 (en) * 2014-05-21 2017-03-09 Yazaki Corporation Safety Confirmation Assist Device
US10339809B2 (en) * 2014-05-21 2019-07-02 Yazaki Corporation Safety confirmation assist device
US20150343947A1 (en) * 2014-05-30 2015-12-03 State Farm Mutual Automobile Insurance Company Systems and Methods for Determining a Vehicle is at an Elevated Risk for an Animal Collision
US10166916B2 (en) * 2014-05-30 2019-01-01 State Farm Mutual Automobile Insurance Company Systems and methods for determining a vehicle is at an elevated risk for an animal collision
US10192445B1 (en) 2014-05-30 2019-01-29 State Farm Mutual Automobile Insurance Company Systems and methods for determining a vehicle is at an elevated risk for an animal collision
US10417914B1 (en) 2014-05-30 2019-09-17 State Farm Mutual Automobile Insurance Company Systems and methods for determining a vehicle is at an elevated risk for an animal collision
US10525883B2 (en) 2014-06-13 2020-01-07 Magna Electronics Inc. Vehicle vision system with panoramic view
US10899277B2 (en) 2014-06-13 2021-01-26 Magna Electronics Inc. Vehicular vision system with reduced distortion display
US20170158132A1 (en) * 2014-07-22 2017-06-08 Denso Corporation Vehicular display control apparatus
US10040395B2 (en) * 2014-07-22 2018-08-07 Denso Corporation Vehicular display control apparatus
US10444346B2 (en) * 2014-07-25 2019-10-15 Robert Bosch Gmbh Method for migrating radar sensor limitations with video camera input for active braking for pedestrians
US11472338B2 (en) 2014-09-15 2022-10-18 Magna Electronics Inc. Method for displaying reduced distortion video images via a vehicular vision system
US11572065B2 (en) 2014-09-17 2023-02-07 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US11787402B2 (en) 2014-09-17 2023-10-17 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US9925980B2 (en) 2014-09-17 2018-03-27 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US11198432B2 (en) 2014-09-17 2021-12-14 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US10591311B2 (en) * 2014-09-18 2020-03-17 Bayerische Motoren Werke Aktiengesellschaft Method, device, system, and computer program product for displaying driving route section factors influencing a vehicle
US10354155B2 (en) 2014-11-21 2019-07-16 Manga Electronics Inc. Vehicle vision system with multiple cameras
US10127463B2 (en) 2014-11-21 2018-11-13 Magna Electronics Inc. Vehicle vision system with multiple cameras
US10504237B2 (en) * 2014-12-17 2019-12-10 Bayerische Motoren Werke Aktiengesellschaft Method for determining a viewing direction of a person
US10482684B2 (en) 2015-02-05 2019-11-19 Uber Technologies, Inc. Programmatically determining location information in connection with a transport service
US11080944B2 (en) 2015-02-05 2021-08-03 Uber Technologies, Inc. Programmatically determining location information in connection with a transport service
US11605246B2 (en) 2015-02-05 2023-03-14 Uber Technologies, Inc. Programmatically determining location information in connection with a transport service
US10286855B2 (en) 2015-03-23 2019-05-14 Magna Electronics Inc. Vehicle vision system with video compression
US10023118B2 (en) * 2015-03-23 2018-07-17 Magna Electronics Inc. Vehicle vision system with thermal sensor
US20160280133A1 (en) * 2015-03-23 2016-09-29 Magna Electronics Inc. Vehicle vision system with thermal sensor
US20180086346A1 (en) * 2015-04-03 2018-03-29 Denso Corporation Information presentation apparatus
US10723264B2 (en) * 2015-04-03 2020-07-28 Denso Corporation Information presentation apparatus
US10650617B2 (en) 2015-04-13 2020-05-12 Arity International Limited Automatic crash detection
US11107303B2 (en) 2015-04-13 2021-08-31 Arity International Limited Automatic crash detection
US9650007B1 (en) 2015-04-13 2017-05-16 Allstate Insurance Company Automatic crash detection
US10083550B1 (en) 2015-04-13 2018-09-25 Allstate Insurance Company Automatic crash detection
US9767625B1 (en) 2015-04-13 2017-09-19 Allstate Insurance Company Automatic crash detection
US11074767B2 (en) 2015-04-13 2021-07-27 Allstate Insurance Company Automatic crash detection
US10083551B1 (en) 2015-04-13 2018-09-25 Allstate Insurance Company Automatic crash detection
US10223843B1 (en) 2015-04-13 2019-03-05 Allstate Insurance Company Automatic crash detection
US9916698B1 (en) 2015-04-13 2018-03-13 Allstate Insurance Company Automatic crash detection
US20180137698A1 (en) * 2015-04-24 2018-05-17 Pai-R Co., Ltd. Drive recorder
US10755498B2 (en) * 2015-04-24 2020-08-25 Pai-R Co., Ltd. Drive recorder
US10329827B2 (en) 2015-05-11 2019-06-25 Uber Technologies, Inc. Detecting objects within a vehicle in connection with a service
US11505984B2 (en) 2015-05-11 2022-11-22 Uber Technologies, Inc. Detecting objects within a vehicle in connection with a service
US10662696B2 (en) 2015-05-11 2020-05-26 Uatc, Llc Detecting objects within a vehicle in connection with a service
US20210001838A1 (en) * 2015-05-12 2021-01-07 SZ DJI Technology Co., Ltd. Apparatus and methods for obstacle detection
CN107531217A (en) * 2015-05-12 2018-01-02 深圳市大疆创新科技有限公司 Identification or the apparatus and method of detection barrier
US11697411B2 (en) * 2015-05-12 2023-07-11 SZ DJI Technology Co., Ltd. Apparatus and methods for obstacle detection
US10683006B2 (en) * 2015-05-12 2020-06-16 SZ DJI Technology Co., Ltd. Apparatus and methods for obstacle detection
US9751455B2 (en) * 2015-05-14 2017-09-05 Stanley Electric Co., Ltd. Headlight controller and vehicle headlight system
US20160332560A1 (en) * 2015-05-14 2016-11-17 Stanley Electric Co., Ltd. Headlight controller and vehicle headlight system
US10204528B2 (en) 2015-08-05 2019-02-12 Uber Technologies, Inc. Augmenting transport services using driver profiling
CN107031634A (en) * 2015-10-13 2017-08-11 沃尔沃汽车公司 The method of driving ancillary equipment, vehicle and control vehicular longitudinal velocity
EP3156298A1 (en) * 2015-10-13 2017-04-19 Volvo Car Corporation Driving aid arrangement, a vehicle and a method of controlling a longitudinal velocity of a vehice
US10106160B2 (en) * 2015-10-13 2018-10-23 Volvo Car Corporation Driving aid arrangement, a vehicle and a method of controlling a longitudinal velocity of a vehicle
US20170113613A1 (en) * 2015-10-27 2017-04-27 Magna Electronics Inc. Vehicle vision system with enhanced night vision
US10875403B2 (en) * 2015-10-27 2020-12-29 Magna Electronics Inc. Vehicle vision system with enhanced night vision
US10116873B1 (en) * 2015-11-09 2018-10-30 Ambarella, Inc. System and method to adjust the field of view displayed on an electronic mirror using real-time, physical cues from the driver in a vehicle
US10144419B2 (en) 2015-11-23 2018-12-04 Magna Electronics Inc. Vehicle dynamic control system for emergency handling
US10889293B2 (en) 2015-11-23 2021-01-12 Magna Electronics Inc. Vehicular control system for emergency handling
US11618442B2 (en) 2015-11-23 2023-04-04 Magna Electronics Inc. Vehicle control system for emergency handling
US10152649B2 (en) * 2015-12-01 2018-12-11 Mobileye Vision Technologies Ltd. Detecting visual information corresponding to an animal
US20170154241A1 (en) * 2015-12-01 2017-06-01 Mobileye Vision Technologies Ltd. Detecting visual information corresponding to an animal
US20170168495A1 (en) * 2015-12-10 2017-06-15 Uber Technologies, Inc. Active light sensors for determining expected traction value of a road segment
US10712160B2 (en) 2015-12-10 2020-07-14 Uatc, Llc Vehicle traction map for autonomous vehicles
US10684361B2 (en) 2015-12-16 2020-06-16 Uatc, Llc Predictive sensor array configuration system for an autonomous vehicle
US10712742B2 (en) 2015-12-16 2020-07-14 Uatc, Llc Predictive sensor array configuration system for an autonomous vehicle
CN107054364A (en) * 2016-01-11 2017-08-18 Trw汽车股份有限公司 Irregular control system and method for determining road surface
US10106167B2 (en) * 2016-01-11 2018-10-23 Trw Automotive Gmbh Control system and method for determining an irregularity of a road surface
US20170242244A1 (en) * 2016-02-24 2017-08-24 L-3 Communications Corporation Transparent display with eye protection
US10955664B2 (en) * 2016-02-24 2021-03-23 L3 Technologies, Inc. Transparent display with eye protection
US10055651B2 (en) 2016-03-08 2018-08-21 Magna Electronics Inc. Vehicle vision system with enhanced lane tracking
US11756316B2 (en) 2016-03-08 2023-09-12 Magna Electronics Inc. Vehicular lane keeping system
US11288890B2 (en) 2016-03-08 2022-03-29 Magna Electronics Inc. Vehicular driving assist system
US10685243B2 (en) 2016-03-08 2020-06-16 Magna Electronics Inc. Vehicular driver assist system
US11462022B2 (en) 2016-03-09 2022-10-04 Uatc, Llc Traffic signal analysis system
US10726280B2 (en) 2016-03-09 2020-07-28 Uatc, Llc Traffic signal analysis system
US11487020B2 (en) 2016-04-26 2022-11-01 Uatc, Llc Satellite signal calibration system
US10459087B2 (en) 2016-04-26 2019-10-29 Uber Technologies, Inc. Road registration differential GPS
US9674664B1 (en) * 2016-04-28 2017-06-06 T-Mobile Usa, Inc. Mobile device in-motion proximity guidance system
US10136257B2 (en) 2016-04-28 2018-11-20 T-Mobile Usa, Inc. Mobile device in-motion proximity guidance system
US10489686B2 (en) 2016-05-06 2019-11-26 Uatc, Llc Object detection for an autonomous vehicle
US10672198B2 (en) 2016-06-14 2020-06-02 Uber Technologies, Inc. Trip termination determination for on-demand transport
WO2018004421A1 (en) * 2016-06-28 2018-01-04 Scania Cv Ab Method and control unit for a digital rear view mirror
US11050949B2 (en) 2016-06-28 2021-06-29 Scania Cv Ab Method and control unit for a digital rear view mirror
CN109415018A (en) * 2016-06-28 2019-03-01 斯堪尼亚商用车有限公司 Method and control unit for digital rearview mirror
US10719083B2 (en) 2016-07-01 2020-07-21 Uatc, Llc Perception system for autonomous vehicle
US10739786B2 (en) 2016-07-01 2020-08-11 Uatc, Llc System and method for managing submaps for controlling autonomous vehicles
US10678262B2 (en) 2016-07-01 2020-06-09 Uatc, Llc Autonomous vehicle localization using image analysis and manipulation
US10852744B2 (en) 2016-07-01 2020-12-01 Uatc, Llc Detecting deviations in driving behavior for autonomous vehicles
US10871782B2 (en) 2016-07-01 2020-12-22 Uatc, Llc Autonomous vehicle control using submaps
US10129221B1 (en) 2016-07-05 2018-11-13 Uber Technologies, Inc. Transport facilitation system implementing dual content encryption
US10491571B2 (en) 2016-07-05 2019-11-26 Uber Technologies, Inc. Computing system implementing dual content encryption for a transport service
DE102016112483A1 (en) * 2016-07-07 2018-01-11 Connaught Electronics Ltd. Method for reducing interference signals in a top view image showing a motor vehicle and a surrounding area of the motor vehicle, driver assistance system and motor vehicle
US10807533B2 (en) * 2016-07-11 2020-10-20 Lg Electronics Inc. Driver assistance apparatus and vehicle having the same
US20190291642A1 (en) * 2016-07-11 2019-09-26 Lg Electronics Inc. Driver assistance apparatus and vehicle having the same
US10902525B2 (en) 2016-09-21 2021-01-26 Allstate Insurance Company Enhanced image capture and analysis of damaged tangible objects
US11361380B2 (en) 2016-09-21 2022-06-14 Allstate Insurance Company Enhanced image capture and analysis of damaged tangible objects
US10262211B2 (en) * 2016-09-28 2019-04-16 Wipro Limited Windshield and a method for mitigating glare from a windshield of an automobile
FR3056806A1 (en) * 2016-09-28 2018-03-30 Valeo Schalter Und Sensoren Gmbh SYSTEM FOR MONITORING FREE SPACE AROUND A MOTOR VEHICLE
US10775504B2 (en) 2016-09-29 2020-09-15 Honeywell International Inc. Laser air data sensor mounting and operation for eye safety
WO2018081610A1 (en) * 2016-10-27 2018-05-03 Nrg Systems, Inc. System and methods for detecting bats and avian carcasses
DE102016224905A1 (en) * 2016-12-14 2018-06-14 Conti Temic Microelectronic Gmbh Apparatus and method for fusing image data from a multi-camera system for a motor vehicle
US10824884B2 (en) 2016-12-15 2020-11-03 Conti Temic Microelectronic Gmbh Device for providing improved obstacle identification
DE102016225073A1 (en) * 2016-12-15 2018-06-21 Conti Temic Microelectronic Gmbh DEVICE FOR PROVIDING AN IMPROVED OBSTACLE IDENTIFICATION
CN109952231A (en) * 2016-12-30 2019-06-28 金泰克斯公司 With the on-demand full display mirror for scouting view
US20180204538A1 (en) * 2017-01-13 2018-07-19 Continental Automotive Systems, Inc. External light dimming system and method
US10371542B2 (en) 2017-02-17 2019-08-06 Uber Technologies, Inc. System and methods for performing multivariate optimizations based on location data
US11371858B2 (en) 2017-02-17 2022-06-28 Uber Technologies, Inc. System and method for performing multivariate optimizations based on location data
US10445950B1 (en) 2017-03-27 2019-10-15 Uber Technologies, Inc. Vehicle monitoring system
US10402771B1 (en) * 2017-03-27 2019-09-03 Uber Technologies, Inc. System and method for evaluating drivers using sensor data from mobile computing devices
US20180304813A1 (en) * 2017-04-20 2018-10-25 Subaru Corporation Image display device
US10919450B2 (en) * 2017-04-20 2021-02-16 Subaru Corporation Image display device
US20180320876A1 (en) * 2017-05-03 2018-11-08 Fluence Bioengineering Systems and methods for coupling a metal core pcb to a heat sink
US10208940B2 (en) * 2017-05-03 2019-02-19 Fluence Bioengineering, Inc. Systems and methods for coupling a metal core PCB to a heat sink
US11209658B2 (en) * 2017-07-06 2021-12-28 Boe Technology Group Co., Ltd. Virtual image display apparatus, vehicle comprising head-up display apparatus comprising virtual image display apparatus, and method for virtual image display
US10315496B2 (en) * 2017-08-11 2019-06-11 GM Global Technology Operations LLC Systems and methods for sun protection
CN109383241A (en) * 2017-08-11 2019-02-26 通用汽车环球科技运作有限责任公司 For sun-proof system and method
US10773725B1 (en) * 2017-08-25 2020-09-15 Apple Inc. Tire-road friction estimation and mapping
US10717439B2 (en) * 2017-09-15 2020-07-21 Honda Motor Co., Ltd Traveling control system and vehicle control method
US11364915B2 (en) * 2017-12-07 2022-06-21 Nissan Motor Co., Ltd. Road condition determination method and road condition determination device
US20190184985A1 (en) * 2017-12-18 2019-06-20 Stephen Tokish Active rear sense area adjustment of collision avoidance system of a vehicle when vehicle is approaching a positive road grade change
US10576975B2 (en) * 2017-12-18 2020-03-03 Fca Us Llc Active rear sense area adjustment of collision avoidance system of a vehicle when vehicle is approaching a positive road grade change
US11465559B2 (en) * 2017-12-27 2022-10-11 Denso Corporation Display processing device and display control device
DE102018201316A1 (en) * 2018-01-29 2019-08-01 Conti Temic Microelectronic Gmbh Surroundview system for one vehicle
US10578717B2 (en) * 2018-01-31 2020-03-03 Honeywell International Inc. Dimmable glass for eye safety for LiDAR technology
US20190235056A1 (en) * 2018-01-31 2019-08-01 Honeywell International Inc. Dimmable glass for eye safety for lidar technology
US20210001850A1 (en) * 2018-03-01 2021-01-07 Jaguar Land Rover Limited Vehicle control method and apparatus
US20190306396A1 (en) * 2018-03-29 2019-10-03 Varroc Lighting Systems, s.r.o. Communication Device of a Motor Vehicle, a Motor Vehicle Lighting Device for the Communication Device of a Motor Vehicle and a Car2Car or Car2X Communication Method for a Motor Vehicle
US11245829B2 (en) * 2018-03-29 2022-02-08 Varroc Lighting Systems, s.r.o. Communication device of a motor vehicle, a motor vehicle lighting device for the communication device of a motor vehicle and a Car2Car or Car2X communication method for a motor vehicle
US11628844B2 (en) 2018-05-03 2023-04-18 Volvo Car Corporation System and method for providing vehicle safety distance and speed alerts under slippery road conditions
US11124193B2 (en) 2018-05-03 2021-09-21 Volvo Car Corporation System and method for providing vehicle safety distance and speed alerts under slippery road conditions
US11884279B2 (en) 2018-05-03 2024-01-30 Volvo Car Corporation System and method for providing vehicle safety distance and speed alerts under slippery road conditions
WO2020047302A1 (en) * 2018-08-29 2020-03-05 Buffalo Automation Group Inc. Lane and object detection systems and methods
CN111103753A (en) * 2018-10-09 2020-05-05 先进光电科技股份有限公司 Panoramic image system and driving assistance system
US11089215B2 (en) * 2018-10-09 2021-08-10 Ability Opto-Electronics Technology Co.Ltd. Panoramic image system and driver assistance system
TWI776025B (en) * 2018-10-09 2022-09-01 先進光電科技股份有限公司 Panorama image system and driver assistance system
US11208053B2 (en) * 2019-01-07 2021-12-28 Ability Opto Electronics Technology Co., Ltd. Movable carrier auxiliary system
JP2019132847A (en) * 2019-02-28 2019-08-08 株式会社ニコン Imaging device
US11560153B2 (en) 2019-03-07 2023-01-24 6 River Systems, Llc Systems and methods for collision avoidance by autonomous vehicles
US11205083B2 (en) 2019-04-02 2021-12-21 Magna Electronics Inc. Vehicular driver monitoring system
US11645856B2 (en) 2019-04-02 2023-05-09 Magna Electronics Inc. Vehicular driver monitoring system
US20220309923A1 (en) * 2019-04-29 2022-09-29 Qualcomm Incorporated Method and apparatus for vehicle maneuver planning and messaging
US11908327B2 (en) * 2019-04-29 2024-02-20 Qualcomm Incorporated Method and apparatus for vehicle maneuver planning and messaging
US20200353922A1 (en) * 2019-05-07 2020-11-12 Hyundai Mobis Co., Ltd. Vehicle scc system based on complex information and method of controlling the same
US11505189B2 (en) * 2019-05-07 2022-11-22 Hyundai Mobis Co., Ltd. Vehicle SCC system based on complex information and method of controlling the same
CN113924824A (en) * 2019-05-29 2022-01-11 法雷奥照明公司 Method for operating a vehicle lighting device and vehicle lighting device
EP3785994A1 (en) * 2019-08-29 2021-03-03 Ningbo Geely Automobile Research & Development Co. Ltd. A system and method for highlighting of an object to a vehicle occupant
US20210157330A1 (en) * 2019-11-23 2021-05-27 Ha Q Tran Smart vehicle
EP3842307A1 (en) * 2019-12-27 2021-06-30 Volvo Car Corporation System and method for providing vehicle safety distance and speed alerts under slippery road conditions
US11508156B2 (en) 2019-12-27 2022-11-22 Magna Electronics Inc. Vehicular vision system with enhanced range for pedestrian detection
CN113119960A (en) * 2019-12-27 2021-07-16 沃尔沃汽车公司 System and method for providing vehicle safety distance and speed warning under slippery conditions
US11708024B2 (en) 2020-01-06 2023-07-25 Gentex Corporation Dynamic imaging system
WO2021141873A1 (en) * 2020-01-06 2021-07-15 Gentex Corporation Dynamic imaging system
WO2021144029A1 (en) * 2020-01-17 2021-07-22 Volvo Truck Corporation A cruise control system and a method for controlling a powertrain
US11824855B1 (en) 2020-02-12 2023-11-21 Uber Technologies, Inc. Computer system and device for controlling use of secure media recordings
US11868508B2 (en) 2020-02-12 2024-01-09 Uber Technologies, Inc. Computer system and device for controlling use of secure media recordings
US11494517B2 (en) 2020-02-12 2022-11-08 Uber Technologies, Inc. Computer system and device for controlling use of secure media recordings
US20230017715A1 (en) * 2020-04-17 2023-01-19 Magna Mirrors Of America, Inc. Vehicular driver monitoring system with dms camera at interior rearview mirror assembly
US11780370B2 (en) * 2020-04-17 2023-10-10 Magna Mirrors Of America, Inc. Vehicular driver monitoring system with DMS camera at interior rearview mirror assembly
US20210323473A1 (en) * 2020-04-17 2021-10-21 Magna Mirrors Of America, Inc. Interior rearview mirror assembly with driver monitoring system
US11465561B2 (en) * 2020-04-17 2022-10-11 Magna Mirrors Of America, Inc. Interior rearview mirror assembly with driver monitoring system
US20210382560A1 (en) * 2020-06-05 2021-12-09 Aptiv Technologies Limited Methods and System for Determining a Command of an Occupant of a Vehicle

Also Published As

Publication number Publication date
US11091105B2 (en) 2021-08-17
US9509957B2 (en) 2016-11-29
US20210370855A1 (en) 2021-12-02
US20130229523A1 (en) 2013-09-05
US20170072880A1 (en) 2017-03-16

Similar Documents

Publication Publication Date Title
US20210370855A1 (en) Vehicular control system
US10807515B2 (en) Vehicular adaptive headlighting system
US10479269B2 (en) Lighting apparatus for vehicle and vehicle having the same
JP5341402B2 (en) In-vehicle display system
US10195980B2 (en) System for imaging
JP6568603B2 (en) Vehicle image display system and vehicle equipped with the image display system
JP3864406B2 (en) Vehicle display device
US8098171B1 (en) Traffic visibility in poor viewing conditions on full windshield head-up display
KR101768500B1 (en) Drive assistance apparatus and method for controlling the same
US20060151223A1 (en) Device and method for improving visibility in a motor vehicle
US10906399B2 (en) Method and system for alerting a truck driver
US20060164219A1 (en) Method and device for warning the driver of a motor vehicle
KR102372566B1 (en) Lighting apparatus for Vehicle and Vehicle
CN111196217A (en) Vehicle assistance system
JPH06255399A (en) Display device for vehicle
US20200020235A1 (en) Method, System, and Device for Forward Vehicular Vision
US20230322253A1 (en) Pedestrian Alert System
CN112334361A (en) Information display device and information display method
CN113448096A (en) Display device for vehicle
JP3363935B2 (en) Vehicle display device
JP2016515969A (en) Visual positioning bearing navigation system with direction
KR20170070566A (en) Vehicle And Control Method Thereof
JPH06255397A (en) Display device for vehicle
JP6354805B2 (en) Visibility control device
US11220209B2 (en) Bus with a safety lighting system for road users

Legal Events

Date Code Title Description
AS Assignment

Owner name: MAGNA ELECTRONICS INC., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HIGGINS-LUTHMAN, MICHAEL J.;LU, YUESHENG;REEL/FRAME:023007/0852

Effective date: 20090318

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION