US20140218529A1 - Vehicle data recording system - Google Patents

Vehicle data recording system Download PDF

Info

Publication number
US20140218529A1
US20140218529A1 US14/169,329 US201414169329A US2014218529A1 US 20140218529 A1 US20140218529 A1 US 20140218529A1 US 201414169329 A US201414169329 A US 201414169329A US 2014218529 A1 US2014218529 A1 US 2014218529A1
Authority
US
United States
Prior art keywords
vehicle
image data
cameras
camera
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/169,329
Inventor
Hossam Mahmoud
Christian Traub
Arno Krapf
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magna Electronics Inc
Original Assignee
Magna Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magna Electronics Inc filed Critical Magna Electronics Inc
Priority to US14/169,329 priority Critical patent/US20140218529A1/en
Publication of US20140218529A1 publication Critical patent/US20140218529A1/en
Priority to US16/380,438 priority patent/US10523904B2/en
Priority to US16/729,800 priority patent/US11012668B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing

Definitions

  • the present invention relates to imaging systems or vision systems for vehicles.
  • the present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture images exterior of the vehicle, and provides the communication/data signals, including camera data or image data that may be displayed or processed to provide the desired display images and/or processing and control, depending on the particular application of the camera and vision or imaging system.
  • the present invention provides a vehicle data recording system that is operable to record data captured by one or more cameras or image-based sensors and/or one or more other sensors or non-image based sensors of the vehicle.
  • the system of the present invention provides a triggering means to trigger or initiate data capture for a parked vehicle in a manner that captures data responsive to a triggering event indicative of a change in the scene at or around the vehicle.
  • a vision system for a vehicle includes at least one camera disposed at a vehicle equipped with the vehicle vision system.
  • the at least one camera has a field of view exterior the equipped vehicle and is operable to capture image data.
  • An image processor is operable to process image data captured by the at least one camera and a data recording device is operable to record image data captured by the at least one camera.
  • a control controls operation of the at least one camera and, responsive to a determination that the equipped vehicle is in a parked state, the control controls the at least one camera to capture frames of image data at a first capture rate. Responsive to image processing of captured image data, the control compares a frame of captured imaged data to at least one previous frame of captured image data.
  • the control increases the capture rate to a second capture rate, (ii) the at least one camera captures frames of image data at the second capture rate and (iii) the control activates the data recording device to record the images captured at the second capture rate.
  • the present invention periodically or episodically captures frames of image data and processes the captured image data to determine when a threshold degree of change occurs in the scene being monitored by the camera (such as when a person walks by or up to the parked vehicle).
  • a threshold degree of change occurs in the scene being monitored by the camera (such as when a person walks by or up to the parked vehicle).
  • the system increases the rate of capture (such as from capturing one frame every second or every five seconds or the like to capturing at least 20 frames per second or at least 30 frames per second or the like) and records the captured image data (such as captured video image data) on the data recording device.
  • the system reduces the power consumption of the parked vehicle by episodically capturing image data at a slower rate and only captures video images and activates the data recording device when the system determines that there is a threshold or significant change in the frames of image data captured by the camera or cameras.
  • FIG. 1 is a plan view of a vehicle with a vision system and imaging sensors or cameras that provide exterior fields of view in accordance with the present invention
  • FIG. 2 is a chart showing the processing steps that the data recording system of the present invention undergoes at an activating time t a ;
  • FIG. 3 is a chart showing a time pattern for the cyclical wake up and sleep pattern for the data recording system of the present invention
  • FIG. 4 is an image showing the initial environmental scene at the time the vehicle is parked
  • FIG. 5 is an image showing the environmental scene at a later time after the vehicle is parked
  • FIG. 6 shows the scene or image of FIG. 4 when mosaiced in accordance with the present invention
  • FIG. 7 shows the scene or image of FIG. 5 when mosaiced in accordance with the present invention.
  • FIG. 8 shows the result when calculating the difference between the images of FIGS. 7 and 6 (the image in FIG. 8 was inverted for clarity) in accordance with the present invention
  • FIG. 9A shows the image of FIG. 8 , with the brightness of some of the different pixels/areas marked by inclining numbers in accordance with the present invention
  • FIG. 9B is a sectional cut out of the image of FIG. 9A , with the different pixels/areas enlarged in accordance with the present invention.
  • FIG. 10A is the image of FIG. 9A after a contrast enhancement
  • FIG. 10B is a cut out of the same region shown in FIG. 9B as cut out of FIG. 10A ;
  • FIGS. 11A and 11B show the same images of FIGS. 9B and 10B , respectively, with the brightness value labels adapted to the new light settings;
  • FIG. 12 is a flow chart of a decision process of the data recording system of the present invention.
  • FIG. 13 is an exemplary branch in the field of view of a sensor of the data recording system of the present invention.
  • FIG. 14 is a mosaiced difference image showing the differences that may be detected by movement of the branch in FIG. 13 ;
  • FIG. 15 is a counter table for pixel locations
  • FIGS. 16A-17B show counter tables and triggerings that may occur due to a waving branch such as like that shown in FIGS. 13 and 14 ;
  • FIG. 18 is a cropped image of the image of FIG. 4 ;
  • FIG. 19 is a mosaiced image of the image of FIG. 18 ;
  • FIG. 20 is a chart showing sensitivity modes for the data recording system of the present invention as implemented on an electrical vehicle or the like;
  • FIG. 21 is a schematic of a jammer attack scenario, where the vehicle is equipped with a jammer detector of the present invention.
  • FIG. 22 is a schematic showing a scene where a crowd of pedestrians 90 is about to enter a road with a vehicle 22 approaching, where the pedestrians may be sensed by other vehicles and such information may be communicated to the approaching vehicle in accordance with the present invention.
  • a driver assist system and/or vision system and/or object detection system and/or alert system may operate to capture images exterior of the vehicle and process the captured image data to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction.
  • the object detection may utilize detection and analysis of moving vectors representative of objects detected in the field of view of the vehicle camera, in order to determine which detected objects are objects of interest to the driver of the vehicle, such as when the driver of the vehicle undertakes a reversing maneuver.
  • a vehicle 10 includes an imaging system or vision system 12 that includes one or more imaging sensors or cameras (such as a rearward facing imaging sensor or camera 14 a and/or a forwardly facing camera 14 b at the front (or at the windshield) of the vehicle, and/or a sidewardly/rearwardly facing camera 14 c , 14 b at the sides of the vehicle), which capture images exterior of the vehicle, with the cameras having a lens for focusing images at or onto an imaging array or imaging plane of the camera ( FIG. 1 ).
  • the vision system 12 is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle.
  • the vision system may process image data to detect objects, such as objects to the rear of the subject or equipped vehicle during a reversing maneuver, or such as approaching or following vehicles or vehicles at a side lane adjacent to the subject or equipped vehicle or the like.
  • Vehicles are often equipped with cameras and other environmental sensors. Such cameras and/or sensors are typically deactivated or off or in a sleep mode when the vehicle is parked and turned off in order to save energy and reduce electrical power consumption. Some systems of a vehicle may stay awake when the vehicle is parked or may stay partially/intermittently/periodically awake when the vehicle is parked or may be awakable or episodically awakable, such as responsive to an external trigger or the like, such as a vehicle door lock system or vehicle alarm system or the like.
  • the vehicle data or damage recording system of the present invention uses the on board cameras and environmental sensors of an equipped vehicle to record the scene at or around the equipped vehicle at a time when (or just before) the vehicle is hit by another vehicle, animal or person.
  • the system of the present invention thus uses the camera or cameras already present on the vehicle.
  • the camera or cameras used by the recording system may be part of a multi-camera vision system or surround view system or rear backup aid system or forward facing camera system of the vehicle (and may utilize aspects of the systems described in U.S. Pat. No. 7,855,755, which is hereby incorporated herein by reference in its entirety).
  • Such use of cameras already present on the vehicle for other purposes reduces the cost of the recording system, since no dedicated cameras are needed for the recording system when the recording system is added to the vehicle.
  • the present invention thus may provide identification of the opponent or collider or the opponent's license plate or the like, such as in “hit-and-run” situations or the like.
  • the system of the vehicle may start to record the environmental scene.
  • the system may record image data captured by some or all of the vehicle installed cameras (and/or remote cameras within a communication range of the equipped vehicle (such as with a X2Car communication range), such as traffic monitoring cameras at intersections or security cameras at buildings, parking lots and/or the like).
  • outputs of other environmental sensors of the vehicle such as ultrasound sensors, capacitive proximity sensors, touch sensors, heartbeat sensors (such as sensors utilizing aspects of the sensors and systems described in U.S. Pat. No.
  • RADAR sensors LADAR sensors, LIDAR sensors, time of flight (TOF) sensors, structured light sensors and/or infrared sensors or the like, may be engaged and may be recorded as well.
  • LADAR sensors LADAR sensors
  • LIDAR sensors LIDAR sensors
  • TOF time of flight sensors
  • structured light sensors and/or infrared sensors or the like may be engaged and may be recorded as well.
  • the trigger that the vehicle was (or imminently may be) hit or damaged may come from a vehicle alarm system, which may include or may be responsive to sensors for vibration, level changing, glass breaking, unauthorized door or ignition lock actuation or compartment ultrasound disturbances and/or the like.
  • the data recording system of the present invention may include or utilize a sensing system that uses capacitive sensors, which may be installed in or at the rear bumper of the vehicle and/or the door handles of the vehicle (and typically operable to detect a person's hand at the door handle to open the vehicle door), but are not exclusively for triggering the recording system or the vehicle alarm system.
  • a sensing system that uses capacitive sensors, which may be installed in or at the rear bumper of the vehicle and/or the door handles of the vehicle (and typically operable to detect a person's hand at the door handle to open the vehicle door), but are not exclusively for triggering the recording system or the vehicle alarm system.
  • the trigger may be achieved by the image processing devices of the vehicle that are operable to process image data captured by the vehicle camera or cameras.
  • image processing devices may take reference images from the environmental scene at the time the vehicle is parked, such as shown in FIG. 4 .
  • the object detection may determine non-moving or steady objects that are within detection range or field of view of the camera or cameras.
  • the image data representative of the scene and/or the images of the scene may be stored as an initial reference (such as, for example, the image shown in FIG. 4 ).
  • Other sensor data such as the distance or level (to anything in the vehicle's environment) detected by different ultrasound sensors may be stored as well.
  • the system may operate to wake up in a cyclical pattern.
  • FIG. 3 shows a typical time pattern for the cyclical wake up and sleep pattern for the system of the present invention.
  • the inactive phases are meant to be much longer than the active phases.
  • the system may reduce or shut off as much power consumption as possible during the inactive time phases.
  • the life time of some components may be reduced due to excessive restarting or rebooting.
  • the initialization time of some image processing functional components may be comparably long so a shorter shut off time period may not be suitable.
  • the imager, controllers and static memory components may suffer the most.
  • the present invention preferably provides a good or acceptable balance by keeping the imager operating and avoiding the need of the processing unit except to run cyclically for the recording system, mostly on a field-programmable gate array (FPGA).
  • FPGA field-programmable gate array
  • the data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status.
  • the system may enter an active or awake time phase t a , and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2 .
  • the processing steps may be:
  • the data captured may be stored in a FIFO memory at which the oldest part of the lapse video and optionally other sensor data may be overwritten by the newer ones in cases where no vehicle alarm occurs, or alternatively the captured data may be stored in a memory device (in all cases), and the system may periodically back up the stored data and/or may transfer the stored data from a local memory device (such as a vision system or vehicle inherent memory device such as like a flash memory or solid state drive, which may be exchangeable by the vehicle owner) to a remote or external memory device (such as via a telematics system or other communication or data transfer system).
  • a local memory device such as a vision system or vehicle inherent memory device such as like a flash memory or solid state drive, which may be exchangeable by the vehicle owner
  • a remote or external memory device such as via a telematics system or other communication or data transfer system.
  • the captured images or video and optionally other sensor data may be stored/transferred in a compressed data format or as RAW or may be stored in RAW locally and transferred compressed or may be stored compressed locally and transferred in RAW.
  • the system may store images in the area or areas of moving objects and/or regions of interest in a high definition and/or uncompressed format, and the system may store images in other areas or parts of less interest at or surrounding the vehicle in a low definition or compressed format for shrinking the data size for storing or transmission.
  • the backup and storage modes or means may be customizable by the vision system manufacturer, the OEM, the dealership, third party services or the owner of the vehicle, and the system may communicate a message or alert (optionally a text message or the like or a video message or still photo or image message or the like) to the driver or owner of the vehicle (such as to the driver's cell phone or PDA or the like) in response to a triggering event occurring.
  • a message or alert (optionally a text message or the like or a video message or still photo or image message or the like) to the driver or owner of the vehicle (such as to the driver's cell phone or PDA or the like) in response to a triggering event occurring.
  • viewing-programs or apps may be provided by the vehicle or vision system manufacturer or vendors or by aftermarket distributers. These may provide additional services such as a function to automatically guide the police to the vehicle by the driver's request (such as via his or her cell phone or PDA program or app).
  • An object detection (OD) algorithm or feature extractor may also be employed for comparing whether objects have entered the scene (such as shown in FIG. 5 , where two people are walking by the vehicle) that did't present earlier (see FIG. 4 ) or which have moved since the last wake up phase.
  • OD object detection
  • feature extractor may also be employed for comparing whether objects have entered the scene (such as shown in FIG. 5 , where two people are walking by the vehicle) that did't present earlier (see FIG. 4 ) or which have moved since the last wake up phase.
  • classifiers in use such as like ‘Viola Jones’, ‘Fastest Pedestrian Detector in the West’ or HOG (Histogram of Orientated Gradients) or the like. This is the ‘positive’ or changed case, which leads to a sensor wake up mode according to state (9) in FIG.
  • the control increases the frequency or frame capture rate of the camera (reduces the time period between frame captures), such as to greater than or equal to about ten fps or about twenty fps or about thirty fps or any other selected rate or frequency or capture rate) and activates or controls the recording device to record the captured image data and store the captured image data.
  • a simple and practical embodiment of an OD may be to use an image difference procedure.
  • the images may be additionally processed by a mosaicing filter, a Gaussian filter or a box filter or the like (see step 5 in FIG. 2 ) before difference calculating (see step 7 in FIG. 2 ).
  • FIG. 6 shows the scene or image of FIG. 4 when mosaiced
  • FIG. 7 shows the scene or image of FIG. 5 when mosaiced.
  • FIG. 8 shows the result when calculating the difference between the images of FIGS. 7 and 6 (for better readability herein the image in FIG. 8 was inverted, the feature information with images does not change by that).
  • FIG. 9A the brightness of some of the interesting pixels/areas are marked by inclining numbers, with the greater numbers applied at the pixels or areas that have a greater difference between the initial image ( FIGS. 4 and 6 ) and the later image ( FIGS. 5 and 7 ).
  • FIG. 9B is a sectional cut out of that same scene, showing the difference area enlarged.
  • FIG. 10A is the image of FIG. 9A after a contrast enhancement (alternatively a histogram or cut filtering may be used), and FIG. 10B is a cut out of the same region shown in FIG. 9B as cut out of FIG. 10A , showing the difference area enlarged.
  • the contrast enhancement, histogram or cut filtering may include the difference calculating step (see step 7 in FIG.
  • FIGS. 11A , 11 B show the same scene of FIGS. 9A , 9 B and 10 A, 10 B with the brightness value labels adapted to the new light settings.
  • the system may assume that something has changed in the scene from the image in FIG. 4 to the image in FIG. 5 .
  • the system thus may assume a motion has happened or light conditions have abruptly changed.
  • the system may enter a power saving sleep state (again), and this equates to the path following ‘no’ from the decision block of the flow chart in FIG. 12 .
  • the data recording system may record a fast motion flic or video responsive to a trigger or indication that a threshold difference was determined during the comparison of a frame of captured image data to the previous frame or initial frame of captured image data. This equates to the path following ‘yes’ from the decision block of the flow chart in FIG. 12 .
  • the decision block of FIG. 12 requires a minimum amount of differing mosaics d x,y to limit or substantially preclude triggering responsive to image noise. There may be environmental scenes in which the system becomes disturbed and/or awakened quite often.
  • the data recording system may engage the peripheral sensors, the cameras and data buses very often, which may lead to substantial draining of the vehicle's battery, which may lead to the consequence that the system's or vehicle's battery power management would have to turn off the vehicle data recording system entirely, and possibly very soon, such as within or after about two hours (of being awake very often or all the time).
  • the data recording system of the present invention may be able to distinguish or sort out or suppress the wake up (input-) events or sources.
  • the branch or branches may be detected as movement of an object within two (or more) consecutive cyclically captured image data sets by the difference comparison.
  • the branch shown in FIG. 13 may appear as mosaiced difference image like the image shown in FIG. 14 .
  • the system may be operable to identify that detection area as being the source for false triggering (wake up) by statistical means.
  • the system may have an array of memory in the same size ratio as the pixels on the imager or the same size ratio as the captured mosaic image (see the example shown in FIG. 15 having an (mosaic areas equating) array of 0x0F by 0x0F) called ‘d x,y ’, with the size X by Y, in the flow chart of FIG. 12 .
  • each pixel or mosaic there may be a value (such as, for example, four bit or the like) storing the disturbance counter ‘d’ event history of being a pixel or mosaic ‘p’ differing at the image data difference comparison p x,y (t n ) ⁇ p x,y (t n-1 ) when comparing the p 0,0 (upper left corner of the array) to p x,y (lower right corner of the array).
  • two consecutive triggerings that may occur due to a waving branch such as like that shown in FIGS. 13 and 14 may influence the disturbance counters d x,y illustrated in FIGS. 16A and 17A having the resulting disturbance counter table entries such as like those shown in FIGS. 16B and 17B accordingly.
  • the data recording system may include a certain event history counter level (which may be fix or adjusted by algorithm), such as, for example, 6 (or more or less), which leads to the consequence that the according pixel will not be considered (ignored) as a wake up trigger source until falling under the counter level borderline again.
  • the system may include an algorithm that may decrease the event history counter level (of one, several or all pixels). This may happen by time, by trigger events or by other difference pixel statistical means.
  • a threshold ‘tr’ of the disturbance counter ‘d’ (equal for all ‘p’) with the exemplary value 6.
  • the pixel or mosaic may be ignored by setting its masking bit m x,y to zero.
  • the specific areas become ignored in determining the triggering.
  • the area (x,y positions of d x,y ) in the image covered by the waving branch would soon have several positions which exceed the value 5, which would result in the system ignoring that area by masking it's pixels or mosaics.
  • detection of the moving branch would not trigger activation of the video image data capture.
  • another procedure may be to decrease the disturbance event history counter. This is for re-enabling areas which have been masked earlier when their disturbing ratio is diminishing.
  • the waving branch may stop waving due to the wind lessening.
  • the disturbance counter d x,y may be diminished by one tenth on every disturbance event (at which a specific pixel or mosaic isn't participating (> ⁇ 0)).
  • Another divisor ‘r’ counts the events and how often a d x,y has changed from above the threshold ‘tr’ to below the threshold. The default and minimum of ‘r’ is 1.
  • the supervised scenery around the equipped vehicle may be limited by cropping or masking the captured images by a static mask.
  • FIG. 18 shows a cropped image of FIG. 4 , with its mosaiced image shown in FIG. 19 .
  • the ultrasound sensors of the vehicle may detect entering and leaving objects (pedestrians) very often (leading to wake up events triggering the cameras to capture the scene), and the system may decide to have a higher robustness to disturbances due to such an (input-) event source. This may be done by changing the inputs' priority level for wake up or by employing a counter or sensitivity state machine or algorithm.
  • the sensitivity may be counted up (less sensitive) when a trigger occurs by a specific input source (or input sensor type class or such). Substantial triggering will lead to a high level of insensitivity.
  • the counter may be counted down over time (such as, for example, the counter may be counted down one increment every hour of a four bit counter).
  • the one or all counter may be reset by a vehicle damage alert event directly followed by an according wake up trigger event.
  • the system may employ a learning algorithm in which the initial counter level may lead to an enhanced or optimal balance of system awareness/inactivity and battery life. The system thus may increase the threshold degree of differences that need to be determined before triggering the system, such that a greater difference or change over time is determined before the system will be awakened or triggered.
  • the system may be switched off, whereby the cameras may be activated when the vehicle alert or security system engages, such as when triggered by an actually detected hit.
  • the data recording system may run or operate in more sensitive (more power consuming) modes as compared to when the vehicle is unplugged. Both are shown in an example of FIG. 20 .
  • the activity percentage relates to the cycle times, and less activity equates to longer times of t a +t i , and higher activity equates to shorter times, adjusted by t a , t i . 100% may mean that the system may have a very short cycle time or even never turns into saving mode but records the scene all the time.
  • the system may include features to increase the use cases, convenience and security of the system.
  • the system may be operable to, at the time a data recording alert is detected, send a text message over a mobile channel or a captured movie via UMTS, LTE or the like (to alert the owner of the vehicle of a triggering event of the data recording system), such as by utilizing aspects of the display and data communication system described in International Publication No. WO 2013/081985, which is hereby incorporated herein by reference in its entirety.
  • the image, images or movie or data set captured by the system responsive to a triggering event may be stored at a remote server for accessibility by the vehicle's owner or may be stored at a removable storage device as like a SD-card or the like, which may be removed from the vehicle and taken away by the vehicle owner.
  • the vehicle owner may be able to enter settings for backup paths, cycles, modes, trigger sensitivity modes and/or power management modes and/or the like, according to the vehicle/car damage recording system.
  • the system may provide specified set up modes adapting the system to the local laws and legal modalities. For example, the system may blur the faces in its records automatically if required by the local laws (or if elected by user set up).
  • the actual vehicle location may come from a GPS receiver via a CAN bus of the vehicle or the like.
  • the system may trigger other systems or services, such as an alert system that notifies or calls the police, a parking lot guard, security service or the like, responsive to the vehicle being damaged or hit (or responsive to a triggering event).
  • the system may detect (via image capture by a camera of the system or vehicle), store and transmit the image of the license plate (or information derived from image processing of image data captured by the camera) of a hit and run vehicle and may additionally transmit such information or image data to the according (called) parking lot guard, security service or police (-man or -station).
  • the vehicle data recording system may be part of a vehicle alarm system, a crash black box, a vehicle system integrity device (or system), a vehicle vision system or remote vehicle security surveillance (service) system and/or the like.
  • the vehicle alert system may employ a known (commodity) jammer detector (such as shown, for example, at http://www.shop-alarm.de/GPS_-_GSM_-_WIFI_-_DECT_-_Bluetooth_Stoersender_und_Jammer_Detector.html), such as for any kind of data transmission band jamming the vehicle is using (GPRS, GSM, WLAN, DECT, WEP, UMTS, LTE and near field communication).
  • Jammers may be misused to enable vehicle thefts to overcome the vehicle security systems or to lengthen the time they keep undiscovered, especially for suppressing the outbounding transmissions of vehicle alerts, images or flics to the owner or the police as described above.
  • Jammer detectors are able to detect the presence of jammers 40 in range of the vehicle 10 ( FIG. 21 ).
  • the behavior of the vehicle alert system of the present invention responsive to the detected presence of a jamming device 40 in range may be to disable all usual vehicle access functionalities. That mode may be called a ‘safe one time access mode.’
  • a ‘Keyless Entry/Go’ may become disabled, and the vehicle key remote control opening may become disabled, and the vehicle entry access by turning the vehicle key in the door key hole may be disabled as well.
  • the door dead bolts when present may be turned to lock if not done already.
  • FIG. 21 illustrates such a scenario. As shown in FIG.
  • the vehicle may be equipped with such a jammer detector, whereby, if a GSM signal between a GSM tower 50 and the vehicle 10 is jammed (such as by any means or device for jamming or blocking or interfering with such signals), such as by a jammer device 40 at or near the vehicle, the vehicle owner's smart phone (here tablet) or other personal remote communication device may generate the message ‘vehicle signal lost’ to the user or vehicle owner (such as responsive to a signal indicative of the lost or interrupted signal).
  • the vehicle responsive to a determination of signal jamming, may fall into or switch to a ‘safe one time access mode’ to limit or substantially preclude a break in of the vehicle when the signal is being jammed.
  • a smart phone app may be used for displaying the vehicle's safety status, and the app may be set up in a way that in case a cyclic feedback from the vehicle is interrupted or not provided (which may be caused by the presence of a jammer within the vehicle's radio range), the app will signal that the connection has been lost, which may indicate that the vehicle may be in an ‘unsafe’ or ‘endangered’ situation.
  • the status messages of a plurality of vehicles equipped with such a system may be accessible by parking guards, security or police services.
  • a determination or event may be a trigger for elevated scrutiny or security at or to the area where the reported vehicles are located.
  • the vehicle when the presence of the jammer may be not detectable any more, the vehicle may remain in the ‘safe one time access mode’, and may not be openable by key or remote device.
  • the driver may have to enter a PUC (personal unlock code).
  • the PUC may be one the driver may have entered earlier, preferably a one time usable PUC may be generated and transmitted by a remote service server (provided by the vision system manufacturer or vendors, OEM, or third party service) to the vehicle or to the driver's mobile device (with unique device ID as unique identification) upon request.
  • the server may only serve the PUC to earlier exclusively dedicated individuals.
  • the server may enter a dialog for making sure no unauthorized user is attempting to acquire a PUC. That may be done in a known manner by requesting key data only a dedicated person can know, and these may have been entered earlier.
  • the key data may be, for example, the date of birth, town of birth, maiden name, favorite pet's name, best friend in college or the like.
  • the ‘safe one time access mode’ may have additional or alternative safety hurdles to overcome before unlocking beside the PUC entry, such as like identifying by dedicating a person's face detection, retina, fingerprint or by body shape classification, such as by utilizing aspects of the systems described in U.S. provisional application Ser. No. 61/842,644, filed Jul. 3, 2013, which is hereby incorporated herein by reference in its entirety.
  • this function may be implemented in conjunction with or incorporated in or may be part of or used in combination with a keyless entry/go access admission system with visual driver identification, such as described in U.S. provisional application Ser. No. 61/845,061, filed Jul. 11, 2013 (Attorney Docket MAG04 P-2145), which is hereby incorporated herein by reference in its entirety.
  • the above vehicle park surveillance system (and optionally the above access admission system) may find use in conjunction with a power lift gate triggering and object collision prevention system, such as described in U.S. patent Ser. No. 14/159,772, filed Jan. 21, 2014 (Attorney Docket MAG04 P-2215), which is hereby incorporated herein by reference in its entirety.
  • one or more than one vehicles 20 a - f may be parked at a local area, and may be equipped with the image processing system according the invention, and thus may be operable to detect or determine objects including pedestrians ( 90 in FIG. 22 ) by vehicle inherent sensors such as cameras 14 c (with a field of view 80 ) in the manner as described above.
  • the parked vehicle 20 a may occlude the pedestrians to the area of view of the camera or sensor 14 b of the approaching vehicle 22 .
  • the vehicles 20 a - f and 22 may be equipped with the devices according the invention including a ZigBee transmitter 70 .
  • the pedestrian's position at the area of sensor or camera view 80 of the vehicle 20 a next to them would be determined by the vision system of the vehicle 20 a , which may have been woken up or activated upon the pedestrians entering the scene.
  • the ZigBee transmitters of vehicles 20 b - f (which may be parked or moving) in the neighborhood may continuously transmit the pedestrian's position to the receiving ZigBee node 70 of the approaching vehicle 22 .
  • the detection information may not just be transmitted or sent to the specific parked individual vehicle's owner cell phone or the like, but may be sent to the driving vehicles so that local area driver assistant systems are also informed of the location of the detected pedestrian or pedestrians, and such communication may be made via any suitable type or kind of data channel (such as, for example, GPRS, GSM, WLAN, DECT, WEP, SMS, MMS, UMTS or LTE, and/or optically transmitted or communicated or connected inductively or by wire (especially the charging cable data, line of an e-car power outlet connected to a common data grid or the like)).
  • the signal may run direct from vehicle to vehicle.
  • the signal may be transmitted via more than one vehicle and/or infrastructure which may act as peers in a (may be temporary) local vehicle and/or infrastructure grid or mesh network or peer to peer network, such as like a ZigBee network or the like.
  • the ZigBee device 70 may be attached to or connected to or in communication with the vision system and/or camera (such as camera 14 a of vision system 12 ).
  • the driver assistant systems of the driving vehicle(s) 22 (driving in the direction of the white arrow 24 in FIG. 22 ) will use the received data (gray arrows 26 in FIG. 22 ) of detected pedestrians 90 occluded by the parked vehicles (and thus not in the are of view 100 of the driving vehicle 22 ) and especially their detected position and walking direction (black arrow 28 in FIG. 22 ) to determine potential collision hazards to the pedestrians which may be occluded from the driving vehicle's view 100 and optional other vehicle sensors.
  • the system of the approaching vehicle may generate an alert to the driver to warn the driver of the (as yet) not viewable pedestrians and their approach to the path of travel of the driven vehicle.
  • the camera or sensor may comprise any suitable camera or sensor.
  • the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
  • the system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras.
  • the image processor may comprise an EyeQ2 or EyeQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580; and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects.
  • the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
  • the vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like.
  • the imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640 ⁇ 480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array.
  • the photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns.
  • the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels.
  • the imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like.
  • the logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
  • the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,
  • the system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos.
  • the imaging device and control and image processor and any associated illumination source may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454; and 6,824,281, and/or International Publication Nos.
  • WO 2010/099416 WO 2011/028686; and/or WO 2013/016409
  • U.S. patent application Ser. No. 12/508,840 filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170
  • U.S. patent application Ser. No. 13/534,657 filed Jun. 27, 2012 (Attorney Docket MAG04 P-1892), which are all hereby incorporated herein by reference in their entireties.
  • the camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct.
  • the imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos.
  • the camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149; and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos.
  • a vehicle vision system such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos.
  • a reverse or sideward imaging system such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No.
  • the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009 (Attorney Docket DON01 P-1564), which are hereby incorporated herein by reference in their entireties.
  • the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle.
  • the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 (Attorney Docket DON01 P-1797), which are hereby incorporated herein by reference in their entireties.
  • the video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos.
  • the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).
  • the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249; and/or WO 2013/109869, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 (Attorney Docket DON01 P-1797), which are hereby incorporated herein by reference in their entireties.
  • a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun.
  • the display is viewable through the reflective element when the display is activated to display information.
  • the display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like.
  • PSIR passenger side inflatable restraint
  • the mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties.
  • the thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036; and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.
  • the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742; and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.
  • accessories or systems such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A vision system for a vehicle includes at least one camera having a field of view exterior the vehicle. An image processor is operable to process image data captured by the camera. When the vehicle is parked, a control controls the camera to capture frames of image data at a first capture rate. The control compares a frame of captured imaged data to at least one previous frame of captured image data. Responsive to the comparison determining a change in the frames of captured image data beyond a threshold degree of change, (i) the control increases the capture rate to a second capture rate, (ii) the at least one camera captures frames of image data at the second capture rate and (iii) the control activates a recording device to record images captured by the camera at the second capture rate.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application claims the filing benefits of U.S. provisional applications, Ser. No. 61/893,489, filed Oct. 21, 2013, and Ser. No. 61/760,364, filed Feb. 4, 2013, which are hereby incorporated herein by reference in their entireties.
  • FIELD OF THE INVENTION
  • The present invention relates to imaging systems or vision systems for vehicles.
  • BACKGROUND OF THE INVENTION
  • Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935; and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
  • SUMMARY OF THE INVENTION
  • The present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture images exterior of the vehicle, and provides the communication/data signals, including camera data or image data that may be displayed or processed to provide the desired display images and/or processing and control, depending on the particular application of the camera and vision or imaging system. The present invention provides a vehicle data recording system that is operable to record data captured by one or more cameras or image-based sensors and/or one or more other sensors or non-image based sensors of the vehicle. The system of the present invention provides a triggering means to trigger or initiate data capture for a parked vehicle in a manner that captures data responsive to a triggering event indicative of a change in the scene at or around the vehicle.
  • According to an aspect of the present invention, a vision system for a vehicle includes at least one camera disposed at a vehicle equipped with the vehicle vision system. The at least one camera has a field of view exterior the equipped vehicle and is operable to capture image data. An image processor is operable to process image data captured by the at least one camera and a data recording device is operable to record image data captured by the at least one camera. A control controls operation of the at least one camera and, responsive to a determination that the equipped vehicle is in a parked state, the control controls the at least one camera to capture frames of image data at a first capture rate. Responsive to image processing of captured image data, the control compares a frame of captured imaged data to at least one previous frame of captured image data. Responsive to the comparison determining a change in the frames of captured image data beyond a threshold degree of change, (i) the control increases the capture rate to a second capture rate, (ii) the at least one camera captures frames of image data at the second capture rate and (iii) the control activates the data recording device to record the images captured at the second capture rate.
  • Thus, the present invention periodically or episodically captures frames of image data and processes the captured image data to determine when a threshold degree of change occurs in the scene being monitored by the camera (such as when a person walks by or up to the parked vehicle). When such a threshold degree of change occurs, the system increases the rate of capture (such as from capturing one frame every second or every five seconds or the like to capturing at least 20 frames per second or at least 30 frames per second or the like) and records the captured image data (such as captured video image data) on the data recording device. Thus, the system reduces the power consumption of the parked vehicle by episodically capturing image data at a slower rate and only captures video images and activates the data recording device when the system determines that there is a threshold or significant change in the frames of image data captured by the camera or cameras.
  • These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a plan view of a vehicle with a vision system and imaging sensors or cameras that provide exterior fields of view in accordance with the present invention;
  • FIG. 2 is a chart showing the processing steps that the data recording system of the present invention undergoes at an activating time ta;
  • FIG. 3 is a chart showing a time pattern for the cyclical wake up and sleep pattern for the data recording system of the present invention;
  • FIG. 4 is an image showing the initial environmental scene at the time the vehicle is parked;
  • FIG. 5 is an image showing the environmental scene at a later time after the vehicle is parked;
  • FIG. 6 shows the scene or image of FIG. 4 when mosaiced in accordance with the present invention;
  • FIG. 7 shows the scene or image of FIG. 5 when mosaiced in accordance with the present invention;
  • FIG. 8 shows the result when calculating the difference between the images of FIGS. 7 and 6 (the image in FIG. 8 was inverted for clarity) in accordance with the present invention;
  • FIG. 9A shows the image of FIG. 8, with the brightness of some of the different pixels/areas marked by inclining numbers in accordance with the present invention;
  • FIG. 9B is a sectional cut out of the image of FIG. 9A, with the different pixels/areas enlarged in accordance with the present invention;
  • FIG. 10A is the image of FIG. 9A after a contrast enhancement;
  • FIG. 10B is a cut out of the same region shown in FIG. 9B as cut out of FIG. 10A;
  • FIGS. 11A and 11B show the same images of FIGS. 9B and 10B, respectively, with the brightness value labels adapted to the new light settings;
  • FIG. 12 is a flow chart of a decision process of the data recording system of the present invention;
  • FIG. 13 is an exemplary branch in the field of view of a sensor of the data recording system of the present invention;
  • FIG. 14 is a mosaiced difference image showing the differences that may be detected by movement of the branch in FIG. 13;
  • FIG. 15 is a counter table for pixel locations;
  • FIGS. 16A-17B show counter tables and triggerings that may occur due to a waving branch such as like that shown in FIGS. 13 and 14;
  • FIG. 18 is a cropped image of the image of FIG. 4;
  • FIG. 19 is a mosaiced image of the image of FIG. 18;
  • FIG. 20 is a chart showing sensitivity modes for the data recording system of the present invention as implemented on an electrical vehicle or the like;
  • FIG. 21 is a schematic of a jammer attack scenario, where the vehicle is equipped with a jammer detector of the present invention; and
  • FIG. 22 is a schematic showing a scene where a crowd of pedestrians 90 is about to enter a road with a vehicle 22 approaching, where the pedestrians may be sensed by other vehicles and such information may be communicated to the approaching vehicle in accordance with the present invention.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A driver assist system and/or vision system and/or object detection system and/or alert system may operate to capture images exterior of the vehicle and process the captured image data to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The object detection may utilize detection and analysis of moving vectors representative of objects detected in the field of view of the vehicle camera, in order to determine which detected objects are objects of interest to the driver of the vehicle, such as when the driver of the vehicle undertakes a reversing maneuver.
  • Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes one or more imaging sensors or cameras (such as a rearward facing imaging sensor or camera 14 a and/or a forwardly facing camera 14 b at the front (or at the windshield) of the vehicle, and/or a sidewardly/rearwardly facing camera 14 c, 14 b at the sides of the vehicle), which capture images exterior of the vehicle, with the cameras having a lens for focusing images at or onto an imaging array or imaging plane of the camera (FIG. 1). The vision system 12 is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle. Optionally, the vision system may process image data to detect objects, such as objects to the rear of the subject or equipped vehicle during a reversing maneuver, or such as approaching or following vehicles or vehicles at a side lane adjacent to the subject or equipped vehicle or the like.
  • Vehicles are often equipped with cameras and other environmental sensors. Such cameras and/or sensors are typically deactivated or off or in a sleep mode when the vehicle is parked and turned off in order to save energy and reduce electrical power consumption. Some systems of a vehicle may stay awake when the vehicle is parked or may stay partially/intermittently/periodically awake when the vehicle is parked or may be awakable or episodically awakable, such as responsive to an external trigger or the like, such as a vehicle door lock system or vehicle alarm system or the like.
  • The vehicle data or damage recording system of the present invention uses the on board cameras and environmental sensors of an equipped vehicle to record the scene at or around the equipped vehicle at a time when (or just before) the vehicle is hit by another vehicle, animal or person. The system of the present invention thus uses the camera or cameras already present on the vehicle. For example, the camera or cameras used by the recording system may be part of a multi-camera vision system or surround view system or rear backup aid system or forward facing camera system of the vehicle (and may utilize aspects of the systems described in U.S. Pat. No. 7,855,755, which is hereby incorporated herein by reference in its entirety). Such use of cameras already present on the vehicle for other purposes reduces the cost of the recording system, since no dedicated cameras are needed for the recording system when the recording system is added to the vehicle.
  • Employment of an advanced wake up algorithm allows the system to record the environmental scene before the vehicle is hit. The present invention thus may provide identification of the opponent or collider or the opponent's license plate or the like, such as in “hit-and-run” situations or the like.
  • At the time the equipped vehicle identifies that it may likely be hit or was hit, the system of the vehicle may start to record the environmental scene. The system may record image data captured by some or all of the vehicle installed cameras (and/or remote cameras within a communication range of the equipped vehicle (such as with a X2Car communication range), such as traffic monitoring cameras at intersections or security cameras at buildings, parking lots and/or the like). Optionally, outputs of other environmental sensors of the vehicle, such as ultrasound sensors, capacitive proximity sensors, touch sensors, heartbeat sensors (such as sensors utilizing aspects of the sensors and systems described in U.S. Pat. No. 8,258,932, which is hereby incorporated herein by reference in its entirety), RADAR sensors, LADAR sensors, LIDAR sensors, time of flight (TOF) sensors, structured light sensors and/or infrared sensors or the like, may be engaged and may be recorded as well.
  • The trigger that the vehicle was (or imminently may be) hit or damaged may come from a vehicle alarm system, which may include or may be responsive to sensors for vibration, level changing, glass breaking, unauthorized door or ignition lock actuation or compartment ultrasound disturbances and/or the like.
  • Optionally, the data recording system of the present invention may include or utilize a sensing system that uses capacitive sensors, which may be installed in or at the rear bumper of the vehicle and/or the door handles of the vehicle (and typically operable to detect a person's hand at the door handle to open the vehicle door), but are not exclusively for triggering the recording system or the vehicle alarm system.
  • Alternatively, or in addition to such capacitive sensors, the trigger may be achieved by the image processing devices of the vehicle that are operable to process image data captured by the vehicle camera or cameras. Such image processing devices may take reference images from the environmental scene at the time the vehicle is parked, such as shown in FIG. 4. The object detection may determine non-moving or steady objects that are within detection range or field of view of the camera or cameras. The image data representative of the scene and/or the images of the scene may be stored as an initial reference (such as, for example, the image shown in FIG. 4). Other sensor data, such as the distance or level (to anything in the vehicle's environment) detected by different ultrasound sensors may be stored as well. In order to keep the electrical power consumption low, the system may operate to wake up in a cyclical pattern.
  • FIG. 3 shows a typical time pattern for the cyclical wake up and sleep pattern for the system of the present invention. As shown in FIG. 3, there are active or awake time phases ta and inactive or asleep time phases ti. For saving electrical energy of the vehicle's battery, the inactive phases are meant to be much longer than the active phases. Optionally, and desirably, the system may reduce or shut off as much power consumption as possible during the inactive time phases. Thus, it is envisioned that the life time of some components may be reduced due to excessive restarting or rebooting. Also, the initialization time of some image processing functional components may be comparably long so a shorter shut off time period may not be suitable. The imager, controllers and static memory components may suffer the most. Thus, the present invention preferably provides a good or acceptable balance by keeping the imager operating and avoiding the need of the processing unit except to run cyclically for the recording system, mostly on a field-programmable gate array (FPGA).
  • The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase ta, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be:
      • (1) wake up;
      • (2) initialize camera(s);
      • (3) capture image(s);
      • (4) transfer image (to processing unit if not processed locally in camera);
      • (5) filter image (Gaus filter/Box filter/Mosaicing);
      • (6) load compare image (if not temporarily stored earlier);
      • (7) calculate difference image (or other suitable object detection);
      • (8) load ignore mask table;
      • (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode);
      • (10) update ignore mask table;
      • (11) store image; and
      • (12) enter sleep phase (ti).
      • The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases ta), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
  • The data captured may be stored in a FIFO memory at which the oldest part of the lapse video and optionally other sensor data may be overwritten by the newer ones in cases where no vehicle alarm occurs, or alternatively the captured data may be stored in a memory device (in all cases), and the system may periodically back up the stored data and/or may transfer the stored data from a local memory device (such as a vision system or vehicle inherent memory device such as like a flash memory or solid state drive, which may be exchangeable by the vehicle owner) to a remote or external memory device (such as via a telematics system or other communication or data transfer system). The captured images or video and optionally other sensor data may be stored/transferred in a compressed data format or as RAW or may be stored in RAW locally and transferred compressed or may be stored compressed locally and transferred in RAW. In cases where the system employs an object detection algorithm, the system may store images in the area or areas of moving objects and/or regions of interest in a high definition and/or uncompressed format, and the system may store images in other areas or parts of less interest at or surrounding the vehicle in a low definition or compressed format for shrinking the data size for storing or transmission.
  • The backup and storage modes or means may be customizable by the vision system manufacturer, the OEM, the dealership, third party services or the owner of the vehicle, and the system may communicate a message or alert (optionally a text message or the like or a video message or still photo or image message or the like) to the driver or owner of the vehicle (such as to the driver's cell phone or PDA or the like) in response to a triggering event occurring. Accordingly, viewing-programs or apps may be provided by the vehicle or vision system manufacturer or vendors or by aftermarket distributers. These may provide additional services such as a function to automatically guide the police to the vehicle by the driver's request (such as via his or her cell phone or PDA program or app).
  • An object detection (OD) algorithm or feature extractor (such as one that may be based on optical flow, difference images or edge detection, such as Lucas Kanade, Zobel, Laplace or such as FAST or the like) may also be employed for comparing whether objects have entered the scene (such as shown in FIG. 5, where two people are walking by the vehicle) that weren't present earlier (see FIG. 4) or which have moved since the last wake up phase. To detect humans within an image there may be classifiers in use such as like ‘Viola Jones’, ‘Fastest Pedestrian Detector in the West’ or HOG (Histogram of Orientated Gradients) or the like. This is the ‘positive’ or changed case, which leads to a sensor wake up mode according to state (9) in FIG. 2. When a signal is generated indicative of a determined threshold change in the frames of captured image data, the control increases the frequency or frame capture rate of the camera (reduces the time period between frame captures), such as to greater than or equal to about ten fps or about twenty fps or about thirty fps or any other selected rate or frequency or capture rate) and activates or controls the recording device to record the captured image data and store the captured image data.
  • A simple and practical embodiment of an OD may be to use an image difference procedure. To shrink the to-be-compared image properties, the images may be additionally processed by a mosaicing filter, a Gaussian filter or a box filter or the like (see step 5 in FIG. 2) before difference calculating (see step 7 in FIG. 2). FIG. 6 shows the scene or image of FIG. 4 when mosaiced and FIG. 7 shows the scene or image of FIG. 5 when mosaiced. FIG. 8 shows the result when calculating the difference between the images of FIGS. 7 and 6 (for better readability herein the image in FIG. 8 was inverted, the feature information with images does not change by that).
  • In FIG. 9A, the brightness of some of the interesting pixels/areas are marked by inclining numbers, with the greater numbers applied at the pixels or areas that have a greater difference between the initial image (FIGS. 4 and 6) and the later image (FIGS. 5 and 7). FIG. 9B is a sectional cut out of that same scene, showing the difference area enlarged. FIG. 10A is the image of FIG. 9A after a contrast enhancement (alternatively a histogram or cut filtering may be used), and FIG. 10B is a cut out of the same region shown in FIG. 9B as cut out of FIG. 10A, showing the difference area enlarged. The contrast enhancement, histogram or cut filtering may include the difference calculating step (see step 7 in FIG. 2) by choosing the according mapping parameters initially. As can be seen with reference to these figures, it becomes clear that the lighter areas represent areas where a low average of differing pixels have been eliminated. Areas which have massively changed or substantially changed (darker areas in this inverted image) are the areas where people or objects have appeared (real change, meant for triggering), while the rest of the scene has stayed mostly unchanged (meant for not triggering), except for some minor differences on strong contrast thresholds caused by slightly inaccurate image superposition (such as image noise natural for electronic cameras that may be widely eliminated by the earlier filtering (see step 5 in FIG. 2)).
  • FIGS. 11A, 11B show the same scene of FIGS. 9A, 9B and 10A, 10B with the brightness value labels adapted to the new light settings.
  • In cases where there are mosaics remaining which are not uneven or zero or eliminated, the system may assume that something has changed in the scene from the image in FIG. 4 to the image in FIG. 5. The system thus may assume a motion has happened or light conditions have abruptly changed. In cases where nothing has changed, the system may enter a power saving sleep state (again), and this equates to the path following ‘no’ from the decision block of the flow chart in FIG. 12.
  • The data recording system may record a fast motion flic or video responsive to a trigger or indication that a threshold difference was determined during the comparison of a frame of captured image data to the previous frame or initial frame of captured image data. This equates to the path following ‘yes’ from the decision block of the flow chart in FIG. 12. The decision block of FIG. 12 requires a minimum amount of differing mosaics dx,y to limit or substantially preclude triggering responsive to image noise. There may be environmental scenes in which the system becomes disturbed and/or awakened quite often. In such a situation, the data recording system may engage the peripheral sensors, the cameras and data buses very often, which may lead to substantial draining of the vehicle's battery, which may lead to the consequence that the system's or vehicle's battery power management would have to turn off the vehicle data recording system entirely, and possibly very soon, such as within or after about two hours (of being awake very often or all the time).
  • To achieve a less substantial draining of the vehicle battery and to provide a longer stand by time of the recording system's functionality, the data recording system of the present invention may be able to distinguish or sort out or suppress the wake up (input-) events or sources. For example, in a situation where there is a bush or branch within the camera's capturing range or field of view, with the bush or tree or the like having a branch or branches waving in the wind, the branch or branches may be detected as movement of an object within two (or more) consecutive cyclically captured image data sets by the difference comparison. For example, the branch shown in FIG. 13 may appear as mosaiced difference image like the image shown in FIG. 14. The system may be operable to identify that detection area as being the source for false triggering (wake up) by statistical means. The system may have an array of memory in the same size ratio as the pixels on the imager or the same size ratio as the captured mosaic image (see the example shown in FIG. 15 having an (mosaic areas equating) array of 0x0F by 0x0F) called ‘dx,y’, with the size X by Y, in the flow chart of FIG. 12. For each pixel or mosaic there may be a value (such as, for example, four bit or the like) storing the disturbance counter ‘d’ event history of being a pixel or mosaic ‘p’ differing at the image data difference comparison px,y(tn)−px,y(tn-1) when comparing the p0,0 (upper left corner of the array) to px,y (lower right corner of the array). As an example, two consecutive triggerings that may occur due to a waving branch such as like that shown in FIGS. 13 and 14 may influence the disturbance counters dx,y illustrated in FIGS. 16A and 17A having the resulting disturbance counter table entries such as like those shown in FIGS. 16B and 17B accordingly.
  • The data recording system may include a certain event history counter level (which may be fix or adjusted by algorithm), such as, for example, 6 (or more or less), which leads to the consequence that the according pixel will not be considered (ignored) as a wake up trigger source until falling under the counter level borderline again. Optionally, the system may include an algorithm that may decrease the event history counter level (of one, several or all pixels). This may happen by time, by trigger events or by other difference pixel statistical means. In the flow chart of FIG. 12, for example, there is a threshold ‘tr’ of the disturbance counter ‘d’ (equal for all ‘p’) with the exemplary value 6. That means if one pixel or mosaic's counter is exceeded so as to be involved to trigger the resulting image difference events over a short duration, the pixel or mosaic (if having a disturbance counter greater than the threshold value) may be ignored by setting its masking bit mx,y to zero. By masking the difference images before doing the trigger decision, the specific areas become ignored in determining the triggering. In the example over time the area (x,y positions of dx,y) in the image covered by the waving branch would soon have several positions which exceed the value 5, which would result in the system ignoring that area by masking it's pixels or mosaics. Thus, detection of the moving branch would not trigger activation of the video image data capture.
  • Optionally, another procedure may be to decrease the disturbance event history counter. This is for re-enabling areas which have been masked earlier when their disturbing ratio is diminishing. In the example above, the waving branch may stop waving due to the wind lessening. In the exemplary flow chart of FIG. 12, the disturbance counter dx,y may be diminished by one tenth on every disturbance event (at which a specific pixel or mosaic isn't participating (><0)). Another divisor ‘r’ counts the events and how often a dx,y has changed from above the threshold ‘tr’ to below the threshold. The default and minimum of ‘r’ is 1. As soon as ‘r’ increases, the disturbance counter will decrease respectively slower according to the term dx,y=dx,y−0,1/rx,y. When ‘r’ is 2, ‘d’ diminishes by one every 20th disturbance event, and when ‘r’ is 3, ‘d’ diminishes by one every 30th disturbance event, and so on. This is provided to reduce or eliminate (masking) disturbance sources which are cyclically present from time to time but not steadily present, such as a waving branch in intermittent wind conditions or a flickering street light, which is only on at night, or the like.
  • To decrease the false case rate and the calculation resource demand, the supervised scenery around the equipped vehicle may be limited by cropping or masking the captured images by a static mask. FIG. 18 shows a cropped image of FIG. 4, with its mosaiced image shown in FIG. 19.
  • At times when the vehicle is parked close to a crowded pedestrian path, such as a sidewalk at the right of the vehicle, the ultrasound sensors of the vehicle may detect entering and leaving objects (pedestrians) very often (leading to wake up events triggering the cameras to capture the scene), and the system may decide to have a higher robustness to disturbances due to such an (input-) event source. This may be done by changing the inputs' priority level for wake up or by employing a counter or sensitivity state machine or algorithm. The sensitivity may be counted up (less sensitive) when a trigger occurs by a specific input source (or input sensor type class or such). Substantial triggering will lead to a high level of insensitivity. The counter may be counted down over time (such as, for example, the counter may be counted down one increment every hour of a four bit counter). The one or all counter may be reset by a vehicle damage alert event directly followed by an according wake up trigger event. Optionally, the system may employ a learning algorithm in which the initial counter level may lead to an enhanced or optimal balance of system awareness/inactivity and battery life. The system thus may increase the threshold degree of differences that need to be determined before triggering the system, such that a greater difference or change over time is determined before the system will be awakened or triggered.
  • When the vehicle data recording system is engaged over several days and the vehicle is not driven to charge it's battery, the system may raise the insensitivity level of all sensors more and more and the system's cyclical wake up events may increase time wise (such as, for example, from ta+ti=1 s to 2 s, 4 s, 8 s, 16 s over to 32 s and so on). After a longer time, such as over one week or the like, the system may be switched off, whereby the cameras may be activated when the vehicle alert or security system engages, such as when triggered by an actually detected hit.
  • For applications in an electrical vehicle or the like, when the vehicle is plugged in (charging), the data recording system may run or operate in more sensitive (more power consuming) modes as compared to when the vehicle is unplugged. Both are shown in an example of FIG. 20. The activity percentage relates to the cycle times, and less activity equates to longer times of ta+ti, and higher activity equates to shorter times, adjusted by ta, ti. 100% may mean that the system may have a very short cycle time or even never turns into saving mode but records the scene all the time.
  • Optionally, the system may include features to increase the use cases, convenience and security of the system. For example, the system may be operable to, at the time a data recording alert is detected, send a text message over a mobile channel or a captured movie via UMTS, LTE or the like (to alert the owner of the vehicle of a triggering event of the data recording system), such as by utilizing aspects of the display and data communication system described in International Publication No. WO 2013/081985, which is hereby incorporated herein by reference in its entirety. The image, images or movie or data set captured by the system responsive to a triggering event (and optionally the current vehicle location as determined by the vehicle GPS) may be stored at a remote server for accessibility by the vehicle's owner or may be stored at a removable storage device as like a SD-card or the like, which may be removed from the vehicle and taken away by the vehicle owner.
  • Optionally, the vehicle owner may be able to enter settings for backup paths, cycles, modes, trigger sensitivity modes and/or power management modes and/or the like, according to the vehicle/car damage recording system. Optionally, the system may provide specified set up modes adapting the system to the local laws and legal modalities. For example, the system may blur the faces in its records automatically if required by the local laws (or if elected by user set up). Optionally, the actual vehicle location may come from a GPS receiver via a CAN bus of the vehicle or the like. Optionally, the system may trigger other systems or services, such as an alert system that notifies or calls the police, a parking lot guard, security service or the like, responsive to the vehicle being damaged or hit (or responsive to a triggering event). Optionally, the system may detect (via image capture by a camera of the system or vehicle), store and transmit the image of the license plate (or information derived from image processing of image data captured by the camera) of a hit and run vehicle and may additionally transmit such information or image data to the according (called) parking lot guard, security service or police (-man or -station). The vehicle data recording system may be part of a vehicle alarm system, a crash black box, a vehicle system integrity device (or system), a vehicle vision system or remote vehicle security surveillance (service) system and/or the like.
  • As another aspect of the present invention, the vehicle alert system may employ a known (commodity) jammer detector (such as shown, for example, at http://www.shop-alarm.de/GPS_-_GSM_-_WIFI_-_DECT_-_Bluetooth_Stoersender_und_Jammer_Detector.html), such as for any kind of data transmission band jamming the vehicle is using (GPRS, GSM, WLAN, DECT, WEP, UMTS, LTE and near field communication). Jammers may be misused to enable vehicle thefts to overcome the vehicle security systems or to lengthen the time they keep undiscovered, especially for suppressing the outbounding transmissions of vehicle alerts, images or flics to the owner or the police as described above. Jammer detectors are able to detect the presence of jammers 40 in range of the vehicle 10 (FIG. 21). The behavior of the vehicle alert system of the present invention responsive to the detected presence of a jamming device 40 in range may be to disable all usual vehicle access functionalities. That mode may be called a ‘safe one time access mode.’ By that a ‘Keyless Entry/Go’ may become disabled, and the vehicle key remote control opening may become disabled, and the vehicle entry access by turning the vehicle key in the door key hole may be disabled as well. The door dead bolts when present may be turned to lock if not done already. FIG. 21 illustrates such a scenario. As shown in FIG. 21, the vehicle may be equipped with such a jammer detector, whereby, if a GSM signal between a GSM tower 50 and the vehicle 10 is jammed (such as by any means or device for jamming or blocking or interfering with such signals), such as by a jammer device 40 at or near the vehicle, the vehicle owner's smart phone (here tablet) or other personal remote communication device may generate the message ‘vehicle signal lost’ to the user or vehicle owner (such as responsive to a signal indicative of the lost or interrupted signal). The vehicle, responsive to a determination of signal jamming, may fall into or switch to a ‘safe one time access mode’ to limit or substantially preclude a break in of the vehicle when the signal is being jammed.
  • As another aspect of the present invention, a smart phone app may be used for displaying the vehicle's safety status, and the app may be set up in a way that in case a cyclic feedback from the vehicle is interrupted or not provided (which may be caused by the presence of a jammer within the vehicle's radio range), the app will signal that the connection has been lost, which may indicate that the vehicle may be in an ‘unsafe’ or ‘endangered’ situation. Optionally, the status messages of a plurality of vehicles equipped with such a system may be accessible by parking guards, security or police services. Thus, at times when multiple vehicles report ‘connection lost’, such a determination or event may be a trigger for elevated scrutiny or security at or to the area where the reported vehicles are located.
  • As another aspect of the present invention, when the presence of the jammer may be not detectable any more, the vehicle may remain in the ‘safe one time access mode’, and may not be openable by key or remote device. To unlock the ‘safe one time access mode’, the driver may have to enter a PUC (personal unlock code). The PUC may be one the driver may have entered earlier, preferably a one time usable PUC may be generated and transmitted by a remote service server (provided by the vision system manufacturer or vendors, OEM, or third party service) to the vehicle or to the driver's mobile device (with unique device ID as unique identification) upon request. The server may only serve the PUC to earlier exclusively dedicated individuals. At the time someone requests a one time PUC, the server may enter a dialog for making sure no unauthorized user is attempting to acquire a PUC. That may be done in a known manner by requesting key data only a dedicated person can know, and these may have been entered earlier. The key data may be, for example, the date of birth, town of birth, maiden name, favorite pet's name, best friend in college or the like.
  • As another aspect of the present invention, the ‘safe one time access mode’ may have additional or alternative safety hurdles to overcome before unlocking beside the PUC entry, such as like identifying by dedicating a person's face detection, retina, fingerprint or by body shape classification, such as by utilizing aspects of the systems described in U.S. provisional application Ser. No. 61/842,644, filed Jul. 3, 2013, which is hereby incorporated herein by reference in its entirety.
  • As another aspect of the present invention, this function may be implemented in conjunction with or incorporated in or may be part of or used in combination with a keyless entry/go access admission system with visual driver identification, such as described in U.S. provisional application Ser. No. 61/845,061, filed Jul. 11, 2013 (Attorney Docket MAG04 P-2145), which is hereby incorporated herein by reference in its entirety.
  • As another aspect of the present invention, the above vehicle park surveillance system (and optionally the above access admission system) may find use in conjunction with a power lift gate triggering and object collision prevention system, such as described in U.S. patent Ser. No. 14/159,772, filed Jan. 21, 2014 (Attorney Docket MAG04 P-2215), which is hereby incorporated herein by reference in its entirety.
  • The Technische Universität München (TUM) published a solution for preventing pedestrians (as well as animals, cyclists and cars, hereinafter referred to as pedestrians) from hazards due to being hidden behind objects by carrying a transponder (which may be incorporated in a cell phone or the like) and permanently providing the actual position of it (and by that the carrying pedestrian) for being received by vehicle driver assistant systems for warning or actively preventing collisions by braking and assumingly invasive steering (see http://www.tum.de/die-tum/aktuelles/pressemitteilungen/kurz/article/31294/, which is hereby incorporated herein by reference in its entirety). However, the requirement that the pedestrians must always carry a transponder in order to be detected by the driver assistant systems of vehicles is a suboptimal solution.
  • As another aspect of the present invention, one or more than one vehicles 20 a-f may be parked at a local area, and may be equipped with the image processing system according the invention, and thus may be operable to detect or determine objects including pedestrians (90 in FIG. 22) by vehicle inherent sensors such as cameras 14 c (with a field of view 80) in the manner as described above. In the illustrated scenario of FIG. 22, the parked vehicle 20 a may occlude the pedestrians to the area of view of the camera or sensor 14 b of the approaching vehicle 22. The vehicles 20 a-f and 22 may be equipped with the devices according the invention including a ZigBee transmitter 70. The pedestrian's position at the area of sensor or camera view 80 of the vehicle 20 a next to them would be determined by the vision system of the vehicle 20 a, which may have been woken up or activated upon the pedestrians entering the scene. The ZigBee transmitters of vehicles 20 b-f (which may be parked or moving) in the neighborhood may continuously transmit the pedestrian's position to the receiving ZigBee node 70 of the approaching vehicle 22.
  • For giving support to driver assistant system of driving vehicles in that local area, the detection information may not just be transmitted or sent to the specific parked individual vehicle's owner cell phone or the like, but may be sent to the driving vehicles so that local area driver assistant systems are also informed of the location of the detected pedestrian or pedestrians, and such communication may be made via any suitable type or kind of data channel (such as, for example, GPRS, GSM, WLAN, DECT, WEP, SMS, MMS, UMTS or LTE, and/or optically transmitted or communicated or connected inductively or by wire (especially the charging cable data, line of an e-car power outlet connected to a common data grid or the like)). There may be a cloud server, a mobile app, or kind of vehicle inherent app or algorithm for both receiving (from other driving vehicles) and transmitting (to the parked vehicle or vehicles) the data such as shown in above referenced and incorporated International Publication No. WO 2013/081985. The signal may run direct from vehicle to vehicle. The signal may be transmitted via more than one vehicle and/or infrastructure which may act as peers in a (may be temporary) local vehicle and/or infrastructure grid or mesh network or peer to peer network, such as like a ZigBee network or the like. In the example of FIG. 22, the ZigBee device 70 may be attached to or connected to or in communication with the vision system and/or camera (such as camera 14 a of vision system 12). The driver assistant systems of the driving vehicle(s) 22 (driving in the direction of the white arrow 24 in FIG. 22) will use the received data (gray arrows 26 in FIG. 22) of detected pedestrians 90 occluded by the parked vehicles (and thus not in the are of view 100 of the driving vehicle 22) and especially their detected position and walking direction (black arrow 28 in FIG. 22) to determine potential collision hazards to the pedestrians which may be occluded from the driving vehicle's view 100 and optional other vehicle sensors. Responsive to a determination that the approaching or driving vehicle 22 may collide with the pedestrian(s) as the pedestrian(s) move out from behind the parked vehicle and into the path of travel of the approaching vehicle, the system of the approaching vehicle may generate an alert to the driver to warn the driver of the (as yet) not viewable pedestrians and their approach to the path of travel of the driven vehicle.
  • The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
  • The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EyeQ2 or EyeQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580; and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
  • The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
  • For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145313; WO 2012/0145501; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/0116043; WO 2012/0145501; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2013/019795; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869; WO 2013/123161; WO 2013/126715; WO 2013/043661 and/or WO 2013/158592 and/or U.S. patent applications, Ser. No. 14/163,325, filed Jan. 24, 2014 (Attorney Docket No. MAG04 P-2216); Ser. No. 14/159,772, filed Jan. 21, 2014 (Attorney Docket MAG04 P2215); Ser. No. 14/107,624, filed Dec. 16, 2013 (Attorney Docket MAG04 P-2206); Ser. No. 14/102,981, filed Dec. 11, 2013 (Attorney Docket MAG04 P-2196); Ser. No. 14/102,980, filed Dec. 11, 2013 (Attorney Docket MAG04 P-2195); Ser. No. 14/098,817, filed Dec. 6, 2013 (Attorney Docket MAG04 P-2193); Ser. No. 14/097,581, filed Dec. 5, 2013 (Attorney Docket MAG04 P-2192); Ser. No. 14/093,981, filed Dec. 2, 2013 (Attorney Docket MAG04 P-2197); Ser. No. 14/093,980, filed Dec. 2, 2013 (Attorney Docket MAG04 P-2191); Ser. No. 14/082,573, filed Nov. 18, 2013 (Attorney Docket MAG04 P-2183); Ser. No. 14/082,574, filed Nov. 18, 2013 (Attorney Docket MAG04 P-2184); Ser. No. 14/082,575, filed Nov. 18, 2013 (Attorney Docket MAG04 P-2185); Ser. No. 14/082,577, filed Nov. 18, 2013 (Attorney Docket MAG04 P-2203); Ser. No. 14/071,086, filed Nov. 4, 2013 (Attorney Docket MAG04 P2208); Ser. No. 14/076,524, filed Nov. 11, 2013 (Attorney Docket MAG04 P-2209); Ser. No. 14/052,945, filed Oct. 14, 2013 (Attorney Docket MAG04 P-2165); Ser. No. 14/046,174, filed Oct. 4, 2013 (Attorney Docket MAG04 P-2158); Ser. No. 14/016,790, filed Oct. 3, 2013 (Attorney Docket MAG04 P-2139); Ser. No. 14/036,723, filed Sep. 25, 2013 (Attorney Docket MAG04 P-2148); Ser. No. 14/016,790, filed Sep. 3, 2013 (Attorney Docket MAG04 P-2139); Ser. No. 14/001,272, filed Aug. 23, 2013 (Attorney Docket MAG04 P-1824); Ser. No. 13/970,868, filed Aug. 20, 2013 (Attorney Docket MAG04 P-2131); Ser. No. 13/964,134, filed Aug. 12, 2013 (Attorney Docket MAG04 P-2123); Ser. No. 13/942,758, filed Jul. 16, 2013 (Attorney Docket MAG04 P-2127); Ser. No. 13/942,753, filed Jul. 16, 2013 (Attorney Docket MAG04 P-2112); Ser. No. 13/927,680, filed Jun. 26, 2013 (Attorney Docket MAG04 P-2091); Ser. No. 13/916,051, filed Jun. 12, 2013 (Attorney Docket MAG04 P-2081); Ser. No. 13/894,870, filed May 15, 2013 (Attorney Docket MAG04 P-2062); Ser. No. 13/887,724, filed May 6, 2013 (Attorney Docket MAG04 P-2072); Ser. No. 13/852,190, filed Mar. 28, 2013 (Attorney Docket MAG04 P2046); Ser. No. 13/851,378, filed Mar. 27, 2013 (Attorney Docket MAG04 P-2036); Ser. No. 13/848,796, filed Mar. 22, 2012 (Attorney Docket MAG04 P-2034); Ser. No. 13/847,815, filed Mar. 20, 2013 (Attorney Docket MAG04 P-2030); Ser. No. 13/800,697, filed Mar. 13, 2013 (Attorney Docket MAG04 P-2060); Ser. No. 13/785,099, filed Mar. 5, 2013 (Attorney Docket MAG04 P-2017); Ser. No. 13/779,881, filed Feb. 28, 2013 (Attorney Docket MAG04 P-2028); Ser. No. 13/774,317, filed Feb. 22, 2013 (Attorney Docket MAG04 P-2015); Ser. No. 13/774,315, filed Feb. 22, 2013 (Attorney Docket MAG04 P-2013); Ser. No. 13/681,963, filed Nov. 20, 2012 (Attorney Docket MAG04 P-1983); Ser. No. 13/660,306, filed Oct. 25, 2012 (Attorney Docket MAG04 P-1950); Ser. No. 13/653,577, filed Oct. 17, 2012 (Attorney Docket MAG04 P-1948); and/or Ser. No. 13/534,657, filed Jun. 27, 2012 (Attorney Docket MAG04 P-1892), and/or U.S. provisional applications, Ser. No. 61/919,129, filed Dec. 20, 2013; Ser. No. 61/919,130, filed Dec. 20, 2013; Ser. No. 61/919,131, filed Dec. 20, 2013; Ser. No. 61/919,147, filed Dec. 20, 2013; Ser. No. 61/919,138, filed Dec. 20, 2013, Ser. No. 61/919,133, filed Dec. 20, 2013; Ser. No. 61/918,290, filed Dec. 19, 2013; Ser. No. 61/915,218, filed Dec. 12, 2013; Ser. No. 61/912,146, filed Dec. 5, 2013; Ser. No. 61/911, 666, filed Dec. 4, 2013; Ser. No. 61/911,665, filed Dec. 4, 2013; Ser. No. 61/905,461, filed Nov. 18, 2013; Ser. No. 61/905,462, filed Nov. 18, 2013; Ser. No. 61/901,127, filed Nov. 7, 2013; Ser. No. 61/895,610, filed Oct. 25, 2013; Ser. No. 61/895,609, filed Oct. 25, 2013; Ser. No. 61/886,883, filed Oct. 4, 2013; Ser. No. 61/879,837, filed Sep. 19, 2013; Ser. No. 61/879,835, filed Sep. 19, 2013; Ser. No. 61/878,877, filed Sep. 17, 2013; Ser. No. 61/875,351, filed Sep. 9, 2013; Ser. No. 61/869,195, filed. Aug. 23, 2013; Ser. No. 61/864,835, filed Aug. 12, 2013; Ser. No. 61/864,836, filed Aug. 12, 2013; Ser. No. 61/864,837, filed Aug. 12, 2013; Ser. No. 61/864,838, filed Aug. 12, 2013; Ser. No. 61/856,843, filed Jul. 22, 2013, Ser. No. 61/845,061, filed Jul. 11, 2013; Ser. No. 61/844,630, filed Jul. 10, 2013; Ser. No. 61/844,173, filed Jul. 9, 2013; Ser. No. 61/844,171, filed Jul. 9, 2013; Ser. No. 61/842,644, filed Jul. 3, 2013; Ser. No. 61/840,542, filed Jun. 28, 2013; Ser. No. 61/838,619, filed Jun. 24, 2013; Ser. No. 61/838,621, filed Jun. 24, 2013; Ser. No. 61/837,955, filed Jun. 21, 2013; Ser. No. 61/836,900, filed Jun. 19, 2013; Ser. No. 61/836,380, filed Jun. 18, 2013; Ser. No. 61/834,129, filed Jun. 12, 2013; Ser. No. 61/833,080, filed Jun. 10, 2013; Ser. No. 61/830,375, filed Jun. 3, 2013; Ser. No. 61/830,377, filed Jun. 3, 2013; Ser. No. 61/825,752, filed May 21, 2013; Ser. No. 61/825,753, filed May 21, 2013; Ser. No. 61/823,648, filed May 15, 2013; Ser. No. 61/823,644, filed May 15, 2013; Ser. No. 61/821,922, filed May 10, 2013; Ser. No. 61/819,835, filed May 6, 2013; Ser. No. 61/819,033, filed May 3, 2013; Ser. No. 61/816,956, filed Apr. 29, 2013; Ser. No. 61/815,044, filed Apr. 23, 2013; Ser. No. 61/814,533, filed Apr. 22, 2013; Ser. No. 61/813,361, filed Apr. 18, 2013; Ser. No. 61/810,407, filed Apr. 10, 2013; Ser. No. 61/808,930, filed Apr. 5, 2013; Ser. No. 61/806,674, filed Mar. 29, 2013; Ser. No. 61/793,592, filed Mar. 15, 2013; Ser. No. 61/772,015, filed Mar. 4, 2013; Ser. No. 61/772,014, filed Mar. 4, 2013; Ser. No. 61/770,051, filed Feb. 27, 2013; Ser. No. 61/766,883, filed Feb. 20, 2013; and/or Ser. No. 61/760,366, filed Feb. 4, 2013, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011 (Attorney Docket MAG04 P-1595), which are hereby incorporated herein by reference in their entireties.
  • The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454; and 6,824,281, and/or International Publication Nos. WO 2010/099416; WO 2011/028686; and/or WO 2013/016409, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012 (Attorney Docket MAG04 P-1892), which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361; and/or Ser. No. 13/260,400, filed Sep. 26, 2011 (Attorney Docket MAG04 P-1757), and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; 7,720,580; and/or 7,965,336, and/or International Publication Nos. WO/2009/036176 and/or WO/2009/046268, which are all hereby incorporated herein by reference in their entireties.
  • The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149; and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176; and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268; and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.
  • Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009 (Attorney Docket DON01 P-1564), which are hereby incorporated herein by reference in their entireties.
  • Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 (Attorney Docket DON01 P-1797), which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252; and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).
  • Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249; and/or WO 2013/109869, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 (Attorney Docket DON01 P-1797), which are hereby incorporated herein by reference in their entireties.
  • Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036; and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.
  • Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742; and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.
  • Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.

Claims (20)

1. A vision system for a vehicle, said vehicle vision system comprising:
at least one camera disposed at a vehicle equipped with said vehicle vision system, wherein said at least one camera has a field of view exterior the equipped vehicle and is operable to capture image data;
an image processor operable to process image data captured by said at least one camera;
a recording device operable to record images captured by said at least one camera;
a control controlling operation of said at least one camera;
wherein said control, responsive to a determination that the equipped vehicle is in a parked state, controls said at least one camera to capture frames of image data at a first capture rate;
wherein, responsive to image processing of captured image data, said control compares a frame of captured image data to at least one previous frame of captured image data; and
wherein, responsive to said comparison determining a change in the frames of captured image data beyond a threshold degree of change, (i) said control increases the capture rate to a second capture rate, (ii) said at least one camera captures frames of image data at said second capture rate and (iii) said control activates said recording device to record images captured at said second capture rate.
2. The vehicle vision system of claim 1, wherein, responsive to determination of said threshold degree of change, said control activates at least one other camera or sensor to capture data and said recording device records information captured by said at least one other camera or sensor.
3. The vehicle vision system of claim 1, wherein said first capture rate is less than or equal to about five frames per second and wherein said second capture rate is greater than or equal to about 10 frames per second.
4. The vehicle vision system of claim 1, wherein said control is operable to determine a repeating movement in a region encompassed by the field of view of said at least one camera, and wherein, responsive to said determination of a repeating movement, said control ignores the portion of captured image data that corresponds to that region.
5. The vehicle vision system of claim 1, wherein said control, responsive to determination of a threshold level of repeating activity at at least a portion of captured image data, increases the threshold degree of change.
6. The vehicle vision system of claim 1, wherein said control, responsive to a determination of a pedestrian in the field of view of said at least one camera, generates a signal to other vehicles indicative of the determined pedestrian.
7. The vehicle vision system of claim 1, wherein said control, responsive to a signal received from another vehicle that is indicative of a determination of a pedestrian at or near the other vehicle who is moving towards the path of travel of the equipped vehicle, generates an alert to the driver of the equipped vehicle.
8. The vehicle vision system of claim 1, wherein said control, responsive to determination of an interruption in a signal communicated to said control from a remote transmitter, at least temporarily disables an access system of the equipped vehicle.
9. The vehicle vision system of claim 8, wherein, responsive to determination of the interruption of the signal communicated to said control, a person situated remote from the equipped vehicle is alerted.
10. The vehicle vision system of claim 9, wherein said person is alerted via a hand held device.
11. The vehicle vision system of claim 1, wherein said at least one camera comprises a plurality of cameras having respective exterior fields of view, said cameras comprising at least a part of a surround view system of the equipped vehicle.
12. The vehicle vision system of claim 1, wherein said at least one camera comprises a forward viewing camera of the equipped vehicle, and wherein said image processor processes captured image data when the vehicle is operating for at least one of (i) a headlamp control system of the equipped vehicle, (ii) a lane change assistance system of the equipped vehicle and (iii) a lane departure warning system of the equipped vehicle.
13. The vehicle vision system of claim 1, wherein said at least one camera comprises a rearward viewing camera of the equipped vehicle, and wherein said image processor processes captured image data when the vehicle is operating for at least one of (i) a rear backup assist system of the equipped vehicle and (ii) a surround view system of the equipped vehicle.
14. A vision system for a vehicle, said vehicle vision system comprising:
a plurality of cameras disposed at a vehicle equipped with said vehicle vision system, wherein said cameras having respective fields of view exterior the equipped vehicle and is operable to capture image data;
an image processor operable to process image data captured by said cameras;
a recording device operable to record images captured by said cameras;
a control controlling operation of said cameras;
wherein said control, responsive to a determination that the equipped vehicle is in a parked state, controls said cameras to capture respective frames of image data at a first capture rate;
wherein, responsive to image processing of image data captured by said cameras, said control compares a frame of image data captured by an individual camera to at least one previous frame of image data captured by that individual camera; and
wherein, responsive to said comparison determining a change in the frames of image data captured by an individual camera beyond a threshold degree of change, (i) said control increases the capture rate to a second capture rate for at least that individual camera, (ii) at least that individual camera captures frames of image data at said second capture rate and (iii) said control activates said recording device to record images captured at said second capture rate by at least that individual camera.
15. The vehicle vision system of claim 14, wherein, responsive to determination of said threshold degree of change in frames of image data captured by an individual camera, (i) said control increases the capture rate of the others of said cameras to said second capture rate, (ii) the others of said cameras capture frames of image data at said second capture rate and (iii) said recording device records images captured at said second capture rate by the others of said cameras.
16. The vehicle vision system of claim 14, wherein said control, responsive to determination of a threshold level of repeating activity at at least a portion of image data captured by at least one of said cameras, increases the threshold degree of change.
17. The vehicle vision system of claim 14, wherein said control, responsive to a determination of a pedestrian in the field of view of at least one of said cameras, generates a signal to other vehicles indicative of the determined pedestrian.
18. The vehicle vision system of claim 14, wherein said plurality of cameras and said image processor comprise at least a part of a surround view system of the equipped vehicle.
19. A vision system for a vehicle, said vehicle vision system comprising:
a plurality of cameras disposed at a vehicle equipped with said vehicle vision system, wherein said cameras having respective fields of view exterior the equipped vehicle and is operable to capture image data;
an image processor operable to process image data captured by said cameras;
wherein said plurality of cameras and said image processor comprise at least a part of a surround view system of the equipped vehicle;
a recording device operable to record images captured by said cameras;
a control controlling operation of said cameras;
wherein said control, responsive to a determination that the equipped vehicle is in a parked state, controls said cameras to capture respective frames of image data at a first capture rate;
wherein, responsive to image processing of image data captured by said cameras, said control compares a frame of image data captured by an individual camera to at least one previous frame of image data captured by that individual camera; and
wherein, responsive to said comparison determining a change in the frames of image data captured by an individual camera beyond a threshold degree of change, (i) said control increases the capture rate to a second capture rate for at least that individual camera, (ii) at least that individual camera captures frames of image data at said second capture rate and (iii) said control activates said recording device to record images captured at said second capture rate by at least that individual camera;
wherein said control, responsive to determination of a threshold level of repeating activity at at least a portion of captured image data of at least one of said cameras, increases the threshold degree of change.
20. The vehicle vision system of claim 19, wherein, responsive to determination of said threshold degree of change in frames of image data captured by an individual camera, (i) said control increases the capture rate of the others of said cameras to said second capture rate, (ii) the others of said cameras capture frames of image data at said second capture rate and (iii) said recording device records images captured at said second capture rate by the others of said cameras.
US14/169,329 2013-02-04 2014-01-31 Vehicle data recording system Abandoned US20140218529A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US14/169,329 US20140218529A1 (en) 2013-02-04 2014-01-31 Vehicle data recording system
US16/380,438 US10523904B2 (en) 2013-02-04 2019-04-10 Vehicle data recording system
US16/729,800 US11012668B2 (en) 2013-02-04 2019-12-30 Vehicular security system that limits vehicle access responsive to signal jamming detection

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361760364P 2013-02-04 2013-02-04
US201361893489P 2013-10-21 2013-10-21
US14/169,329 US20140218529A1 (en) 2013-02-04 2014-01-31 Vehicle data recording system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/380,438 Continuation US10523904B2 (en) 2013-02-04 2019-04-10 Vehicle data recording system

Publications (1)

Publication Number Publication Date
US20140218529A1 true US20140218529A1 (en) 2014-08-07

Family

ID=51258924

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/169,329 Abandoned US20140218529A1 (en) 2013-02-04 2014-01-31 Vehicle data recording system
US16/380,438 Active US10523904B2 (en) 2013-02-04 2019-04-10 Vehicle data recording system
US16/729,800 Active US11012668B2 (en) 2013-02-04 2019-12-30 Vehicular security system that limits vehicle access responsive to signal jamming detection

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/380,438 Active US10523904B2 (en) 2013-02-04 2019-04-10 Vehicle data recording system
US16/729,800 Active US11012668B2 (en) 2013-02-04 2019-12-30 Vehicular security system that limits vehicle access responsive to signal jamming detection

Country Status (1)

Country Link
US (3) US20140218529A1 (en)

Cited By (115)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140016815A1 (en) * 2012-07-12 2014-01-16 Koji Kita Recording medium storing image processing program and image processing apparatus
US20140211007A1 (en) * 2013-01-28 2014-07-31 Fujitsu Ten Limited Object detector
US20150066237A1 (en) * 2013-09-05 2015-03-05 Hyundai Mobis Co., Ltd. Control method for around view stop mode
US9068390B2 (en) 2013-01-21 2015-06-30 Magna Electronics Inc. Vehicle hatch control system
US20150371527A1 (en) * 2013-02-13 2015-12-24 Volkswagen Ag Method and Device for Displaying Information of a System
US9269263B2 (en) 2012-02-24 2016-02-23 Magna Electronics Inc. Vehicle top clearance alert system
US9405120B2 (en) 2014-11-19 2016-08-02 Magna Electronics Solutions Gmbh Head-up display and vehicle using the same
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
US9688199B2 (en) 2014-03-04 2017-06-27 Magna Electronics Inc. Vehicle alert system utilizing communication system
US9689191B1 (en) * 2015-10-08 2017-06-27 Hyundai Motor Company Power tailgate control device and method
US9718405B1 (en) * 2015-03-23 2017-08-01 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US9729636B2 (en) 2014-08-01 2017-08-08 Magna Electronics Inc. Smart road system for vehicles
US9740945B2 (en) 2015-01-14 2017-08-22 Magna Electronics Inc. Driver assistance system for vehicle
US9881501B2 (en) 2015-11-02 2018-01-30 Magna Electronics Inc. Driver assistance system with traffic light alert
US9881220B2 (en) 2013-10-25 2018-01-30 Magna Electronics Inc. Vehicle vision system utilizing communication system
US20180072270A1 (en) * 2016-09-09 2018-03-15 Magna Electronics Inc. Vehicle surround security system
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
JP2018101402A (en) * 2016-12-21 2018-06-28 トヨタ自動車株式会社 Vehicle data recording device
US10019857B1 (en) * 2017-05-18 2018-07-10 Ford Global Technologies, Llc Hit-and-run detection
US10017114B2 (en) 2014-02-19 2018-07-10 Magna Electronics Inc. Vehicle vision system with display
WO2018130596A1 (en) * 2017-01-12 2018-07-19 Connaught Electronics Ltd. Method for operating a camera in dependency on a current state of an environmental region of the camera, camera, and motor vehicle
US10032369B2 (en) 2015-01-15 2018-07-24 Magna Electronics Inc. Vehicle vision system with traffic monitoring and alert
US10043091B2 (en) 2014-12-05 2018-08-07 Magna Electronics Inc. Vehicle vision system with retroreflector pattern recognition
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US10089869B1 (en) * 2017-05-25 2018-10-02 Ford Global Technologies, Llc Tracking hit and run perpetrators using V2X communication
US10115314B2 (en) 2015-07-08 2018-10-30 Magna Electronics Inc. Lane change system for platoon of vehicles
US10132971B2 (en) 2016-03-04 2018-11-20 Magna Electronics Inc. Vehicle camera with multiple spectral filters
US10147246B2 (en) 2016-06-09 2018-12-04 Magna Electronics Inc. Wheel bolt torque monitoring system for vehicle
US10190560B2 (en) 2016-06-03 2019-01-29 Magna Electronics Inc. Camera based vehicle start-stop feature
JP2019022063A (en) * 2017-07-14 2019-02-07 株式会社デンソーテン Image recording apparatus and image recording method
US10222224B2 (en) 2013-06-24 2019-03-05 Magna Electronics Inc. System for locating a parking space based on a previously parked space
US10240384B2 (en) * 2015-12-10 2019-03-26 Hyundai Motor Company Apparatus and method of controlling tailgate using rear-view camera in vehicle
US10250737B2 (en) * 2017-04-18 2019-04-02 Beijing Mobike Technology Co., Ltd. Terminal function setting method and device for vehicle unlocking, and mobile terminal
US20190118814A1 (en) * 2017-10-23 2019-04-25 Uber Technologies, Inc. Cargo trailer sensor assembly
US10324297B2 (en) 2015-11-30 2019-06-18 Magna Electronics Inc. Heads up display system for vehicle
US10331956B2 (en) 2015-09-23 2019-06-25 Magna Electronics Inc. Vehicle vision system with detection enhancement using light control
US10347129B2 (en) 2016-12-07 2019-07-09 Magna Electronics Inc. Vehicle system with truck turn alert
US10358115B2 (en) * 2017-10-23 2019-07-23 Hyundai Motor Company Vehicle, vehicle security system and vehicle security method
US10389016B2 (en) 2014-05-12 2019-08-20 Magna Electronics Inc. Vehicle communication system with heated antenna
US10401621B2 (en) 2016-04-19 2019-09-03 Magna Electronics Inc. Display unit for vehicle head-up display system
US10419723B2 (en) 2015-06-25 2019-09-17 Magna Electronics Inc. Vehicle communication system with forward viewing camera and integrated antenna
US10414339B2 (en) * 2014-11-07 2019-09-17 Connaught Electronics Ltd. Method for operating a camera system, camera system and motor vehicle
US10430674B2 (en) 2015-12-14 2019-10-01 Magna Electronics Inc. Vehicle vision system using reflective vehicle tags
US10462354B2 (en) 2016-12-09 2019-10-29 Magna Electronics Inc. Vehicle control system utilizing multi-camera module
US10486742B2 (en) 2016-08-01 2019-11-26 Magna Electronics Inc. Parking assist system using light projections
US10496090B2 (en) 2016-09-29 2019-12-03 Magna Electronics Inc. Handover procedure for driver of autonomous vehicle
CN110557560A (en) * 2018-05-31 2019-12-10 佳能株式会社 image pickup apparatus, control method thereof, and storage medium
US10523904B2 (en) 2013-02-04 2019-12-31 Magna Electronics Inc. Vehicle data recording system
US10562624B2 (en) 2016-11-18 2020-02-18 Magna Mirrors Of America, Inc. Vehicle vision system using aerial camera
EP3611595A1 (en) * 2018-08-16 2020-02-19 Veoneer Sweden AB A vision system for a motor vehicle and a method of training
US10574305B2 (en) 2016-05-11 2020-02-25 Magna Electronics Inc. Vehicle secured communication system
US10609335B2 (en) 2012-03-23 2020-03-31 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US10640040B2 (en) 2011-11-28 2020-05-05 Magna Electronics Inc. Vision system for vehicle
US10650304B2 (en) 2016-05-11 2020-05-12 Magna Electronics Inc. Vehicle driving assist system with enhanced data processing
US10677894B2 (en) 2016-09-06 2020-06-09 Magna Electronics Inc. Vehicle sensing system for classification of vehicle model
US10703274B2 (en) * 2004-09-14 2020-07-07 Magna Electronics Inc. Vehicular multi-camera vision system including rear backup camera
US10708227B2 (en) 2016-07-19 2020-07-07 Magna Electronics Inc. Scalable secure gateway for vehicle
US10703341B2 (en) 2017-02-03 2020-07-07 Magna Electronics Inc. Vehicle sensor housing with theft protection
US10703204B2 (en) 2016-03-23 2020-07-07 Magna Electronics Inc. Vehicle driver monitoring system
US20200243113A1 (en) * 2019-01-30 2020-07-30 Alice Wilson Roberson Vehicle Camera and Record System
US10768298B2 (en) * 2016-06-14 2020-09-08 Magna Electronics Inc. Vehicle sensing system with 360 degree near range sensing
CN111656411A (en) * 2018-12-19 2020-09-11 Jvc建伍株式会社 Recording control device, recording control system, recording control method, and recording control program
US10782388B2 (en) 2017-02-16 2020-09-22 Magna Electronics Inc. Vehicle radar system with copper PCB
US10819943B2 (en) 2015-05-07 2020-10-27 Magna Electronics Inc. Vehicle vision system with incident recording function
US10852418B2 (en) 2016-08-24 2020-12-01 Magna Electronics Inc. Vehicle sensor with integrated radar and image sensors
CN112071083A (en) * 2020-09-15 2020-12-11 台州市远行客网络技术有限公司 Motor vehicle license plate relay identification system and license plate relay identification method
US10870426B2 (en) 2017-06-22 2020-12-22 Magna Electronics Inc. Driving assistance system with rear collision mitigation
US10870449B2 (en) * 2015-08-18 2020-12-22 Magna Electronics Inc. Vehicular trailering system
US10875498B2 (en) 2018-09-26 2020-12-29 Magna Electronics Inc. Vehicular alert system for door lock function
US10875403B2 (en) 2015-10-27 2020-12-29 Magna Electronics Inc. Vehicle vision system with enhanced night vision
US10883846B2 (en) 2017-08-23 2021-01-05 Magna Electronics Inc. Vehicle communication system with area classifier
US10936884B2 (en) * 2017-01-23 2021-03-02 Magna Electronics Inc. Vehicle vision system with object detection failsafe
US11006068B1 (en) * 2019-11-11 2021-05-11 Bendix Commercial Vehicle Systems Llc Video recording based on image variance
US11021107B2 (en) 2008-10-16 2021-06-01 Magna Mirrors Of America, Inc. Vehicular interior rearview mirror system with display
US11027654B2 (en) 2015-12-04 2021-06-08 Magna Electronics Inc. Vehicle vision system with compressed video transfer via DSRC link
US11034299B2 (en) 2015-05-06 2021-06-15 Magna Mirrors Of America, Inc. Vehicular vision system with episodic display of video images showing approaching other vehicle
US11047977B2 (en) 2018-02-20 2021-06-29 Magna Electronics Inc. Vehicle radar system with solution for ADC saturation
US11064165B2 (en) 2019-02-05 2021-07-13 Magna Electronics Inc. Wireless trailer camera system with tracking feature
US11068918B2 (en) 2016-09-22 2021-07-20 Magna Electronics Inc. Vehicle communication system
US11093766B1 (en) 2020-04-03 2021-08-17 Micron Technology, Inc. Detect and alert of forgotten items left in a vehicle
US11124113B2 (en) 2017-04-18 2021-09-21 Magna Electronics Inc. Vehicle hatch clearance determining system
US20210309183A1 (en) * 2020-04-03 2021-10-07 Micron Technology, Inc. Intelligent Detection and Alerting of Potential Intruders
US11146759B1 (en) * 2018-11-13 2021-10-12 JMJ Designs, LLC Vehicle camera system
US11244564B2 (en) 2017-01-26 2022-02-08 Magna Electronics Inc. Vehicle acoustic-based emergency vehicle detection
US11267393B2 (en) 2019-05-16 2022-03-08 Magna Electronics Inc. Vehicular alert system for alerting drivers of other vehicles responsive to a change in driving conditions
US11299113B2 (en) * 2018-12-07 2022-04-12 Hyundai Motor Company Device for assisting safe exit from vehicle, system having the same, and method thereof
US11333739B2 (en) 2019-02-26 2022-05-17 Magna Electronics Inc. Vehicular radar system with automatic sensor alignment
US11332124B2 (en) 2019-01-10 2022-05-17 Magna Electronics Inc. Vehicular control system
US11341671B2 (en) 2018-11-01 2022-05-24 Magna Electronics Inc. Vehicular driver monitoring system
US11417107B2 (en) 2018-02-19 2022-08-16 Magna Electronics Inc. Stationary vision system at vehicle roadway
US11454720B2 (en) 2018-11-28 2022-09-27 Magna Electronics Inc. Vehicle radar system with enhanced wave guide antenna system
US11454719B2 (en) 2016-07-08 2022-09-27 Magna Electronics Inc. 2D MIMO radar system for vehicle
US11462107B1 (en) 2019-07-23 2022-10-04 BlueOwl, LLC Light emitting diodes and diode arrays for smart ring visual output
US11479258B1 (en) 2019-07-23 2022-10-25 BlueOwl, LLC Smart ring system for monitoring UVB exposure levels and using machine learning technique to predict high risk driving behavior
US11486968B2 (en) 2017-11-15 2022-11-01 Magna Electronics Inc. Vehicle Lidar sensing system with sensor module
US11488399B2 (en) 2018-12-19 2022-11-01 Magna Electronics Inc. Vehicle driver monitoring system for determining driver workload
US11488397B2 (en) 2015-11-02 2022-11-01 Magna Electronics Inc. Vehicle customization system
US11503251B2 (en) 2012-01-20 2022-11-15 Magna Electronics Inc. Vehicular vision system with split display
US11537203B2 (en) 2019-07-23 2022-12-27 BlueOwl, LLC Projection system for smart ring visual output
US11536829B2 (en) 2017-02-16 2022-12-27 Magna Electronics Inc. Vehicle radar system with radar embedded into radome
US11551644B1 (en) * 2019-07-23 2023-01-10 BlueOwl, LLC Electronic ink display for smart ring
US11560143B2 (en) 2020-05-26 2023-01-24 Magna Electronics Inc. Vehicular autonomous parking system using short range communication protocols
US11594128B2 (en) 2019-07-23 2023-02-28 BlueOwl, LLC Non-visual outputs for a smart ring
US11637511B2 (en) 2019-07-23 2023-04-25 BlueOwl, LLC Harvesting energy for a smart ring via piezoelectric charging
US11653090B1 (en) * 2017-07-04 2023-05-16 Ramin Farjadrad Intelligent distributed systems and methods for live traffic monitoring and optimization
US20230150447A1 (en) * 2017-11-06 2023-05-18 Magna Electronics Inc. Vehicular vision system with underbody camera
US11749105B2 (en) 2020-10-01 2023-09-05 Magna Electronics Inc. Vehicular communication system with turn signal identification
US11808876B2 (en) 2018-10-25 2023-11-07 Magna Electronics Inc. Vehicular radar system with vehicle to infrastructure communication
US11823395B2 (en) 2020-07-02 2023-11-21 Magna Electronics Inc. Vehicular vision system with road contour detection feature
US11853030B2 (en) 2019-07-23 2023-12-26 BlueOwl, LLC Soft smart ring and method of manufacture
US11866063B2 (en) 2020-01-10 2024-01-09 Magna Electronics Inc. Communication system and method
US11877054B2 (en) 2011-09-21 2024-01-16 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US11894704B2 (en) 2019-07-23 2024-02-06 BlueOwl, LLC Environment-integrated smart ring charger
US11949673B1 (en) 2019-07-23 2024-04-02 BlueOwl, LLC Gesture authentication using a smart ring
US11971475B2 (en) 2022-09-26 2024-04-30 Magna Electronics Inc. 2D MIMO vehicular radar sensing system

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017119394A1 (en) * 2017-08-28 2019-02-28 HELLA GmbH & Co. KGaA Method for controlling at least one light module of a lighting unit of a vehicle, lighting unit, computer program product and computer-readable medium
US10577852B2 (en) * 2018-02-13 2020-03-03 GM Global Technology Operations LLC Method and apparatus for preventing tailgate collision with hitch accessory
US11353872B2 (en) 2018-07-30 2022-06-07 Pony Ai Inc. Systems and methods for selectively capturing and filtering sensor data of an autonomous vehicle
US11745816B2 (en) * 2020-05-28 2023-09-05 Acton, Inc. Electric vehicle with one or more cameras
US11676397B2 (en) 2020-10-15 2023-06-13 Denso International America, Inc. System and method for detecting an object collision

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040016870A1 (en) * 2002-05-03 2004-01-29 Pawlicki John A. Object detection system for vehicle
US20040212678A1 (en) * 2003-04-25 2004-10-28 Cooper Peter David Low power motion detection system
US20100198513A1 (en) * 2009-02-03 2010-08-05 Gm Global Technology Operations, Inc. Combined Vehicle-to-Vehicle Communication and Object Detection Sensing
US20100265344A1 (en) * 2009-04-15 2010-10-21 Qualcomm Incorporated Auto-triggered fast frame rate digital video recording
US20110148609A1 (en) * 2009-12-21 2011-06-23 Harsha Dabholkar Apparatus And Method For Reducing False Alarms In Stolen Vehicle Tracking
US20120105635A1 (en) * 2010-10-27 2012-05-03 Erhardt Herbert J Automotive imaging system for recording exception events
US20120268601A1 (en) * 2011-04-25 2012-10-25 Mitac International Corp. Method of recording traffic images and a drive recorder system
US20130141597A1 (en) * 2011-12-06 2013-06-06 Kyungsuk David Lee Controlling power consumption in object tracking pipeline
US20140192206A1 (en) * 2013-01-07 2014-07-10 Leap Motion, Inc. Power consumption in motion-capture systems

Family Cites Families (249)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5170374A (en) 1981-05-13 1992-12-08 Hitachi, Ltd. Semiconductor memory
US6442465B2 (en) 1992-05-05 2002-08-27 Automotive Technologies International, Inc. Vehicular component control systems and methods
US6735506B2 (en) 1992-05-05 2004-05-11 Automotive Technologies International, Inc. Telematics system
US5845000A (en) 1992-05-05 1998-12-01 Automotive Technologies International, Inc. Optical identification and monitoring system using pattern recognition for use with vehicles
US5001558A (en) 1985-06-11 1991-03-19 General Motors Corporation Night vision system with color video camera
JPH01173825A (en) 1987-12-28 1989-07-10 Aisin Aw Co Ltd Navigation device for vehicle
US5003288A (en) 1988-10-25 1991-03-26 Nartron Corporation Ambient light sensing method and apparatus
US5614885A (en) 1988-12-05 1997-03-25 Prince Corporation Electrical control system for vehicle options
FR2642855B1 (en) 1989-02-06 1991-05-17 Essilor Int OPTICAL LENS FOR THE CORRECTION OF ASTIGMATISM
JPH0749925B2 (en) 1989-03-01 1995-05-31 浜松ホトニクス株式会社 Two-dimensional incident position detector
US5097362A (en) 1989-07-19 1992-03-17 Lynas Robert M Rearview mirror targeting and repositioning system
US5027001A (en) 1989-08-29 1991-06-25 Torbert William F Moisture sensitive automatic windshield wiper and headlight control device
US4987357A (en) 1989-12-18 1991-01-22 General Motors Corporation Adaptive motor vehicle cruise control
US5059877A (en) 1989-12-22 1991-10-22 Libbey-Owens-Ford Co. Rain responsive windshield wiper control
JP2843079B2 (en) 1989-12-22 1999-01-06 本田技研工業株式会社 Driving path determination method
FR2658642B1 (en) 1990-02-20 1994-06-10 Rousseau Codes METHOD AND DEVICE FOR DRIVING DRIVING LAND VEHICLES.
US5303205A (en) 1990-02-26 1994-04-12 Trend Tec Inc. Vehicular distance measuring system with integral mirror display
JP2920653B2 (en) 1990-03-15 1999-07-19 アイシン精機株式会社 In-vehicle imaging device
DE4111993B4 (en) 1990-04-23 2005-05-25 Volkswagen Ag Camera for an image processing system
US5121200A (en) 1990-07-06 1992-06-09 Choi Seung Lyul Travelling monitoring system for motor vehicles
US5027200A (en) 1990-07-10 1991-06-25 Edward Petrossian Enhanced viewing at side and rear of motor vehicles
US5177685A (en) 1990-08-09 1993-01-05 Massachusetts Institute Of Technology Automobile navigation system using real time spoken driving instructions
US5086253A (en) 1990-10-15 1992-02-04 Lawler Louis N Automatic headlight dimmer apparatus
US5309137A (en) 1991-02-26 1994-05-03 Mitsubishi Denki Kabushiki Kaisha Motor car traveling control device
US5451822A (en) 1991-03-15 1995-09-19 Gentex Corporation Electronic control system
KR930001987Y1 (en) 1991-03-28 1993-04-19 홍선택 Rear-view mirror adjusting device
WO1992018848A1 (en) 1991-04-23 1992-10-29 Introlab Pty. Limited A moisture sensor
US5182502A (en) 1991-05-06 1993-01-26 Lectron Products, Inc. Automatic headlamp dimmer
US5245422A (en) 1991-06-28 1993-09-14 Zexel Corporation System and method for automatically steering a vehicle within a lane in a road
JP2782990B2 (en) 1991-07-11 1998-08-06 日産自動車株式会社 Vehicle approach determination device
US5469298A (en) 1991-08-14 1995-11-21 Prince Corporation Reflective display at infinity
JPH0554276A (en) 1991-08-23 1993-03-05 Matsushita Electric Ind Co Ltd Obstacle detection device
US5193000A (en) 1991-08-28 1993-03-09 Stereographics Corporation Multiplexing technique for stereoscopic video system
US5416318A (en) 1991-10-03 1995-05-16 Hegyi; Dennis J. Combined headlamp and climate control sensor having a light diffuser and a light modulator
FR2682792B1 (en) 1991-10-16 1995-10-20 Ii Bc Sys DEVICE FOR AVOIDING CARAMBOLAGES IN CHAIN.
JP3167752B2 (en) 1991-10-22 2001-05-21 富士重工業株式会社 Vehicle distance detection device
US5535314A (en) 1991-11-04 1996-07-09 Hughes Aircraft Company Video image processor and method for detecting vehicles
JP3031013B2 (en) 1991-11-15 2000-04-10 日産自動車株式会社 Visual information providing device
US5276389A (en) 1991-12-14 1994-01-04 Leopold Kostal Gmbh & Co. Kg Method of controlling a windshield wiper system
US5336980A (en) 1992-12-10 1994-08-09 Leopold Kostal Gmbh & Co. Apparatus and method for controlling a windshield wiping system
US5394333A (en) 1991-12-23 1995-02-28 Zexel Usa Corp. Correcting GPS position in a hybrid naviation system
US5208701A (en) 1991-12-24 1993-05-04 Xerox Corporation Wobble correction lens with binary diffractive optic surface and refractive cylindrical surface
US5461357A (en) 1992-01-29 1995-10-24 Mazda Motor Corporation Obstacle detection device for vehicle
JP2800531B2 (en) 1992-02-28 1998-09-21 三菱電機株式会社 Obstacle detection device for vehicles
JP2973695B2 (en) 1992-03-12 1999-11-08 船井電機株式会社 In-vehicle navigation system
JPH05265547A (en) 1992-03-23 1993-10-15 Fuji Heavy Ind Ltd On-vehicle outside monitoring device
US5204778A (en) 1992-04-06 1993-04-20 Gentex Corporation Control system for automatic rearview mirrors
US5325386A (en) 1992-04-21 1994-06-28 Bandgap Technology Corporation Vertical-cavity surface emitting laser assay display system
EP0567660B2 (en) 1992-04-21 2000-09-06 IBP Pietzsch GmbH Device for the guiding of vehicles
GB2267341B (en) 1992-05-27 1996-02-21 Koito Mfg Co Ltd Glare sensor for a vehicle
US5515448A (en) 1992-07-28 1996-05-07 Yazaki Corporation Distance measuring apparatus of a target tracking type
JPH0785280B2 (en) 1992-08-04 1995-09-13 タカタ株式会社 Collision prediction judgment system by neural network
US5351044A (en) 1992-08-12 1994-09-27 Rockwell International Corporation Vehicle lane position detection system
BR9306901A (en) 1992-08-14 1998-12-08 Vorad Safety Systems Inc Recording of operational events in an automotive vehicle
EP0655142B1 (en) 1992-08-14 1999-06-23 Vorad Safety Systems, Inc. Smart blind spot sensor
JP2783079B2 (en) 1992-08-28 1998-08-06 トヨタ自動車株式会社 Light distribution control device for headlamp
US5448319A (en) 1992-09-22 1995-09-05 Olympus Optical Co., Ltd. Optical system for monitor cameras to be mounted on vehicles
DE4332612C2 (en) 1992-09-25 1996-02-22 Yazaki Corp Exterior view monitoring method for motor vehicles
JP3462227B2 (en) 1992-11-13 2003-11-05 矢崎総業株式会社 Display device for vehicles
EP0631167B1 (en) 1992-12-14 2005-02-16 Denso Corporation Image display
US5285060A (en) 1992-12-15 1994-02-08 Donnelly Corporation Display for automatic rearview mirror
JP3263699B2 (en) 1992-12-22 2002-03-04 三菱電機株式会社 Driving environment monitoring device
DE69324224T2 (en) 1992-12-29 1999-10-28 Koninkl Philips Electronics Nv Image processing method and apparatus for generating an image from a plurality of adjacent images
JPH06213660A (en) 1993-01-19 1994-08-05 Aisin Seiki Co Ltd Detecting method for approximate straight line of image
US5529138A (en) 1993-01-22 1996-06-25 Shaw; David C. H. Vehicle collision avoidance system
US5289321A (en) 1993-02-12 1994-02-22 Secor James O Consolidated rear view camera and display system for motor vehicle
US5313072A (en) 1993-02-16 1994-05-17 Rockwell International Corporation Optical detector for windshield wiper control
US6498620B2 (en) 1993-02-26 2002-12-24 Donnelly Corporation Vision system for a vehicle including an image capture device and a display system having a long focal length
US5550677A (en) 1993-02-26 1996-08-27 Donnelly Corporation Automatic rearview mirror system using a photosensor array
US5877897A (en) 1993-02-26 1999-03-02 Donnelly Corporation Automatic rearview mirror, vehicle lighting control and vehicle interior monitoring system using a photosensor array
US5796094A (en) 1993-02-26 1998-08-18 Donnelly Corporation Vehicle headlight control using imaging sensor
US6396397B1 (en) 1993-02-26 2002-05-28 Donnelly Corporation Vehicle imaging system with stereo imaging
US5670935A (en) 1993-02-26 1997-09-23 Donnelly Corporation Rearview vision system for vehicle including panoramic view
JP3468428B2 (en) 1993-03-24 2003-11-17 富士重工業株式会社 Vehicle distance detection device
DE4408745C2 (en) 1993-03-26 1997-02-27 Honda Motor Co Ltd Driving control device for vehicles
JP2887039B2 (en) 1993-03-26 1999-04-26 三菱電機株式会社 Vehicle periphery monitoring device
US6430303B1 (en) 1993-03-31 2002-08-06 Fujitsu Limited Image processing apparatus
JPH09501120A (en) 1993-03-31 1997-02-04 オートモーティブ・テクノロジーズ・インターナショナル・インク Position / speed sensor for passengers in the vehicle
US6084519A (en) 1993-05-07 2000-07-04 Control Devices, Inc. Multi-function light sensor for vehicle
DE4318114C2 (en) 1993-06-01 1998-07-16 Kostal Leopold Gmbh & Co Kg Sensor device
US6553130B1 (en) 1993-08-11 2003-04-22 Jerome H. Lemelson Motor vehicle warning and control system and method
US5434407A (en) 1993-08-23 1995-07-18 Gentex Corporation Automatic rearview mirror incorporating light pipe
GB9317983D0 (en) 1993-08-28 1993-10-13 Lucas Ind Plc A driver assistance system for a vehicle
US5586063A (en) 1993-09-01 1996-12-17 Hardin; Larry C. Optical range and speed detection system
US5638116A (en) 1993-09-08 1997-06-10 Sumitomo Electric Industries, Ltd. Object recognition apparatus and method
US5374852A (en) 1993-09-17 1994-12-20 Parkes; Walter B. Motor vehicle headlight activation apparatus for inclement weather conditions
US5440428A (en) 1993-09-30 1995-08-08 Hughes Aircraft Company Automotive instrument 3-D virtual image display
US5883739A (en) 1993-10-04 1999-03-16 Honda Giken Kogyo Kabushiki Kaisha Information display device for vehicle
US5406395A (en) 1993-11-01 1995-04-11 Hughes Aircraft Company Holographic parking assistance device
JP3522317B2 (en) 1993-12-27 2004-04-26 富士重工業株式会社 Travel guide device for vehicles
US5430431A (en) 1994-01-19 1995-07-04 Nelson; Louis J. Vehicle protection system and method
US5471515A (en) 1994-01-28 1995-11-28 California Institute Of Technology Active pixel sensor with intra-pixel charge transfer
JP3358099B2 (en) 1994-03-25 2002-12-16 オムロン株式会社 Optical sensor device
US5666028A (en) 1994-04-06 1997-09-09 Gentex Corporation Automobile headlamp and running light control system
US5537003A (en) 1994-04-08 1996-07-16 Gentex Corporation Control system for automotive vehicle headlamps and other vehicle equipment
FR2718874B1 (en) 1994-04-15 1996-05-15 Thomson Csf Traffic monitoring method for automatic detection of vehicle incidents.
US5963247A (en) 1994-05-31 1999-10-05 Banitt; Shmuel Visual display systems and a system for producing recordings for visualization thereon and methods therefor
ES1028357Y (en) 1994-06-03 1995-06-16 Cortes Luis Leon Lamata RECEIVING DEVICE FOR REAR VIEW SCREEN.
US5574443A (en) 1994-06-22 1996-11-12 Hsieh; Chi-Sheng Vehicle monitoring apparatus with broadly and reliably rearward viewing
JP3287117B2 (en) 1994-07-05 2002-05-27 株式会社日立製作所 Environment recognition device for vehicles using imaging device
JP3357749B2 (en) 1994-07-12 2002-12-16 本田技研工業株式会社 Vehicle road image processing device
US5793420A (en) 1994-10-28 1998-08-11 Schmidt; William P. Video recording system for vehicle
US5732379A (en) 1994-11-25 1998-03-24 Itt Automotive Europe Gmbh Brake system for a motor vehicle with yaw moment control
US5677851A (en) 1994-12-15 1997-10-14 Novell, Inc. Method and apparatus to secure digital directory object changes
JPH08175263A (en) 1994-12-27 1996-07-09 Murakami Kaimeidou:Kk Interior mirror with built-in display device
US5614788A (en) 1995-01-31 1997-03-25 Autosmart Light Switches, Inc. Automated ambient condition responsive daytime running light system
US5528698A (en) 1995-03-27 1996-06-18 Rockwell International Corporation Automotive occupant sensing device
JP2885125B2 (en) 1995-03-30 1999-04-19 トヨタ自動車株式会社 Estimation method of motion state quantity changing with turning of vehicle
JP3539788B2 (en) 1995-04-21 2004-07-07 パナソニック モバイルコミュニケーションズ株式会社 Image matching method
US5500766A (en) 1995-05-04 1996-03-19 Stonecypher; Bob Blind spot side mirror
US5568027A (en) 1995-05-19 1996-10-22 Libbey-Owens-Ford Co. Smooth rain-responsive wiper control
US5737226A (en) 1995-06-05 1998-04-07 Prince Corporation Vehicle compass system with automatic calibration
US7202776B2 (en) 1997-10-22 2007-04-10 Intelligent Technologies International, Inc. Method and system for detecting objects external to a vehicle
US7085637B2 (en) 1997-10-22 2006-08-01 Intelligent Technologies International, Inc. Method and system for controlling a vehicle
US5915800A (en) 1995-06-19 1999-06-29 Fuji Jukogyo Kabushiki Kaisha System for controlling braking of an automotive vehicle
JP3546600B2 (en) 1995-09-07 2004-07-28 トヨタ自動車株式会社 Light distribution control device for headlamp
US5724316A (en) 1995-09-26 1998-03-03 Delco Electronics Corporation GPS based time determining system and method
US5878370A (en) 1995-12-01 1999-03-02 Prince Corporation Vehicle compass system with variable resolution
US6266082B1 (en) 1995-12-19 2001-07-24 Canon Kabushiki Kaisha Communication apparatus image processing apparatus communication method and image processing method
US5790973A (en) 1995-12-19 1998-08-04 Prince Corporation Last exit warning system
US5761094A (en) 1996-01-18 1998-06-02 Prince Corporation Vehicle compass system
US5786772A (en) 1996-03-22 1998-07-28 Donnelly Corporation Vehicle blind spot detection display system
US5661303A (en) 1996-05-24 1997-08-26 Libbey-Owens-Ford Co. Compact moisture sensor with collimator lenses and prismatic coupler
US6550949B1 (en) 1996-06-13 2003-04-22 Gentex Corporation Systems and components for enhancing rear vision from a vehicle
DE19624046A1 (en) 1996-06-17 1997-12-18 Bayerische Motoren Werke Ag Method and device for indicating the braking strength or deceleration in a vehicle
JP3805832B2 (en) 1996-07-10 2006-08-09 富士重工業株式会社 Vehicle driving support device
JPH1059068A (en) 1996-08-23 1998-03-03 Yoshihisa Furuta Dead angle confirmation device for vehicle
US5878357A (en) 1996-09-03 1999-03-02 Ford Global Technologies, Inc. Method and apparatus for vehicle yaw rate estimation
US5924212A (en) 1996-10-09 1999-07-20 Donnelly Corporation Electronic compass
JPH10161013A (en) 1996-12-05 1998-06-19 Canon Inc Environment recognition device and camera provided therewith
JP4162717B2 (en) 1996-12-10 2008-10-08 タッチ センサー テクノロジーズ,エルエルシー Differential touch sensor and control circuit thereof
US5877707A (en) 1997-01-17 1999-03-02 Kowalick; Thomas M. GPS based seat belt monitoring system & method for using same
US5844505A (en) 1997-04-01 1998-12-01 Sony Corporation Automobile navigation system
US5837994C1 (en) 1997-04-02 2001-10-16 Gentex Corp Control system to automatically dim vehicle head lamps
US6631316B2 (en) 2001-03-05 2003-10-07 Gentex Corporation Image processing system to control vehicle headlamps or other vehicle equipment
US6049171A (en) 1998-09-18 2000-04-11 Gentex Corporation Continuously variable headlamp control
US5923027A (en) 1997-09-16 1999-07-13 Gentex Corporation Moisture sensor and windshield fog detector using an image sensor
US6611610B1 (en) 1997-04-02 2003-08-26 Gentex Corporation Vehicle lamp control
US5990469A (en) 1997-04-02 1999-11-23 Gentex Corporation Control circuit for image array sensors
US6587573B1 (en) 2000-03-20 2003-07-01 Gentex Corporation System for controlling exterior vehicle lights
US5883443A (en) * 1997-06-27 1999-03-16 Ut Automotive Dearborn, Inc. Countermeasure method and system for securing a remote keyless entry system
JP3508909B2 (en) 1997-07-01 2004-03-22 株式会社村上開明堂 Rearview mirror quick deflection controller
US6313454B1 (en) 1999-07-02 2001-11-06 Donnelly Corporation Rain sensor
US6353392B1 (en) 1997-10-30 2002-03-05 Donnelly Corporation Rain sensor with fog discrimination
US6020704A (en) 1997-12-02 2000-02-01 Valeo Electrical Systems, Inc. Windscreen sensing and wiper control system
US6294989B1 (en) 1998-12-16 2001-09-25 Donnelly Corporation Tire inflation assistance monitoring system
US6124647A (en) 1998-12-16 2000-09-26 Donnelly Corporation Information display in a rearview mirror
DE19812237C1 (en) 1998-03-20 1999-09-23 Daimler Chrysler Ag Method for driving dynamics control on a road vehicle
US5899956A (en) 1998-03-31 1999-05-04 Advanced Future Technologies, Inc. Vehicle mounted navigation device
US6477464B2 (en) 2000-03-09 2002-11-05 Donnelly Corporation Complete mirror-based global-positioning system (GPS) navigation solution
US6175300B1 (en) 1998-09-03 2001-01-16 Byron K. Kendrick Blind spot viewing system
US6066933A (en) 1998-10-02 2000-05-23 Ponziana; Richard L. Rain sensing system and method having automatically registered and oriented rain sensor
US6266442B1 (en) 1998-10-23 2001-07-24 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US6201642B1 (en) 1999-07-27 2001-03-13 Donnelly Corporation Vehicular vision system with a wide angle lens including a diffractive element
US6320282B1 (en) 1999-01-19 2001-11-20 Touchsensor Technologies, Llc Touch switch with integral control circuit
DE19902081A1 (en) 1999-01-20 2000-07-27 Zeiss Carl Fa Stabilized camera
US6166698A (en) 1999-02-16 2000-12-26 Gentex Corporation Rearview mirror with integrated microwave receiver
US6144022A (en) 1999-03-15 2000-11-07 Valeo Electrical Systems, Inc. Rain sensor using statistical analysis
US6333759B1 (en) 1999-03-16 2001-12-25 Joseph J. Mazzilli 360 ° automobile video camera system
US6392315B1 (en) 1999-04-05 2002-05-21 Delphi Technologies, Inc. Compensation circuit for an automotive ignition sensing system
CA2369648A1 (en) 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Limited Image processing device and monitoring system
US6795221B1 (en) 1999-08-05 2004-09-21 Microvision, Inc. Scanned display with switched feeds and distortion correction
US6411204B1 (en) 1999-11-15 2002-06-25 Donnelly Corporation Deceleration based anti-collision safety light control for vehicle
US6704621B1 (en) 1999-11-26 2004-03-09 Gideon P. Stein System and method for estimating ego-motion of a moving vehicle using successive images recorded along the vehicle's path of motion
SE520360C2 (en) 1999-12-15 2003-07-01 Goeran Sjoenell Warning device for vehicles
US6526335B1 (en) 2000-01-24 2003-02-25 G. Victor Treyz Automobile personal computer systems
JP2001213254A (en) 2000-01-31 2001-08-07 Yazaki Corp Side monitoring device for vehicle
WO2007053710A2 (en) 2005-11-01 2007-05-10 Donnelly Corporation Interior rearview mirror with display
US7167796B2 (en) 2000-03-09 2007-01-23 Donnelly Corporation Vehicle navigation system for use with a telematics system
AU2001243285A1 (en) 2000-03-02 2001-09-12 Donnelly Corporation Video mirror systems incorporating an accessory module
KR100373002B1 (en) 2000-04-03 2003-02-25 현대자동차주식회사 Method for judgment out of lane of vehicle
US7365769B1 (en) 2000-07-06 2008-04-29 Donald Mager Activating a vehicle's own brake lights and/or brakes when brake lights are sensed in front of the vehicle, including responsively to the proximity of, and/or rate of closure with, a forward vehicle
GB2365142B (en) 2000-07-27 2002-06-19 Michael John Downs Jamin-type interferometers and components therefor
JP3521860B2 (en) 2000-10-02 2004-04-26 日産自動車株式会社 Vehicle travel path recognition device
US7062300B1 (en) 2000-11-09 2006-06-13 Ki Il Kim Cellular phone holder with charger mounted to vehicle dashboard
US6672731B2 (en) 2000-11-20 2004-01-06 Donnelly Corporation Vehicular rearview mirror with blind spot viewing system
ES2287266T3 (en) 2001-01-23 2007-12-16 Donnelly Corporation IMPROVED VEHICLE LIGHTING SYSTEM.
US20020113873A1 (en) 2001-02-20 2002-08-22 Williams Michael R. Rear vision system for large vehicles
US6424273B1 (en) 2001-03-30 2002-07-23 Koninklijke Philips Electronics N.V. System to aid a driver to determine whether to change lanes
DE10118265A1 (en) 2001-04-12 2002-10-17 Bosch Gmbh Robert Detecting vehicle lane change, involves forming track change indicating signal by comparing measured angular rate of preceding vehicle(s) with vehicle's own yaw rate
DE20106977U1 (en) 2001-04-23 2002-08-29 Mekra Lang Gmbh & Co Kg Warning device in motor vehicles
US6539306B2 (en) 2001-06-15 2003-03-25 Gentex Corporation Automotive mirror with integrated Loran components
US6497503B1 (en) 2001-06-21 2002-12-24 Ford Global Technologies, Inc. Headlamp system with selectable beam pattern
WO2003029046A1 (en) 2001-10-03 2003-04-10 Maryann Winter Apparatus and method for sensing the occupancy status of parking spaces in a parking lot
US6636258B2 (en) 2001-10-19 2003-10-21 Ford Global Technologies, Llc 360° vision system for a vehicle
US6909753B2 (en) 2001-12-05 2005-06-21 Koninklijke Philips Electronics, N.V. Combined MPEG-4 FGS and modulation algorithm for wireless video transmission
US20030137586A1 (en) 2002-01-22 2003-07-24 Infinite Innovations, Inc. Vehicle video switching system and method
US6824281B2 (en) 2002-01-31 2004-11-30 Donnelly Corporation Vehicle accessory module
EP1332923B1 (en) 2002-02-05 2007-07-11 Donnelly Hohe GmbH & Co. KG Manoeuvring and/or parking aid device for a vehicle
US6975775B2 (en) 2002-03-06 2005-12-13 Radiant Imaging, Inc. Stray light correction method for imaging light and color measurement system
US20030222982A1 (en) 2002-03-28 2003-12-04 Hamdan Majil M. Integrated video/data information system and method for application to commercial vehicles to enhance driver awareness
US7145519B2 (en) 2002-04-18 2006-12-05 Nissan Motor Co., Ltd. Image display apparatus, method, and program for automotive vehicle
US7004606B2 (en) 2002-04-23 2006-02-28 Donnelly Corporation Automatic headlamp control
US6946978B2 (en) 2002-04-25 2005-09-20 Donnelly Corporation Imaging system for vehicle
US7123168B2 (en) 2002-04-25 2006-10-17 Donnelly Corporation Driving separation distance indicator
US10562492B2 (en) * 2002-05-01 2020-02-18 Gtj Ventures, Llc Control, monitoring and/or security apparatus and method
DE20214892U1 (en) 2002-09-25 2002-11-21 Hohe Gmbh & Co Kg Monitoring device for a motor vehicle
US7136753B2 (en) 2002-12-05 2006-11-14 Denso Corporation Object recognition apparatus for vehicle, inter-vehicle control apparatus, and distance measurement apparatus
US7541743B2 (en) 2002-12-13 2009-06-02 Ford Global Technologies, Llc Adaptive vehicle communication controlled lighting system
DE10346508B4 (en) 2003-10-02 2007-10-11 Daimlerchrysler Ag Device for improving the visibility in a motor vehicle
WO2005042321A1 (en) 2003-10-28 2005-05-12 Continental Teves Ag & Co.Ohg Method and system for improving the handling characteristics of a vehicle
US7526103B2 (en) 2004-04-15 2009-04-28 Donnelly Corporation Imaging system for vehicle
US7227611B2 (en) 2004-08-23 2007-06-05 The Boeing Company Adaptive and interactive scene illumination
US7881496B2 (en) 2004-09-30 2011-02-01 Donnelly Corporation Vision system for vehicle
US20060103727A1 (en) 2004-11-17 2006-05-18 Huan-Chin Tseng Vehicle back up camera
US8258932B2 (en) 2004-11-22 2012-09-04 Donnelly Corporation Occupant detection system for vehicle
US7720580B2 (en) 2004-12-23 2010-05-18 Donnelly Corporation Object detection system for vehicle
US20060164221A1 (en) 2005-01-18 2006-07-27 Jensen John M Sensor-activated controlled safety or warning light mounted on or facing toward rear of vehicle
DE502006007631D1 (en) 2005-02-22 2010-09-23 Adc Automotive Dist Control METHOD FOR DETECTING THE ACTIVATION OF BRAKE LIGHTS OF PREVAILING VEHICLES
JP4479573B2 (en) * 2005-04-13 2010-06-09 トヨタ自動車株式会社 Vehicle anti-theft device
US20060250501A1 (en) 2005-05-06 2006-11-09 Widmann Glenn R Vehicle security monitor system and method
JP2006341641A (en) 2005-06-07 2006-12-21 Nissan Motor Co Ltd Image display apparatus and image display method
JP4580288B2 (en) 2005-06-28 2010-11-10 本田技研工業株式会社 Driving assistance car
US7460951B2 (en) 2005-09-26 2008-12-02 Gm Global Technology Operations, Inc. System and method of target tracking using sensor fusion
CN101816008A (en) 2005-10-28 2010-08-25 马格纳电子系统公司 Camera module for vehicle vision system
JP2007129525A (en) 2005-11-04 2007-05-24 Konica Minolta Photo Imaging Inc Camera system and controller
DE602006019156D1 (en) 2005-12-27 2011-02-03 Honda Motor Co Ltd Vehicle and steering control device for a vehicle
GB2438009A (en) * 2006-02-24 2007-11-14 Location Company Ltd Vehicle security system
JP4462231B2 (en) 2006-05-09 2010-05-12 株式会社デンソー Auto light device for vehicle
US7724962B2 (en) 2006-07-07 2010-05-25 Siemens Corporation Context adaptive approach in vehicle detection under various visibility conditions
EP3624086A1 (en) 2007-01-25 2020-03-18 Magna Electronics Inc. Radar sensing system for vehicle
JP4497231B2 (en) 2007-10-09 2010-07-07 株式会社デンソー Vehicle speed control device
TWI372564B (en) 2007-10-30 2012-09-11 Av Tech Corp Video system, image emission apparatus, video receiver apparatus and control method
US8027029B2 (en) 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system
DE102008003194A1 (en) 2008-01-04 2009-07-09 Wabco Gmbh Driver assistance system
US8154418B2 (en) 2008-03-31 2012-04-10 Magna Mirrors Of America, Inc. Interior rearview mirror system
US20090265069A1 (en) 2008-04-17 2009-10-22 Herman Desbrunes Land vehicle braking system
US20100020170A1 (en) 2008-07-24 2010-01-28 Higgins-Luthman Michael J Vehicle Imaging System
US9126525B2 (en) 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
US9036026B2 (en) 2009-06-12 2015-05-19 Magna Electronics Scalable integrated electronic control unit for vehicle
EP2423063B1 (en) 2010-08-23 2013-03-06 Harman Becker Automotive Systems GmbH Method of detecting the braking of a vehicle
US9194943B2 (en) 2011-04-12 2015-11-24 Magna Electronics Inc. Step filter for estimating distance in a time-of-flight ranging system
US9043048B2 (en) * 2011-10-13 2015-05-26 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America RF biometric ignition control system
DE102011118149A1 (en) 2011-11-10 2013-05-16 Gm Global Technology Operations, Llc Method for operating a safety system of a motor vehicle and safety system for a motor vehicle
DE102011118157A1 (en) 2011-11-10 2013-05-16 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Method for operating an information and entertainment system of a motor vehicle and information and entertainment system
JP5499011B2 (en) 2011-11-17 2014-05-21 富士重工業株式会社 Outside environment recognition device and outside environment recognition method
WO2013081985A1 (en) 2011-11-28 2013-06-06 Magna Electronics, Inc. Vision system for vehicle
WO2013090282A1 (en) * 2011-12-12 2013-06-20 Clay Skelton Systems, devices and methods for vehicles
US9681125B2 (en) * 2011-12-29 2017-06-13 Pelco, Inc Method and system for video coding with noise filtering
US8694224B2 (en) 2012-03-01 2014-04-08 Magna Electronics Inc. Vehicle yaw rate correction
DE102013217430A1 (en) 2012-09-04 2014-03-06 Magna Electronics, Inc. Driver assistance system for a motor vehicle
US9090234B2 (en) 2012-11-19 2015-07-28 Magna Electronics Inc. Braking control system for vehicle
US9068390B2 (en) 2013-01-21 2015-06-30 Magna Electronics Inc. Vehicle hatch control system
US20140218529A1 (en) 2013-02-04 2014-08-07 Magna Electronics Inc. Vehicle data recording system
US9092986B2 (en) 2013-02-04 2015-07-28 Magna Electronics Inc. Vehicular vision system
US9260095B2 (en) 2013-06-19 2016-02-16 Magna Electronics Inc. Vehicle vision system with collision mitigation
US20140375476A1 (en) 2013-06-24 2014-12-25 Magna Electronics Inc. Vehicle alert system
US20150009010A1 (en) 2013-07-03 2015-01-08 Magna Electronics Inc. Vehicle vision system with driver detection
US9166730B2 (en) * 2013-09-26 2015-10-20 Ford Global Technologies, Llc RF jamming detection and mitigation system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040016870A1 (en) * 2002-05-03 2004-01-29 Pawlicki John A. Object detection system for vehicle
US20040212678A1 (en) * 2003-04-25 2004-10-28 Cooper Peter David Low power motion detection system
US20100198513A1 (en) * 2009-02-03 2010-08-05 Gm Global Technology Operations, Inc. Combined Vehicle-to-Vehicle Communication and Object Detection Sensing
US20100265344A1 (en) * 2009-04-15 2010-10-21 Qualcomm Incorporated Auto-triggered fast frame rate digital video recording
US20110148609A1 (en) * 2009-12-21 2011-06-23 Harsha Dabholkar Apparatus And Method For Reducing False Alarms In Stolen Vehicle Tracking
US20120105635A1 (en) * 2010-10-27 2012-05-03 Erhardt Herbert J Automotive imaging system for recording exception events
US20120268601A1 (en) * 2011-04-25 2012-10-25 Mitac International Corp. Method of recording traffic images and a drive recorder system
US20130141597A1 (en) * 2011-12-06 2013-06-06 Kyungsuk David Lee Controlling power consumption in object tracking pipeline
US20140192206A1 (en) * 2013-01-07 2014-07-10 Leap Motion, Inc. Power consumption in motion-capture systems

Cited By (223)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220032844A1 (en) * 2004-09-14 2022-02-03 Magna Electronics Inc. Vehicular trailer hitching assist system
US11813987B2 (en) * 2004-09-14 2023-11-14 Magna Electronics Inc. Vehicular trailer hitching assist system
US10703274B2 (en) * 2004-09-14 2020-07-07 Magna Electronics Inc. Vehicular multi-camera vision system including rear backup camera
US10800331B1 (en) * 2004-09-14 2020-10-13 Magna Electronics Inc. Vehicular vision system including rear backup camera
US11577646B2 (en) * 2004-09-14 2023-02-14 Magna Electronics Inc. Vehicular trailer hitching assist system
US11155210B2 (en) * 2004-09-14 2021-10-26 Magna Electronics Inc. Vehicular driving assist system including multiple cameras and radar sensor
US11577652B2 (en) 2008-10-16 2023-02-14 Magna Mirrors Of America, Inc. Vehicular video camera display system
US11021107B2 (en) 2008-10-16 2021-06-01 Magna Mirrors Of America, Inc. Vehicular interior rearview mirror system with display
US11807164B2 (en) 2008-10-16 2023-11-07 Magna Mirrors Of America, Inc. Vehicular video camera display system
US11877054B2 (en) 2011-09-21 2024-01-16 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US10640040B2 (en) 2011-11-28 2020-05-05 Magna Electronics Inc. Vision system for vehicle
US11142123B2 (en) 2011-11-28 2021-10-12 Magna Electronics Inc. Multi-camera vehicular vision system
US11634073B2 (en) 2011-11-28 2023-04-25 Magna Electronics Inc. Multi-camera vehicular vision system
US11503251B2 (en) 2012-01-20 2022-11-15 Magna Electronics Inc. Vehicular vision system with split display
US9269263B2 (en) 2012-02-24 2016-02-23 Magna Electronics Inc. Vehicle top clearance alert system
US10147323B2 (en) 2012-02-24 2018-12-04 Magna Electronics Inc. Driver assistance system with path clearance determination
US10911721B2 (en) 2012-03-23 2021-02-02 Magna Electronics Inc. Vehicle vision system with accelerated determination of an object of interest
US11627286B2 (en) 2012-03-23 2023-04-11 Magna Electronics Inc. Vehicular vision system with accelerated determination of another vehicle
US11184585B2 (en) 2012-03-23 2021-11-23 Magna Electronics Inc. Vehicular vision system with accelerated determination of an object of interest
US10609335B2 (en) 2012-03-23 2020-03-31 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US10922563B2 (en) 2012-05-18 2021-02-16 Magna Electronics Inc. Vehicular control system
US11308718B2 (en) 2012-05-18 2022-04-19 Magna Electronics Inc. Vehicular vision system
US10515279B2 (en) 2012-05-18 2019-12-24 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US11769335B2 (en) 2012-05-18 2023-09-26 Magna Electronics Inc. Vehicular rear backup system
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US11508160B2 (en) 2012-05-18 2022-11-22 Magna Electronics Inc. Vehicular vision system
US9436996B2 (en) * 2012-07-12 2016-09-06 Noritsu Precision Co., Ltd. Recording medium storing image processing program and image processing apparatus
US20140016815A1 (en) * 2012-07-12 2014-01-16 Koji Kita Recording medium storing image processing program and image processing apparatus
US9470034B2 (en) 2013-01-21 2016-10-18 Magna Electronics Inc. Vehicle hatch control system
US10072453B2 (en) 2013-01-21 2018-09-11 Magna Electronics Inc. Vehicle door control system
US9068390B2 (en) 2013-01-21 2015-06-30 Magna Electronics Inc. Vehicle hatch control system
US9811741B2 (en) * 2013-01-28 2017-11-07 Fujitsu Ten Limited Object detector
US20140211007A1 (en) * 2013-01-28 2014-07-31 Fujitsu Ten Limited Object detector
US11012668B2 (en) 2013-02-04 2021-05-18 Magna Electronics Inc. Vehicular security system that limits vehicle access responsive to signal jamming detection
US10523904B2 (en) 2013-02-04 2019-12-31 Magna Electronics Inc. Vehicle data recording system
US9881482B2 (en) * 2013-02-13 2018-01-30 Volkswagen Ag Method and device for displaying information of a system
US20150371527A1 (en) * 2013-02-13 2015-12-24 Volkswagen Ag Method and Device for Displaying Information of a System
US10222224B2 (en) 2013-06-24 2019-03-05 Magna Electronics Inc. System for locating a parking space based on a previously parked space
US10718624B2 (en) 2013-06-24 2020-07-21 Magna Electronics Inc. Vehicular parking assist system that determines a parking space based in part on previously parked spaces
US9242603B2 (en) * 2013-09-05 2016-01-26 Hyundai Mobis Co., Ltd. Control method for around view stop mode
US20150066237A1 (en) * 2013-09-05 2015-03-05 Hyundai Mobis Co., Ltd. Control method for around view stop mode
US10235581B2 (en) 2013-10-25 2019-03-19 Magna Electronics Inc. Vehicle vision system with traffic light status determination
US9881220B2 (en) 2013-10-25 2018-01-30 Magna Electronics Inc. Vehicle vision system utilizing communication system
US10137892B2 (en) 2013-12-05 2018-11-27 Magna Electronics Inc. Vehicle monitoring system
US10870427B2 (en) 2013-12-05 2020-12-22 Magna Electronics Inc. Vehicular control system with remote processor
US11618441B2 (en) 2013-12-05 2023-04-04 Magna Electronics Inc. Vehicular control system with remote processor
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
US10688993B2 (en) 2013-12-12 2020-06-23 Magna Electronics Inc. Vehicle control system with traffic driving control
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
US10315573B2 (en) 2014-02-19 2019-06-11 Magna Electronics Inc. Method for displaying information to vehicle driver
US10017114B2 (en) 2014-02-19 2018-07-10 Magna Electronics Inc. Vehicle vision system with display
US9688199B2 (en) 2014-03-04 2017-06-27 Magna Electronics Inc. Vehicle alert system utilizing communication system
US10316571B2 (en) 2014-03-04 2019-06-11 Magna Electronics Inc. Vehicle alert system utilizing communication system
US10753138B2 (en) 2014-03-04 2020-08-25 Magna Electronics Inc. Vehicular collision avoidance system
US10389016B2 (en) 2014-05-12 2019-08-20 Magna Electronics Inc. Vehicle communication system with heated antenna
US10554757B2 (en) 2014-08-01 2020-02-04 Magna Electronics Inc. Smart road system for vehicles
US9729636B2 (en) 2014-08-01 2017-08-08 Magna Electronics Inc. Smart road system for vehicles
US10051061B2 (en) 2014-08-01 2018-08-14 Magna Electronics Inc. Smart road system for vehicles
US10414339B2 (en) * 2014-11-07 2019-09-17 Connaught Electronics Ltd. Method for operating a camera system, camera system and motor vehicle
US9405120B2 (en) 2014-11-19 2016-08-02 Magna Electronics Solutions Gmbh Head-up display and vehicle using the same
US10043091B2 (en) 2014-12-05 2018-08-07 Magna Electronics Inc. Vehicle vision system with retroreflector pattern recognition
US10445600B2 (en) 2015-01-14 2019-10-15 Magna Electronics Inc. Vehicular control system
US11676400B2 (en) 2015-01-14 2023-06-13 Magna Electronics Inc. Vehicular control system
US11436840B2 (en) 2015-01-14 2022-09-06 Magna Electronics Inc. Vehicular control system
US10157322B1 (en) 2015-01-14 2018-12-18 Magna Electronics Inc. Control system for vehicle
US9740945B2 (en) 2015-01-14 2017-08-22 Magna Electronics Inc. Driver assistance system for vehicle
US10049285B2 (en) 2015-01-14 2018-08-14 Magna Electronics Inc. Control system for vehicle
US10803329B2 (en) 2015-01-14 2020-10-13 Magna Electronics Inc. Vehicular control system
US10032369B2 (en) 2015-01-15 2018-07-24 Magna Electronics Inc. Vehicle vision system with traffic monitoring and alert
US10755559B2 (en) 2015-01-15 2020-08-25 Magna Electronics Inc. Vehicular vision and alert system
US10482762B2 (en) 2015-01-15 2019-11-19 Magna Electronics Inc. Vehicular vision and alert system
US10549690B1 (en) 2015-03-23 2020-02-04 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US10744938B1 (en) 2015-03-23 2020-08-18 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US9908470B1 (en) 2015-03-23 2018-03-06 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US10239450B1 (en) 2015-03-23 2019-03-26 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US11084422B1 (en) 2015-03-23 2021-08-10 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US9718405B1 (en) * 2015-03-23 2017-08-01 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US11697371B1 (en) 2015-03-23 2023-07-11 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US11034299B2 (en) 2015-05-06 2021-06-15 Magna Mirrors Of America, Inc. Vehicular vision system with episodic display of video images showing approaching other vehicle
US10819943B2 (en) 2015-05-07 2020-10-27 Magna Electronics Inc. Vehicle vision system with incident recording function
US11483514B2 (en) 2015-05-07 2022-10-25 Magna Electronics Inc. Vehicular vision system with incident recording function
US10855953B2 (en) 2015-06-25 2020-12-01 Magna Electronics Inc. Vehicular control system with forward viewing camera and beam emitting antenna array
US10419723B2 (en) 2015-06-25 2019-09-17 Magna Electronics Inc. Vehicle communication system with forward viewing camera and integrated antenna
US11533454B2 (en) 2015-06-25 2022-12-20 Magna Electronics Inc. Vehicular control system with forward viewing camera and forward sensing sensor
US11134220B2 (en) 2015-06-25 2021-09-28 Magna Electronics Inc. Vehicular control system with forward viewing camera and forward and rearward sensing sensors
US11805228B2 (en) 2015-06-25 2023-10-31 Magna Electronics Inc. Vehicular control system with forward viewing camera and forward sensing sensor
US10115314B2 (en) 2015-07-08 2018-10-30 Magna Electronics Inc. Lane change system for platoon of vehicles
US10614722B2 (en) 2015-07-08 2020-04-07 Magna Electronics Inc. Lane change system for platoon of vehicles
US11222544B2 (en) 2015-07-08 2022-01-11 Magna Electronics Inc. Lane change system for platoon of vehicles
US11673605B2 (en) 2015-08-18 2023-06-13 Magna Electronics Inc. Vehicular driving assist system
US10870449B2 (en) * 2015-08-18 2020-12-22 Magna Electronics Inc. Vehicular trailering system
US10331956B2 (en) 2015-09-23 2019-06-25 Magna Electronics Inc. Vehicle vision system with detection enhancement using light control
US10929693B2 (en) 2015-09-23 2021-02-23 Magna Electronics Inc. Vehicular vision system with auxiliary light source
US9689191B1 (en) * 2015-10-08 2017-06-27 Hyundai Motor Company Power tailgate control device and method
US10875403B2 (en) 2015-10-27 2020-12-29 Magna Electronics Inc. Vehicle vision system with enhanced night vision
US10152886B2 (en) 2015-11-02 2018-12-11 Magna Electronics Inc. Driver assistance system with traffic light alert
US9881501B2 (en) 2015-11-02 2018-01-30 Magna Electronics Inc. Driver assistance system with traffic light alert
US11488397B2 (en) 2015-11-02 2022-11-01 Magna Electronics Inc. Vehicle customization system
US11676404B2 (en) 2015-11-02 2023-06-13 Magna Electronics Inc. Vehicular driver monitoring system with customized outputs
US10324297B2 (en) 2015-11-30 2019-06-18 Magna Electronics Inc. Heads up display system for vehicle
US11027654B2 (en) 2015-12-04 2021-06-08 Magna Electronics Inc. Vehicle vision system with compressed video transfer via DSRC link
US10240384B2 (en) * 2015-12-10 2019-03-26 Hyundai Motor Company Apparatus and method of controlling tailgate using rear-view camera in vehicle
US10430674B2 (en) 2015-12-14 2019-10-01 Magna Electronics Inc. Vehicle vision system using reflective vehicle tags
US10132971B2 (en) 2016-03-04 2018-11-20 Magna Electronics Inc. Vehicle camera with multiple spectral filters
US11872884B2 (en) 2016-03-23 2024-01-16 Magna Electronics Inc. Vehicular driver monitoring system
US10703204B2 (en) 2016-03-23 2020-07-07 Magna Electronics Inc. Vehicle driver monitoring system
US10401621B2 (en) 2016-04-19 2019-09-03 Magna Electronics Inc. Display unit for vehicle head-up display system
US10650304B2 (en) 2016-05-11 2020-05-12 Magna Electronics Inc. Vehicle driving assist system with enhanced data processing
US11288569B2 (en) 2016-05-11 2022-03-29 Magna Electronics Inc. Vehicle driving assist system with enhanced data processing
US10574305B2 (en) 2016-05-11 2020-02-25 Magna Electronics Inc. Vehicle secured communication system
US11043990B2 (en) 2016-05-11 2021-06-22 Magna Electronics Inc. Vehicular secured communication system
US10190560B2 (en) 2016-06-03 2019-01-29 Magna Electronics Inc. Camera based vehicle start-stop feature
US11125198B2 (en) 2016-06-03 2021-09-21 Magna Electronics Inc. Camera based vehicle start-stop feature
US10731618B2 (en) 2016-06-03 2020-08-04 Magna Electronics Inc. Camera based vehicle start-stop feature
US10147246B2 (en) 2016-06-09 2018-12-04 Magna Electronics Inc. Wheel bolt torque monitoring system for vehicle
US11275175B2 (en) 2016-06-14 2022-03-15 Magna Electronics Inc. Method for detecting objects via a vehicular sensing system
US10768298B2 (en) * 2016-06-14 2020-09-08 Magna Electronics Inc. Vehicle sensing system with 360 degree near range sensing
US11454719B2 (en) 2016-07-08 2022-09-27 Magna Electronics Inc. 2D MIMO radar system for vehicle
US10708227B2 (en) 2016-07-19 2020-07-07 Magna Electronics Inc. Scalable secure gateway for vehicle
US11463408B2 (en) 2016-07-19 2022-10-04 Magna Electronics Inc. Vehicular secure gateway system
US11613308B2 (en) 2016-08-01 2023-03-28 Magna Electronics Inc. Vehicular lighting system using light projections
US10486742B2 (en) 2016-08-01 2019-11-26 Magna Electronics Inc. Parking assist system using light projections
US11719808B2 (en) 2016-08-24 2023-08-08 Magna Electronics Inc. Vehicle sensor with integrated radar and image sensors
US10852418B2 (en) 2016-08-24 2020-12-01 Magna Electronics Inc. Vehicle sensor with integrated radar and image sensors
US10677894B2 (en) 2016-09-06 2020-06-09 Magna Electronics Inc. Vehicle sensing system for classification of vehicle model
US11604253B2 (en) 2016-09-06 2023-03-14 Magna Electronics Inc. Vehicular sensing system for classification of detected objects
US10866310B2 (en) 2016-09-06 2020-12-15 Magna Electronics Inc. Vehicle sensing system for classification of vehicle model
US20180072270A1 (en) * 2016-09-09 2018-03-15 Magna Electronics Inc. Vehicle surround security system
US11068918B2 (en) 2016-09-22 2021-07-20 Magna Electronics Inc. Vehicle communication system
US10496090B2 (en) 2016-09-29 2019-12-03 Magna Electronics Inc. Handover procedure for driver of autonomous vehicle
US11927954B2 (en) 2016-09-29 2024-03-12 Magna Electronics Inc. Vehicular control system with handover procedure for driver of controlled vehicle
US11550319B2 (en) 2016-09-29 2023-01-10 Magna Electronics Inc. Vehicular control system with handover procedure for driver of controlled vehicle
US11137760B2 (en) 2016-09-29 2021-10-05 Magna Electronics Inc. Handover procedure for driver of controlled vehicle
US10967971B2 (en) 2016-11-18 2021-04-06 Magna Mirrors Of America, Inc. Vehicle vision system using aerial camera
US10562624B2 (en) 2016-11-18 2020-02-18 Magna Mirrors Of America, Inc. Vehicle vision system using aerial camera
US11845546B2 (en) 2016-11-18 2023-12-19 Magna Mirrors Of America, Inc. Vehicle vision system using aerial camera
US11727807B2 (en) 2016-12-07 2023-08-15 Magna Electronics Inc. Vehicular control system with collision avoidance
US11138883B2 (en) 2016-12-07 2021-10-05 Magna Electronics Inc. Vehicular communication system with collision alert
US10347129B2 (en) 2016-12-07 2019-07-09 Magna Electronics Inc. Vehicle system with truck turn alert
US10462354B2 (en) 2016-12-09 2019-10-29 Magna Electronics Inc. Vehicle control system utilizing multi-camera module
JP2018101402A (en) * 2016-12-21 2018-06-28 トヨタ自動車株式会社 Vehicle data recording device
WO2018130596A1 (en) * 2017-01-12 2018-07-19 Connaught Electronics Ltd. Method for operating a camera in dependency on a current state of an environmental region of the camera, camera, and motor vehicle
US20210241008A1 (en) * 2017-01-23 2021-08-05 Magna Electronics Inc. Vehicle vision system with object detection failsafe
US11657620B2 (en) * 2017-01-23 2023-05-23 Magna Electronics Inc. Vehicle vision system with object detection failsafe
US10936884B2 (en) * 2017-01-23 2021-03-02 Magna Electronics Inc. Vehicle vision system with object detection failsafe
US11244564B2 (en) 2017-01-26 2022-02-08 Magna Electronics Inc. Vehicle acoustic-based emergency vehicle detection
US10703341B2 (en) 2017-02-03 2020-07-07 Magna Electronics Inc. Vehicle sensor housing with theft protection
US10782388B2 (en) 2017-02-16 2020-09-22 Magna Electronics Inc. Vehicle radar system with copper PCB
US11422228B2 (en) 2017-02-16 2022-08-23 Magna Electronics Inc. Method for constructing vehicular radar sensor with copper PCB
US11536829B2 (en) 2017-02-16 2022-12-27 Magna Electronics Inc. Vehicle radar system with radar embedded into radome
US11124113B2 (en) 2017-04-18 2021-09-21 Magna Electronics Inc. Vehicle hatch clearance determining system
US10250737B2 (en) * 2017-04-18 2019-04-02 Beijing Mobike Technology Co., Ltd. Terminal function setting method and device for vehicle unlocking, and mobile terminal
US11603041B2 (en) 2017-04-18 2023-03-14 Magna Electronics Inc. Vehicle hatch clearance determining system
US10019857B1 (en) * 2017-05-18 2018-07-10 Ford Global Technologies, Llc Hit-and-run detection
CN108966145A (en) * 2017-05-25 2018-12-07 福特全球技术公司 Hit-and-run criminal is tracked using V2X communication
US10089869B1 (en) * 2017-05-25 2018-10-02 Ford Global Technologies, Llc Tracking hit and run perpetrators using V2X communication
US11713038B2 (en) 2017-06-22 2023-08-01 Magna Electronics Inc. Vehicular control system with rear collision mitigation
US10870426B2 (en) 2017-06-22 2020-12-22 Magna Electronics Inc. Driving assistance system with rear collision mitigation
US11472403B2 (en) 2017-06-22 2022-10-18 Magna Electronics Inc. Vehicular control system with rear collision mitigation
US11653090B1 (en) * 2017-07-04 2023-05-16 Ramin Farjadrad Intelligent distributed systems and methods for live traffic monitoring and optimization
JP2019022063A (en) * 2017-07-14 2019-02-07 株式会社デンソーテン Image recording apparatus and image recording method
US10883846B2 (en) 2017-08-23 2021-01-05 Magna Electronics Inc. Vehicle communication system with area classifier
US10358115B2 (en) * 2017-10-23 2019-07-23 Hyundai Motor Company Vehicle, vehicle security system and vehicle security method
US11702076B2 (en) 2017-10-23 2023-07-18 Uatc, Llc Cargo trailer sensor assembly
US20190118814A1 (en) * 2017-10-23 2019-04-25 Uber Technologies, Inc. Cargo trailer sensor assembly
US11052913B2 (en) * 2017-10-23 2021-07-06 Uatc, Llc Cargo trailer sensor assembly
US20230150447A1 (en) * 2017-11-06 2023-05-18 Magna Electronics Inc. Vehicular vision system with underbody camera
US11794680B2 (en) * 2017-11-06 2023-10-24 Magna Electronics Inc. Vehicular vision system with underbody camera
US11486968B2 (en) 2017-11-15 2022-11-01 Magna Electronics Inc. Vehicle Lidar sensing system with sensor module
US11417107B2 (en) 2018-02-19 2022-08-16 Magna Electronics Inc. Stationary vision system at vehicle roadway
US11047977B2 (en) 2018-02-20 2021-06-29 Magna Electronics Inc. Vehicle radar system with solution for ADC saturation
US11733376B2 (en) 2018-02-20 2023-08-22 Magna Electronics Inc. Vehicle radar system with solution for ADC saturation
US10979632B2 (en) * 2018-05-31 2021-04-13 Canon Kabushiki Kaisha Imaging apparatus, method for controlling same, and storage medium
CN110557560A (en) * 2018-05-31 2019-12-10 佳能株式会社 image pickup apparatus, control method thereof, and storage medium
WO2020035491A1 (en) * 2018-08-16 2020-02-20 Veoneer Sweden Ab A vision system for a motor vehicle and a method of training
US20210163031A1 (en) * 2018-08-16 2021-06-03 Veoneer Sweden Ab A vision system for a motor vehicle and a method of training
EP3611595A1 (en) * 2018-08-16 2020-02-19 Veoneer Sweden AB A vision system for a motor vehicle and a method of training
EP4163762A1 (en) * 2018-08-16 2023-04-12 Arriver Software AB A vision system for a motor vehicle and a method of training
US10875498B2 (en) 2018-09-26 2020-12-29 Magna Electronics Inc. Vehicular alert system for door lock function
US11808876B2 (en) 2018-10-25 2023-11-07 Magna Electronics Inc. Vehicular radar system with vehicle to infrastructure communication
US11341671B2 (en) 2018-11-01 2022-05-24 Magna Electronics Inc. Vehicular driver monitoring system
US11146759B1 (en) * 2018-11-13 2021-10-12 JMJ Designs, LLC Vehicle camera system
US11454720B2 (en) 2018-11-28 2022-09-27 Magna Electronics Inc. Vehicle radar system with enhanced wave guide antenna system
US11852720B2 (en) 2018-11-28 2023-12-26 Magna Electronics Inc. Vehicle radar system with enhanced wave guide antenna system
US11299113B2 (en) * 2018-12-07 2022-04-12 Hyundai Motor Company Device for assisting safe exit from vehicle, system having the same, and method thereof
US11265508B2 (en) * 2018-12-19 2022-03-01 Jvckenwood Corporation Recording control device, recording control system, recording control method, and recording control program
US11854276B2 (en) 2018-12-19 2023-12-26 Magna Electronics Inc. Vehicle driver monitoring system for determining driver workload
CN111656411A (en) * 2018-12-19 2020-09-11 Jvc建伍株式会社 Recording control device, recording control system, recording control method, and recording control program
US11488399B2 (en) 2018-12-19 2022-11-01 Magna Electronics Inc. Vehicle driver monitoring system for determining driver workload
US11332124B2 (en) 2019-01-10 2022-05-17 Magna Electronics Inc. Vehicular control system
US11753002B2 (en) 2019-01-10 2023-09-12 Magna Electronics Inc. Vehicular control system
US20200243113A1 (en) * 2019-01-30 2020-07-30 Alice Wilson Roberson Vehicle Camera and Record System
US11064165B2 (en) 2019-02-05 2021-07-13 Magna Electronics Inc. Wireless trailer camera system with tracking feature
US11333739B2 (en) 2019-02-26 2022-05-17 Magna Electronics Inc. Vehicular radar system with automatic sensor alignment
US11267393B2 (en) 2019-05-16 2022-03-08 Magna Electronics Inc. Vehicular alert system for alerting drivers of other vehicles responsive to a change in driving conditions
US11894704B2 (en) 2019-07-23 2024-02-06 BlueOwl, LLC Environment-integrated smart ring charger
US11462107B1 (en) 2019-07-23 2022-10-04 BlueOwl, LLC Light emitting diodes and diode arrays for smart ring visual output
US11958488B2 (en) 2019-07-23 2024-04-16 BlueOwl, LLC Smart ring system for monitoring UVB exposure levels and using machine learning technique to predict high risk driving behavior
US11949673B1 (en) 2019-07-23 2024-04-02 BlueOwl, LLC Gesture authentication using a smart ring
US11594128B2 (en) 2019-07-23 2023-02-28 BlueOwl, LLC Non-visual outputs for a smart ring
US11775065B2 (en) 2019-07-23 2023-10-03 BlueOwl, LLC Projection system for smart ring visual output
US11537203B2 (en) 2019-07-23 2022-12-27 BlueOwl, LLC Projection system for smart ring visual output
US11551644B1 (en) * 2019-07-23 2023-01-10 BlueOwl, LLC Electronic ink display for smart ring
US11479258B1 (en) 2019-07-23 2022-10-25 BlueOwl, LLC Smart ring system for monitoring UVB exposure levels and using machine learning technique to predict high risk driving behavior
US11637511B2 (en) 2019-07-23 2023-04-25 BlueOwl, LLC Harvesting energy for a smart ring via piezoelectric charging
US11922809B2 (en) 2019-07-23 2024-03-05 BlueOwl, LLC Non-visual outputs for a smart ring
US11923791B2 (en) 2019-07-23 2024-03-05 BlueOwl, LLC Harvesting energy for a smart ring via piezoelectric charging
US11909238B1 (en) 2019-07-23 2024-02-20 BlueOwl, LLC Environment-integrated smart ring charger
US11537917B1 (en) 2019-07-23 2022-12-27 BlueOwl, LLC Smart ring system for measuring driver impairment levels and using machine learning techniques to predict high risk driving behavior
US11853030B2 (en) 2019-07-23 2023-12-26 BlueOwl, LLC Soft smart ring and method of manufacture
US11006068B1 (en) * 2019-11-11 2021-05-11 Bendix Commercial Vehicle Systems Llc Video recording based on image variance
US11866063B2 (en) 2020-01-10 2024-01-09 Magna Electronics Inc. Communication system and method
US11433855B2 (en) * 2020-04-03 2022-09-06 Micron Technology, Inc. Intelligent detection and alerting of potential intruders
US20210309183A1 (en) * 2020-04-03 2021-10-07 Micron Technology, Inc. Intelligent Detection and Alerting of Potential Intruders
CN113496204A (en) * 2020-04-03 2021-10-12 美光科技公司 Intelligent detection and warning of potential intruders
US11093766B1 (en) 2020-04-03 2021-08-17 Micron Technology, Inc. Detect and alert of forgotten items left in a vehicle
US11702001B2 (en) 2020-04-03 2023-07-18 Micron Technology, Inc. Detect and alert of forgotten items left in a vehicle
US11560143B2 (en) 2020-05-26 2023-01-24 Magna Electronics Inc. Vehicular autonomous parking system using short range communication protocols
US11823395B2 (en) 2020-07-02 2023-11-21 Magna Electronics Inc. Vehicular vision system with road contour detection feature
CN112071083A (en) * 2020-09-15 2020-12-11 台州市远行客网络技术有限公司 Motor vehicle license plate relay identification system and license plate relay identification method
US11749105B2 (en) 2020-10-01 2023-09-05 Magna Electronics Inc. Vehicular communication system with turn signal identification
US11971475B2 (en) 2022-09-26 2024-04-30 Magna Electronics Inc. 2D MIMO vehicular radar sensing system
US11972615B2 (en) 2023-06-12 2024-04-30 Magna Electronics Inc. Vehicular control system

Also Published As

Publication number Publication date
US20190238799A1 (en) 2019-08-01
US20200137355A1 (en) 2020-04-30
US11012668B2 (en) 2021-05-18
US10523904B2 (en) 2019-12-31

Similar Documents

Publication Publication Date Title
US11012668B2 (en) Vehicular security system that limits vehicle access responsive to signal jamming detection
US10718624B2 (en) Vehicular parking assist system that determines a parking space based in part on previously parked spaces
US11184585B2 (en) Vehicular vision system with accelerated determination of an object of interest
US11393217B2 (en) Vehicular vision system with detection and tracking of objects at the side of a vehicle
US10692380B2 (en) Vehicle vision system with collision mitigation
US10072453B2 (en) Vehicle door control system
US20170313247A1 (en) Vehicle safety system
US20180072270A1 (en) Vehicle surround security system
US20150294169A1 (en) Vehicle vision system with driver monitoring
EP3754618B1 (en) Recording control device, recording control system, recording control method, and recording control program
US10671868B2 (en) Vehicular vision system using smart eye glasses
JP2007168570A (en) Controller of vehicle-mounted camera
US20210065554A1 (en) Method for monitoring an environment of a parked motor vehicle comprising an asynchronous camera
US20220348164A1 (en) Automatic lift gate opener using vehicular rear camera
JP2013041488A (en) On-vehicle camera control unit, on-vehicle camera control system and on-vehicle camera system
KR20220126963A (en) Periphery monitoring system using event based sensor of vehicle

Legal Events

Date Code Title Description
STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION