WO2017180394A1 - Method and system for online performance monitoring of the perception system of road vehicles - Google Patents

Method and system for online performance monitoring of the perception system of road vehicles Download PDF

Info

Publication number
WO2017180394A1
WO2017180394A1 PCT/US2017/026183 US2017026183W WO2017180394A1 WO 2017180394 A1 WO2017180394 A1 WO 2017180394A1 US 2017026183 W US2017026183 W US 2017026183W WO 2017180394 A1 WO2017180394 A1 WO 2017180394A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
vehicles
perception
nearby
observed
Prior art date
Application number
PCT/US2017/026183
Other languages
French (fr)
Inventor
Pertti PEUSSA
Matti Kutila
Mikko Tarkiainen
Ari Virtanen
Original Assignee
Pcms Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pcms Holdings, Inc. filed Critical Pcms Holdings, Inc.
Publication of WO2017180394A1 publication Critical patent/WO2017180394A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4039Means for monitoring or calibrating of parts of a radar system of sensor or antenna obstruction, e.g. dirt- or ice-coating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/015Detecting movement of traffic to be counted or controlled with provision for distinguishing between two or more types of vehicles, e.g. between motor-cars and cycles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S2007/4975Means for monitoring or calibrating of sensor obstruction by, e.g. dirt- or ice-coating, e.g. by reflection measurement on front-screen
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • G01S2007/52009Means for monitoring or calibrating of sensor obstruction, e.g. dirt- or ice-coating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data

Definitions

  • the present disclosure generally relates to automotive perception sensors.
  • a radar can tell whether its internal power-on checks were successful, or a laser scanner can detect whether its front cover is totally blocked.
  • test targets can be available also at very feature-poor environment; vehicle perception system is tuned to find and track these targets; they are well defined, and detectable despite adverse conditions, such as heavy snowing; many of them communicate, allowing an additional channel to assure detection; they can appear 360° around the vehicle, and throughout the detection range in varying velocities, meaning that they can be used to test several of the perception sensors of the vehicle. Additionally, velocity is useful in sensor performance assessment, especially for radars. Moving vehicles may often have lights, which is further information for sensor performance analysis, such as assessing the performance of a camera system.
  • a method comprising: identifying a nearby set of vehicles to a first vehicle based on received wireless messages from a plurality of vehicles; identifying an observed set of vehicles based on sensors of the first vehicle; and responsive to a determination that at least one vehicle in the nearby set is not in the observed set, performing at least one perception-deficit-response action.
  • a system comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions such as, but not limited to, those set forth above.
  • FIG. 1 depicts an exemplary embodiment of the detection range of a vehicle having 360 degree perception capability.
  • FIG. 2 depicts an exemplary embodiment of the sensor detection range of a vehicle in ideal conditions and in adverse conditions.
  • FIG. 3 depicts an exemplary embodiment the sensor detection range of a vehicle with a damaged right front sensor.
  • FIG. 4A depicts an exemplary embodiment of a first vehicle detecting a second vehicle, and the associated perceived information.
  • FIG. 4B depicts an exemplary embodiment of the second vehicle detecting the first vehicle, and the associated perceived information.
  • FIG. 5 depicts an exemplary embodiment of a typical traffic situation on a freeway, from the perspective at a given moment in time of an egovehicle.
  • FIG. 6 is a flow diagram of an exemplary embodiment of a method of sensor performance monitoring operation.
  • FIG. 7 depicts an exemplary embodiment of an architecture of data structures and information as used in the present disclosure.
  • FIG. 8 illustrates an embodiment of the communication process of sharing detections between vehicles.
  • FIG. 9 illustrates an exemplary embodiment of a perception capabilities database (forward looking perception system).
  • FIG. 10 illustrates an exemplary wireless transmit/receive unit (WTRU) that may be employed in some embodiments.
  • WTRU wireless transmit/receive unit
  • FIG. 11 illustrates an exemplary network entity that may be employed in some embodiments.
  • modules that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules.
  • a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits
  • Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer- readable medium or media, such as commonly referred to as RAM, ROM, etc.
  • FIG. 1 illustrates a simplified depiction of the detection (or perception) range 1007 of a vehicle 1005, which has 360° perception capability.
  • the vehicle 1005 is heading to the right.
  • the forward looking distance may be about 200 meters.
  • Confirmed objects 1010 may each have a distance 1015 and angle 1020 from the vehicle 1005.
  • FIG. 2 illustrates a simplified depiction of the sensor detection range of a vehicle 205 in ideal conditions (range 210), and in adverse conditions (range 215), such as dirt on the sensors. Since adverse weather conditions (e.g., heavy snow) or driving situations (e.g., oncoming car is raising dust in the air) can change rapidly, it is important to frequently monitor the performance of the perception system during driving.
  • adverse weather conditions e.g., heavy snow
  • driving situations e.g., oncoming car is raising dust in the air
  • FIG. 3 illustrates a simplified depiction of the sensor detection range 310 of a vehicle 305 with a damaged right front sensor (or otherwise impaired sensor). While some objects 315 are detected and have their distance and angle relative to vehicle 305 confirmed, along a portion 320 the vehicle 305 is unable to detect anything. In an alternative example, the sensor view could be blocked, such as by a retrofitted fog light, or the like.
  • the senor may be so dirty that it is practically nonfunctioning.
  • the vehicle 305 is partly blind (in non-detection area 320) and it should be inspected and repaired soon, because its functions, potentially including self-driving, are severely compromised.
  • V2V communication equipped vehicles may be repeatedly broadcasting information (such as their location, heading, velocity, etc.) to other vehicles in the area. In typical conditions, this message may carry 300 meters or more. This means that a particular vehicle can often be aware of other vehicles beyond its sensor detection range.
  • test targets can be available also at very feature-poor environment; vehicle perception system is tuned to find and track these targets; they are well defined, and detectable despite adverse conditions, such as heavy snowing; many of them communicate, allowing an additional channel to assure detection; they can appear 360° around the vehicle, and throughout the detection range in varying velocities, meaning that they can be used to test all the perception sensors of the vehicle. Additionally, velocity is useful in sensor performance assessment, especially for radars. Moving vehicles may often have lights, which is further information for sensor performance analysis, such as assessing the performance of a camera system.
  • two oncoming vehicles - both having perception capability - measure how far and in which angle the other vehicle is, and share the information. This allows both vehicles to get an independent estimation of their respective perception ranges and angular correctness.
  • FIGS. 4A-4B illustrate simplified depictions of the interaction of two oncoming vehicles.
  • a first vehicle 405 (referred to hereafter as the "egovehicle") detects a second vehicle 410 at a distance 422a and angle 424a (in the egovehicle's coordinate system, relative to direction of travel 420a) at a certain time.
  • the egovehicle 405 receives the second vehicle detections (e.g., distance 422b and angle 424b in vehicle 410s coordinate system, relative to its direction of travel 420b)
  • the egovehicle 405 can deduce how far its sensors were able to 'see' second vehicle 410 at this point. Since the detections will likely take place at different times, also included in the information exchange may be a measurement time stamp, and vehicle speed and heading (e.g., 407 and 412), and/or the like, to allow calculations to match the times of the measurements.
  • the egovehicle broadcasts the request to share detected objects with all vehicles in the communication range. When this is done frequently enough - initiated by the egovehicle or some other vehicle in the area - the vehicles can maintain an up-to-date measure of what their actual detection range is.
  • the disclosed approach is applied only in freeways or inter-urban roads, as in dense cities the vehicles may often be so close to each other that it is not always clear which of the detected objects is which vehicle - especially in traffic jams. Further, in many instances it is more important to monitor the true range of the perception in situations involving highways, freeways, and inter-urban roads, for example due to the higher speeds used.
  • FIGS. 4A-4B and 5 An exemplary embodiment of the present disclosure is set forth with two vehicles exchanging their perceived information, as in FIGS. 4A-4B and 5.
  • the egovehicle one or more other V2V equipped vehicles (including both self-driven and manually driven vehicles), and in some instances one or more non-V2V equipped vehicles.
  • a Vehicle 0 which may be self-driven or manually driven.
  • the egovehicle may further comprise: a time management module satisfying the requirements given in DSRC or C-ITS specifications; one or more environment perception sensors; a perception system, which monitors the perception sensors and keeps track of the detected objects; a V2V communication system including BSM (DSRC) or CAM (C-ITS) message reception; a module for location tracking of other vehicles based on received BSM/CAM messages; processing capability to match and quantify received perceptions quickly; and an updatable database for storing information about sensor perception capability of the egovehicle. Additional features may be included, or the recited features may be modified as known to one of ordinary skill in the art.
  • V2V equipped self-driven or manually driven vehicles may have similar functionalities to the egovehicle as described above.
  • those other vehicles may be provided with only: a time management module satisfying the requirements given in DSRC or C-ITS specifications; one or more environment perception sensors; a perception system, which monitors the perception sensors and keeps track of the detected objects; and a V2V communication system including BSM (DSRC) or CAM (C-ITS) message reception.
  • DSRC BSM
  • C-ITS CAM
  • Non-V2V equipped vehicles need no specific features, as they are only used as potential common landmarks possibly tracked by the egovehicle and/or other V2V vehicles.
  • the detection range of the egovehicle is tested when other vehicles are about to enter or leave its perception range. This enables sensor performance testing near its limits. Since 250 meters is the typical upper limit of most perception sensors, an interurban road or highway with a 250-meter line-of-sight is ideal for this. However, the present disclosure is applicable with shorter line-of-sight as well, but may not test the upper limit of the sensor system.
  • FIG. 5 illustrates a simplified depiction of a typical traffic situation on a freeway (as illustrated in FIG. 5, there is a four-lane freeway scenario, however alternative scenarios such as urban 2 lane roads are also envisioned herein).
  • An egovehicle 515 knows or receives the locations, heading, speed, and vehicle type of other V2V equipped vehicles within its communication range, and tracks some vehicles within its perception range. In some instances, there may also be vehicles capable of Visible Light Communication. Such communication is included among the forms of V2V communication in this disclosure.
  • egovehicle 515 has a currently assumed perception range 505, and a current awareness range via V2V communications 510.
  • the egovehicle 515 may identify a nearby set of vehicles based on received wireless messages from a plurality of vehicles (e.g., vehicles within the awareness range via V2V 510). The egovehicle 515 may then identify an observed set of vehicles based on its sensors (such as radar, LiDAR, optical sensors, camera, etc.) within a currently assumed perception range 505. In the embodiment of FIG. 5, egovehicle 515 may observe a set of vehicles including vehicles 536 and 542. Among the nearby set of vehicles is also vehicle 532, which has sent a wireless message including its location to egovehicle 515.
  • a nearby set of vehicles is also vehicle 532, which has sent a wireless message including its location to egovehicle 515.
  • egovehicle 515 only evaluates vehicles in the nearby set which are within a previously calculated perception range of the egovehicle 505.
  • egovehicle 515 may determine that it has a perception deficit. Given this perception deficit of a vehicle in the nearby set not being observed by egovehicle 515, the egovehicle may perform at least one perception-deficit response action.
  • the perception-deficit response action may comprise one or more of (but is not limited to): presenting an alert to an occupant of the vehicle; disengaging an autonomous functionality of the vehicle; adjusting a driving parameter of the vehicle, such as due to a reduced perception capability; alerting a human occupant to take over control of the vehicle; generating an alert that a sensor of the vehicle may be malfunctioning; and/or the like.
  • Types of driving parameters may include speed of the vehicle, distance from adjacent vehicles, driving route of the vehicle, whether an autonomous or semi-autonomous vehicle may operate in an autonomous mode, and/or the like.
  • the disclosed process may be as follows, and is also depicted as a general flow diagram in FIG. 6.
  • the egovehicle 515 may broadcast a Detection Data Request via its V2V communication device (615), and then wait some period of time for responses (620).
  • the egovehicle 515 may broadcast the request to vehicles 522, 524, 528, 530, 532, 538, 542, 544, 546,548, and 550, which are each V2V equipped vehicles within the V2V communications range 510 of the egovehicle 515.
  • Each of these other vehicles may have similar functionality to the egovehicle.
  • the message structure of a Detection Data Request may comprise, but is not limited to: a Command, such as "Return your detection and egomotion data"; (Angle, Distance, RelativeVelocity) triples of all objects detected and tracked by the egovehicle; a Last DetectionTimeO, and the corresponding VelocityO, HeadingO, CoordinatesO of the egovehicle; VechCategoryO (e.g., "moped”, “motorcycle”, “passengerCar”, “bus”, “lightTruck”, “heavyTruck", “trailer”, “specialVehicle”, or “tram”) of the egovehicle.
  • a Command such as "Return your detection and egomotion data”
  • Last DetectionTimeO and the corresponding VelocityO, HeadingO, CoordinatesO of the egove
  • Each of those V2V equipped vehicles who can receive the broadcast may respond with a Detection Data Response broadcast.
  • the egovehicle may broadcast the request to vehicles 522, 524, 528, 530, 532, 538, 542, 544, 546,548, and 550.
  • the number of responding broadcasts may depend on the number of V2V vehicles in the vicinity of the egovehicle (e.g., V2V communications range 510 of egovehicle 515).
  • the egovehicle would optimally receive a response from each of vehicles 522, 524, 528, 530, 532, 538, 542, 544, 546,548, and 550.
  • the message structure of a Detection Data Response may comprise, but is not limited to: a Command, such as "Returning my detection and egomotion data"; (Angle, Distance, Relative Velocity) triples of all objects detected and tracked by the responding vehicle; a Last DetectionTimeX, and the corresponding VelocityX, HeadingX, CoordinatesX of the responding vehicle; VechCategoryX (e.g., "moped,” “motorcycle,” “passengerCar,” “bus,” “lightTruck,” “heavyTruck,” “trailer,” “specialVehicle,” or “tram”) of the responding vehicle.
  • a Command such as "Returning my detection and egomotion data”
  • (Angle, Distance, Relative Velocity) triples of all objects detected and tracked by the responding vehicle a Last DetectionTimeX, and the corresponding VelocityX, HeadingX, CoordinatesX of the responding vehicle
  • VechCategoryX e.g., "
  • the egovehicle 515 may then calculate what was the location of all the received obj ects at the time of its last detection (625). For this it uses the received detections together with the coordinates, heading, speed, and detection time found in each broadcasted message.
  • the egovehicle 515 may match the objects detected by the egovehicle' s sensors with the refined object locations calculated above (630). Based on a mathematical analysis, each egovehicle detected object may be assigned a detection certainty value (635), which may be stored in a perception capability database (640) together with Angle, Distance and relative velocity. For example, the certainty value may be at a maximum value if two or more other vehicles have detected the object (within an error tolerance) at the same location as the egovehicle. If only one other vehicle has detected the object at the same location as the egovehicle, the certainty may be halfway between the maximum value and a minimum value. If no other vehicle has responded indicating detection of the object, the certainty may be the minimum value.
  • the egovehicle may generate a best estimate of the current perception range (645).
  • the estimate of the perception range may include utilization of several detection certainties generated over a period of time.
  • the estimated perception range may modify the manufacturer's given sensor coverage area so that the range is shape scaled such that the farthest object detected by egovehicle and also detected by two other vehicles (e.g., at max confidence) just fits within the shape scaled perception range.
  • the estimated perception range may be evaluated for specific sets of objects, such as groupings by vehicle category, and/or the like.
  • the egovehicle may generate a point cloud using egovehicle' s detections and time-calibrated detections received from other vehicles.
  • the egovehicle may calculate a best fit point to best represent each cloud, and determine how many detection points fall within an error tolerance of the best fit point for each cloud. For example, if two or more received points and the egovehicle detection are within the error tolerance, the detection distance may receive the maximum certainty value. If only one received point and the egovehicle detection are within the range, the detection distance may receive a middle value certainty. If only the egovehicle detection is within the range, the detection distance may receive a minimum certainty value.
  • the received detections may be used to diagnose the sensor system. For example, a large number of obj ects detected by other vehicles but never the egovehicle within its certain detection sector may indicate limitations on the egovehicle' s sensors (e.g., blocked sensor, malfunctioning sensor, range limited sensor, etc.), such as shown in FIG. 3. As such, the egovehicle continuously tracks its perception capability, which can be utilized in various safety features. Safety features may include, but is not limited to, features such as requesting the driver to take over control of the vehicle due to substandard perception capability, indicating a sensor re-calibration need, or indicating a serious malfunction in the sensor system.
  • temporary inability of the egovehicle to track particular objects may not indicate limited sensors, because a threshold time lapse between last detection by the egovehicle of an object has not been surpassed (e.g., to allow time for a passing vehicle, passing a cluster of trees, etc.).
  • FIG. 7 depicts an exemplary embodiment of an architecture of data structures and information as used in some embodiments.
  • the system 702 of the egovehicle may maintain a list of tracked objects 705, maintain structured data 710, record information related to the egomotion of the egovehicle 715 (e.g., heading, velocity, coordinates, vehicle category, etc.), utilize a communication module for V2V and/or other communications 720, utilize a clock/timing module 725, maintain a database of current perception capabilities 730, and iteratively refine with through perception refinement 735 those perception capabilities based on its own sensor readings and data responses received from other vehicles through communications 740, and/or the like.
  • Other vehicles may be expected to have systems (702b, 702n) having comparable features to the egovehicle, such as lists of tracked objects (705b, 705n), structured data (710b, 71 On), record information (715b, 715n), communications (720b, 720n), clocks (725b, 725n), and/or the like.
  • the egovehicle may then make this updated perception range (or any other current perception capabilities) available to all self-driving and/or safety applications present in the egovehicle (650).
  • this updated perception range is below a predefined threshold permitted for the present speed of the self-driven egovehicle, a system safety function may determine that the driver must take over the control of the vehicle from a self-driving functionality.
  • the egovehicle may delete from the perception database any material which is older than a predetermined value (655) (e.g., any obj ect that has not been detected for 2 seconds, keep 25 newest high certainty detections; delete the oldest after a new one has been received, etc.).
  • a predetermined value e.g., any obj ect that has not been detected for 2 seconds, keep 25 newest high certainty detections; delete the oldest after a new one has been received, etc.
  • FIG. 8 illustrates an embodiment of the communication process of sharing detections between vehicles.
  • all V2V equipped vehicles will receive the current perceptions of V2V equipped vehicles in the vicinity. This data enables maximum situational awareness among all V2V equipped vehicles, as potentially hidden objects are likely to be identified thanks to overlapping perception ranges from various angles among the plurality of vehicles.
  • egovehicle 805 may broadcast a perception data request message 810 (including its own detection data) to other vehicles 807.
  • Egovehicle 805 may then enter a waiting period 815 in which it may receive detection data responses 820a, 820b, 820n from any or all other vehicles 807.
  • egovehicle 805 may perform various operations 830 (such as those discussed above in relation to FIG. 6), including but not limited to matching detections and updating the perception capability database.
  • the present disclosure may also be adapted to additional situations, including but not limited to the following.
  • a new broadcast message may be defined, which triggers perceptions simultaneously in all the V2V equipped vehicles within a geographical area defined in the triggering message (e.g., within 250 meters of the egovehicle).
  • the time stamps and heading information could be omitted from messages, since all Distance, Angle and Velocity relate to the 'snapshot' time, and there is no need for the egovehicle to correct or refine the received response data for time differences between sensor readings for various vehicles.
  • the 'Current perception capabilities' parameters are logged with time and vehicle coordinates
  • the sensor performance as a function of weather conditions may be estimated for that vehicle model and make. This information could be used instead of an updated perception capabilities database, in case the egovehicle is driving for long periods without any other vehicles around, but weather information available.
  • the changes in the detection range may also be transmitted to other vehicles as digital data that complements weather info of the area. Since different car makes and models will have different sensors installed, the 'weather info' should be in a normative form (e.g., what is the performance degradation in various wavelengths and frequencies). Therefore, the consequences of weather on sensor performance of each vehicle may be estimated at the sensor level, which requires dealing with the functions below the perception system level.
  • a vehicle estimates the sensor system's performance without any Detection data request by comparing the distance & angle in the list of tracked objects with the location of matching vehicle in the dynamic data storage. The result will have lower confidence levels, since in this method the algorithm cannot utilize the detections of other vehicles to confirm the object.
  • Some cases which cause sensor performance degradation may create constant performance degradation, which can be repaired only by redirecting the sensor(s), removing the hindering obstacle(s), and possibly calibrating the sensors after the fix. For example, mechanical contact with a sensor area; installation of accessory equipment on a front bumper or to other sensor areas at the sides or rear; re-installing or replacing the bumper, grille, or other parts at sensor areas; a rough carwash; or the like.
  • Some cases may cause performance to degrade steadily. For example, the sensors gradually becoming covered with snow, ice, dirt, etc. This may be especially true if degradation is caused by gradually accumulating dirt. Snow and ice can thaw away, and therefore the sensor performance may recover at least partially (partially because in road conditions snow and/or ice is generally accompanied with dirt) in warmer conditions. If the sensor is behind the windshield in an area cleaned by wipers, it is more protected from blockage by snow, ice, dirt, or the like, unless the windshield wipers fail. The same applies if the sensor is behind a similarly cleaned area, such as perhaps within the headlights (in the case of headlight washers and/or wipers).
  • FIG. 9 illustrates an exemplary embodiment of a perception capabilities database (forward looking perception system).
  • an egovehicle database may record the real-time or actual perception capabilities of the egovehicle 905.
  • the egovehicle sensors as monitored by the methods disclosed may determine their actual detection range, including detection ranges based on the class of the object detected.
  • the methods disclosed herein may result in the egovehicle 905 determining functional detection ranges for different classes of vehicles, such as motorcycles (functional range 910), cars (functional range 920), and trucks (functional range 930). In some embodiments, such ranges may be determined by the maximum ranges at which different classes of vehicles have been detected.
  • a method of adjusting vehicle operation responsive to a detected deficit in perception sensing comprising: receiving, at a first vehicle, from one or more vehicles of a plurality of vehicles within a predefined range of the first vehicle, information regarding the location of the one or more vehicles; comparing the information regarding the location of each of the one or more vehicles with information derived from sensors of the first vehicle; responsive to a determination that at least one vehicle of the plurality of vehicles for which the received information regarding location of the vehicle corresponds to a location within a predetermined sensing region around the first vehicle is not indicated to be at that location by the information derived from sensors of the first vehicle: adjusting the function of at least one driving function of the first vehicle.
  • the method may include wherein adjusting the function comprises at least one of: disengaging the function, slowing the vehicle, or alerting a human occupant to take over control of the vehicle.
  • the method may include wherein the information received at the first vehicle is received by a V2V communication system.
  • the method may include wherein the V2V communication system comprises a WTRU.
  • the method may include wherein the first vehicle and at least one of the one or more vehicles have V2V functionality.
  • a vehicle comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions comprising: receiving, at a first vehicle, from one or more vehicles of a plurality of vehicles within a predefined range of the first vehicle, information regarding the location of the one or more vehicles; comparing the information regarding the location of each of the one or more vehicles with information derived from sensors of the first vehicle; responsive to a determination that at least one vehicle of the plurality of vehicles for which the received information regarding location of the vehicle corresponds to a location within a predetermined sensing region around the first vehicle is not indicated to be at that location by the information derived from sensors of the first vehicle: adjusting the function of at least one driving function of the first vehicle.
  • a method comprising: receiving, at an autonomous vehicle, a message indicating a first location estimate of a nearby vehicle; operating at least one sensor of the autonomous vehicle to generate a second location estimate of the nearby vehicle; comparing the first location estimate with the second location estimate to determine whether a difference between the first and second location estimates exceeds a threshold; in response to a determination that a difference between the first and second location estimates exceeds the threshold, initiating a transition from an autonomous driving mode to a manual driving mode.
  • the method may also include wherein the threshold is an absolute distance threshold.
  • the threshold is a percentage distance threshold.
  • the method may also include wherein the message indicating the first location estimate of the nearby vehicle is received from the nearby vehicle.
  • the method may also include wherein the message indicating the first location estimate of the nearby vehicle is received from a vehicle other than the nearby vehicle.
  • an autonomous vehicle comprising a processor and a non- transitory storage medium, the storage medium storing instructions operative to perform functions comprising: receiving, at an autonomous vehicle, a message indicating a first location estimate of a nearby vehicle; operating at least one sensor of the autonomous vehicle to generate a second location estimate of the nearby vehicle; comparing the first location estimate with the second location estimate to determine whether a difference between the first and second location estimates exceeds a threshold; in response to a determination that a difference between the first and second location estimates exceeds the threshold, initiating a transition from a self-driving mode to a manual driving mode.
  • a method comprising: sending a detection data request from a first vehicle; receiving at least a first response to the detection data request from at least a second vehicle; determining the location of a plurality of vehicles within a detection range of the first vehicle based on sensor data from at least a first sensor measured the first vehicle and the at least first response to the detection data request; determining a certainty value for each of the plurality of vehicles based on a comparison of the sensor data gathered by the first vehicle and the at least first response to the detection data request; and responsive to a determination that a range for one or more vehicles detected through the at least first sensor and a range determined from the at least first response is different: determining that a functional range of the at least first sensor is restricted.
  • the method may further comprise providing an alert that an autonomous driving mode of the first vehicle cannot be used.
  • the method may further comprise disabling an autonomous mode of the first vehicle.
  • the method may further comprise, after sending the detection data request, waiting for a defined waiting period for responses from the at least second vehicle.
  • the method may include wherein the detection data request is sent by a V2V communication system.
  • the method may include wherein the V2V communication system comprises a WTRU.
  • the method may include wherein the first vehicle and the at least second vehicle have V2V functionality.
  • the method may include wherein determining the certainty value for each of the plurality of nearby vehicles further comprises: determining a number of vehicles which have detected one of the plurality of nearby vehicles; and assigning a certainty value to detection data related to said one of the plurality of nearby, wherein the certainty value is based on the number of vehicles which detected said one of the plurality of nearby vehicles.
  • assigning the certainty value further comprises assigning a maximum certainty value when the number of vehicles which have detected one of the plurality of vehicles is greater than or equal to 2.
  • assigning the certainty value further comprises assigning a middle value when the number of vehicles which have detected one of the plurality of vehicles is equal to 1.
  • the method may include wherein assigning the certainty value further comprises assigning a minimum value when the number of vehicles which have detected one of the plurality of vehicles is equal to 0.
  • the method may include wherein determining whether vehicles have detected the same vehicle includes an error tolerance factor.
  • the method may further comprise comparing the certainty value with the sensor data of the first vehicle to generate an estimate of the first vehicle's current perception range.
  • the method may include wherein if the current perception range is below a predefined threshold for a current speed of the first vehicle, requiring a driver of the first vehicle to resume control from a self-driving functionality of the first vehicle.
  • a method of sensor performance monitoring for a self- driving vehicle comprising: broadcasting a detection data request; waiting for at least a first response from at least one nearby vehicle; calculating locations of all received objects at the time of a last self-driving vehicle perception time; matching objects with the calculated locations; generating a certainty value for each object; storing the objects in a perception capability database of the self-driving vehicle; generating an estimate of the self-driving vehicle's current perception range; and making the estimated current perception range available to all self-driving and safety applications onboard the self-driving vehicle.
  • the method may further comprise deleting over-age readings from the perception capability database.
  • the method may further comprise, prior to broadcasting, determining whether a most recent detected data response is older than a defined update time window.
  • the method may further comprise, prior to broadcasting, determining whether a vehicle is entering or leaving the perception range of the self-driving vehicle.
  • a method for estimating and broadcasting range of perception in a first vehicle comprising: sending a detection data request; determining the location of all vehicles within a communication range based on the response messages; determining a certainty value for each vehicle based on a comparison of the first vehicle's measurements and received measurements; and responsive to determination that an object's range detected through perception sensors and said object's range determined using receiving V2V messages is different: determining that the range of the perception sensor is restricted; and alerting the user that the autonomous mode cannot be used.
  • a system comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions comprising: sending a detection data request from a first vehicle; receiving at least a first response to the detection data request from at least a second vehicle; determining the location of a plurality of vehicles within a detection range of the first vehicle based on sensor data from at least a first sensor measured the first vehicle and the at least first response to the detection data request; determining a certainty value for each of the plurality of vehicles based on a comparison of the sensor data gathered by the first vehicle and the at least first response to the detection data request; and responsive to a determination that a range for one or more vehicles detected through the at least first sensor and a range determined from the at least first response is different: determining that a functional range of the at least first sensor is restricted.
  • a self-driving vehicle comprising a processor and a non- transitory storage medium, the storage medium storing instructions operative to perform functions comprising: broadcasting a detection data request; waiting for at least a first response from at least one nearby vehicle; calculating locations of all received objects at the time of a last self-driving vehicle perception time; matching objects with the calculated locations; generating a certainty value for each object; storing the objects in a perception capability database of the self-driving vehicle; generating an estimate of the self-driving vehicle's current perception range; and making the estimated current perception range available to all self-driving and safety applications onboard the self-driving vehicle.
  • a vehicle comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions comprising: sending a detection data request; determining the location of all vehicles within a communication range based on the response messages; determining a certainty value for each vehicle based on a comparison of the first vehicle's measurements and received measurements; and responsive to determination that an object's range detected through perception sensors and said object's range determined using receiving V2V messages is different: determining the range of the perception sensor is restricted; and alerting the user that the autonomous mode cannot be used.
  • Exemplary embodiments disclosed herein are implemented using one or more wired and/or wireless network nodes, such as a wireless transmit/receive unit (WTRU) or other network entity.
  • WTRU wireless transmit/receive unit
  • FIG. 10 is a system diagram of an exemplary WTRU 102, which may be employed in embodiments described herein.
  • the WTRU 102 may include a processor 118, a communication interface 119 including a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, a non-removable memory 130, a removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and sensors 138.
  • GPS global positioning system
  • the processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
  • the processor 1 18 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment.
  • the processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 10 depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
  • the transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station over the air interface 116.
  • the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals.
  • the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, as examples.
  • the transmit/receive element 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
  • the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MTMO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 116.
  • the transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122. As noted above, the WTRU 102 may have multi-mode capabilities. Thus, the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
  • the processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128.
  • the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132.
  • the non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like.
  • SIM subscriber identity module
  • SD secure digital
  • the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102, such as on a server or a home computer (not shown).
  • the processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102.
  • the power source 134 may be any suitable device for powering the WTRU 102.
  • the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel- zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li -ion), and the like), solar cells, fuel cells, and the like.
  • the processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102.
  • location information e.g., longitude and latitude
  • the WTRU 102 may receive location information over the air interface 1 16 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
  • the processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 138 may include sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
  • sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module
  • FIG. 11 depicts an exemplary network entity 190 that may be used in embodiments of the present disclosure.
  • network entity 190 includes a communication interface 192, a processor 194, and non-transitory data storage 196, all of which are communicatively linked by a bus, network, or other communication path 198.
  • Communication interface 192 may include one or more wired communication interfaces and/or one or more wireless-communication interfaces. With respect to wired communication, communication interface 192 may include one or more interfaces such as Ethernet interfaces, as an example. With respect to wireless communication, communication interface 192 may include components such as one or more antennae, one or more transceivers/chipsets designed and configured for one or more types of wireless (e.g., LTE) communication, and/or any other components deemed suitable by those of skill in the relevant art. And further with respect to wireless communication, communication interface 192 may be equipped at a scale and with a configuration appropriate for acting on the network side— as opposed to the client side— of wireless communications (e.g., LTE communications, Wi-Fi communications, and the like). Thus, communication interface 192 may include the appropriate equipment and circuitry (perhaps including multiple transceivers) for serving multiple mobile stations, UEs, or other access terminals in a coverage area.
  • wireless communication interface 192 may include the appropriate equipment and circuitry (perhaps including multiple transceivers)
  • Processor 194 may include one or more processors of any type deemed suitable by those of skill in the relevant art, some examples including a general-purpose microprocessor and a dedicated DSP.
  • Data storage 196 may take the form of any non-transitory computer-readable medium or combination of such media, some examples including flash memory, read-only memory (ROM), and random-access memory (RAM) to name but a few, as any one or more types of non- transitory data storage deemed suitable by those of skill in the relevant art could be used.
  • data storage 196 contains program instructions 197 executable by processor 194 for carrying out various combinations of the various network-entity functions described herein.
  • ROM read only memory
  • RAM random access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD- ROM disks, and digital versatile disks (DVDs).
  • a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Abstract

Systems and methods related to monitoring the performance of the environment perception sensor system during driving in freeways or inter-urban roads with the help of other vehicles, and their location and perception information and the like, received via V2V broadcasts. This enables each vehicle to be continuously aware of its current detection range, which is safety-critical information for the all self-driving functions. This monitoring also reveals continuous sub-performance, in which case the vehicle owner needs to check the sensor system.

Description

METHOD AND SYSTEM FOR ONLINE PERFORMANCE MONITORING
OF THE PERCEPTION SYSTEM OF ROAD VEHICLES
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application is a non-provisional filing of, and claims benefit under 35 U.S.C §119(c) from, U.S. Provisional Patent Application Serial No. 62/321,457, filed April 12, 2016, entitled "METHOD AND SYSTEM FOR ONLINE PERFORMANCE MONITORING OF THE PERCEPTION SYSTEM OF ROAD VEHICLES," which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
[0002] The present disclosure generally relates to automotive perception sensors.
BACKGROUND
[0003] It is becoming increasingly important for the performance of the perception sensors to be regularly monitored and inspected, since more and more vehicle actions are based on automatic decisions, which are very much dependent on sensor readings. Sensor performance may degrade such as in the following cases: head-on collision, or if the sensor area(s) gets a mechanical contact; installing accessory equipment to the front bumper, or to other sensor area(s) at the sides or rear; re-installing or replacing the bumper, grille, or other parts in the vicinity of a sensor; and/or the like.
[0004] Sensor performance degradation is expected to be an even more serious problem in the future, when more and more self-driving cars are expected to work 24/7 in various weather conditions. Continuous monitoring of sensor performance is needed during driving, and especially in situations such as when sensors become covered with snow, ice, or dirt; or when driving in heavy rain, snowstorms, hailstorms, dense fog, or dust.
[0005] As sensors are getting covered with snow, ice, or dirt, this reduces the signal strength received, and as a consequence the sensing distance is reduced, or the fineness of detail is reduced. Some sensors may detect this condition in their self-diagnostic phase, some do not.
[0006] When driving in heavy rain, snowstorm, hailstorm, dense fog, or dust, an increased amount of the sensing signal is scattered in the air, and the received signal is weaker. As a consequence, the reliable sensing distance is reduced. Since scattering is stronger with optical wavelengths, some sensors such as LIDAR, cameras, and the like are more sensitive to adverse weather conditions. The self-diagnostics of sensors may have severe difficulties in picking up these conditions. [0007] Many sensor manufacturers have a self-diagnostics functionality built in to each sensor.
However, such functionality generally can only detect very obvious errors. For example, a radar can tell whether its internal power-on checks were successful, or a laser scanner can detect whether its front cover is totally blocked.
[0008] Sensor monitoring and calibration is described in, for example US2015/0066412A1, US7162339B2, US2010/0235129A1, DE102011017593A1, and DE10201 1112243 Al .
SUMMARY
[0009] It is becoming increasingly important that the performance of the perception sensors is regularly monitored, since more and more vehicle actions are based on automatic decisions, which are very much dependent on situational awareness given mainly by the sensor set of the vehicle.
[0010] Currently there are not many methods for online sensor performance checks. Using other vehicles as test targets as disclosed herein may have benefits over stationary objects at the roadside, including but not limited to: test targets can be available also at very feature-poor environment; vehicle perception system is tuned to find and track these targets; they are well defined, and detectable despite adverse conditions, such as heavy snowing; many of them communicate, allowing an additional channel to assure detection; they can appear 360° around the vehicle, and throughout the detection range in varying velocities, meaning that they can be used to test several of the perception sensors of the vehicle. Additionally, velocity is useful in sensor performance assessment, especially for radars. Moving vehicles may often have lights, which is further information for sensor performance analysis, such as assessing the performance of a camera system.
[0011] In one embodiment, disclosed herein is a method comprising: identifying a nearby set of vehicles to a first vehicle based on received wireless messages from a plurality of vehicles; identifying an observed set of vehicles based on sensors of the first vehicle; and responsive to a determination that at least one vehicle in the nearby set is not in the observed set, performing at least one perception-deficit-response action.
[0012] In one embodiment, there is a system comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions such as, but not limited to, those set forth above.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] A more detailed understanding may be had from the following description, presented by way of example in conjunction with the accompanying drawings, wherein: [0014] FIG. 1 depicts an exemplary embodiment of the detection range of a vehicle having 360 degree perception capability.
[0015] FIG. 2 depicts an exemplary embodiment of the sensor detection range of a vehicle in ideal conditions and in adverse conditions.
[0016] FIG. 3 depicts an exemplary embodiment the sensor detection range of a vehicle with a damaged right front sensor.
[0017] FIG. 4A depicts an exemplary embodiment of a first vehicle detecting a second vehicle, and the associated perceived information.
[0018] FIG. 4B depicts an exemplary embodiment of the second vehicle detecting the first vehicle, and the associated perceived information.
[0019] FIG. 5 depicts an exemplary embodiment of a typical traffic situation on a freeway, from the perspective at a given moment in time of an egovehicle.
[0020] FIG. 6 is a flow diagram of an exemplary embodiment of a method of sensor performance monitoring operation.
[0021] FIG. 7 depicts an exemplary embodiment of an architecture of data structures and information as used in the present disclosure.
[0022] FIG. 8 illustrates an embodiment of the communication process of sharing detections between vehicles.
[0023] FIG. 9 illustrates an exemplary embodiment of a perception capabilities database (forward looking perception system).
[0024] FIG. 10 illustrates an exemplary wireless transmit/receive unit (WTRU) that may be employed in some embodiments.
[0025] FIG. 11 illustrates an exemplary network entity that may be employed in some embodiments.
DETAILED DESCRIPTION
[0026] A detailed description of illustrative embodiments will now be provided with reference to the various Figures. Although this description provides detailed examples of possible implementations, it should be noted that the provided details are intended to be by way of example and in no way limit the scope of the application.
[0027] Note that various hardware elements of one or more of the described embodiments are referred to as "modules" that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits
(ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer- readable medium or media, such as commonly referred to as RAM, ROM, etc.
[0028] Modern vehicles have radars, cameras, laser scanners and ultrasonic perception sensors. They mainly provide the distance and angle to the nearest object(s), typically to surrounding vehicle(s). Most ADAS and V2V equipped cars have at least forward looking sensors. New vehicles are often equipped with a sensor system capable of 360° perception around the vehicle. FIG. 1 illustrates a simplified depiction of the detection (or perception) range 1007 of a vehicle 1005, which has 360° perception capability. In FIG. 1, the vehicle 1005 is heading to the right. In an exemplary scenario, if the vehicle is equipped with an adaptive cruise control (ACC) system, the forward looking distance may be about 200 meters. Confirmed objects 1010 may each have a distance 1015 and angle 1020 from the vehicle 1005.
[0029] It is becoming increasingly important that the performance of the perception sensors is constantly monitored, since more and more vehicle actions are based on automatic decisions, which are highly dependent on the vehicle's situational awareness, which in turn is highly dependent on sensor readings.
[0030] Especially when driving in adverse weather conditions, the sensor performance may vary with the weather. Therefore, sensor performance monitoring should be carried out frequently and automatically during normal driving.
[0031] FIG. 2 illustrates a simplified depiction of the sensor detection range of a vehicle 205 in ideal conditions (range 210), and in adverse conditions (range 215), such as dirt on the sensors. Since adverse weather conditions (e.g., heavy snow) or driving situations (e.g., oncoming car is raising dust in the air) can change rapidly, it is important to frequently monitor the performance of the perception system during driving.
[0032] The self-diagnostics of a perception sensor generally cannot alone detect any sensor heading errors. Also, conditions such as rain, snow, fog, or the like are not easily detectable with the sensor alone. Larger sensor heading errors (such as resulting from a possible bumper contact) or malfunction can only be fixed by mechanical adjustment or repairs. [0033] FIG. 3 illustrates a simplified depiction of the sensor detection range 310 of a vehicle 305 with a damaged right front sensor (or otherwise impaired sensor). While some objects 315 are detected and have their distance and angle relative to vehicle 305 confirmed, along a portion 320 the vehicle 305 is unable to detect anything. In an alternative example, the sensor view could be blocked, such as by a retrofitted fog light, or the like. In other examples, the sensor may be so dirty that it is practically nonfunctioning. In any case the vehicle 305 is partly blind (in non-detection area 320) and it should be inspected and repaired soon, because its functions, potentially including self-driving, are severely compromised.
[0034] V2V communication equipped vehicles may be repeatedly broadcasting information (such as their location, heading, velocity, etc.) to other vehicles in the area. In typical conditions, this message may carry 300 meters or more. This means that a particular vehicle can often be aware of other vehicles beyond its sensor detection range.
[0035] It is becoming increasingly important that the performance of the perception sensors is regularly monitored, since more and more vehicle actions are based on automatic decisions, which are very much dependent on situational awareness given mainly by the sensor set of the vehicle.
[0036] Currently there are not many methods for online sensor performance checks. Using other vehicles as test targets has a number of benefits over stationary objects at the roadside, including but not limited to: test targets can be available also at very feature-poor environment; vehicle perception system is tuned to find and track these targets; they are well defined, and detectable despite adverse conditions, such as heavy snowing; many of them communicate, allowing an additional channel to assure detection; they can appear 360° around the vehicle, and throughout the detection range in varying velocities, meaning that they can be used to test all the perception sensors of the vehicle. Additionally, velocity is useful in sensor performance assessment, especially for radars. Moving vehicles may often have lights, which is further information for sensor performance analysis, such as assessing the performance of a camera system.
[0037] In an exemplary embodiment, two oncoming vehicles - both having perception capability - measure how far and in which angle the other vehicle is, and share the information. This allows both vehicles to get an independent estimation of their respective perception ranges and angular correctness.
[0038] FIGS. 4A-4B illustrate simplified depictions of the interaction of two oncoming vehicles. A first vehicle 405 (referred to hereafter as the "egovehicle") detects a second vehicle 410 at a distance 422a and angle 424a (in the egovehicle's coordinate system, relative to direction of travel 420a) at a certain time. When the egovehicle 405 receives the second vehicle detections (e.g., distance 422b and angle 424b in vehicle 410s coordinate system, relative to its direction of travel 420b), the egovehicle 405can deduce how far its sensors were able to 'see' second vehicle 410 at this point. Since the detections will likely take place at different times, also included in the information exchange may be a measurement time stamp, and vehicle speed and heading (e.g., 407 and 412), and/or the like, to allow calculations to match the times of the measurements.
[0039] In a more general case (see FIG. 5), the egovehicle broadcasts the request to share detected objects with all vehicles in the communication range. When this is done frequently enough - initiated by the egovehicle or some other vehicle in the area - the vehicles can maintain an up-to-date measure of what their actual detection range is.
[0040] In some embodiments of the present disclosure, the disclosed approach is applied only in freeways or inter-urban roads, as in dense cities the vehicles may often be so close to each other that it is not always clear which of the detected objects is which vehicle - especially in traffic jams. Further, in many instances it is more important to monitor the true range of the perception in situations involving highways, freeways, and inter-urban roads, for example due to the higher speeds used.
[0041] An exemplary embodiment of the present disclosure is set forth with two vehicles exchanging their perceived information, as in FIGS. 4A-4B and 5. Generally, several entities may be involved: the egovehicle, one or more other V2V equipped vehicles (including both self-driven and manually driven vehicles), and in some instances one or more non-V2V equipped vehicles.
[0042] In one embodiment, there is a Vehicle 0 (egovehicle), which may be self-driven or manually driven. In some embodiments, the egovehicle may further comprise: a time management module satisfying the requirements given in DSRC or C-ITS specifications; one or more environment perception sensors; a perception system, which monitors the perception sensors and keeps track of the detected objects; a V2V communication system including BSM (DSRC) or CAM (C-ITS) message reception; a module for location tracking of other vehicles based on received BSM/CAM messages; processing capability to match and quantify received perceptions quickly; and an updatable database for storing information about sensor perception capability of the egovehicle. Additional features may be included, or the recited features may be modified as known to one of ordinary skill in the art.
[0043] Other V2V equipped self-driven or manually driven vehicles may have similar functionalities to the egovehicle as described above. In some embodiments, those other vehicles may be provided with only: a time management module satisfying the requirements given in DSRC or C-ITS specifications; one or more environment perception sensors; a perception system, which monitors the perception sensors and keeps track of the detected objects; and a V2V communication system including BSM (DSRC) or CAM (C-ITS) message reception.
[0044] Non-V2V equipped vehicles need no specific features, as they are only used as potential common landmarks possibly tracked by the egovehicle and/or other V2V vehicles.
[0045] Generally, it is preferred that the detection range of the egovehicle is tested when other vehicles are about to enter or leave its perception range. This enables sensor performance testing near its limits. Since 250 meters is the typical upper limit of most perception sensors, an interurban road or highway with a 250-meter line-of-sight is ideal for this. However, the present disclosure is applicable with shorter line-of-sight as well, but may not test the upper limit of the sensor system.
[0046] FIG. 5 illustrates a simplified depiction of a typical traffic situation on a freeway (as illustrated in FIG. 5, there is a four-lane freeway scenario, however alternative scenarios such as urban 2 lane roads are also envisioned herein). An egovehicle 515 knows or receives the locations, heading, speed, and vehicle type of other V2V equipped vehicles within its communication range, and tracks some vehicles within its perception range. In some instances, there may also be vehicles capable of Visible Light Communication. Such communication is included among the forms of V2V communication in this disclosure.
[0047] In FIG. 5, there are a plurality of vehicles, where vehicles 520, 526, 534, 536, 540, 552, and 554 do not have V2V functionality, while vehicles 522, 524, 528, 530, 532, 538, 542, 544, 546, 548, and 550 do have V2V functionality. Also, egovehicle 515 has a currently assumed perception range 505, and a current awareness range via V2V communications 510.
[0048] In the moment depicted in FIG. 5, the egovehicle 515 cannot see vehicle 538 due to the bushes, nor vehicle 540 behind vehicle 536.
[0049] In one embodiment, the egovehicle 515 may identify a nearby set of vehicles based on received wireless messages from a plurality of vehicles (e.g., vehicles within the awareness range via V2V 510). The egovehicle 515 may then identify an observed set of vehicles based on its sensors (such as radar, LiDAR, optical sensors, camera, etc.) within a currently assumed perception range 505. In the embodiment of FIG. 5, egovehicle 515 may observe a set of vehicles including vehicles 536 and 542. Among the nearby set of vehicles is also vehicle 532, which has sent a wireless message including its location to egovehicle 515. In an embodiment, egovehicle 515 only evaluates vehicles in the nearby set which are within a previously calculated perception range of the egovehicle 505. Here, because the egovehicle 515 "knows" that vehicle 532 is in the nearby set, and is within its perception range 505 but is not observed by the egovehicle' s sensors, egovehicle 515 may determine that it has a perception deficit. Given this perception deficit of a vehicle in the nearby set not being observed by egovehicle 515, the egovehicle may perform at least one perception-deficit response action.
[0050] In some embodiments, the perception-deficit response action may comprise one or more of (but is not limited to): presenting an alert to an occupant of the vehicle; disengaging an autonomous functionality of the vehicle; adjusting a driving parameter of the vehicle, such as due to a reduced perception capability; alerting a human occupant to take over control of the vehicle; generating an alert that a sensor of the vehicle may be malfunctioning; and/or the like. Types of driving parameters may include speed of the vehicle, distance from adjacent vehicles, driving route of the vehicle, whether an autonomous or semi-autonomous vehicle may operate in an autonomous mode, and/or the like.
[0051] In another exemplary embodiment related to FIG. 5, the disclosed process may be as follows, and is also depicted as a general flow diagram in FIG. 6.
[0052] If the time from the last received Detection Data Response is longer than a predetermined value (605), and the egovehicle 515 has noticed a vehicle entering or leaving its perception range (610) (e.g., vehicle 544 in FIG. 5), the egovehicle 515 may broadcast a Detection Data Request via its V2V communication device (615), and then wait some period of time for responses (620). In the case of FIG. 5, the egovehicle 515 may broadcast the request to vehicles 522, 524, 528, 530, 532, 538, 542, 544, 546,548, and 550, which are each V2V equipped vehicles within the V2V communications range 510 of the egovehicle 515. Each of these other vehicles may have similar functionality to the egovehicle.
[0053] In an exemplary embodiment, the message structure of a Detection Data Request may comprise, but is not limited to: a Command, such as "Return your detection and egomotion data"; (Angle, Distance, RelativeVelocity) triples of all objects detected and tracked by the egovehicle; a Last DetectionTimeO, and the corresponding VelocityO, HeadingO, CoordinatesO of the egovehicle; VechCategoryO (e.g., "moped", "motorcycle", "passengerCar", "bus", "lightTruck", "heavyTruck", "trailer", "specialVehicle", or "tram") of the egovehicle.
[0054] Each of those V2V equipped vehicles who can receive the broadcast may respond with a Detection Data Response broadcast. For example, in FIG. 5, the egovehicle may broadcast the request to vehicles 522, 524, 528, 530, 532, 538, 542, 544, 546,548, and 550. The number of responding broadcasts may depend on the number of V2V vehicles in the vicinity of the egovehicle (e.g., V2V communications range 510 of egovehicle 515). For example, in the scenario of FIG. 5, the egovehicle would optimally receive a response from each of vehicles 522, 524, 528, 530, 532, 538, 542, 544, 546,548, and 550. [0055] In an exemplary embodiment, the message structure of a Detection Data Response may comprise, but is not limited to: a Command, such as "Returning my detection and egomotion data"; (Angle, Distance, Relative Velocity) triples of all objects detected and tracked by the responding vehicle; a Last DetectionTimeX, and the corresponding VelocityX, HeadingX, CoordinatesX of the responding vehicle; VechCategoryX (e.g., "moped," "motorcycle," "passengerCar," "bus," "lightTruck," "heavyTruck," "trailer," "specialVehicle," or "tram") of the responding vehicle.
[0056] The egovehicle 515 may then calculate what was the location of all the received obj ects at the time of its last detection (625). For this it uses the received detections together with the coordinates, heading, speed, and detection time found in each broadcasted message.
[0057] The egovehicle 515 may match the objects detected by the egovehicle' s sensors with the refined object locations calculated above (630). Based on a mathematical analysis, each egovehicle detected object may be assigned a detection certainty value (635), which may be stored in a perception capability database (640) together with Angle, Distance and relative velocity. For example, the certainty value may be at a maximum value if two or more other vehicles have detected the object (within an error tolerance) at the same location as the egovehicle. If only one other vehicle has detected the object at the same location as the egovehicle, the certainty may be halfway between the maximum value and a minimum value. If no other vehicle has responded indicating detection of the object, the certainty may be the minimum value. Based on these factors, the egovehicle may generate a best estimate of the current perception range (645). In some embodiments, the estimate of the perception range may include utilization of several detection certainties generated over a period of time. For example, in one embodiment, the estimated perception range may modify the manufacturer's given sensor coverage area so that the range is shape scaled such that the farthest object detected by egovehicle and also detected by two other vehicles (e.g., at max confidence) just fits within the shape scaled perception range. In other embodiments, for example, the estimated perception range may be evaluated for specific sets of objects, such as groupings by vehicle category, and/or the like.
[0058] In one embodiment, the egovehicle may generate a point cloud using egovehicle' s detections and time-calibrated detections received from other vehicles. In one embodiment, the egovehicle may calculate a best fit point to best represent each cloud, and determine how many detection points fall within an error tolerance of the best fit point for each cloud. For example, if two or more received points and the egovehicle detection are within the error tolerance, the detection distance may receive the maximum certainty value. If only one received point and the egovehicle detection are within the range, the detection distance may receive a middle value certainty. If only the egovehicle detection is within the range, the detection distance may receive a minimum certainty value.
[0059] In some embodiments, the received detections may be used to diagnose the sensor system. For example, a large number of obj ects detected by other vehicles but never the egovehicle within its certain detection sector may indicate limitations on the egovehicle' s sensors (e.g., blocked sensor, malfunctioning sensor, range limited sensor, etc.), such as shown in FIG. 3. As such, the egovehicle continuously tracks its perception capability, which can be utilized in various safety features. Safety features may include, but is not limited to, features such as requesting the driver to take over control of the vehicle due to substandard perception capability, indicating a sensor re-calibration need, or indicating a serious malfunction in the sensor system.
[0060] In some embodiments, if the current perception range is an iteratively updated estimation, then temporary inability of the egovehicle to track particular objects (e.g., vehicles 538 and 540 in FIG. 5) may not indicate limited sensors, because a threshold time lapse between last detection by the egovehicle of an object has not been surpassed (e.g., to allow time for a passing vehicle, passing a cluster of trees, etc.).
[0061] FIG. 7 depicts an exemplary embodiment of an architecture of data structures and information as used in some embodiments. As shown, the system 702 of the egovehicle (Vehicle 0) may maintain a list of tracked objects 705, maintain structured data 710, record information related to the egomotion of the egovehicle 715 (e.g., heading, velocity, coordinates, vehicle category, etc.), utilize a communication module for V2V and/or other communications 720, utilize a clock/timing module 725, maintain a database of current perception capabilities 730, and iteratively refine with through perception refinement 735 those perception capabilities based on its own sensor readings and data responses received from other vehicles through communications 740, and/or the like.
[0062] Other vehicles (such as vehicles 1, 2, 3, N), may be expected to have systems (702b, 702n) having comparable features to the egovehicle, such as lists of tracked objects (705b, 705n), structured data (710b, 71 On), record information (715b, 715n), communications (720b, 720n), clocks (725b, 725n), and/or the like.
[0063] The egovehicle may then make this updated perception range (or any other current perception capabilities) available to all self-driving and/or safety applications present in the egovehicle (650). In some embodiments, such as for a self-driven egovehicle, if the updated perception range is below a predefined threshold permitted for the present speed of the self-driven egovehicle, a system safety function may determine that the driver must take over the control of the vehicle from a self-driving functionality. [0064] In some embodiments, the egovehicle may delete from the perception database any material which is older than a predetermined value (655) (e.g., any obj ect that has not been detected for 2 seconds, keep 25 newest high certainty detections; delete the oldest after a new one has been received, etc.).
[0065] FIG. 8 illustrates an embodiment of the communication process of sharing detections between vehicles. In embodiments using broadcasted messages, all V2V equipped vehicles will receive the current perceptions of V2V equipped vehicles in the vicinity. This data enables maximum situational awareness among all V2V equipped vehicles, as potentially hidden objects are likely to be identified thanks to overlapping perception ranges from various angles among the plurality of vehicles. For example, egovehicle 805 may broadcast a perception data request message 810 (including its own detection data) to other vehicles 807. Egovehicle 805 may then enter a waiting period 815 in which it may receive detection data responses 820a, 820b, 820n from any or all other vehicles 807. The detection data responses 820a, 820b, ... , 820n each include detection data from the transmitting other vehicle. After the end of the waiting period 815, egovehicle 805 may perform various operations 830 (such as those discussed above in relation to FIG. 6), including but not limited to matching detections and updating the perception capability database.
[0066] The present disclosure may also be adapted to additional situations, including but not limited to the following.
[0067] In one embodiment, instead of vehicles measuring Distance and Angle readings asynchronously, a new broadcast message may be defined, which triggers perceptions simultaneously in all the V2V equipped vehicles within a geographical area defined in the triggering message (e.g., within 250 meters of the egovehicle). In such embodiments, the time stamps and heading information could be omitted from messages, since all Distance, Angle and Velocity relate to the 'snapshot' time, and there is no need for the egovehicle to correct or refine the received response data for time differences between sensor readings for various vehicles.
[0068] In one embodiment, if the 'Current perception capabilities' parameters are logged with time and vehicle coordinates, the sensor performance as a function of weather conditions may be estimated for that vehicle model and make. This information could be used instead of an updated perception capabilities database, in case the egovehicle is driving for long periods without any other vehicles around, but weather information available.
[0069] In some embodiments, the changes in the detection range may also be transmitted to other vehicles as digital data that complements weather info of the area. Since different car makes and models will have different sensors installed, the 'weather info' should be in a normative form (e.g., what is the performance degradation in various wavelengths and frequencies). Therefore, the consequences of weather on sensor performance of each vehicle may be estimated at the sensor level, which requires dealing with the functions below the perception system level.
[0070] In some embodiments, a vehicle estimates the sensor system's performance without any Detection data request by comparing the distance & angle in the list of tracked objects with the location of matching vehicle in the dynamic data storage. The result will have lower confidence levels, since in this method the algorithm cannot utilize the detections of other vehicles to confirm the object.
[0071] Real World Examples. Some cases which cause sensor performance degradation may create constant performance degradation, which can be repaired only by redirecting the sensor(s), removing the hindering obstacle(s), and possibly calibrating the sensors after the fix. For example, mechanical contact with a sensor area; installation of accessory equipment on a front bumper or to other sensor areas at the sides or rear; re-installing or replacing the bumper, grille, or other parts at sensor areas; a rough carwash; or the like.
[0072] Some cases may cause performance to degrade steadily. For example, the sensors gradually becoming covered with snow, ice, dirt, etc. This may be especially true if degradation is caused by gradually accumulating dirt. Snow and ice can thaw away, and therefore the sensor performance may recover at least partially (partially because in road conditions snow and/or ice is generally accompanied with dirt) in warmer conditions. If the sensor is behind the windshield in an area cleaned by wipers, it is more protected from blockage by snow, ice, dirt, or the like, unless the windshield wipers fail. The same applies if the sensor is behind a similarly cleaned area, such as perhaps within the headlights (in the case of headlight washers and/or wipers).
[0073] In the case of driving in heavy rain, snowstorms, hailstorms, dense fog, dust, or the like, certain densities of rain, snow, hail, fog, or dust may cause multiple incorrect readings, or shorten the reliable sensing distance, especially with optical sensors. Typically, a higher density weather condition causes greater sensor performance degradation. The weather conditions may often vary, which further means that the sensor performance changes often. Therefore, the vehicle should be able to carry out sensor performance checks frequently, in order to understand how much it can rely on its sensors at any given time.
[0074] FIG. 9 illustrates an exemplary embodiment of a perception capabilities database (forward looking perception system). As can be seen in FIG. 9, an egovehicle database may record the real-time or actual perception capabilities of the egovehicle 905. For example, despite having a maximum detection range given by the manufacturer 940, the egovehicle sensors as monitored by the methods disclosed may determine their actual detection range, including detection ranges based on the class of the object detected. As shown in FIG. 9, the methods disclosed herein may result in the egovehicle 905 determining functional detection ranges for different classes of vehicles, such as motorcycles (functional range 910), cars (functional range 920), and trucks (functional range 930). In some embodiments, such ranges may be determined by the maximum ranges at which different classes of vehicles have been detected.
[0075] Additional Embodiments. In one embodiment, there is a method of adjusting vehicle operation responsive to a detected deficit in perception sensing, comprising: receiving, at a first vehicle, from one or more vehicles of a plurality of vehicles within a predefined range of the first vehicle, information regarding the location of the one or more vehicles; comparing the information regarding the location of each of the one or more vehicles with information derived from sensors of the first vehicle; responsive to a determination that at least one vehicle of the plurality of vehicles for which the received information regarding location of the vehicle corresponds to a location within a predetermined sensing region around the first vehicle is not indicated to be at that location by the information derived from sensors of the first vehicle: adjusting the function of at least one driving function of the first vehicle. The method may include wherein adjusting the function comprises at least one of: disengaging the function, slowing the vehicle, or alerting a human occupant to take over control of the vehicle. The method may include wherein the information received at the first vehicle is received by a V2V communication system. The method may include wherein the V2V communication system comprises a WTRU. The method may include wherein the first vehicle and at least one of the one or more vehicles have V2V functionality.
[0076] In an embodiment, there is a vehicle comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions comprising: receiving, at a first vehicle, from one or more vehicles of a plurality of vehicles within a predefined range of the first vehicle, information regarding the location of the one or more vehicles; comparing the information regarding the location of each of the one or more vehicles with information derived from sensors of the first vehicle; responsive to a determination that at least one vehicle of the plurality of vehicles for which the received information regarding location of the vehicle corresponds to a location within a predetermined sensing region around the first vehicle is not indicated to be at that location by the information derived from sensors of the first vehicle: adjusting the function of at least one driving function of the first vehicle.
[0077] In an embodiment, there is a method comprising: receiving, at an autonomous vehicle, a message indicating a first location estimate of a nearby vehicle; operating at least one sensor of the autonomous vehicle to generate a second location estimate of the nearby vehicle; comparing the first location estimate with the second location estimate to determine whether a difference between the first and second location estimates exceeds a threshold; in response to a determination that a difference between the first and second location estimates exceeds the threshold, initiating a transition from an autonomous driving mode to a manual driving mode. The method may also include wherein the threshold is an absolute distance threshold. The method may also include wherein the threshold is a percentage distance threshold. The method may also include wherein the message indicating the first location estimate of the nearby vehicle is received from the nearby vehicle. The method may also include wherein the message indicating the first location estimate of the nearby vehicle is received from a vehicle other than the nearby vehicle.
[0078] In an embodiment, there is an autonomous vehicle comprising a processor and a non- transitory storage medium, the storage medium storing instructions operative to perform functions comprising: receiving, at an autonomous vehicle, a message indicating a first location estimate of a nearby vehicle; operating at least one sensor of the autonomous vehicle to generate a second location estimate of the nearby vehicle; comparing the first location estimate with the second location estimate to determine whether a difference between the first and second location estimates exceeds a threshold; in response to a determination that a difference between the first and second location estimates exceeds the threshold, initiating a transition from a self-driving mode to a manual driving mode.
[0079] In an embodiment, there is a method comprising: sending a detection data request from a first vehicle; receiving at least a first response to the detection data request from at least a second vehicle; determining the location of a plurality of vehicles within a detection range of the first vehicle based on sensor data from at least a first sensor measured the first vehicle and the at least first response to the detection data request; determining a certainty value for each of the plurality of vehicles based on a comparison of the sensor data gathered by the first vehicle and the at least first response to the detection data request; and responsive to a determination that a range for one or more vehicles detected through the at least first sensor and a range determined from the at least first response is different: determining that a functional range of the at least first sensor is restricted. The method may further comprise providing an alert that an autonomous driving mode of the first vehicle cannot be used. The method may further comprise disabling an autonomous mode of the first vehicle. The method may further comprise, after sending the detection data request, waiting for a defined waiting period for responses from the at least second vehicle. The method may include wherein the detection data request is sent by a V2V communication system. The method may include wherein the V2V communication system comprises a WTRU. The method may include wherein the first vehicle and the at least second vehicle have V2V functionality. The method may include wherein determining the certainty value for each of the plurality of nearby vehicles further comprises: determining a number of vehicles which have detected one of the plurality of nearby vehicles; and assigning a certainty value to detection data related to said one of the plurality of nearby, wherein the certainty value is based on the number of vehicles which detected said one of the plurality of nearby vehicles. The method may include wherein assigning the certainty value further comprises assigning a maximum certainty value when the number of vehicles which have detected one of the plurality of vehicles is greater than or equal to 2. The method may include wherein assigning the certainty value further comprises assigning a middle value when the number of vehicles which have detected one of the plurality of vehicles is equal to 1. The method may include wherein assigning the certainty value further comprises assigning a minimum value when the number of vehicles which have detected one of the plurality of vehicles is equal to 0. The method may include wherein determining whether vehicles have detected the same vehicle includes an error tolerance factor. The method may further comprise comparing the certainty value with the sensor data of the first vehicle to generate an estimate of the first vehicle's current perception range. The method may include wherein if the current perception range is below a predefined threshold for a current speed of the first vehicle, requiring a driver of the first vehicle to resume control from a self-driving functionality of the first vehicle.
[0080] In an embodiment, there is a method of sensor performance monitoring for a self- driving vehicle, comprising: broadcasting a detection data request; waiting for at least a first response from at least one nearby vehicle; calculating locations of all received objects at the time of a last self-driving vehicle perception time; matching objects with the calculated locations; generating a certainty value for each object; storing the objects in a perception capability database of the self-driving vehicle; generating an estimate of the self-driving vehicle's current perception range; and making the estimated current perception range available to all self-driving and safety applications onboard the self-driving vehicle. The method may further comprise deleting over-age readings from the perception capability database. The method may further comprise, prior to broadcasting, determining whether a most recent detected data response is older than a defined update time window. The method may further comprise, prior to broadcasting, determining whether a vehicle is entering or leaving the perception range of the self-driving vehicle.
[0081] In an embodiment, there is a method for estimating and broadcasting range of perception in a first vehicle, comprising: sending a detection data request; determining the location of all vehicles within a communication range based on the response messages; determining a certainty value for each vehicle based on a comparison of the first vehicle's measurements and received measurements; and responsive to determination that an object's range detected through perception sensors and said object's range determined using receiving V2V messages is different: determining that the range of the perception sensor is restricted; and alerting the user that the autonomous mode cannot be used.
[0082] In an embodiment, there is a system comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions comprising: sending a detection data request from a first vehicle; receiving at least a first response to the detection data request from at least a second vehicle; determining the location of a plurality of vehicles within a detection range of the first vehicle based on sensor data from at least a first sensor measured the first vehicle and the at least first response to the detection data request; determining a certainty value for each of the plurality of vehicles based on a comparison of the sensor data gathered by the first vehicle and the at least first response to the detection data request; and responsive to a determination that a range for one or more vehicles detected through the at least first sensor and a range determined from the at least first response is different: determining that a functional range of the at least first sensor is restricted.
[0083] In an embodiment, there is a self-driving vehicle comprising a processor and a non- transitory storage medium, the storage medium storing instructions operative to perform functions comprising: broadcasting a detection data request; waiting for at least a first response from at least one nearby vehicle; calculating locations of all received objects at the time of a last self-driving vehicle perception time; matching objects with the calculated locations; generating a certainty value for each object; storing the objects in a perception capability database of the self-driving vehicle; generating an estimate of the self-driving vehicle's current perception range; and making the estimated current perception range available to all self-driving and safety applications onboard the self-driving vehicle.
[0084] In an embodiment, there is a vehicle comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions comprising: sending a detection data request; determining the location of all vehicles within a communication range based on the response messages; determining a certainty value for each vehicle based on a comparison of the first vehicle's measurements and received measurements; and responsive to determination that an object's range detected through perception sensors and said object's range determined using receiving V2V messages is different: determining the range of the perception sensor is restricted; and alerting the user that the autonomous mode cannot be used. [0085] Exemplary embodiments disclosed herein are implemented using one or more wired and/or wireless network nodes, such as a wireless transmit/receive unit (WTRU) or other network entity.
[0086] FIG. 10 is a system diagram of an exemplary WTRU 102, which may be employed in embodiments described herein. As shown in FIG. 10, the WTRU 102 may include a processor 118, a communication interface 119 including a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, a non-removable memory 130, a removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and sensors 138. It will be appreciated that the WTRU 102 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
[0087] The processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 1 18 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment. The processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 10 depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
[0088] The transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station over the air interface 116. For example, in one embodiment, the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals. In another embodiment, the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, as examples. In yet another embodiment, the transmit/receive element 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
[0089] In addition, although the transmit/receive element 122 is depicted in FIG. 10 as a single element, the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MTMO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 116. [0090] The transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122. As noted above, the WTRU 102 may have multi-mode capabilities. Thus, the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
[0091] The processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128. In addition, the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132. The non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102, such as on a server or a home computer (not shown).
[0092] The processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102. The power source 134 may be any suitable device for powering the WTRU 102. As examples, the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel- zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li -ion), and the like), solar cells, fuel cells, and the like.
[0093] The processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102. In addition to, or in lieu of, the information from the GPS chipset 136, the WTRU 102 may receive location information over the air interface 1 16 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
[0094] The processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 138 may include sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
[0095] FIG. 11 depicts an exemplary network entity 190 that may be used in embodiments of the present disclosure. As depicted in FIG. 11, network entity 190 includes a communication interface 192, a processor 194, and non-transitory data storage 196, all of which are communicatively linked by a bus, network, or other communication path 198.
[0096] Communication interface 192 may include one or more wired communication interfaces and/or one or more wireless-communication interfaces. With respect to wired communication, communication interface 192 may include one or more interfaces such as Ethernet interfaces, as an example. With respect to wireless communication, communication interface 192 may include components such as one or more antennae, one or more transceivers/chipsets designed and configured for one or more types of wireless (e.g., LTE) communication, and/or any other components deemed suitable by those of skill in the relevant art. And further with respect to wireless communication, communication interface 192 may be equipped at a scale and with a configuration appropriate for acting on the network side— as opposed to the client side— of wireless communications (e.g., LTE communications, Wi-Fi communications, and the like). Thus, communication interface 192 may include the appropriate equipment and circuitry (perhaps including multiple transceivers) for serving multiple mobile stations, UEs, or other access terminals in a coverage area.
[0097] Processor 194 may include one or more processors of any type deemed suitable by those of skill in the relevant art, some examples including a general-purpose microprocessor and a dedicated DSP.
[0098] Data storage 196 may take the form of any non-transitory computer-readable medium or combination of such media, some examples including flash memory, read-only memory (ROM), and random-access memory (RAM) to name but a few, as any one or more types of non- transitory data storage deemed suitable by those of skill in the relevant art could be used. As depicted in FIG. 11, data storage 196 contains program instructions 197 executable by processor 194 for carrying out various combinations of the various network-entity functions described herein.
[0099] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer- readable medium for execution by a computer or processor. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD- ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Claims

CLAIMS We claim:
1. A method comprising:
identifying a nearby set of vehicles to a first vehicle based on received wireless messages from a plurality of vehicles;
identifying an observed set of vehicles based on sensors of the first vehicle; and responsive to a determination that at least one vehicle in the nearby set is not in the observed set, performing at least one perception-deficit response action.
2. The method of claim 1, wherein the at least one perception-deficit response comprises presenting an alert to an occupant of the first vehicle.
3. The method of claim 1, wherein the at least one perception-deficit response comprises disengaging an autonomous function of the first vehicle.
4. The method of claim 1, wherein the at least one perception-deficit response comprises adjusting a driving parameter of the first vehicle due to reduced perception capability.
5. The method of claim 1, wherein the at least one perception-deficit response comprises alerting a human occupant to take over control of the first vehicle.
6. The method of claim 1, wherein the wireless messages received by the first vehicle comprise timestamped velocity, heading, and coordinates of a given nearby vehicle.
7. The method of claim 1, wherein the nearby set comprises vehicles from which wireless messages were received by the first vehicle and vehicles observed by those vehicles.
8. The method of claim 1, wherein the wireless messages received by the first vehicle comprise location information of a given nearby vehicle and location information for each vehicle observed by said nearby vehicle.
9. The method of claim 8, further comprising determining a detection certainty value for each of the nearby set of vehicles based on a number of vehicles which have observed said one of the nearby vehicles.
10. The method of claim 9, wherein determining the certainty value further comprises
assigning a maximum certainty value when the number of vehicles which have observed a nearby vehicle is greater than or equal to 2.
11. The method of claim 9, wherein determining the certainty value further comprises
assigning a middle certainty value when the number of vehicles which have observed a nearby vehicle is equal to 1.
12. The method of claim 9, wherein determining the certainty value further comprises
assigning a minimum certainty value when the number of vehicles which have observed a nearby vehicle is equal to 0.
13. The method of claim 9, further comprising comparing the determined certainty values with sensor data of the first vehicle to generate an estimate of the first vehicle's current perception range.
14. The method of claim 1, wherein for determining that at least one vehicle in the nearby set is not in the observed set only nearby vehicles within a previously calculated perception range of the first vehicle are compared to the observed set.
15. The method of claim 1, further comprising broadcasting a request for detection data from the first vehicle via a V2V communication system of the first vehicle.
16. The method of claim 15, wherein the request is broadcast responsive to a determination by the first vehicle that a vehicle has entered a perception range of the first vehicle.
17. The method of claim 15, wherein the request is broadcast responsive to a determination by the first vehicle that a vehicle has left a perception range of the first vehicle.
18. A vehicle comprising a processor and a non-transitory storage medium, the storage medium storing instructions operative to perform functions comprising:
identifying a nearby set of vehicles to the vehicle based on received wireless messages from a plurality of vehicles;
identifying an observed set of vehicles based on sensors of the vehicle; and responsive to a determination that at least one vehicle in the nearby set is not in the observed set, performing at least one perception-deficit-response action.
PCT/US2017/026183 2016-04-12 2017-04-05 Method and system for online performance monitoring of the perception system of road vehicles WO2017180394A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662321457P 2016-04-12 2016-04-12
US62/321,457 2016-04-12

Publications (1)

Publication Number Publication Date
WO2017180394A1 true WO2017180394A1 (en) 2017-10-19

Family

ID=58692554

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/026183 WO2017180394A1 (en) 2016-04-12 2017-04-05 Method and system for online performance monitoring of the perception system of road vehicles

Country Status (1)

Country Link
WO (1) WO2017180394A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110228431A (en) * 2018-03-06 2019-09-13 罗伯特·博世有限公司 Method and apparatus for calibrating the sensor of vehicle
WO2019232144A1 (en) * 2018-06-01 2019-12-05 Qualcomm Incorporated Techniques for sharing of sensor information
JP6671568B1 (en) * 2019-07-30 2020-03-25 三菱電機株式会社 Sensor diagnostic device and sensor diagnostic program
CN111918244A (en) * 2019-05-10 2020-11-10 大众汽车有限公司 Concept for addressing road users in wireless communication
EP3832349A1 (en) * 2019-12-06 2021-06-09 Veoneer Sweden AB Associating radar detections with received data transmissions
CN113022759A (en) * 2019-12-06 2021-06-25 格科特有限公司 Collision warning system and method for micro-mobile vehicles
CN113359094A (en) * 2020-03-06 2021-09-07 本田技研工业株式会社 Vehicle information processing device
US20210354708A1 (en) * 2020-05-15 2021-11-18 Zenuity Ab Online perception performance evaluation for autonomous and semi-autonomous vehicles
JP2022033927A (en) * 2021-02-19 2022-03-02 アポロ インテリジェント コネクティビティ (ベイジン) テクノロジー カンパニー リミテッド Testing method and apparatus for vehicle perception system, device, and electronic apparatus
WO2022161772A1 (en) * 2021-02-01 2022-08-04 Bayerische Motoren Werke Aktiengesellschaft Method and device for determining a detection range of a sensor of a motor vehicle
CN115273473A (en) * 2022-07-29 2022-11-01 阿波罗智联(北京)科技有限公司 Method and device for processing perception information of road side equipment and automatic driving vehicle
CN116824869A (en) * 2023-08-31 2023-09-29 国汽(北京)智能网联汽车研究院有限公司 Vehicle-road cloud integrated traffic fusion perception testing method, device, system and medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7162339B2 (en) 2004-08-31 2007-01-09 General Motors Corporation automated vehicle calibration and testing system via telematics
US20100198513A1 (en) * 2009-02-03 2010-08-05 Gm Global Technology Operations, Inc. Combined Vehicle-to-Vehicle Communication and Object Detection Sensing
US20100235129A1 (en) 2009-03-10 2010-09-16 Honeywell International Inc. Calibration of multi-sensor system
US20120101704A1 (en) * 2010-10-21 2012-04-26 GM Global Technology Operations LLC Method for operating at least one sensor of a vehicle and vehicle having at least one sensor
DE102011112243A1 (en) 2011-09-01 2012-05-24 Daimler Ag Method for calibration of sensor device used for detecting surroundings of vehicle, involves comparing generated reference data with sensor data of sensor device, to calibrate sensor device
DE102011017593A1 (en) 2011-04-27 2012-10-31 Robert Bosch Gmbh Device for detecting errors of environment sensor of vehicle, has evaluation unit for comparing two informations such that fault is detected, when former information deviates over predetermined amount of latter information
DE102012024959A1 (en) * 2012-12-20 2014-06-26 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Method for operating vehicle e.g. passenger car, involves calculating position of object, and determining instantaneous detection area of sensor based on determined position of object when object is not detected by sensor
US20150066412A1 (en) 2011-10-11 2015-03-05 Stefan Nordbruch Method and device for calibrating a surroundings sensor
US9079587B1 (en) * 2014-02-14 2015-07-14 Ford Global Technologies, Llc Autonomous control in a dense vehicle environment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7162339B2 (en) 2004-08-31 2007-01-09 General Motors Corporation automated vehicle calibration and testing system via telematics
US20100198513A1 (en) * 2009-02-03 2010-08-05 Gm Global Technology Operations, Inc. Combined Vehicle-to-Vehicle Communication and Object Detection Sensing
US20100235129A1 (en) 2009-03-10 2010-09-16 Honeywell International Inc. Calibration of multi-sensor system
US20120101704A1 (en) * 2010-10-21 2012-04-26 GM Global Technology Operations LLC Method for operating at least one sensor of a vehicle and vehicle having at least one sensor
DE102011017593A1 (en) 2011-04-27 2012-10-31 Robert Bosch Gmbh Device for detecting errors of environment sensor of vehicle, has evaluation unit for comparing two informations such that fault is detected, when former information deviates over predetermined amount of latter information
DE102011112243A1 (en) 2011-09-01 2012-05-24 Daimler Ag Method for calibration of sensor device used for detecting surroundings of vehicle, involves comparing generated reference data with sensor data of sensor device, to calibrate sensor device
US20150066412A1 (en) 2011-10-11 2015-03-05 Stefan Nordbruch Method and device for calibrating a surroundings sensor
DE102012024959A1 (en) * 2012-12-20 2014-06-26 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Method for operating vehicle e.g. passenger car, involves calculating position of object, and determining instantaneous detection area of sensor based on determined position of object when object is not detected by sensor
US9079587B1 (en) * 2014-02-14 2015-07-14 Ford Global Technologies, Llc Autonomous control in a dense vehicle environment

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110228431A (en) * 2018-03-06 2019-09-13 罗伯特·博世有限公司 Method and apparatus for calibrating the sensor of vehicle
WO2019232144A1 (en) * 2018-06-01 2019-12-05 Qualcomm Incorporated Techniques for sharing of sensor information
US11244175B2 (en) 2018-06-01 2022-02-08 Qualcomm Incorporated Techniques for sharing of sensor information
CN111918244A (en) * 2019-05-10 2020-11-10 大众汽车有限公司 Concept for addressing road users in wireless communication
CN111918244B (en) * 2019-05-10 2024-02-02 大众汽车有限公司 Concept for addressing road users in wireless communication
JP6671568B1 (en) * 2019-07-30 2020-03-25 三菱電機株式会社 Sensor diagnostic device and sensor diagnostic program
WO2021019665A1 (en) * 2019-07-30 2021-02-04 三菱電機株式会社 Sensor diagnosis device and sensor diagnosis program
US11623707B2 (en) * 2019-12-06 2023-04-11 GEKOT Inc. Collision alert systems and methods for micromobility vehicles
EP3832349A1 (en) * 2019-12-06 2021-06-09 Veoneer Sweden AB Associating radar detections with received data transmissions
CN113022759A (en) * 2019-12-06 2021-06-25 格科特有限公司 Collision warning system and method for micro-mobile vehicles
US11878761B2 (en) 2019-12-06 2024-01-23 Gekot, Inc. Collision alert systems and methods for micromobility vehicles
CN113359094A (en) * 2020-03-06 2021-09-07 本田技研工业株式会社 Vehicle information processing device
US20210354708A1 (en) * 2020-05-15 2021-11-18 Zenuity Ab Online perception performance evaluation for autonomous and semi-autonomous vehicles
WO2022161772A1 (en) * 2021-02-01 2022-08-04 Bayerische Motoren Werke Aktiengesellschaft Method and device for determining a detection range of a sensor of a motor vehicle
EP3916421A3 (en) * 2021-02-19 2022-03-30 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Testing method and apparatus for vehicle perception system, device, and storage medium
JP2022033927A (en) * 2021-02-19 2022-03-02 アポロ インテリジェント コネクティビティ (ベイジン) テクノロジー カンパニー リミテッド Testing method and apparatus for vehicle perception system, device, and electronic apparatus
CN115273473A (en) * 2022-07-29 2022-11-01 阿波罗智联(北京)科技有限公司 Method and device for processing perception information of road side equipment and automatic driving vehicle
CN116824869A (en) * 2023-08-31 2023-09-29 国汽(北京)智能网联汽车研究院有限公司 Vehicle-road cloud integrated traffic fusion perception testing method, device, system and medium
CN116824869B (en) * 2023-08-31 2023-11-24 国汽(北京)智能网联汽车研究院有限公司 Vehicle-road cloud integrated traffic fusion perception testing method, device, system and medium

Similar Documents

Publication Publication Date Title
WO2017180394A1 (en) Method and system for online performance monitoring of the perception system of road vehicles
US11194057B2 (en) ASIL-classification by cooperative positioning
US11100731B2 (en) Vehicle sensor health monitoring
US11796654B2 (en) Distributed sensor calibration and sensor sharing using cellular vehicle-to-everything (CV2X) communication
US20190143967A1 (en) Method and system for collaborative sensing for updating dynamic map layers
US11055933B2 (en) Method for operating a communication network comprising a plurality of motor vehicles, and motor vehicle
US11341844B2 (en) Method and system for determining driving assisting data
CN111559383A (en) Method and system for determining Autonomous Vehicle (AV) motion based on vehicle and edge sensor data
JP6973351B2 (en) Sensor calibration method and sensor calibration device
JP6311211B2 (en) Obstacle detection device
JP6903598B2 (en) Information processing equipment, information processing methods, information processing programs, and mobiles
JP2023165850A (en) Electronic equipment, control method of electronic equipment, and control program of electronic equipment
US20210323577A1 (en) Methods and systems for managing an automated driving system of a vehicle
US11643082B2 (en) Systems and methods for determining real-time lane level snow accumulation
KR20150055278A (en) System and method for collecting traffic information using vehicle radar
JP7187784B2 (en) Vehicle information processing system, management device, vehicle information processing method, and vehicle information processing program
EP3782142A1 (en) Methods and control arrangements for diagnosing short-range wireless transmission functionality of vehicles
JP7123117B2 (en) Vehicle Position Reliability Calculation Device, Vehicle Position Reliability Calculation Method, Vehicle Control Device, and Vehicle Control Method
Elsagheer Mohamed et al. Autonomous real-time speed-limit violation detection and reporting systems based on the internet of vehicles (IoV)
US20230166761A1 (en) Method and system for estimation of an operational design domain boundary
US11796345B2 (en) Method and system for optimized notification of detected event on vehicles
US20230213664A1 (en) Systems and methods for radio frequency (rf) ranging-aided localization and map generation
WO2016072082A1 (en) Driving assistance system and center
JP2024049539A (en) Information Providing Device
JP2009054011A (en) Information processing apparatus and program

Legal Events

Date Code Title Description
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17722531

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17722531

Country of ref document: EP

Kind code of ref document: A1