US20080304705A1 - System and method for side vision detection of obstacles for vehicles - Google Patents
System and method for side vision detection of obstacles for vehicles Download PDFInfo
- Publication number
- US20080304705A1 US20080304705A1 US11/954,513 US95451307A US2008304705A1 US 20080304705 A1 US20080304705 A1 US 20080304705A1 US 95451307 A US95451307 A US 95451307A US 2008304705 A1 US2008304705 A1 US 2008304705A1
- Authority
- US
- United States
- Prior art keywords
- cameras
- pair
- host vehicle
- image
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 82
- 238000001514 detection method Methods 0.000 title claims abstract description 27
- 230000001143 conditioned effect Effects 0.000 claims abstract description 5
- 230000008569 process Effects 0.000 claims description 34
- 238000001914 filtration Methods 0.000 claims description 5
- 230000002776 aggregation Effects 0.000 claims description 3
- 238000004220 aggregation Methods 0.000 claims description 3
- 238000003706 image smoothing Methods 0.000 claims description 3
- 230000003750 conditioning effect Effects 0.000 claims 2
- 210000003128 head Anatomy 0.000 description 21
- 238000012545 processing Methods 0.000 description 10
- 230000008901 benefit Effects 0.000 description 8
- 238000013459 approach Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 239000011521 glass Substances 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 238000003708 edge detection Methods 0.000 description 2
- 230000003628 erosive effect Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012952 Resampling Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000000275 quality assurance Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/12—Mirror assemblies combined with other articles, e.g. clocks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/26—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/31—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/12—Mirror assemblies combined with other articles, e.g. clocks
- B60R2001/1253—Mirror assemblies combined with other articles, e.g. clocks with cameras, video cameras or video screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R2011/0001—Arrangements for holding or mounting articles, not otherwise provided for characterised by position
- B60R2011/004—Arrangements for holding or mounting articles, not otherwise provided for characterised by position outside the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/107—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/804—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8066—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring rearward traffic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
Definitions
- This invention relates to machine vision systems, and more particularly to vehicle-borne methods and apparatuses for detecting obstacles and potential colliding objects.
- Trucks and tractor trailers are responsible for transporting a substantial portion of the goods used in commerce in the US and other countries. They are present on virtually every road or highway. Unlike their smaller counterparts, automobiles and motorized cycles, the size and weight of a truck makes it more difficult to maneuver and stop, and its driver's ability to view obstacles and vehicles (particularly those located behind and aside) may be much more limited. Moreover, when a truck collides with another vehicle or pedestrian, the other smaller vehicle or pedestrian may not survive. Thus, safety systems that warn of the potential for such collisions are highly desirable.
- Some trucks are equipped with localized radar units, or other types of sensors, in an attempt to provide warning prior to lane change or turning in an effort to avoid a catastrophic collision. These systems tend to be short range and, in part due to their expense, are mounted only on the cab. Thus, objects/vehicles moving along the side of the rear-mounted truck trailer may not be sensed in time, if at all. Also, stationary objects are not effectively sensed by radar.
- Camera-based vision systems offer a potential solution.
- most camera systems tend to view a wide field and a single camera may have trouble discriminating range and movement of an object, in the same way that closing one eye tends to reduce a person's depth perception.
- vehicle guidance systems mounted front-facing
- stereo vision so-called “stereo vision,” which emulates an animal's natural horizontal placement of eyes so as to provide range discrimination
- this approach is not as effective for dealing with objects that tend to have more vertical than horizontal features, such as pedestrians, sign posts and motorcycles. Such objects tend to get lost in the horizontal spread of a conventional stereo vision arrangement.
- a safety system employing a camera-based vision system that allows the entire unit to be mounted on the cab so that trailers and cargo structures are removable from the truck without need of changing over any vision system components.
- the system should allow for the use of conventional camera components with high reliability and durability, should be relatively unobtrusive and provide good discrimination of vertically featured objects along the rearward sides of the truck, behind the cab.
- This invention overcomes disadvantages of the prior art by providing a system and method for object detection and collision avoidance for objects and vehicles located behind the cab or front section of an elongated, and possibly tandem, vehicle.
- the system and method can employ relatively inexpensive cameras, in a stereo relationship, on a low-profile mounting, to perform reliable detection with good range discrimination.
- the field of detection is sufficiently behind and aside the rear area to assure an adequate safety zone in most instances.
- this system and method allows all equipment to be maintained on the cab of a tandem vehicle, rather than the interchangeable and more-prone-to-damage cargo section and/or trailer.
- One or more cameras can be mounted on, or within, the mirror on each side, on aerodynamic fairings or other exposed locations of the vehicle. Image signals received from each camera can be conditioned before they are matched and compared for disparities viewed above the ground surface, and according to predetermined disparity criteria.
- the pair of stereo cameras can be mounted on at least one side of a vehicle looking in at downward angle and back towards the rear of the vehicle.
- another pair of stereo cameras can be mounted on the other side of the vehicle to provide coverage of both sides of the vehicle simultaneously.
- the invention is calibrated to provide heights above the ground plane for any point in a field of view such as the road next to and extending behind the vehicle. Therefore, when any object enters the field of view, it generates interest points called “features”, the heights of which are measured relative to the ground plane. These points are then clustered in image and 3D space to provide “objects”.
- a “virtual mirror” which could directly make drivers more aware of obstacles in the lanes along side of, and extending behind their vehicle, or a warning system, which alerts the driver when making a lane change that would cause an accident.
- a system and method for detecting objects and vehicles with respect to a host vehicle includes providing a pair of cameras mounted on a host vehicle, and arranged with respect to each other in a narrow-baseline orientation.
- a processor is provided, which detects an object or vehicle and that derives an approximate range based upon an input image from each of the pair of cameras.
- an output device is provided, which reports detection of the object or vehicle within a predetermined range of the host vehicle.
- Each of the pair of cameras are mounted so as to orient the narrow-baseline is a substantially vertical direction normal to a ground surface beneath the host vehicle, and each of the cameras can include a field of view having an increased resolution of pixels oriented along the vertical direction.
- each of the pair of cameras can be directed rearwardly from a front section of the host vehicle so as to view an area along a side of the host vehicle and beyond a back end of the host vehicle.
- at least one of the pair of cameras can be mounted on a rear view mirror on a side of the host vehicle.
- pairs of cameras can be mounted on both sides of the vehicle in accordance with the teachings of this invention to provide full coverage.
- each of the pair of cameras can be, furthermore, mounted in a stereo head housing on the rear view mirror, and/or at least one of the pair of cameras can be mounted within a housing of the rear view mirror as an integrated mirror unit as an option.
- Another one of the pair of cameras can be mounted at a location on the host vehicle remote from the rear view mirror in various embodiments, thereby providing further vertical and/or horizontal separation between each of the cameras.
- at least one of the pair of cameras is mounted on an aerodynamic fairing of the host vehicle.
- Such an aerodynamic fairing can include a cowling that houses the at least one of the pair of cameras so as to allow the at least one of the pair of cameras to view the area along a side of the host vehicle and beyond a back end of the host vehicle.
- the processor can include an image rectification process, an image smoothing process and a max ⁇ min filter process constructed and arranged to condition image data from each of the pair of cameras and a matching process that matches the conditioned image data to derive a disparity image from which objects and vehicles above a ground surface beneath the host vehicle are detected in accordance with parameters of a disparity criteria image.
- the processor can further comprise a spurious edge filtering process and an edge aggregation process that act upon detected images of objects and vehicles so as to generate a report of detected objects that is provided to the output device.
- the host vehicle can be an elongated and/or tandem vehicle, such as a tractor trailer where the front section carrying the cameras comprises a cab thereof.
- FIG. 1 is a side perspective view of an illustrative embodiment of a tractor-trailer-cab-mounted obstacle and vehicle detection system and a typical type of detected vehicle;
- FIG. 2 is a partial perspective view of the exemplary tractor trailer of FIG. 1 showing the obstacle detection layout according to an illustrative embodiment of the invention in which two stereo cameras are mounted together on the mirrors of the cab;
- FIG. 3 is a schematic plan view of a narrow-baseline stereo camera head for use in an illustrative embodiment of the invention
- FIG. 4 is a diagram of a screen display depicting the relative views of a vehicle derived from the camera head arrangement of FIG. 3 ;
- FIG. 5 is a partial perspective view of an exemplary tractor trailer showing the obstacle detection layout according to another illustrative embodiment of the invention in which individual cameras are mounted on each mirror and corresponding individual cameras are mounted on fender mirror posts so as to provide a stereo viewing layout;
- FIG. 6 is a top view of one field-of-view that is imaged with respect to an exemplary tractor trailer according to an illustrative embodiment of the invention
- FIG. 7 is a schematic layout of a different field of view that is imaged with respect to an exemplary automobile, areas of which field may be covered by prior art systems and methods;
- FIGS. 8 and 9 are each schematic block diagrams of two alternate configurations of system components according to illustrative embodiments of the present invention.
- FIG. 10 is a flow diagram of the obstacle-detection process employed in conjunction with the illustrative embodiments of the present invention.
- FIG. 11 is a schematic block diagram of an obstacle detection apparatus according to an illustrative embodiment of the present invention.
- FIG. 12 is a frontal view of a vehicle mirror face, used for example, on a tractor trailer showing an integrated stereo camera head according to an illustrative embodiment of the invention.
- FIG. 13 is a partial perspective view of an exemplary tractor trailer including an aerodynamic-fairing-mounted stereo camera assembly according to an alternate embodiment of the present invention.
- FIG. 1 shows an obstacle detection and collision-avoidance system mounted on an exemplary tractor trailer 100 .
- the tractor trailer (also part of a class of large cargo vehicles termed generally herein as a “truck”) 100 includes a box trailer 102 and a cab 110 .
- the cab 110 houses the driver and provides the motive power for the rig.
- the driver is mounted relatively high in the cab 110 above the road surface 120 , which consists of a plurality of highway lanes 122 , separated by dividing lines 124 .
- the cab windshield 130 affords the driver good visibility of the road ahead at long, medium and somewhat close distances.
- the side windows 132 afford the driver a good view of obstacles and vehicles that are substantially aside the cab.
- a rear-facing stereo vision system is mounted on the cab with corresponding rear-directed cameras 210 , 212 beneath each respective mirror assembly.
- the cameras provide two discrete image views (dashed lines 160 and 162 in FIG. 1 ) of the depicted vehicle 150 .
- the illustrative embodiment includes two pairs of stereo cameras 210 , 212 mounted on either side of the cab 110 .
- the mirrors 140 and 142 in this example are a conventional type with lower support brackets 220 , 222 respectively that depend from each door 144 , 146 .
- Their mirror glass is oriented to face rearwardly toward the back of the tractor-trailer truck 100 , thereby affording the driver a view of the rear sides of the rig.
- Each stereo camera pair 210 , 212 is mounted on the respective mirror bracket 220 and to face rearwardly at an appropriate angle to image the region adjacent to, and behind, the cab, thereby acting in essence, as automated mirrors within a predetermined viewing band as described further below.
- a front/plan view of one of the stereo camera pair assemblies 210 is shown in FIG. 3 .
- the depicted stereo pair assembly 210 is fixtured within a stereo housing or “head” 310 , which can take any form and can include an aerodynamic fairing along its front-facing side.
- the housing encloses two cameras 320 , 322 (shown schematically in phantom), with lenses 324 , 326 having respective optical axes 330 , 332 spaced apart by a narrow distance 340 .
- a camera “stereo pair” herein is defined as two discrete cameras that can be positioned at various spacings and/or distances from one another, two cameras included within a head, or fixtured separately, and oriented at any angle to each other.
- the dual-camera configuration shown in FIG. 3 is commonly referred to as a narrow-baseline stereo head.
- the narrow baseline stereo head offers the advantages that it can be attached to the truck cab 110 at one place, and optionally in a vertical arrangement as shown in FIG. 3 .
- the individual cameras 320 , 322 can be fixtured without the head housing ( 310 ) in the same vertical arrangement as depicted. This would offer the advantage of a small profile as does the stereo head 10 although two separate cameras as opposed to two cameras within the stereo head 310 would need to be fixtured to the subject vehicle.
- the baseline distance 340 between the optical centers of the cameras is approximately 12 cm, and less than approximately 20 cm is typical.
- the lenses 324 , 326 have a focal length of approximately 5.8 mm (approximately 50-degree Horizontal Field of View (“HFOV”)).
- HFOV Horizontal Field of View
- FIG. 4 An exemplary display of a pair of images derived from the stereo pair 210 is shown in FIG. 4 .
- the display 410 includes a pair of sub-windows 420 , 422 arranged vertically in the manner of the two narrow-baseline cameras 320 , 322 .
- each depiction of an imaged automobile 430 , 432 defines a differing pose between each of the respective image fields, which allows for object/vehicle range discrimination as described further below.
- FIG. 5 depicts it in conjunction with the above-described, exemplary truck 100 and cab 110 .
- This illustrative embodiment provides a layout of a wide-based stereo pair camera system having two separate cameras 510 and 520 for the stereo pair on the left (driver) side, and two separate cameras 512 and 522 for the stereo pair on the right (passenger) side.
- the cameras 510 and 512 are mounted beneath the respective door-mounted mirrors 140 , 142 as described generally above.
- the other cameras 520 , 522 in each stereo pair are mounted on respective posts 530 , 532 that project from the cab's front fenders 540 , 542 , or another convenient outboard location (e.g. the front bumper).
- the upright posts 530 , 532 can be conventional or customized units, which may also support marker lights, flags and/or additional forward-placed, rear-view mirrors side mirrors 550 , 552 , respectively.
- the cameras ( 320 , 322 or 510 , 520 , etc.) of each stereo pair are positioned and aligned substantially vertically with respect to the ground surface plane 120 , so that there is defined a top camera and a bottom camera.
- the illustrative vertical arrangement offers advantages when used to detect some of the common obstacles presented to vehicles. More specifically, the invention recognizes that using “narrow-baseline” stereo in the vertical arrangement may be desirable when compared with the typical horizontal arrangement for detecting other vehicles and other obstacles typically encountered by moving vehicles.
- a vertically arranged stereo system can usually detect better than a horizontally arranged stereo system.
- This better detection of horizontal features results because, among other reasons, the features are perpendicular to the epipolar line, where an epipolar line is known to those skilled in the art and described in further detail below.
- a vertically arranged stereo pair is generally easier to mount on the side of a commercial or other vehicle, and the vertically arranged stereo pair provides a better profile(s) when mounted on the side of the vehicle.
- a third advantage of the above-described vertical arrangement only applies when orientation of the cameras with respect to the ground 120 is vertical, so that horizontal lines in the world become vertical lines in the image.
- Most camera systems have better resolution in the horizontal than vertical direction, especially if only one field of an interlaced CCD camera system is used, so by orienting the cameras vertically, the system can orient the direction of maximum resolution along the long axis of the vehicle where the increased resolution is desirable.
- the captured field of view (represented by dashed-line trapezoids 610 , 612 ) is illustrated in FIG. 6 , showing a top view of the exemplary tractor-trailer 100 .
- each stereo pair 210 and 212 is mounted from approximately one to two meters from the ground/road surface 120 , and each stereo pair 210 , 212 monitors a respective area 610 , 612 approximately 6 m wide (WM) and 24 m long (LM) on each side of the tractor-trailer 100 .
- the viewed area 610 , 612 covers up to approximately six meters, or two adjacent lanes 122 , along each side of the tractor-trailer 100 , and approximately 2-5 meters behind the back end 630 of the trailer 102 .
- the area of primary importance for the exemplary embodiment is one-half the width WM of each viewing area 610 , 612 .
- One-half the width WM effectively covers one lane 122 of traffic/obstacles adjacent to the truck 100 , which if monitored, can help inform the driver of the presence of obstacles, such as other vehicles, in the adjacent lane prior to changing lanes.
- a large portion, but possibly not all, of the viewed area 610 , 612 is typically within the field of view of the driver s mirrors. The invention recognizes that this area 610 , 612 is a principle area to be monitored for the driver of an elongated vehicle.
- the prior art describes non-stereo based vision methods to capture portions of this area 610 , 612 , such as described in Safe Robot Driving in Cluttered Environments , C. Thorpe. J. D. Carlson. D. Duggins, J. Gowdy, R. MacLachlan, C. Mertz, A. Suppe, and C. Wang, Proceedings of the 11th International Symposium of Robotics Research, October, 2003.
- the prior systems described to capture portions of this area 610 , 612 are monocular systems. The invention recognizes that monocular systems are not as well suited as stereo to the task of viewing obstacles in the area 610 , 612 for vehicles.
- Stereo vision fundamentally sees obstacles (features that stickup up), whereas other techniques indirectly see them (by interpreting color or motion).
- Other techniques can, therefore, be fooled by things that are not obstacles, but are moving correctly—such as shadows of vehicles moving parallel.
- Such monocular systems can also be fooled by things that look like obstacles—such as oddly colored pavement or shadows.
- Such monocular systems can miss large classes of objects. For instance many monocular systems that use optical flow methods or motion stereo only see moving obstacles, and ignore fixed obstacles such as traffic barriers or stopped cars.
- FIG. 7 shows, by way of background, an ideal version of a panoramic field (dashed circle 710 ) of view is typical for cars 720 in accordance with Matusyk.
- Matusyk one can simply superimpose the panoramic field of view of Matusyk on the tractor trailer cab 110 of FIG.
- a stereo pair can monitor a large distance backwards, one or two lanes wide next to and behind any vehicle, including an elongated vehicle such as the exemplary tractor trailer 100 .
- the cameras ( 320 , 322 , 510 , 520 , etc.) are oriented at an angle A ( FIG. 2 ) relative to a line 250 parallel to the ground plane 120 , so that their axes (approximated by line 260 ) look downward. In this manner, obstacles alongside and behind the vehicle will be located in the field of view of the cameras.
- the surface normal to the plane of the cameras within the stereo pairs 210 and 212 points downward and backwards as shown generally in FIG. 2 , wherein the cameras housed within each head are angled down just enough to view objects approximately 2 meters above the ground at 24 meters distance from each camera.
- the angle A of the cameras may be modified to capture other objects without departing from the scope of the invention.
- the cameras can be located at other places or other positions on the tractor with a stereo pair for one or both sides, and wherein the stereo pair for each side has a wider or smaller baseline. It should also be apparent that a stereo pair can be mounted on only one side of the vehicle without departing from the scope of the invention.
- FIG. 8 shows a first illustrative system configuration 800 of one stereo pair, such as stereo pair 210 as shown in FIGS. 2 and 3 , the systems monitoring the top and bottom are tightly integrated.
- a frame grabber 810 receives video data output from both the top camera ( 820 ) and bottom camera ( 822 ), which is processed on a processing system 830 .
- the processing system 830 can be any acceptable vision processor that discriminates based upon (for example) contrast differences and/or blob analysis between the object and its surroundings, and resolves the locational differences (in each camera field of view) between matching features in the two images. The differences are used to compute the range of the object in a manner described further below.
- the system 800 then outputs the results for obstacles on the one side, or back of the vehicle, to one or more output devices 840 .
- These output devices can include a virtual mirror with a video display 270 , shown generally on the driver's ( 280 ) dashboard in FIG. 2 .
- the display can be substituted with or augmented with other warnings, such as audible alarms, beepers and/or horns (including a recorded/synthesized spoken alert), as well as visible alerts including flashing lights, meters or other display icons that indicate the side and (optionally) the range of the vehicle/obstacle.
- the output 840 can be integrated with an automated steering, acceleration and braking system.
- a second illustrative system configuration 900 is shown in FIG. 9 .
- independent systems monitor top and bottom camera outputs ( 820 , 822 ) separately.
- Two separate frame grabbers 910 , 912 receive respective inputs from each of the top and bottom cameras.
- a separate processor 930 and 932 processes the respective output from each frame grabber 910 and 912 .
- the processors 930 , 932 merge their results (via branch connection 934 ) and one or both processors produce a final detecting and range result that is provided to the output device 940 .
- Each system 800 or 900 generally outputs its results independently for its respective side of the vehicle.
- both stereo pairs can share one output device (such as device screen 270 ), and/or share one common processing system.
- the processing system can be mounted in the camera housing(s) or can be located in a cab-mounted box, including within the display 270 itself. Data and power are provided to and from each camera via cabling (or wirelessly) as indicated by the dashed connection lines 290 in FIGS. 2 and 5 .
- FIGS. 8 and 9 respectively use one or more frame grabbers
- the system can function without a frame grabber.
- the images from each camera need only digitized in some manner, and input into the one or more processors. Once digitized, the images can also stored and later retrieved from memory (included in the processor), and can be used for quality assurance, later driver training, or games, for example.
- FIG. 10 illustrates a generalized block diagram of an obstacle-detection system or apparatus 1000 of the type in which the invention is practiced.
- the system 1000 includes a stereo image acquisition device 1010 , such as two conventional cameras (CCD, CMOS, etc.) that each generate an image of a field of view.
- the stereo image acquisition device 1010 transmits image data to a three-dimensional (3D) processor 1020 either in a stream or upon request by the processor.
- the 3D processor can be any processing device or module capable of performing at least the minimum processing steps described hereinafter (in FIG. 11 ) to detect obstacles.
- the processor 1020 communicates with an output device 1030 , such as a virtual mirror and/or warning system (for example, dash-mounted console display 270 ).
- an output device 1030 such as a virtual mirror and/or warning system (for example, dash-mounted console display 270 ).
- a personal computer PC
- laptop computer a personal data assistant (PDA), dedicated processor, or any number or combination of processing devices are capable of serving as a 3D processor, either alone, or in conjunction with one another, according to the present invention.
- PDA personal data assistant
- the 3D processor can be partitioned locally, or otherwise, in more than one manner without departing from the scope of the invention.
- various parameters of the system are set up at the place of manufacture.
- Such factory setup involves calibration and the computation of the intrinsic parameters for the cameras, and the relative orientation between the cameras.
- Calibration involves the solution of several sub-problems, as discussed hereinafter, each of which has several solutions that are well-understood by persons having ordinary skill in the art. Further rectification coefficients described hereinafter, are computed to enable runtime image correction.
- Stereo measurements can be made in a coordinate system that is different from the coordinate systems of either camera.
- the scene or world coordinates correspond to the points in a viewed scene.
- Camera coordinates (top and bottom) correspond to the viewer-centered representation of scene points.
- Undistorted image coordinates correspond to scene points projected onto the image plane.
- Distorted image coordinates correspond to points having undergone lens distortion.
- Pixel coordinates correspond to the grid of image samples in the image array.
- one camera is designated to be a “reference camera” to which the stereo coordinate system is tied.
- An interior orientation process is performed to determine the internal geometry of a camera.
- These parameters also called the intrinsic parameters, include the following: (a) effective focal length, also called the camera constant; (b) location of the principal point, also called the image center; (c) radial distortion coefficients; and (d) horizontal scale factor, also called the aspect ratio.
- the cameras used in the illustrative embodiment have fixed-focus lenses that cannot be modified; therefore these parameters can be computed and preset at the factory.
- a relative orientation process is also performed during factory setup to determine the relative position and orientation between two cameras from projections of calibration points in the scene. Again, the cameras are mechanically fixtured such that they stay in alignment, and hence, these parameters can also be preset at the factory.
- Rectification is the process of resampling stereo images so that epipolar lines correspond to image rows. More particularly an epipolar line on one stereo image corresponding to a given point in another stereo image is the perspective projection on the first stereo image of the three-dimensional ray that is the inverse perspective projection of the given point from the other stereo image (Robert M. Haralick & Linda G. Shapiro, Computer and Robot Vision , Vol. II, p. 598 (1993)). If the two images are coplanar and the horizontal axes are collinear (no rotation about the optical axis), then the image rows are epipolar lines, and stereo correspondences can be found along corresponding rows. These images, referred to as normal image pairs, provide computational advantages because the rectification of normal image pairs need only be performed one time.
- the illustrative method for rectifying the images is independent of the representation used for the given pose of the two cameras. It relies on the principal that any perspective projection is a projective projection. Image planes corresponding to the two cameras are replaced by image planes with the desired geometry (normal image pair), while keeping the geometry of the rays spanned by the points and the projection centers intact. This results in a planar projective transformation. These coefficients can also be computed at the factory. Given the parameters computed in interior orientation, relative orientation and rectification, the camera images can be corrected for distortion and misalignment either in software or hardware. The resulting corrected images have the geometry of a normal image pair square pixels, aligned optical planes, aligned axes (rows), and pinhole camera model.
- Ground plane calibration can be performed in a variety of ways, such as direct measurement of the height and orientation of the head above the ground, through user selection of a set of features on the ground that can be seen by both stereo cameras; or through taking a stereo depth image of a flat area with sufficient horizontal features and fitting a plane to the results.
- regions of the images which are expected to contain the vehicle itself are set up manually. This involves capturing the image from the reference camera (camera that the stereo coordinate system is tied to), rectifying it, displaying it, and then using a graphics overlay tool to specify the areas expected to contain the vehicle.
- the invention automatically sets up the 6-meter zone adjacent to the vehicle, and extending backwards as the zone in which to monitor for objects. Filtering is performed to exclusively process features inside this zone.
- automatic setup can be performed by laying out fiducial markings or tape on the ground, or by auto-detection of the vehicle profile in a controlled, flat environment.
- other embodiments of the invention may modify the vehicle and active zones depending on the turn-rate of the vehicle, as the size and shape of the zones for an articulated vehicle such as a tractor-trailer may change significantly in the course of tight turns.
- FIG. 11 While there are several processes that may be employed to perform stereo vision according to the present invention, one illustrative embodiment of a process 1100 is outlined below with reference to FIG. 11 .
- the images (Top (A) and Bottom (B)) are, respectively, input to the system processor(s) (steps 1110 A and 1110 B), acquired (steps 1112 A, 1112 B), and smoothed (steps 114 A and 1114 B), using a Gaussian or a parabolic low-pass filter at a certain sigma(s).
- Stereo systems have been shown to be more effective when not processing raw images. Instead, the exemplar has a Max ⁇ Min filter process (steps 1116 A, 1116 B) that enhances the edges, where the enhanced edges are embodied in the outputted second derivative image.
- the Max ⁇ Min filter takes the following steps:
- the smoothed Max ⁇ Min filter process has several advantages over normal high-pass or edge-detection filters used to preprocess for stereo matching. For example, it is a band pass operator, as the Gaussian filter first removes the high frequency image elements, and then it performs the Max ⁇ Min operation which behaves as a high-pass filter so that low frequencies are eliminated. Therefore, it is not as sensitive to noise as some of the other candidate high-pass filters suggested in the art. In addition, it does not dislocate boundaries of the object as the final operation is a Max ⁇ Min.
- a matching process also called a correspondence procedure, (step 1120 ) receives the second derivative images and uses this result to match small constant-sized windows of the filtered reference image (e.g. the top image) to corresponding windows in the other filtered image (e.g. the bottom image).
- the windows are matched using the sums of absolute differences (SAD) approach, where the score for a match between two windows is the sum of the absolute difference between the corresponding elements of the windows.
- the scoring of a match is called the Strength of Match (SOM).
- SOM Strength of Match
- the matching process 1120 consists of the following sub-steps: (a) at each iteration, the matches for which the SOM is maximum for both of the features forming them are chosen as correct; then, because of the uniqueness constraint, all other associated matches with these two features are eliminated from further consideration. This allows further matches to be selected as correct provided that they now have the highest SOM for both constituent tokens.
- the matcher produces a disparity image (step 1130 ), where every match is recorded with the resulting best disparity.
- a disparity image is an image where each point is either zero (0), representing no match, or the distance value (in pixels) between where a small window around that point in the reference image was found in the other image. Smaller disparities mean a greater distance from the camera for a ray emerging from the reference image and intersecting an object, and larger disparities mean closer.
- the procedure searches for points/features which are located at a certain distance above the ground plane 120 .
- a number of procedures can be employed to accomplish this goal.
- the procedure desirable employs the disparity image.
- the procedure generates a disparity criteria image (input box 1150 ).
- This criteria image 1150 represents the disparity value that would result from perfectly imaging a plane at the criteria distance above the ground plane. Any potential obstacle with a height greater than the criteria distance will return disparities greater than (i.e. closer than) the corresponding pixels of the disparity criteria image.
- the procedure 1100 generates a feature image by marking all pixels for which the disparity from the generated disparity image (from step 1130 ) is greater than the corresponding pixel in the disparity criteria image.
- disparity image and/or disparity criteria image may be optional, or may not be necessary.
- the procedure 1100 first filters the spurious matches in the feature image (step 1160 ) before it aggregates the matches (step 1170 ).
- the order of aggregating and filtering can be altered without departing from the scope of the invention.
- the illustrative procedure 1100 filters most of the mismatches by using image erosion, a conventional morphological image processing technique.
- the procedure passes a 3 ⁇ 3 window over the feature image, and if there are zero elements in a window, the element at the center of the window is set to zero. This step eliminates any feature regions smaller than 3 ⁇ 3, which empirically eliminates the vast majority of spurious features from the resulting filtered feature image while retaining most real features.
- the elements of the filtered feature image are then clustered together (aggregation step 1170 ) using standard region segmentation approaches so that feature pixels that are contiguous in the image with similar disparity (i.e. contiguous in 3D space) are conglomerated together into objects. These objects are then reported to the output device of the system (step 1180 ). In the illustrative collision-avoidance system this is accomplished through outputting CAN (Control Area Network) messages on a CAN bus.
- CAN Control Area Network
- FIG. 12 details a basic mirror assembly 1200 , which includes a housing frame 1210 and a mirror glass 1220 .
- the mirror glass is modified with transparent (potentially unsilvered) ports 1226 , 1228 to allow for see-through viewing of two respective camera lenses 1230 , 1232 , which are part of two cameras within a stereo head 1240 (shown in phantom).
- the data and power for the head 1240 is routed along a cable 1250 that can be part of an overall wiring harness for the mirror (e.g.
- the head can be gimbaled, or otherwise movably mounted within the housing to maintain a predetermined orientation relative to the vehicle side, despite the adjustment position of the mirror.
- the head 1240 can include a powered adjustment unit that maintains a desired angle as the mirror is adjusted.
- the ports are large enough to allow a range of angular adjustment for the head versus the mirror glass without occluding the view of the camera lenses 1230 , 1232 .
- the head can be mounted as part of a mirror assembly, but outside the mirror glass.
- the cab 1310 of an exemplary tractor trailer has been modified so that the upper aerodynamic fairing (found as an original or add-on feature in most modern trucks) 1320 includes cowlings 1330 , 1332 that define a generally aerodynamic shape and that house one or more rear-facing stereo vision cameras in accordance with an embodiment of this invention.
- the cowlings 1330 , 1332 each include two cameras (for example cameras 1340 , 1342 in cowling 1332 ).
- each cowling maintains only one camera and another is provided on a mirror, fender or other convenient, sufficiently outboard location. Aerodynamic fairings can be manufactured to accept cameras and camera heads in accordance with this invention.
- ground plane calibration in the illustrative embodiments described herein is performed for each particular vehicle mounting class, persons having ordinary skill in the art should appreciate that ground plane calibration could also be performed in the factory or at alternate locations without departing from the spirit and scope of the invention.
- edge enhancement is performed in the illustrative embodiments described herein by performing a Max ⁇ Min filter operation, persons having skill in the art should appreciate that any number of edge-processing techniques known in the art can be used to accomplish this step without departing from the spirit and scope of the present invention.
- edge detection, correspondence matching rectification, and filtering described hereinbefore can be combined and effected as hardware implementations, software implementations or a combination thereof.
- the exemplary embodiment often refers to elongated vehicles, it should be clear to those skilled in the art that obstacles, such as fast-approaching vehicles, or falling back vehicles, or lightweight vehicles, such as cars, can also be detected using the teachings herein.
- obstacles such as fast-approaching vehicles, or falling back vehicles, or lightweight vehicles, such as cars, can also be detected using the teachings herein.
- the arrangement of stereo camera baseline described herein is aligned as substantially vertical or substantially horizontal, it should be clear that, with some drawbacks known in the art, the baseline can be arranged at any angle therebetween.
- the term “stereo pair” can be used torn described two or more, cameras that can be coordinated into a single object detection system using the techniques described herein. where more than two cameras re employed on a given side, the techniques for discriminating objects and associated range thereof can be modified to allow incorporation of data from the third (or greater) camera(s).
- the invention has been shown and described with respect to exemplary embodiments thereof, various other changes, omissions, and additions in the form of, and detail thereof, may be made therein without departing from the spirit and scope of the invention.
- the above-described system and method for object detection and collision avoidance described herein provides a novel and effective way to identify moving objects and vehicles located behind the cab of an elongated, and possibly tandem, vehicle.
- the system and method can employ relatively inexpensive cameras, in a stereo relationship, on a low-profile mounting, to perform reliable detection with good range discrimination.
- the field of detection is sufficiently behind and aside the rear area to assure an adequate safety zone in most instances.
- this system and method allows all equipment to be maintained on the cab of a tandem vehicle, rather than the interchangeable and more-prone-to damage cargo section and/or trailer.
- the data connection between the camera sensors and main controller can be wireless in alternate embodiments, allowing cameras to operate using onboard battery power or a simple power connection from the trailer and/or cab.
- the processes described herein can be implemented using electronic or computer hardware, software consisting of a computer-readable medium of program instructions, or a combination of hardware and software. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
Abstract
Description
- This application claims the benefit of U.S. Provisional Application Ser. No. 60/869,681, filed Dec. 12, 2006, entitled STEREO VISION SYSTEM FOR SIDE VISION DETECTION OF OBSTACLES FOR VEHICLES, the entire disclosure of which is herein incorporated by reference.
- This invention was made with U.S. government support under U.S. Department of Transportation National Highway Traffic Safety Administration Cooperative Agreement Numbers DTNH22-05-H-01232 and DTFH61-01-X-00053. The government has certain rights in this invention.
- This invention relates to machine vision systems, and more particularly to vehicle-borne methods and apparatuses for detecting obstacles and potential colliding objects.
- To improve safety of our roads a number of safety systems for automobiles have been developed, which entail a variety of differing approaches to the challenge. Many safety systems are concerned with the avoidance of collisions between vehicles and obstacles, pedestrians or other vehicles. Some of these approaches contemplate the use of camera-based vision systems. Notably, camera systems directed forwardly have been employed in an attempt to create self-guiding automobiles and other vehicles. A compendium of various approaches which may employ vision systems is described in Vehicle Surround Capture: Survey of Techniques and a Novel Omni Video Based Approach for Dynamic Panoramic Surround Maps, 2006, some of which should not be considered prior art as not predating applicant's invention of the concepts herein.
- Trucks and tractor trailers are responsible for transporting a substantial portion of the goods used in commerce in the US and other countries. They are present on virtually every road or highway. Unlike their smaller counterparts, automobiles and motorized cycles, the size and weight of a truck makes it more difficult to maneuver and stop, and its driver's ability to view obstacles and vehicles (particularly those located behind and aside) may be much more limited. Moreover, when a truck collides with another vehicle or pedestrian, the other smaller vehicle or pedestrian may not survive. Thus, safety systems that warn of the potential for such collisions are highly desirable.
- Some trucks are equipped with localized radar units, or other types of sensors, in an attempt to provide warning prior to lane change or turning in an effort to avoid a catastrophic collision. These systems tend to be short range and, in part due to their expense, are mounted only on the cab. Thus, objects/vehicles moving along the side of the rear-mounted truck trailer may not be sensed in time, if at all. Also, stationary objects are not effectively sensed by radar.
- Camera-based vision systems offer a potential solution. However, most camera systems tend to view a wide field and a single camera may have trouble discriminating range and movement of an object, in the same way that closing one eye tends to reduce a person's depth perception. While some vehicle guidance systems (mounted front-facing) have employed so-called “stereo vision,” which emulates an animal's natural horizontal placement of eyes so as to provide range discrimination, this approach is not as effective for dealing with objects that tend to have more vertical than horizontal features, such as pedestrians, sign posts and motorcycles. Such objects tend to get lost in the horizontal spread of a conventional stereo vision arrangement.
- Accordingly, it is desirable to provide a safety system, employing a camera-based vision system that allows the entire unit to be mounted on the cab so that trailers and cargo structures are removable from the truck without need of changing over any vision system components. The system should allow for the use of conventional camera components with high reliability and durability, should be relatively unobtrusive and provide good discrimination of vertically featured objects along the rearward sides of the truck, behind the cab.
- This invention overcomes disadvantages of the prior art by providing a system and method for object detection and collision avoidance for objects and vehicles located behind the cab or front section of an elongated, and possibly tandem, vehicle. Through the use of narrow-baseline stereo vision that can be vertically oriented relative to the ground/road surface, the system and method can employ relatively inexpensive cameras, in a stereo relationship, on a low-profile mounting, to perform reliable detection with good range discrimination. The field of detection is sufficiently behind and aside the rear area to assure an adequate safety zone in most instances. Moreover, this system and method allows all equipment to be maintained on the cab of a tandem vehicle, rather than the interchangeable and more-prone-to-damage cargo section and/or trailer. One or more cameras can be mounted on, or within, the mirror on each side, on aerodynamic fairings or other exposed locations of the vehicle. Image signals received from each camera can be conditioned before they are matched and compared for disparities viewed above the ground surface, and according to predetermined disparity criteria.
- The pair of stereo cameras can be mounted on at least one side of a vehicle looking in at downward angle and back towards the rear of the vehicle. Optionally, another pair of stereo cameras can be mounted on the other side of the vehicle to provide coverage of both sides of the vehicle simultaneously. The invention is calibrated to provide heights above the ground plane for any point in a field of view such as the road next to and extending behind the vehicle. Therefore, when any object enters the field of view, it generates interest points called “features”, the heights of which are measured relative to the ground plane. These points are then clustered in image and 3D space to provide “objects”. The position of these objects are then reported to one or more other systems, such as, for example, a “virtual mirror”, which could directly make drivers more aware of obstacles in the lanes along side of, and extending behind their vehicle, or a warning system, which alerts the driver when making a lane change that would cause an accident.
- In an illustrative embodiment of the invention, a system and method for detecting objects and vehicles with respect to a host vehicle includes providing a pair of cameras mounted on a host vehicle, and arranged with respect to each other in a narrow-baseline orientation. A processor is provided, which detects an object or vehicle and that derives an approximate range based upon an input image from each of the pair of cameras. In addition, an output device is provided, which reports detection of the object or vehicle within a predetermined range of the host vehicle. Each of the pair of cameras are mounted so as to orient the narrow-baseline is a substantially vertical direction normal to a ground surface beneath the host vehicle, and each of the cameras can include a field of view having an increased resolution of pixels oriented along the vertical direction. In an illustrative embodiment, each of the pair of cameras can be directed rearwardly from a front section of the host vehicle so as to view an area along a side of the host vehicle and beyond a back end of the host vehicle. Moreover, at least one of the pair of cameras can be mounted on a rear view mirror on a side of the host vehicle. In general, pairs of cameras can be mounted on both sides of the vehicle in accordance with the teachings of this invention to provide full coverage.
- In an illustrative embodiment, each of the pair of cameras can be, furthermore, mounted in a stereo head housing on the rear view mirror, and/or at least one of the pair of cameras can be mounted within a housing of the rear view mirror as an integrated mirror unit as an option. Another one of the pair of cameras can be mounted at a location on the host vehicle remote from the rear view mirror in various embodiments, thereby providing further vertical and/or horizontal separation between each of the cameras. As a further option, at least one of the pair of cameras is mounted on an aerodynamic fairing of the host vehicle. Such an aerodynamic fairing can include a cowling that houses the at least one of the pair of cameras so as to allow the at least one of the pair of cameras to view the area along a side of the host vehicle and beyond a back end of the host vehicle.
- In an illustrative embodiment the processor can include an image rectification process, an image smoothing process and a max−min filter process constructed and arranged to condition image data from each of the pair of cameras and a matching process that matches the conditioned image data to derive a disparity image from which objects and vehicles above a ground surface beneath the host vehicle are detected in accordance with parameters of a disparity criteria image. The processor can further comprise a spurious edge filtering process and an edge aggregation process that act upon detected images of objects and vehicles so as to generate a report of detected objects that is provided to the output device.
- The host vehicle can be an elongated and/or tandem vehicle, such as a tractor trailer where the front section carrying the cameras comprises a cab thereof.
- The invention description below refers to the accompanying drawings, of which:
-
FIG. 1 is a side perspective view of an illustrative embodiment of a tractor-trailer-cab-mounted obstacle and vehicle detection system and a typical type of detected vehicle; -
FIG. 2 is a partial perspective view of the exemplary tractor trailer ofFIG. 1 showing the obstacle detection layout according to an illustrative embodiment of the invention in which two stereo cameras are mounted together on the mirrors of the cab; -
FIG. 3 is a schematic plan view of a narrow-baseline stereo camera head for use in an illustrative embodiment of the invention; -
FIG. 4 is a diagram of a screen display depicting the relative views of a vehicle derived from the camera head arrangement ofFIG. 3 ; -
FIG. 5 is a partial perspective view of an exemplary tractor trailer showing the obstacle detection layout according to another illustrative embodiment of the invention in which individual cameras are mounted on each mirror and corresponding individual cameras are mounted on fender mirror posts so as to provide a stereo viewing layout; -
FIG. 6 is a top view of one field-of-view that is imaged with respect to an exemplary tractor trailer according to an illustrative embodiment of the invention; -
FIG. 7 is a schematic layout of a different field of view that is imaged with respect to an exemplary automobile, areas of which field may be covered by prior art systems and methods; -
FIGS. 8 and 9 are each schematic block diagrams of two alternate configurations of system components according to illustrative embodiments of the present invention. -
FIG. 10 is a flow diagram of the obstacle-detection process employed in conjunction with the illustrative embodiments of the present invention; -
FIG. 11 is a schematic block diagram of an obstacle detection apparatus according to an illustrative embodiment of the present invention; -
FIG. 12 is a frontal view of a vehicle mirror face, used for example, on a tractor trailer showing an integrated stereo camera head according to an illustrative embodiment of the invention; and -
FIG. 13 is a partial perspective view of an exemplary tractor trailer including an aerodynamic-fairing-mounted stereo camera assembly according to an alternate embodiment of the present invention. -
FIG. 1 shows an obstacle detection and collision-avoidance system mounted on anexemplary tractor trailer 100. The tractor trailer (also part of a class of large cargo vehicles termed generally herein as a “truck”) 100 includes abox trailer 102 and acab 110. Thecab 110 houses the driver and provides the motive power for the rig. The driver is mounted relatively high in thecab 110 above theroad surface 120, which consists of a plurality ofhighway lanes 122, separated by dividinglines 124. Thecab windshield 130 affords the driver good visibility of the road ahead at long, medium and somewhat close distances. Likewise, theside windows 132 afford the driver a good view of obstacles and vehicles that are substantially aside the cab. Because rear view of the cab is occluded by thetall trailer 102, the driver relies uponlarge mirror assemblies side door 144 being shown inFIG. 1 and thepassenger door 146 shown inFIG. 2 ). These mirrors give a good view of objects aside the trailer such as the depicted motorcycle andrider 150. However, while diligent monitoring of mirrors is important, it is desirable to provide a more-automated and fulltime warning of the presence of such vehicles or objects as any collision by thetruck 100 with a smaller vehicle may prove fatal to the occupant. Thus, with further reference toFIG. 2 , a rear-facing stereo vision system is mounted on the cab with corresponding rear-directedcameras lines FIG. 1 ) of the depictedvehicle 150. - More particularly, and with reference to
FIG. 2 , which shows a portion of the tractor-trailer truck 100 ofFIG. 1 , the illustrative embodiment includes two pairs ofstereo cameras cab 110. Themirrors lower support brackets door trailer truck 100, thereby affording the driver a view of the rear sides of the rig. Eachstereo camera pair respective mirror bracket 220 and to face rearwardly at an appropriate angle to image the region adjacent to, and behind, the cab, thereby acting in essence, as automated mirrors within a predetermined viewing band as described further below. A front/plan view of one of the stereocamera pair assemblies 210 is shown inFIG. 3 . The depictedstereo pair assembly 210 is fixtured within a stereo housing or “head” 310, which can take any form and can include an aerodynamic fairing along its front-facing side. The housing encloses twocameras 320, 322 (shown schematically in phantom), withlenses optical axes narrow distance 340. Note that a camera “stereo pair” herein is defined as two discrete cameras that can be positioned at various spacings and/or distances from one another, two cameras included within a head, or fixtured separately, and oriented at any angle to each other. The dual-camera configuration shown inFIG. 3 is commonly referred to as a narrow-baseline stereo head. The narrow baseline stereo head offers the advantages that it can be attached to thetruck cab 110 at one place, and optionally in a vertical arrangement as shown inFIG. 3 . It should be apparent to one skilled in the art that theindividual cameras stereo head 310 would need to be fixtured to the subject vehicle. - In an exemplary system, the
baseline distance 340 between the optical centers of the cameras is approximately 12 cm, and less than approximately 20 cm is typical. Also, in an exemplary system, thelenses stereo pair 210 is shown inFIG. 4 . Thedisplay 410 includes a pair ofsub-windows baseline cameras exemplary sub-windows automobile - In an alternate arrangement (in accordance with the above-described term “stereo pair”), two cameras can be positioned physically farther apart. Such a wider-spaced arrangement is shown in
FIG. 5 which depicts it in conjunction with the above-described,exemplary truck 100 andcab 110. This illustrative embodiment provides a layout of a wide-based stereo pair camera system having twoseparate cameras separate cameras cameras mirrors other cameras respective posts front fenders upright posts - It is recognized by the invention that for a two-part vehicle, consisting of the
tractor 110 and thetrailer 102 it may be desirable to mount the cameras on the necessary component that is thetractor 110. This is because thetrailer 102 is typically interchangeable, subject to rough handling, long-term (possibly insecure) storage, and damage. Therefore, the expense of deploying a system according to the illustrative embodiment may be limited by mounting sensors on just thetractor 110 while still allowing the system to monitor the whole area adjacent to the trailer. - In this illustrative embodiment, the cameras (320, 322 or 510, 520, etc.) of each stereo pair are positioned and aligned substantially vertically with respect to the
ground surface plane 120, so that there is defined a top camera and a bottom camera. The illustrative vertical arrangement offers advantages when used to detect some of the common obstacles presented to vehicles. More specifically, the invention recognizes that using “narrow-baseline” stereo in the vertical arrangement may be desirable when compared with the typical horizontal arrangement for detecting other vehicles and other obstacles typically encountered by moving vehicles. Most of the distinctive and easily detectable (via contrast differences, etc.) features to detect, such as the tops bottoms, and internal structure of cars, trucks and cycles, extend in a largely horizontal direction, which a vertically arranged stereo system can usually detect better than a horizontally arranged stereo system. This better detection of horizontal features results because, among other reasons, the features are perpendicular to the epipolar line, where an epipolar line is known to those skilled in the art and described in further detail below. Additionally, a vertically arranged stereo pair is generally easier to mount on the side of a commercial or other vehicle, and the vertically arranged stereo pair provides a better profile(s) when mounted on the side of the vehicle. In many commercial vehicles, such as trucks, their width is sizable, and projections beyond that width must be limited to prevent overhang into the next lane or the curb/shoulder of the road. Both of the preceding advantages are captured regardless of how the cameras in the stereo pair are rotated within a head (e.g. 210) or separately (e.g.FIG. 5 ). It is mainly desired that the two stereo cameras be positioned vertically with respect to each other and the ground. - A third advantage of the above-described vertical arrangement only applies when orientation of the cameras with respect to the
ground 120 is vertical, so that horizontal lines in the world become vertical lines in the image. Most camera systems have better resolution in the horizontal than vertical direction, especially if only one field of an interlaced CCD camera system is used, so by orienting the cameras vertically, the system can orient the direction of maximum resolution along the long axis of the vehicle where the increased resolution is desirable. - In the exemplary embodiment, the captured field of view (represented by dashed-
line trapezoids 610, 612) is illustrated inFIG. 6 , showing a top view of the exemplary tractor-trailer 100. In this embodiment eachstereo pair road surface 120, and eachstereo pair respective area trailer 100. The viewedarea adjacent lanes 122, along each side of the tractor-trailer 100, and approximately 2-5 meters behind theback end 630 of thetrailer 102. - The area of primary importance for the exemplary embodiment is one-half the width WM of each
viewing area lane 122 of traffic/obstacles adjacent to thetruck 100, which if monitored, can help inform the driver of the presence of obstacles, such as other vehicles, in the adjacent lane prior to changing lanes. A large portion, but possibly not all, of the viewedarea area - By way of further background, the prior art describes non-stereo based vision methods to capture portions of this
area area area - A discussion of the use of rear-looking stereo is found in Stereo Panoramic Vision for Monitoring Vehicle Blind-spots, Matusyk, Leanne and Zelinsky Alexander, 2004 IEEE Intelligent Vehicles Symposium. Matusyk teaches the use of sensors to obtain a panoramic field of view to monitor the rear of the vehicle, and proposes (although does not show) then extension of the system to monitor all around the vehicle.
FIG. 7 shows, by way of background, an ideal version of a panoramic field (dashed circle 710) of view is typical forcars 720 in accordance with Matusyk. Using the teachings of Matusyk, one can simply superimpose the panoramic field of view of Matusyk on thetractor trailer cab 110 ofFIG. 6 to derive that resulting, somewhat panoramicviewing field circle 650. This circle is generally broken at the rear by the trailer. In any case, it should be clear to those of ordinary skill that the method of Matusyk fails to view most of thearea - Hence, by using a narrow field of view, a stereo pair according to an illustrative embodiment can monitor a large distance backwards, one or two lanes wide next to and behind any vehicle, including an elongated vehicle such as the
exemplary tractor trailer 100. In illustrative embodiments, the cameras (320, 322, 510, 520, etc.) are oriented at an angle A (FIG. 2 ) relative to aline 250 parallel to theground plane 120, so that their axes (approximated by line 260) look downward. In this manner, obstacles alongside and behind the vehicle will be located in the field of view of the cameras. In an exemplary system, the surface normal to the plane of the cameras within the stereo pairs 210 and 212 points downward and backwards as shown generally inFIG. 2 , wherein the cameras housed within each head are angled down just enough to view objects approximately 2 meters above the ground at 24 meters distance from each camera. The angle A of the cameras may be modified to capture other objects without departing from the scope of the invention. - It should be apparent to those skilled in the art that the cameras can be located at other places or other positions on the tractor with a stereo pair for one or both sides, and wherein the stereo pair for each side has a wider or smaller baseline. It should also be apparent that a stereo pair can be mounted on only one side of the vehicle without departing from the scope of the invention.
- At least two possible system configurations can be used to implement the present invention.
FIG. 8 shows a firstillustrative system configuration 800 of one stereo pair, such asstereo pair 210 as shown inFIGS. 2 and 3 , the systems monitoring the top and bottom are tightly integrated. Aframe grabber 810 receives video data output from both the top camera (820) and bottom camera (822), which is processed on aprocessing system 830. Theprocessing system 830 can be any acceptable vision processor that discriminates based upon (for example) contrast differences and/or blob analysis between the object and its surroundings, and resolves the locational differences (in each camera field of view) between matching features in the two images. The differences are used to compute the range of the object in a manner described further below. Thesystem 800 then outputs the results for obstacles on the one side, or back of the vehicle, to one ormore output devices 840. These output devices can include a virtual mirror with avideo display 270, shown generally on the driver's (280) dashboard inFIG. 2 . The display can be substituted with or augmented with other warnings, such as audible alarms, beepers and/or horns (including a recorded/synthesized spoken alert), as well as visible alerts including flashing lights, meters or other display icons that indicate the side and (optionally) the range of the vehicle/obstacle. In more advanced systems theoutput 840 can be integrated with an automated steering, acceleration and braking system. - A second
illustrative system configuration 900 is shown inFIG. 9 . In this embodiment, independent systems monitor top and bottom camera outputs (820, 822) separately. Twoseparate frame grabbers separate processor frame grabber processors system - In systems having a stereo pair for both the right and left sides of the vehicle, both stereo pairs can share one output device (such as device screen 270), and/or share one common processing system. The processing system can be mounted in the camera housing(s) or can be located in a cab-mounted box, including within the
display 270 itself. Data and power are provided to and from each camera via cabling (or wirelessly) as indicated by the dashedconnection lines 290 inFIGS. 2 and 5 . - Additionally, other possible system configurations can be employed in alternate embodiments. For example, although in the
systems FIGS. 8 and 9 , respectively use one or more frame grabbers, the system can function without a frame grabber. The images from each camera need only digitized in some manner, and input into the one or more processors. Once digitized, the images can also stored and later retrieved from memory (included in the processor), and can be used for quality assurance, later driver training, or games, for example. -
FIG. 10 illustrates a generalized block diagram of an obstacle-detection system orapparatus 1000 of the type in which the invention is practiced. In summary, thesystem 1000 includes a stereoimage acquisition device 1010, such as two conventional cameras (CCD, CMOS, etc.) that each generate an image of a field of view. The stereoimage acquisition device 1010 transmits image data to a three-dimensional (3D)processor 1020 either in a stream or upon request by the processor. The 3D processor can be any processing device or module capable of performing at least the minimum processing steps described hereinafter (inFIG. 11 ) to detect obstacles. Once detected, theprocessor 1020 communicates with anoutput device 1030, such as a virtual mirror and/or warning system (for example, dash-mounted console display 270). - Note that a personal computer (PC), laptop computer, a personal data assistant (PDA), dedicated processor, or any number or combination of processing devices are capable of serving as a 3D processor, either alone, or in conjunction with one another, according to the present invention. Further, those skilled in the art will appreciate that the 3D processor can be partitioned locally, or otherwise, in more than one manner without departing from the scope of the invention.
- In the illustrative embodiment of the present invention various parameters of the system (in enabling it to detect objects and discriminate range thereof) are set up at the place of manufacture. Such factory setup involves calibration and the computation of the intrinsic parameters for the cameras, and the relative orientation between the cameras. Calibration involves the solution of several sub-problems, as discussed hereinafter, each of which has several solutions that are well-understood by persons having ordinary skill in the art. Further rectification coefficients described hereinafter, are computed to enable runtime image correction.
- Stereo measurements can be made in a coordinate system that is different from the coordinate systems of either camera. For example the scene or world coordinates correspond to the points in a viewed scene. Camera coordinates (top and bottom) correspond to the viewer-centered representation of scene points. Undistorted image coordinates correspond to scene points projected onto the image plane. Distorted image coordinates correspond to points having undergone lens distortion. Pixel coordinates correspond to the grid of image samples in the image array. In the illustrative embodiment, one camera is designated to be a “reference camera” to which the stereo coordinate system is tied. An interior orientation process is performed to determine the internal geometry of a camera. These parameters, also called the intrinsic parameters, include the following: (a) effective focal length, also called the camera constant; (b) location of the principal point, also called the image center; (c) radial distortion coefficients; and (d) horizontal scale factor, also called the aspect ratio. The cameras used in the illustrative embodiment have fixed-focus lenses that cannot be modified; therefore these parameters can be computed and preset at the factory.
- A relative orientation process is also performed during factory setup to determine the relative position and orientation between two cameras from projections of calibration points in the scene. Again, the cameras are mechanically fixtured such that they stay in alignment, and hence, these parameters can also be preset at the factory.
- A rectification process, closely associated with the relative orientation is also performed during setup. Rectification is the process of resampling stereo images so that epipolar lines correspond to image rows. More particularly an epipolar line on one stereo image corresponding to a given point in another stereo image is the perspective projection on the first stereo image of the three-dimensional ray that is the inverse perspective projection of the given point from the other stereo image (Robert M. Haralick & Linda G. Shapiro, Computer and Robot Vision, Vol. II, p. 598 (1993)). If the two images are coplanar and the horizontal axes are collinear (no rotation about the optical axis), then the image rows are epipolar lines, and stereo correspondences can be found along corresponding rows. These images, referred to as normal image pairs, provide computational advantages because the rectification of normal image pairs need only be performed one time.
- The illustrative method for rectifying the images is independent of the representation used for the given pose of the two cameras. It relies on the principal that any perspective projection is a projective projection. Image planes corresponding to the two cameras are replaced by image planes with the desired geometry (normal image pair), while keeping the geometry of the rays spanned by the points and the projection centers intact. This results in a planar projective transformation. These coefficients can also be computed at the factory. Given the parameters computed in interior orientation, relative orientation and rectification, the camera images can be corrected for distortion and misalignment either in software or hardware. The resulting corrected images have the geometry of a normal image pair square pixels, aligned optical planes, aligned axes (rows), and pinhole camera model.
- The relationship of a stereo pair to the ground plane (road surface 120) is established before operation. Ground plane calibration can be performed in a variety of ways, such as direct measurement of the height and orientation of the head above the ground, through user selection of a set of features on the ground that can be seen by both stereo cameras; or through taking a stereo depth image of a flat area with sufficient horizontal features and fitting a plane to the results.
- In the illustrative system, regions of the images which are expected to contain the vehicle itself are set up manually. This involves capturing the image from the reference camera (camera that the stereo coordinate system is tied to), rectifying it, displaying it, and then using a graphics overlay tool to specify the areas expected to contain the vehicle. The invention automatically sets up the 6-meter zone adjacent to the vehicle, and extending backwards as the zone in which to monitor for objects. Filtering is performed to exclusively process features inside this zone. In alternate embodiments of the invention, automatic setup can be performed by laying out fiducial markings or tape on the ground, or by auto-detection of the vehicle profile in a controlled, flat environment. In addition, other embodiments of the invention may modify the vehicle and active zones depending on the turn-rate of the vehicle, as the size and shape of the zones for an articulated vehicle such as a tractor-trailer may change significantly in the course of tight turns.
- While there are several processes that may be employed to perform stereo vision according to the present invention, one illustrative embodiment of a
process 1100 is outlined below with reference toFIG. 11 . The images (Top (A) and Bottom (B)) are, respectively, input to the system processor(s) (steps steps steps 114A and 1114B), using a Gaussian or a parabolic low-pass filter at a certain sigma(s). Stereo systems have been shown to be more effective when not processing raw images. Instead, the exemplar has a Max−Min filter process (steps 1116A, 1116B) that enhances the edges, where the enhanced edges are embodied in the outputted second derivative image. The Max−Min filter takes the following steps: -
- 1. Find Maximum of a neighborhood of size 2*n+1 max;
- 2. Find Minimum of a neighborhood of size 2*n+1 min;
- 3. Compute Contrast=(max−min);
- 4. Compute Midpoint (max+min)/2;
- 5. C=Center pixel value a neighborhood; and
- 6. Generate a 2nd derivative image for matching purposes, for each pixel Value=128+C−Midpoint.
- The smoothed Max−Min filter process has several advantages over normal high-pass or edge-detection filters used to preprocess for stereo matching. For example, it is a band pass operator, as the Gaussian filter first removes the high frequency image elements, and then it performs the Max−Min operation which behaves as a high-pass filter so that low frequencies are eliminated. Therefore, it is not as sensitive to noise as some of the other candidate high-pass filters suggested in the art. In addition, it does not dislocate boundaries of the object as the final operation is a Max−Min.
- A matching process, also called a correspondence procedure, (step 1120) receives the second derivative images and uses this result to match small constant-sized windows of the filtered reference image (e.g. the top image) to corresponding windows in the other filtered image (e.g. the bottom image). The windows are matched using the sums of absolute differences (SAD) approach, where the score for a match between two windows is the sum of the absolute difference between the corresponding elements of the windows. The scoring of a match is called the Strength of Match (SOM). The initial set of possible matches for each feature is constrained using the epipolar constraint.
- Next an iterative winner-take-all procedure that enforces the uniqueness of a match is applied. The
matching process 1120 consists of the following sub-steps: (a) at each iteration, the matches for which the SOM is maximum for both of the features forming them are chosen as correct; then, because of the uniqueness constraint, all other associated matches with these two features are eliminated from further consideration. This allows further matches to be selected as correct provided that they now have the highest SOM for both constituent tokens. The matcher produces a disparity image (step 1130), where every match is recorded with the resulting best disparity. - A disparity image is an image where each point is either zero (0), representing no match, or the distance value (in pixels) between where a small window around that point in the reference image was found in the other image. Smaller disparities mean a greater distance from the camera for a ray emerging from the reference image and intersecting an object, and larger disparities mean closer.
- Next, the procedure searches for points/features which are located at a certain distance above the
ground plane 120. A number of procedures can be employed to accomplish this goal. However, where a camera may have possibly rolled, the procedure desirable employs the disparity image. Accordingly, at startup when theprocedure 1100 obtains the ground plane description, the procedure generates a disparity criteria image (input box 1150). Thiscriteria image 1150 represents the disparity value that would result from perfectly imaging a plane at the criteria distance above the ground plane. Any potential obstacle with a height greater than the criteria distance will return disparities greater than (i.e. closer than) the corresponding pixels of the disparity criteria image. Thus, theprocedure 1100 generates a feature image by marking all pixels for which the disparity from the generated disparity image (from step 1130) is greater than the corresponding pixel in the disparity criteria image. - It should be clear that if an alternate procedure is employed to locate points above the ground plane, then the disparity image and/or disparity criteria image may be optional, or may not be necessary.
- More particularly, the feature image will have occasional small defects due to inevitable stereo mismatches. Real features tend to result in patches at least the size of the stereo matching window, while spurious matches tend to be isolated to smaller areas. In this embodiment the
procedure 1100 first filters the spurious matches in the feature image (step 1160) before it aggregates the matches (step 1170). However, it should be clear that the order of aggregating and filtering can be altered without departing from the scope of the invention. Thus theillustrative procedure 1100 filters most of the mismatches by using image erosion, a conventional morphological image processing technique. In the erosion step, the procedure passes a 3×3 window over the feature image, and if there are zero elements in a window, the element at the center of the window is set to zero. This step eliminates any feature regions smaller than 3×3, which empirically eliminates the vast majority of spurious features from the resulting filtered feature image while retaining most real features. - The elements of the filtered feature image are then clustered together (aggregation step 1170) using standard region segmentation approaches so that feature pixels that are contiguous in the image with similar disparity (i.e. contiguous in 3D space) are conglomerated together into objects. These objects are then reported to the output device of the system (step 1180). In the illustrative collision-avoidance system this is accomplished through outputting CAN (Control Area Network) messages on a CAN bus.
- Although various calibration methods are described herein in terms of illustrative embodiments of the invention, persons having ordinary skill in the art should appreciate that any number of calibration methods can be used without departing from the spirit and scope of the invention. See, for example, R. Y. Tsai, A Versatile Camera Calibration Technique for High-accuracy 3D Machine Vision Metrology Using Off-the-shelf TV Cameras and Lenses, Robotics and Automation, Vol. 3, No. 4, pp. 323-344, and Z. Zhang, Flexible New Technique for Camera Calibration, TR-98-71, MICROSOFT Research, MICROSOFT CORPORATION, pp 1-22 (Mar. 25, 1999).
- According to an alternate embodiment as shown in, one or both of the stereo vision cameras for each side of the object detection system can be integrated directly into a mirror. In part due the relatively small size of available camera sensors, most large mirrors can accommodate one or more cameras within their housing.
FIG. 12 details abasic mirror assembly 1200, which includes ahousing frame 1210 and amirror glass 1220. The mirror glass is modified with transparent (potentially unsilvered)ports respective camera lenses head 1240 is routed along acable 1250 that can be part of an overall wiring harness for the mirror (e.g. for providing power adjustment, defrost heat and the like). Since the mirror is separately adjustable to suit the driver's needs, the head can be gimbaled, or otherwise movably mounted within the housing to maintain a predetermined orientation relative to the vehicle side, despite the adjustment position of the mirror. Thehead 1240 can include a powered adjustment unit that maintains a desired angle as the mirror is adjusted. The ports are large enough to allow a range of angular adjustment for the head versus the mirror glass without occluding the view of thecamera lenses - In another alternate embodiment, shown in
FIG. 13 , thecab 1310 of an exemplary tractor trailer has been modified so that the upper aerodynamic fairing (found as an original or add-on feature in most modern trucks) 1320 includescowlings cowlings example cameras - Although the illustrative embodiment described herein is setup in the factory using factory setup procedures, persons having ordinary skill in the art should appreciate that any of the described setup steps can also be performed in the field without departing from the scope of the invention. Also, although an interior orientation process for determining the internal geometry of cameras in terms of the camera constant, the image center radial distortion coefficients and aspect ratio, those of ordinary skill in the art should appreciate that additional intrinsic parameters may be added or some of these parameters ignored in alternative embodiments within the scope of the present invention.
- In addition, while ground plane calibration in the illustrative embodiments described herein is performed for each particular vehicle mounting class, persons having ordinary skill in the art should appreciate that ground plane calibration could also be performed in the factory or at alternate locations without departing from the spirit and scope of the invention. Additionally, although edge enhancement is performed in the illustrative embodiments described herein by performing a Max−Min filter operation, persons having skill in the art should appreciate that any number of edge-processing techniques known in the art can be used to accomplish this step without departing from the spirit and scope of the present invention.
- Moreover, although the dense matching step of an illustrative embodiment is described herein, wherein pixels are matched in an edge-enhanced image using a SAD (Sum of Absolute Differences) strength of match, followed by implementing a uniqueness constraint, persons having ordinary skill in the art should appreciate that various alternative matching processes can be substituted without departing from the spirit and scope of the present invention. Those skilled in the art will appreciate that the method and apparatuses described herein can also be used to detect obstacles around other moving objects, such as boats trains, for example.
- Those skilled in the art will further appreciate that some, or all, of the steps of edge detection, correspondence matching rectification, and filtering described hereinbefore can be combined and effected as hardware implementations, software implementations or a combination thereof. Moreover, although the exemplary embodiment often refers to elongated vehicles, it should be clear to those skilled in the art that obstacles, such as fast-approaching vehicles, or falling back vehicles, or lightweight vehicles, such as cars, can also be detected using the teachings herein. Also, while the arrangement of stereo camera baseline described herein is aligned as substantially vertical or substantially horizontal, it should be clear that, with some drawbacks known in the art, the baseline can be arranged at any angle therebetween.
- Additionally, although the monitoring of each side and back of the vehicle is described for two cameras, those skilled in the art will realize that more than two cameras in an arrangement can also be used in accordance with the teachings herein. To this end, the term “stereo pair” can be used torn described two or more, cameras that can be coordinated into a single object detection system using the techniques described herein. where more than two cameras re employed on a given side, the techniques for discriminating objects and associated range thereof can be modified to allow incorporation of data from the third (or greater) camera(s). Furthermore, while the invention has been shown and described with respect to exemplary embodiments thereof, various other changes, omissions, and additions in the form of, and detail thereof, may be made therein without departing from the spirit and scope of the invention.
- Note that, by way of further background, the teachings of commonly assigned U.S. patent application Ser. No. 10/388,925, entitled number entitled STEREO DOOR SENSOR, and commonly assigned U.S. patent application Ser. No. 10/702,059 entitled METHOD AND SYSTEM FOR ENHANCED PORTAL SECURITY THROUGH STEREOSCOPY are expressly incorporated herein by reference.
- In summary, the above-described system and method for object detection and collision avoidance described herein provides a novel and effective way to identify moving objects and vehicles located behind the cab of an elongated, and possibly tandem, vehicle. Through the use of narrow-baseline stereo vision, the system and method can employ relatively inexpensive cameras, in a stereo relationship, on a low-profile mounting, to perform reliable detection with good range discrimination. The field of detection is sufficiently behind and aside the rear area to assure an adequate safety zone in most instances. Moreover, this system and method allows all equipment to be maintained on the cab of a tandem vehicle, rather than the interchangeable and more-prone-to damage cargo section and/or trailer.
- The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope if this invention. Each of the various embodiments described above may be combined with other described embodiments in order to provide multiple features. Furthermore, while the foregoing describes a number of separate embodiments of the apparatus and method of the present invention, what has been described herein is merely illustrative of the application of the principles of the present invention. For example, in alternate embodiments it is contemplated that one or more camera sensors (stereo or single lens) may be mounted on the trailer/cargo carrier of the vehicle. This mounting can be permanent or can be accomplished using detachable mountings that allow cameras to be detached from trailers or the cab when not in use. Likewise, the data connection between the camera sensors and main controller can be wireless in alternate embodiments, allowing cameras to operate using onboard battery power or a simple power connection from the trailer and/or cab. In addition, it is expressly contemplated that the processes described herein can be implemented using electronic or computer hardware, software consisting of a computer-readable medium of program instructions, or a combination of hardware and software. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/954,513 US8199975B2 (en) | 2006-12-12 | 2007-12-12 | System and method for side vision detection of obstacles for vehicles |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US86968106P | 2006-12-12 | 2006-12-12 | |
US11/954,513 US8199975B2 (en) | 2006-12-12 | 2007-12-12 | System and method for side vision detection of obstacles for vehicles |
Publications (2)
Publication Number | Publication Date |
---|---|
US20080304705A1 true US20080304705A1 (en) | 2008-12-11 |
US8199975B2 US8199975B2 (en) | 2012-06-12 |
Family
ID=39530996
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/954,513 Expired - Fee Related US8199975B2 (en) | 2006-12-12 | 2007-12-12 | System and method for side vision detection of obstacles for vehicles |
Country Status (2)
Country | Link |
---|---|
US (1) | US8199975B2 (en) |
DE (1) | DE102007059735A1 (en) |
Cited By (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080278576A1 (en) * | 2007-05-10 | 2008-11-13 | Honda Motor Co., Ltd. | Object detection apparatus, object detection method and object detection program |
US20090021609A1 (en) * | 2007-07-16 | 2009-01-22 | Trw Automotive U.S. Llc | Method and apparatus for distortion correction and image enhancing of a vehicle rear viewing system |
US20090262108A1 (en) * | 2008-01-18 | 2009-10-22 | Sony Corporation | Streaming geometery for use in displaying and editing 3d imagery |
US20090271078A1 (en) * | 2008-04-29 | 2009-10-29 | Mike Dickinson | System and method for identifying a trailer being towed by a vehicle |
US20100061593A1 (en) * | 2008-09-05 | 2010-03-11 | Macdonald Willard S | Extrapolation system for solar access determination |
US20100194884A1 (en) * | 2009-02-02 | 2010-08-05 | Morgan Plaster | Driver observation and security system and method therefor |
US20110311102A1 (en) * | 2010-06-17 | 2011-12-22 | Mcdaniel Michael S | Machine control system utilizing stereo disparity density |
US20120194679A1 (en) * | 2011-01-31 | 2012-08-02 | Nehowig Kelly R | Multi-mode vehicle computing device supporting in-cab and stand-alone operation |
US20130002854A1 (en) * | 2010-09-17 | 2013-01-03 | Certusview Technologies, Llc | Marking methods, apparatus and systems including optical flow-based dead reckoning features |
US20130027196A1 (en) * | 2011-07-26 | 2013-01-31 | Harman International Industries, Incorporated | Obstacle detection system |
CN103177236A (en) * | 2011-12-22 | 2013-06-26 | 株式会社理光 | Method and device for detecting road regions and method and device for detecting separation lines |
US20130312043A1 (en) * | 2012-05-20 | 2013-11-21 | Transportation Security Enterprises, Inc. (Tse) | System and method for security data acquisition and aggregation on mobile platforms |
US20130307989A1 (en) * | 2012-05-20 | 2013-11-21 | Transportation Security Enterprises, Inc. (Tse) | System and method for real-time data capture and packet transmission using a layer 2 wireless mesh network |
US20130307980A1 (en) * | 2012-05-20 | 2013-11-21 | Transportation Security Enterprises, Inc. (Tse) | System and method for real time security data acquisition and integration from mobile platforms |
CN103538533A (en) * | 2013-10-09 | 2014-01-29 | 安徽省龙佳交通设备有限公司 | Left and right blind spot monitoring alarm system for back-off of semitrailer |
US20140063237A1 (en) * | 2012-09-03 | 2014-03-06 | Transportation Security Enterprises, Inc.(TSE), a Delaware corporation | System and method for anonymous object identifier generation and usage for tracking |
US20140119597A1 (en) * | 2012-10-31 | 2014-05-01 | Hyundai Motor Company | Apparatus and method for tracking the position of a peripheral vehicle |
US20140146152A1 (en) * | 2012-11-29 | 2014-05-29 | Timothy J. Frashure | Driver view adapter for forward looking camera |
CN103871042A (en) * | 2012-12-12 | 2014-06-18 | 株式会社理光 | Method and device for detecting continuous type object in parallax direction based on disparity map |
US20140205139A1 (en) * | 2013-01-18 | 2014-07-24 | Caterpillar Inc. | Object recognition system implementing image data transformation |
US20140347450A1 (en) * | 2011-11-30 | 2014-11-27 | Imagenext Co., Ltd. | Method and apparatus for creating 3d image of vehicle surroundings |
US20140376119A1 (en) * | 2013-06-25 | 2014-12-25 | Magna Mirrors Of America, Inc. | Rearview mirror assembly for vehicle |
US20150019043A1 (en) * | 2013-07-12 | 2015-01-15 | Jaybridge Robotics, Inc. | Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection |
JP2015055919A (en) * | 2013-09-10 | 2015-03-23 | 日野自動車株式会社 | Collision avoidance support device |
US9082315B2 (en) | 2012-03-08 | 2015-07-14 | Industrial Technology Research Institute | Surrounding bird view monitoring image generation method and training method, automobile-side device, and training device thereof |
US20150319409A1 (en) * | 2014-04-30 | 2015-11-05 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and distance measuring apparatus using the same |
US20170088050A1 (en) * | 2015-09-24 | 2017-03-30 | Alpine Electronics, Inc. | Following vehicle detection and alarm device |
US9676336B2 (en) | 2013-06-25 | 2017-06-13 | Magna Mirrors Of America, Inc. | Exterior rearview mirror assembly for vehicle |
US9714037B2 (en) | 2014-08-18 | 2017-07-25 | Trimble Navigation Limited | Detection of driver behaviors using in-vehicle systems and methods |
US20170259753A1 (en) * | 2016-03-14 | 2017-09-14 | Uber Technologies, Inc. | Sidepod stereo camera system for an autonomous vehicle |
US20180183980A1 (en) * | 2014-04-28 | 2018-06-28 | The Boeing Company | Apparatus and method for monitoring performance characteristics of a component of a vehicle |
US20180232607A1 (en) * | 2016-01-25 | 2018-08-16 | Zhejiang Shenghui Lighting Co., Ltd | Method and device for target detection |
US10161746B2 (en) | 2014-08-18 | 2018-12-25 | Trimble Navigation Limited | Systems and methods for cargo management |
US10204159B2 (en) | 2015-08-21 | 2019-02-12 | Trimble Navigation Limited | On-demand system and method for retrieving video from a commercial vehicle |
US10300851B1 (en) * | 2018-10-04 | 2019-05-28 | StradVision, Inc. | Method for warning vehicle of risk of lane change and alarm device using the same |
WO2019133640A1 (en) * | 2017-12-29 | 2019-07-04 | PlusAI Corp | Method and system for multiple stereo based depth estimation and collision warning/avoidance utilizing the same |
US10412368B2 (en) | 2013-03-15 | 2019-09-10 | Uber Technologies, Inc. | Methods, systems, and apparatus for multi-sensory stereo vision for robotics |
CN110574357A (en) * | 2017-03-31 | 2019-12-13 | 索尼半导体解决方案公司 | Imaging control apparatus, method for controlling imaging control apparatus, and moving body |
US10686976B2 (en) | 2014-08-18 | 2020-06-16 | Trimble Inc. | System and method for modifying onboard event detection and/or image capture strategy using external source data |
US10857941B2 (en) * | 2019-01-28 | 2020-12-08 | Toyota Motor Engineering & Manufacturing North America, Inc. | E-mirror automatic position adjustment system |
KR20210036387A (en) * | 2018-08-06 | 2021-04-02 | 크노르-브렘제 시스테메 퓌어 누츠파조이게 게엠베하 | Camera monitoring system |
US10967862B2 (en) | 2017-11-07 | 2021-04-06 | Uatc, Llc | Road anomaly detection for autonomous vehicle |
US11108992B2 (en) | 2016-06-08 | 2021-08-31 | Sony Corporation | Imaging control device and method, and vehicle |
US11328822B2 (en) * | 2017-02-01 | 2022-05-10 | Conflu3Nce Ltd | Multi-purpose interactive cognitive platform |
US20220174867A1 (en) * | 2020-12-04 | 2022-06-09 | Scythe Robotics, Inc. | Autonomous lawn mower |
US11505123B2 (en) * | 2019-12-02 | 2022-11-22 | Magna Electronics Inc. | Vehicular camera monitoring system with stereographic display |
US11532232B2 (en) * | 2019-11-01 | 2022-12-20 | Lg Electronics Inc. | Vehicle having dangerous situation notification function and control method thereof |
EP4223593A1 (en) * | 2022-02-08 | 2023-08-09 | Continental Autonomous Mobility Germany GmbH | Rear-view stereo camera system |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ITBO20080506A1 (en) * | 2008-08-05 | 2010-02-06 | Spal Automotive Srl | VISUALIZATION SYSTEM FOR VEHICLES. |
DE102008045007A1 (en) * | 2008-08-29 | 2010-03-04 | Man Nutzfahrzeuge Aktiengesellschaft | A method for image-based monitoring of a vehicle environment, in particular an environment of a commercial vehicle |
US20120106778A1 (en) * | 2010-10-28 | 2012-05-03 | General Electric Company | System and method for monitoring location of persons and objects |
DE102011109459A1 (en) * | 2011-08-04 | 2013-02-07 | Man Truck & Bus Ag | Method for detecting objects on the side of a utility vehicle and utility vehicle with a detection system for carrying out the method |
DE102013002111B4 (en) * | 2013-02-08 | 2021-11-18 | Mekra Lang Gmbh & Co. Kg | Vision system for vehicles, in particular commercial vehicles |
FR3004681B1 (en) * | 2013-04-19 | 2016-11-04 | Valeo Vision | RETROVISION DEVICE FOR MOTORIZED TRANSPORT MEANS, AND VEHICLE COMPRISING SAID DEVICE |
DE102013018543A1 (en) | 2013-11-05 | 2015-05-07 | Mekra Lang Gmbh & Co. Kg | Driver assistance system for vehicles, in particular commercial vehicles |
JP5989701B2 (en) * | 2014-03-24 | 2016-09-07 | トヨタ自動車株式会社 | Boundary detection device and boundary detection method |
KR101593484B1 (en) | 2014-07-10 | 2016-02-15 | 경북대학교 산학협력단 | Image processing apparatus and method for detecting partially visible object approaching from side using equi-height peripheral mosaicking image, and system for assisting vehicle driving employing the same |
US20160026881A1 (en) * | 2014-07-22 | 2016-01-28 | Vislab S.R.L. | Lateral obstacle detection apparatus for a motor vehicle, motor vehicle comprising that apparatus and process for detecting lateral obstacles during the travel of a motor vehicle |
EP2995506A1 (en) * | 2014-09-10 | 2016-03-16 | Anthony Carl Hamilton | Information display module and fairing |
GB201416016D0 (en) * | 2014-09-10 | 2014-10-22 | Hamilton Anthony C | Infomation Display Module |
JP2018512593A (en) | 2015-04-10 | 2018-05-17 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh | Object position measurement with in-vehicle camera using vehicle movement data |
US10373378B2 (en) | 2015-06-26 | 2019-08-06 | Paccar Inc | Augmented reality system for vehicle blind spot prevention |
EP3281825B1 (en) | 2016-08-10 | 2019-04-24 | Continental Automotive GmbH | Rear-view mirror unit |
US10621446B2 (en) * | 2016-12-22 | 2020-04-14 | Texas Instruments Incorporated | Handling perspective magnification in optical flow processing |
DE102017207438B4 (en) | 2017-05-03 | 2020-07-09 | Volkswagen Aktiengesellschaft | Method, device and its use for determining the articulation angle of a team |
CN116704473B (en) * | 2023-05-24 | 2024-03-08 | 禾多科技(北京)有限公司 | Obstacle information detection method, obstacle information detection device, electronic device, and computer-readable medium |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5424952A (en) * | 1993-03-26 | 1995-06-13 | Mitsubishi Denki Kabushiki Kaisha | Vehicle-surroundings monitoring apparatus |
US5530420A (en) * | 1993-12-27 | 1996-06-25 | Fuji Jukogyo Kabushiki Kaisha | Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof |
US5719954A (en) * | 1994-06-07 | 1998-02-17 | Matsushita Electric Industrial Co., Ltd. | Stereo matching method and disparity measuring method |
US5937079A (en) * | 1996-09-05 | 1999-08-10 | Daimler-Benz Ag | Method for stereo image object detection |
US6396397B1 (en) * | 1993-02-26 | 2002-05-28 | Donnelly Corporation | Vehicle imaging system with stereo imaging |
US20040003409A1 (en) * | 2002-06-27 | 2004-01-01 | International Business Machines Corporation | Rendering system and method for images having differing foveal area and peripheral view area resolutions |
US6737964B2 (en) * | 2001-11-05 | 2004-05-18 | Ford Global Technologies, Llc | Vehicle blind spot monitoring system |
US6963661B1 (en) * | 1999-09-09 | 2005-11-08 | Kabushiki Kaisha Toshiba | Obstacle detection system and method therefor |
US20060077543A1 (en) * | 2003-05-29 | 2006-04-13 | Takashi Miyoshi | Stereo camera system and stereo optical module |
US7046822B1 (en) * | 1999-06-11 | 2006-05-16 | Daimlerchrysler Ag | Method of detecting objects within a wide range of a road vehicle |
US7091837B2 (en) * | 2002-03-29 | 2006-08-15 | Kabushiki Kaisha Toshiba | Obstacle detecting apparatus and method |
US20060206243A1 (en) * | 2002-05-03 | 2006-09-14 | Donnelly Corporation, A Corporation Of The State Michigan | Object detection system for vehicle |
US20060239509A1 (en) * | 2005-04-26 | 2006-10-26 | Fuji Jukogyo Kabushiki Kaisha | Road line recognition apparatus |
US7251346B2 (en) * | 2002-11-19 | 2007-07-31 | Honda Motor Co., Ltd. | Moving object detection device, moving object detection method, and moving object detection program |
US20070177011A1 (en) * | 2004-03-05 | 2007-08-02 | Lewin Andrew C | Movement control system |
-
2007
- 2007-12-12 US US11/954,513 patent/US8199975B2/en not_active Expired - Fee Related
- 2007-12-12 DE DE102007059735A patent/DE102007059735A1/en not_active Withdrawn
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6396397B1 (en) * | 1993-02-26 | 2002-05-28 | Donnelly Corporation | Vehicle imaging system with stereo imaging |
US5424952A (en) * | 1993-03-26 | 1995-06-13 | Mitsubishi Denki Kabushiki Kaisha | Vehicle-surroundings monitoring apparatus |
US5530420A (en) * | 1993-12-27 | 1996-06-25 | Fuji Jukogyo Kabushiki Kaisha | Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof |
US5719954A (en) * | 1994-06-07 | 1998-02-17 | Matsushita Electric Industrial Co., Ltd. | Stereo matching method and disparity measuring method |
US5937079A (en) * | 1996-09-05 | 1999-08-10 | Daimler-Benz Ag | Method for stereo image object detection |
US7046822B1 (en) * | 1999-06-11 | 2006-05-16 | Daimlerchrysler Ag | Method of detecting objects within a wide range of a road vehicle |
US6963661B1 (en) * | 1999-09-09 | 2005-11-08 | Kabushiki Kaisha Toshiba | Obstacle detection system and method therefor |
US6737964B2 (en) * | 2001-11-05 | 2004-05-18 | Ford Global Technologies, Llc | Vehicle blind spot monitoring system |
US7209031B2 (en) * | 2002-03-29 | 2007-04-24 | Kabushiki Kaisha Toshiba | Obstacle detecting apparatus and method |
US7091837B2 (en) * | 2002-03-29 | 2006-08-15 | Kabushiki Kaisha Toshiba | Obstacle detecting apparatus and method |
US20060206243A1 (en) * | 2002-05-03 | 2006-09-14 | Donnelly Corporation, A Corporation Of The State Michigan | Object detection system for vehicle |
US20040003409A1 (en) * | 2002-06-27 | 2004-01-01 | International Business Machines Corporation | Rendering system and method for images having differing foveal area and peripheral view area resolutions |
US7251346B2 (en) * | 2002-11-19 | 2007-07-31 | Honda Motor Co., Ltd. | Moving object detection device, moving object detection method, and moving object detection program |
US20060077543A1 (en) * | 2003-05-29 | 2006-04-13 | Takashi Miyoshi | Stereo camera system and stereo optical module |
US20070177011A1 (en) * | 2004-03-05 | 2007-08-02 | Lewin Andrew C | Movement control system |
US20060239509A1 (en) * | 2005-04-26 | 2006-10-26 | Fuji Jukogyo Kabushiki Kaisha | Road line recognition apparatus |
Cited By (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8300887B2 (en) * | 2007-05-10 | 2012-10-30 | Honda Motor Co., Ltd. | Object detection apparatus, object detection method and object detection program |
US20080278576A1 (en) * | 2007-05-10 | 2008-11-13 | Honda Motor Co., Ltd. | Object detection apparatus, object detection method and object detection program |
US20090021609A1 (en) * | 2007-07-16 | 2009-01-22 | Trw Automotive U.S. Llc | Method and apparatus for distortion correction and image enhancing of a vehicle rear viewing system |
US8233045B2 (en) * | 2007-07-16 | 2012-07-31 | Trw Automotive U.S. Llc | Method and apparatus for distortion correction and image enhancing of a vehicle rear viewing system |
US20090262108A1 (en) * | 2008-01-18 | 2009-10-22 | Sony Corporation | Streaming geometery for use in displaying and editing 3d imagery |
US20090262184A1 (en) * | 2008-01-18 | 2009-10-22 | Sony Corporation | Method and apparatus for displaying and editing 3d imagery |
US20090289941A1 (en) * | 2008-01-18 | 2009-11-26 | Sony Corporation | Composite transition nodes for use in 3d data generation |
US8576228B2 (en) | 2008-01-18 | 2013-11-05 | Sony Corporation | Composite transition nodes for use in 3D data generation |
US8564644B2 (en) * | 2008-01-18 | 2013-10-22 | Sony Corporation | Method and apparatus for displaying and editing 3D imagery |
US8471844B2 (en) | 2008-01-18 | 2013-06-25 | Sony Corporation | Streaming geometry for use in displaying and editing 3D imagery |
US20090271078A1 (en) * | 2008-04-29 | 2009-10-29 | Mike Dickinson | System and method for identifying a trailer being towed by a vehicle |
US20100061593A1 (en) * | 2008-09-05 | 2010-03-11 | Macdonald Willard S | Extrapolation system for solar access determination |
US20100194884A1 (en) * | 2009-02-02 | 2010-08-05 | Morgan Plaster | Driver observation and security system and method therefor |
US20110311102A1 (en) * | 2010-06-17 | 2011-12-22 | Mcdaniel Michael S | Machine control system utilizing stereo disparity density |
US8320627B2 (en) * | 2010-06-17 | 2012-11-27 | Caterpillar Inc. | Machine control system utilizing stereo disparity density |
US20130002854A1 (en) * | 2010-09-17 | 2013-01-03 | Certusview Technologies, Llc | Marking methods, apparatus and systems including optical flow-based dead reckoning features |
US20120194679A1 (en) * | 2011-01-31 | 2012-08-02 | Nehowig Kelly R | Multi-mode vehicle computing device supporting in-cab and stand-alone operation |
US9704310B2 (en) * | 2011-01-31 | 2017-07-11 | Trimble Navigation Limited | Multi-mode vehicle computing device supporting in-cab and stand-alone operation |
US20130027196A1 (en) * | 2011-07-26 | 2013-01-31 | Harman International Industries, Incorporated | Obstacle detection system |
US8611598B2 (en) * | 2011-07-26 | 2013-12-17 | Harman International (China) Holdings Co., Ltd. | Vehicle obstacle detection system |
US20140347450A1 (en) * | 2011-11-30 | 2014-11-27 | Imagenext Co., Ltd. | Method and apparatus for creating 3d image of vehicle surroundings |
US9508189B2 (en) * | 2011-11-30 | 2016-11-29 | Kss-Imagenext Co., Ltd. | Method and apparatus for creating 3D image of vehicle surroundings |
CN103177236A (en) * | 2011-12-22 | 2013-06-26 | 株式会社理光 | Method and device for detecting road regions and method and device for detecting separation lines |
US9082315B2 (en) | 2012-03-08 | 2015-07-14 | Industrial Technology Research Institute | Surrounding bird view monitoring image generation method and training method, automobile-side device, and training device thereof |
US20130312043A1 (en) * | 2012-05-20 | 2013-11-21 | Transportation Security Enterprises, Inc. (Tse) | System and method for security data acquisition and aggregation on mobile platforms |
US20130307989A1 (en) * | 2012-05-20 | 2013-11-21 | Transportation Security Enterprises, Inc. (Tse) | System and method for real-time data capture and packet transmission using a layer 2 wireless mesh network |
US20130307980A1 (en) * | 2012-05-20 | 2013-11-21 | Transportation Security Enterprises, Inc. (Tse) | System and method for real time security data acquisition and integration from mobile platforms |
US20140063237A1 (en) * | 2012-09-03 | 2014-03-06 | Transportation Security Enterprises, Inc.(TSE), a Delaware corporation | System and method for anonymous object identifier generation and usage for tracking |
US9025819B2 (en) * | 2012-10-31 | 2015-05-05 | Hyundai Motor Company | Apparatus and method for tracking the position of a peripheral vehicle |
US20140119597A1 (en) * | 2012-10-31 | 2014-05-01 | Hyundai Motor Company | Apparatus and method for tracking the position of a peripheral vehicle |
CN105142981A (en) * | 2012-11-29 | 2015-12-09 | 奔德士商用车系统有限责任公司 | Driver view adapter for forward looking camera |
US9128354B2 (en) * | 2012-11-29 | 2015-09-08 | Bendix Commercial Vehicle Systems Llc | Driver view adapter for forward looking camera |
US20140146152A1 (en) * | 2012-11-29 | 2014-05-29 | Timothy J. Frashure | Driver view adapter for forward looking camera |
CN103871042A (en) * | 2012-12-12 | 2014-06-18 | 株式会社理光 | Method and device for detecting continuous type object in parallax direction based on disparity map |
US20140205139A1 (en) * | 2013-01-18 | 2014-07-24 | Caterpillar Inc. | Object recognition system implementing image data transformation |
US10412368B2 (en) | 2013-03-15 | 2019-09-10 | Uber Technologies, Inc. | Methods, systems, and apparatus for multi-sensory stereo vision for robotics |
US9487142B2 (en) * | 2013-06-25 | 2016-11-08 | Magna Mirrors Of America, Inc. | Rearview mirror assembly for vehicle |
US11851004B2 (en) | 2013-06-25 | 2023-12-26 | Magna Mirrors Of America, Inc. | Rearview mirror assembly for vehicle |
US20140376119A1 (en) * | 2013-06-25 | 2014-12-25 | Magna Mirrors Of America, Inc. | Rearview mirror assembly for vehicle |
US10053015B2 (en) | 2013-06-25 | 2018-08-21 | Magna Mirrors Of America, Inc. | Exterior rearview mirror assembly for vehicle |
US9676336B2 (en) | 2013-06-25 | 2017-06-13 | Magna Mirrors Of America, Inc. | Exterior rearview mirror assembly for vehicle |
US10543784B2 (en) | 2013-06-25 | 2020-01-28 | Magna Mirrors Of America, Inc. | Rearview mirror assembly for vehicle |
US10220788B2 (en) | 2013-06-25 | 2019-03-05 | Magna Mirrors Of America, Inc. | Exterior rearview mirror assembly for vehicle |
US9052714B2 (en) * | 2013-07-12 | 2015-06-09 | Jaybridge Robotics, Inc. | Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection |
US20150019043A1 (en) * | 2013-07-12 | 2015-01-15 | Jaybridge Robotics, Inc. | Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection |
JP2015055919A (en) * | 2013-09-10 | 2015-03-23 | 日野自動車株式会社 | Collision avoidance support device |
CN103538533A (en) * | 2013-10-09 | 2014-01-29 | 安徽省龙佳交通设备有限公司 | Left and right blind spot monitoring alarm system for back-off of semitrailer |
US10750059B2 (en) * | 2014-04-28 | 2020-08-18 | The Boeing Company | Apparatus and method for monitoring performance characteristics of a component of a vehicle |
US20180183980A1 (en) * | 2014-04-28 | 2018-06-28 | The Boeing Company | Apparatus and method for monitoring performance characteristics of a component of a vehicle |
US10757370B2 (en) * | 2014-04-30 | 2020-08-25 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and distance measuring apparatus using the same |
US11006081B2 (en) | 2014-04-30 | 2021-05-11 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and distance measuring apparatus using the same |
US9876992B2 (en) * | 2014-04-30 | 2018-01-23 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and distance measuring apparatus using the same |
US11533455B2 (en) | 2014-04-30 | 2022-12-20 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and distance measuring apparatus using the same |
US20150319409A1 (en) * | 2014-04-30 | 2015-11-05 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and distance measuring apparatus using the same |
US20180109763A1 (en) * | 2014-04-30 | 2018-04-19 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and distance measuring apparatus using the same |
US10161746B2 (en) | 2014-08-18 | 2018-12-25 | Trimble Navigation Limited | Systems and methods for cargo management |
US10686976B2 (en) | 2014-08-18 | 2020-06-16 | Trimble Inc. | System and method for modifying onboard event detection and/or image capture strategy using external source data |
US9714037B2 (en) | 2014-08-18 | 2017-07-25 | Trimble Navigation Limited | Detection of driver behaviors using in-vehicle systems and methods |
US10204159B2 (en) | 2015-08-21 | 2019-02-12 | Trimble Navigation Limited | On-demand system and method for retrieving video from a commercial vehicle |
US10589669B2 (en) * | 2015-09-24 | 2020-03-17 | Alpine Electronics, Inc. | Following vehicle detection and alarm device |
US20170088050A1 (en) * | 2015-09-24 | 2017-03-30 | Alpine Electronics, Inc. | Following vehicle detection and alarm device |
US10474935B2 (en) * | 2016-01-25 | 2019-11-12 | Zhejiang Shenghui Lighting Co., Ltd. | Method and device for target detection |
US20180232607A1 (en) * | 2016-01-25 | 2018-08-16 | Zhejiang Shenghui Lighting Co., Ltd | Method and device for target detection |
US10077007B2 (en) * | 2016-03-14 | 2018-09-18 | Uber Technologies, Inc. | Sidepod stereo camera system for an autonomous vehicle |
US20170259753A1 (en) * | 2016-03-14 | 2017-09-14 | Uber Technologies, Inc. | Sidepod stereo camera system for an autonomous vehicle |
US11108992B2 (en) | 2016-06-08 | 2021-08-31 | Sony Corporation | Imaging control device and method, and vehicle |
US11328822B2 (en) * | 2017-02-01 | 2022-05-10 | Conflu3Nce Ltd | Multi-purpose interactive cognitive platform |
US10864855B2 (en) * | 2017-03-31 | 2020-12-15 | Sony Semiconductor Solutions Corporation | Imaging control apparatus, method for controlling imaging control apparatus, and mobile body |
CN110574357A (en) * | 2017-03-31 | 2019-12-13 | 索尼半导体解决方案公司 | Imaging control apparatus, method for controlling imaging control apparatus, and moving body |
JP7196063B2 (en) | 2017-03-31 | 2022-12-26 | ソニーセミコンダクタソリューションズ株式会社 | IMAGING CONTROL DEVICE, CONTROL METHOD OF IMAGING CONTROL DEVICE, AND MOVING OBJECT |
JPWO2018180579A1 (en) * | 2017-03-31 | 2020-02-13 | ソニーセミコンダクタソリューションズ株式会社 | Imaging control device, control method of imaging control device, and moving object |
EP3606063A4 (en) * | 2017-03-31 | 2020-02-05 | Sony Semiconductor Solutions Corporation | Imaging control device, control method for imaging control device, and mobile object |
US11731627B2 (en) | 2017-11-07 | 2023-08-22 | Uatc, Llc | Road anomaly detection for autonomous vehicle |
US10967862B2 (en) | 2017-11-07 | 2021-04-06 | Uatc, Llc | Road anomaly detection for autonomous vehicle |
US11587248B2 (en) | 2017-12-29 | 2023-02-21 | Plusai, Inc. | Method and system for multiple stereo based depth estimation and collision warning/avoidance utilizing the same |
WO2019133640A1 (en) * | 2017-12-29 | 2019-07-04 | PlusAI Corp | Method and system for multiple stereo based depth estimation and collision warning/avoidance utilizing the same |
US10957064B2 (en) | 2017-12-29 | 2021-03-23 | PlusAI Corp | Method and system for multiple stereo based depth estimation and collision warning/avoidance utilizing the same |
KR102447073B1 (en) * | 2018-08-06 | 2022-09-23 | 크노르-브렘제 시스테메 퓌어 누츠파조이게 게엠베하 | camera monitoring system |
JP2021533666A (en) * | 2018-08-06 | 2021-12-02 | クノル−ブレムゼ ジステーメ フューア ヌッツファールツォイゲ ゲゼルシャフト ミット ベシュレンクテル ハフツングKnorr−Bremse Systeme fuer Nutzfahrzeuge GmbH | Camera surveillance system |
KR20210036387A (en) * | 2018-08-06 | 2021-04-02 | 크노르-브렘제 시스테메 퓌어 누츠파조이게 게엠베하 | Camera monitoring system |
JP7230178B2 (en) | 2018-08-06 | 2023-02-28 | クノル-ブレムゼ ジステーメ フューア ヌッツファールツォイゲ ゲゼルシャフト ミット ベシュレンクテル ハフツング | camera surveillance system |
US20210314497A1 (en) * | 2018-08-06 | 2021-10-07 | Knorr-Bremse Systeme Fuer Nutzfahrzeuge Gmbh | Camera Monitoring System |
US10300851B1 (en) * | 2018-10-04 | 2019-05-28 | StradVision, Inc. | Method for warning vehicle of risk of lane change and alarm device using the same |
US10857941B2 (en) * | 2019-01-28 | 2020-12-08 | Toyota Motor Engineering & Manufacturing North America, Inc. | E-mirror automatic position adjustment system |
US11532232B2 (en) * | 2019-11-01 | 2022-12-20 | Lg Electronics Inc. | Vehicle having dangerous situation notification function and control method thereof |
US11505123B2 (en) * | 2019-12-02 | 2022-11-22 | Magna Electronics Inc. | Vehicular camera monitoring system with stereographic display |
US20220174867A1 (en) * | 2020-12-04 | 2022-06-09 | Scythe Robotics, Inc. | Autonomous lawn mower |
EP4223593A1 (en) * | 2022-02-08 | 2023-08-09 | Continental Autonomous Mobility Germany GmbH | Rear-view stereo camera system |
Also Published As
Publication number | Publication date |
---|---|
DE102007059735A1 (en) | 2008-07-24 |
US8199975B2 (en) | 2012-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8199975B2 (en) | System and method for side vision detection of obstacles for vehicles | |
US11447070B2 (en) | Method for determining misalignment of a vehicular camera | |
US10899277B2 (en) | Vehicular vision system with reduced distortion display | |
US10324297B2 (en) | Heads up display system for vehicle | |
KR100936558B1 (en) | Perimeter monitoring apparatus and image display method for vehicle | |
US8885045B2 (en) | Device and method for monitoring vehicle surroundings | |
JP6819681B2 (en) | Imaging control devices and methods, and vehicles | |
US10183621B2 (en) | Vehicular image processing apparatus and vehicular image processing system | |
US8988493B2 (en) | Rear-view multi-functional camera system with driving corridor monitoring features | |
JP5132249B2 (en) | In-vehicle imaging device | |
EP1961613B1 (en) | Driving support method and driving support device | |
JP6819680B2 (en) | Imaging control devices and methods, and vehicles | |
CN108765496A (en) | A kind of multiple views automobile looks around DAS (Driver Assistant System) and method | |
US20090022423A1 (en) | Method for combining several images to a full image in the bird's eye view | |
US11875575B2 (en) | Vehicular trailering assist system with trailer collision angle detection | |
US20130286193A1 (en) | Vehicle vision system with object detection via top view superposition | |
US20110169957A1 (en) | Vehicle Image Processing Method | |
US20130021453A1 (en) | Autostereoscopic rear-view display system for vehicles | |
KR20120077309A (en) | Apparatus and method for displaying rear image of vehicle | |
US20190135169A1 (en) | Vehicle communication system using projected light | |
JP2004240480A (en) | Operation support device | |
CN113320474A (en) | Automatic parking method and device based on panoramic image and human-computer interaction | |
US8213683B2 (en) | Driving support system with plural dimension processing units | |
JP2023528940A (en) | Devices for verifying the position or orientation of sensors in autonomous vehicles | |
CN103377372A (en) | Looking-around composite graph overlapping region dividing method and looking-around composite graph representing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: COGNEX CORPORATION, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:POMERLEAU, DEAN A.;GOWDY, JAY W.;TROUP, MATTHEW;AND OTHERS;REEL/FRAME:021370/0964;SIGNING DATES FROM 20080213 TO 20080715 Owner name: COGNEX CORPORATION, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:POMERLEAU, DEAN A.;GOWDY, JAY W.;TROUP, MATTHEW;AND OTHERS;SIGNING DATES FROM 20080213 TO 20080715;REEL/FRAME:021370/0964 |
|
REMI | Maintenance fee reminder mailed | ||
LAPS | Lapse for failure to pay maintenance fees | ||
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20160612 |