WO2012143036A1 - Online vehicle camera calibration based on continuity of features - Google Patents

Online vehicle camera calibration based on continuity of features Download PDF

Info

Publication number
WO2012143036A1
WO2012143036A1 PCT/EP2011/056107 EP2011056107W WO2012143036A1 WO 2012143036 A1 WO2012143036 A1 WO 2012143036A1 EP 2011056107 W EP2011056107 W EP 2011056107W WO 2012143036 A1 WO2012143036 A1 WO 2012143036A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image frames
online calibration
road features
cameras
Prior art date
Application number
PCT/EP2011/056107
Other languages
French (fr)
Inventor
Myles Friel
Derek Anthony Savage
Ciaran Hughes
Peter Bone
Original Assignee
Connaught Electronics Limited
Application Solutions (Electronics and Vision) Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Connaught Electronics Limited, Application Solutions (Electronics and Vision) Limited filed Critical Connaught Electronics Limited
Priority to PCT/EP2011/056107 priority Critical patent/WO2012143036A1/en
Publication of WO2012143036A1 publication Critical patent/WO2012143036A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30264Parking

Definitions

  • This invention relates generally to an online calibration of a vehicle video system, and particularly to a method for online calibration of the vehicle video system evaluated from image frames of a camera. It is also related to a computer program product for processing data relating to the online calibration of the vehicle video system, the computer program product comprising a computer usable medium having a computer usable program code embodied therewith, the computer program code being configured to perform any of the method steps.
  • the present invention is further related to an online calibration system for a vehicle video system for processing the computer implemented online calibration method.
  • image capture devices such as, for example, digital or analogue video cameras
  • mounts image capture devices on a motor vehicle in order to produce a video image of an aspect of the environment exterior of the vehicle.
  • image capturing devices on respective opposite sides of the vehicle, for example, on side rear view mirror housings which extend sidewardly from the driver and front passenger doors of the vehicle.
  • the image capture devices are mounted in the side rear view mirror housings with the field of view of the image capture devices directed downwardly towards the ground for capturing plan view images of the ground on respective opposite sides of the vehicle adjacent the vehicle.
  • a visual display unit is located in the vehicle, either in or on the dashboard, or in a location corresponding to that of a conventional interiorly mounted rear view mirror.
  • head-up displays are used for vehicle.
  • a projection onto the windscreen is now possible.
  • the view images of the ground accurately reflect the positions of objects relative to the vehicle, which are captured in the images
  • the view images of the ground juxtapositioned with the view image of the vehicle should accurately represent a top view of the ground adjacent the respective opposite sides of the vehicle exactly as would be seen when viewed from above.
  • the edges of the respective view images of the ground which extend along the sides of the view image of the vehicle must correspond directly with the edge of the ground along the sides of the vehicle when viewed in view from a position above the vehicle. Otherwise, the positions of objects in the respective view images of the ground will not be accurately positioned relative to the vehicle.
  • the edge of one of the view images of the ground adjacent the corresponding side of the view image of the vehicle corresponds with a portion of a view of the ground which is spaced apart from the side of the vehicle, then the positions of objects in the view image of the ground will appear closer to the vehicle in the image than they actually are.
  • one of the image capture devices is mounted on a side mirror housing so that an image of a portion of the ground beneath a side of the vehicle is captured, the positions of objects captured in the view image will appear farther away from the vehicle than they actually are, with disastrous results, particularly if a driver is parking the vehicle parallel to a wall or bollards. Similar requirements apply also for front or rear placed image capture devices.
  • the image capture devices are relatively accurately fitted in the side mirror housings of the motor vehicle, and by using suitable grid patterns on the ground, calibration can be effected.
  • the environments in which motor vehicles must operate are generally relatively harsh environments, in that side mirror housings are vulnerable to impacts with other vehicles or stationary objects. While such impacts may not render the orientation of the side mirror housing unsuitable for producing an adequate rear view from a rear view mirror mounted therein, such impacts can and in general do result in the image capturing device mounted therein being knocked out of alignment, in other words, being offset from its ideal position.
  • re -calibration of the image capture device refitted in the new side mirror housing will be required.
  • WO2009/027090 is described a method and system for online calibration of a vehicle video system using vanishing points evaluated from frames of a camera image containing identified markings or edges on a road.
  • US2010/0110194 is described a method for producing a time sequence of images of motor vehicle surroundings from a virtual elevated central perspective on a display wherein several cameras are used to record time sequences of subimage data records from a plurality of real perspective differently offset and tilted with respect to the virtual perspective.
  • a calibration is performed using a prescribed quality criterion for the overall image.
  • the quality criterion is stipulated such that it reflects the quality of the overall image.
  • a method for online calibration of a vehicle video system is achieved by evaluating image frames of a camera containing longitudinal road features.
  • the method comprises the following steps of capturing in image frames by at least two cameras of the vehicle video system of adjacent portions of the road surface. Longitudinal road features are identified within the image frames. Identified longitudinal road features are selected within at least two image frames respectively taken by two cameras as matching a single line between the two image frames. An analysis of the matching of the single line is performed by determining an offset of the line between the two image frames. The consequently determined offset of the line is applied for a calibration of the camera.
  • the determined offset is applied as error measure to be minimised when adjusting rotation parameters used for the calibration of the cameras.
  • the two considered image frames comprise an overlapping area.
  • more than two image frames are considered captured respectively by more than two adjacent cameras of the vehicle video system, the image frames comprising some overlapping area between each other.
  • selected longitudinal road features corresponding to the matching line are stored over a period of time. This is used for a determination of a deviation of the points applied as error measure to be minimised when adjusting rotation parameters used for the calibration of the camera.
  • longitudinal road features are selected corresponding to at least two matching lines.
  • the calibration are performed for a vehicle video system adapted for displaying a virtual bird view of the adjacent environment of the vehicle by transforming each image frame from the cameras to a top down view to be merged into a single bird view.
  • Steering angle of the vehicle can be considered when selecting longitudinal road features as lines. This allows applying present online calibration method also when the vehicle is possibly not moving straight ahead.
  • a computer program product for processing data relating to online calibration of a vehicle video system is also described and claimed herein.
  • the computer program product comprises a computer usable medium having computer usable program code embodied therewith, the computer usable program code being configured to perform the above summarized methods.
  • a online calibration system for a vehicle video system is also described and claimed herein.
  • the online calibration system comprises a computer program product for processing data relating to an online calibration method and an image processing apparatus with a camera for taking image frames to be used by the online calibration method such to perform the above summarized methods.
  • FIG. 1 illustrates a side view and a top view of a vehicle with a 3d co-ordinate system
  • FIG. 2 illustrates a side view of a vehicle with cameras of the vehicle video system
  • FIG. 3 illustrates a schematic view of a vehicle video system for a bird view display
  • FIG. 4 illustrates a flow diagram according to the invention
  • FIG. 5 illustrates a bird view display according to the invention
  • FIG. 6 illustrates a corrected bird view display according to the invention.
  • aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted within the vehicle using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the vehicle's computer, partly on the vehicle's computer, as a stand-alone software package, partly on the vehicle's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the vehicle's computer through any type of wireless network, including a wireless local area network (WLAN), possibly but not necessarily through the Internet using an Internet Service Provider.
  • WLAN wireless local area network
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • Present invention is a means of calibrating a camera on a vehicle to determine the extrinsic camera rotation parameters relative to the vehicle co-ordinate system when the camera is positioned in an undertermined relevant position on the vehicle body.
  • Calibration allows the vehicle manufacturer to provide a geometrically representative and more importantly a useful view to a vehicle user. Additionally, by extension, the height of the camera off the ground plane (XY-plane) can be determined.
  • Figure 1 a side view and a top view of a vehicle 1 with a defined three dimensional coordinate system x, y, z as used in the following, x-axis is chosen along the longitudinal direction, y-axis along the transverse direction and z-axis along the vertical direction of the vehicle 1.
  • Figure 2 shows a side view of the vehicle 1 together with the x, y, z coordinate system and three cameras 21-23 from the vehicle video system.
  • One camera 21 is placed at the rear of the vehicle, another 22 at the side usually at the wing-mirror and another 23 at the front.
  • a further camera is placed on the other side of the vehicle and is accordingly not visible on figure 2. All the cameras of the video vehicle system capture a significant portion of road surfaces in the scene. Such portion of road surfaces is stored in image frames from which longitudinal road features like markings or road edge are identified to be selected.
  • Figure 3 shows a schematic view of the vehicle video system with four cameras from a video vehicle system as of fig. 2.
  • Four video streams are obtained from the four cameras CI to C4 to be merged into a bird view.
  • Each frame from the cameras is transformed to a top down view (Tl to T4).
  • the transforms are functions of the camera rotation matrices (Rl to R4). Additionally, the transforms are functions of the position of each camera in the reference coordinate system and (if applicable) the fisheye function that describes the intrinsic parameters of the camera (see, for example,
  • the rotations matrices are 3x3 matrices that can typically be generated from three rotation angles (see
  • the merge function M combines the four transformed frames into a single bird view (or top view) for display to the driver.
  • FIG. 4 shows a flow diagram according to present invention.
  • the online calibration method is started at some stage when vehicle is driving. Such starting event can be programmed in different ways e.g. after a predefined time during driving or after travelling some predefined distance or a combination of several parameters. Some initial parameters of the vehicle video system can be loaded first. Alternatively or in combination with that, some parameters from previous calibration (Rl to R4 being the previous rotation matrix parameters for each camera) can be loaded to be considered for the new started calibration procedure.
  • Rl to R4 being the previous rotation matrix parameters for each camera
  • some points on the longitudinal road features like road markings (e.g. broken lines, or edgeline) or road edges are extracted from each of the frames. Any adequate method can be used here to extract those longitudinal road features.
  • One possibility is obtained by analyzing several columns of the raw video within a predetermined region of interest of the frame at regular vertical intervals to detect light colored blobs. The blob detection itself can be performed by looking for a rising edge followed by a falling edge.
  • a blob is accepted only if its average luminance is significantly greater than the surrounding road and its width is within the road marking min and max constraints. Accordingly, this must be adapted if edge of the road are considered instead.
  • a set of points are extracted from each of the imaged road marking.
  • the longitudinal road features are extracted from each of the four cameras. If no valid road marking (or edge of the road) are present in the frames (e.g. no road marking is present that spans more than one camera view or if for any other reason the road marking is not valid), the frame is ignored.
  • the steering angle of the vehicle can be currently used to ensure that the vehicle is travelling along a straight direction when online calibration shall be applied.
  • a rejection criterion can be defined for outliner such that a longitudinal road feature detected with an angle greater than a given threshold could be rejected.
  • the threshold can be predefined possibly according to some experience collected in advance.
  • the constraint that the tracks are parallel to the x-axis of the vehicle could be relaxed for by using the steering information or by having a range of predefined expected steering curvature. If initial estimates of the camera extrinsic parameters are known (e.g. from vehicle mechanical data), they can be used as starting points for the calibration.
  • the speed and steering information possibly available on the vehicular network (e.g.
  • the criteria for rejecting extracted points as outliers using the velocity of the vehicle can be based on the fact that when a vehicle is travelling faster, it is more likely to be travelling parallel to the longitudinal road features like the markings or the edge of the road. In contrast, when a vehicle is travelling slower (e.g. at junctions and
  • Road-marking color information can also be considered as rejection criteria possibly but not necessarily in combination with the vehicle velocity. For example, if a green road blob is detected in areas where only white or yellow/orange markings are expected, it is highly likely that this is an erroneous detection and should be rejected as outlier.
  • the extracted longitudinal road features once extracted are stored.
  • the point locations on the longitudinal road markings extracted from each of the frames are simply stored. These can then be transformed later by some error function E using Tl to T4.
  • each point on each longitudinal road feature is transform as it is extracted using Tl to T4.
  • the transformed points are stored for use with the error function E.
  • the equation parameters can then be stored for use by the error function E.
  • the line fitting can be carried out using any suitable method (e.g. Hough or least-squares).
  • a single line or pair of lines can have multiple (till infinite) rotation parameters that cause them to align. Therefore, to find the correct set of rotation parameters, the error minimization will need a larger set of extracted road marking lines.
  • a set of lines corresponding to road markings is gathered over a period of time. Once enough lines have been gathered, the error minimization can be applied to determine the optimal rotation parameters.
  • Several appropriate methods can be used to determine if enough data has been gathered.
  • the frames are divided into regions. Only when a line has been extracted from each of these regions (or a percentage of the regions) then the calibration procedure will be pursued. Such method offers the best potential for accurate results to be returned.
  • a new set of rotation matrices Rl to R4 using the error measure E is determined and used to update the rotation matrix parameters Rl to R4 for each of the four cameras from the vehicle video system.
  • Figure 5 shows a top down or bird view obtained by combining the image frames of the four cameras and performing some appropriate transformation taking possibly into account distortions coming e.g. by the use of fish eye camera lenses.
  • the middle is drawn in black the location of the vehicle 50 while at the bottom and the top are displayed the image frame 51, 53 gathered by the camera respectively at the back and at the front of the vehicle.
  • the image frames 52, 54 gathered by the respective left and right wing mirror cameras.
  • the four image frames 51 to 54 have some overlapping area 55 to 58 between two adjacent image frames. Also shown are the identified longitudinal road features (in the present case two road marking possibly but not necessarily edgelines) which on figure 5 do not match between the different image frames.
  • the two lines 5 ⁇ and 51 " identified within the image frame 51 taken by the rear camera (see figure 2) do not match with the lines 52' and 54' identified within the image frame 52 and 54 of the respective wing mirror camera in the corresponding overlapping area 55 and 58.
  • Similar situation are shown for the same lines 53' and 53" this time identified within the image frame 53 taken by the front camera which also do not match with the lines 52' and 54' now in the overlapping area 56 and 57.
  • Figure 5 describes how the extracted lines do not align when transformed using Tl to T4 with an incorrect choice of the rotation parameters for the cameras.
  • This misalignment of extracted lines between image frames is used as a measure of error. Any suitable measure of misalignment of the extracted lines can be used.
  • the difference between the slope m and the y-intercept c are used as a measure of the misalignment after the fitting of a line to the transformed points.
  • the longitudinal road features are parallel to the direction of travel of the vehicle (i.e. parallel to the x-axis, see fig. 1 ) once transformed. Therefore, the parallelism of the lines to the vehicle x-axis can also be used as an additional error measure. Furthermore, given a set of line data extracted from the longitudinal road features over a period of time, the total error can be computed as the sum of the errors in alignment.
  • any offset between the lines 5 , 52' and 53' as well as between 51 ", 54' and 53' can be detected and used for defining an online calibration of the cameras from the vehicle video system.
  • offset a difference (deviation) in the slope of the identified lines 51 ', 51", 52', 53', 53", 54' can be measured and used for the calibration.
  • An error function expressing such offset is defined to be minimized using some appropriate algorithm like Nelder-Mead, Levenberg- Marquardt. For present case with tree parameters for each camera i.e. in total 12 parameters to take into account during the minimization procedure, Levenberg-Marquardt may offer the better convergence.
  • Figure 6 shows a top down or bird view similar to figure 5 with the effect of the online calibration performed on the vehicle video system i.e. with the consideration of correct rotation parameters for the camera of the vehicle video system.
  • the vehicle 60 is surrounded by four image frames 61 to 64 from the four cameras.
  • the two identified lines 69, 69' one at the left and the other at the right of the vehicle 60 parallel to its
  • one or more aspects of the present invention can be included in an article of manufacture (e.g., one or more computer program products) having, for instance, computer usable media.
  • the media has embodied therein, for instance, computer readable program code means for providing and facilitating the capabilities of the present invention.
  • the article of manufacture can be included as a part of a computer system or sold separately.
  • at least one program storage device readable by a machine, tangibly embodying at least one program of instructions executable by the machine to perform the capabilities of the present invention can be provided.

Abstract

The invention relates to a method for online calibration of a vehicle video system evaluated from image frames containing longitudinal road features. Adjacent portions of a road surface are captured by at least two cameras of the vehicle video system. Longitudinal road features are identified within the image frames. Identified longitudinal road features are selected within at least two image frames respectively taken by two cameras as matching a single line between the two image frames. An analysis of the matching of the single line is performed by determining an offset of the line between the two image frames. The consequently determined offset of the line is applied for a calibration of the cameras of the vehicle video system.

Description

ONLINE VEHICLE CAMERA CALIBRATION BASED ON
CONTINUITY OF FEATURES
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION
[0001] This invention relates generally to an online calibration of a vehicle video system, and particularly to a method for online calibration of the vehicle video system evaluated from image frames of a camera. It is also related to a computer program product for processing data relating to the online calibration of the vehicle video system, the computer program product comprising a computer usable medium having a computer usable program code embodied therewith, the computer program code being configured to perform any of the method steps. The present invention is further related to an online calibration system for a vehicle video system for processing the computer implemented online calibration method.
DESCRIPTION OF BACKGROUND
[0002] It is known to mount image capture devices, such as, for example, digital or analogue video cameras, on a motor vehicle in order to produce a video image of an aspect of the environment exterior of the vehicle. For example, in order to assist in parking and manoeuvring a vehicle in confined spaces, it is known to mount such image capturing devices on respective opposite sides of the vehicle, for example, on side rear view mirror housings which extend sidewardly from the driver and front passenger doors of the vehicle. The image capture devices are mounted in the side rear view mirror housings with the field of view of the image capture devices directed downwardly towards the ground for capturing plan view images of the ground on respective opposite sides of the vehicle adjacent the vehicle. Typically, a visual display unit is located in the vehicle, either in or on the dashboard, or in a location corresponding to that of a conventional interiorly mounted rear view mirror. Nowadays, also head-up displays are used for vehicle. Even, a projection onto the windscreen is now possible. When a driver is undertaking a parking manoeuvre or a manoeuvre in a confined space, a plan view image of the vehicle with the respective plan view images of the ground on respective opposite sides of the vehicle can be displayed on the visual display unit. The plan view display of the vehicle and the ground on respective opposite sides of the vehicle assists the driver in parking, and in particular, carrying out a parking manoeuvre for parking the vehicle in a parking space parallel to a kerb of a footpath or the like.
[0003] However, in order that the view images of the ground accurately reflect the positions of objects relative to the vehicle, which are captured in the images, it is essential that the view images of the ground juxtapositioned with the view image of the vehicle should accurately represent a top view of the ground adjacent the respective opposite sides of the vehicle exactly as would be seen when viewed from above. In other words, the edges of the respective view images of the ground which extend along the sides of the view image of the vehicle must correspond directly with the edge of the ground along the sides of the vehicle when viewed in view from a position above the vehicle. Otherwise, the positions of objects in the respective view images of the ground will not be accurately positioned relative to the vehicle. For example, if the edge of one of the view images of the ground adjacent the corresponding side of the view image of the vehicle corresponds with a portion of a view of the ground which is spaced apart from the side of the vehicle, then the positions of objects in the view image of the ground will appear closer to the vehicle in the image than they actually are. Conversely, if one of the image capture devices is mounted on a side mirror housing so that an image of a portion of the ground beneath a side of the vehicle is captured, the positions of objects captured in the view image will appear farther away from the vehicle than they actually are, with disastrous results, particularly if a driver is parking the vehicle parallel to a wall or bollards. Similar requirements apply also for front or rear placed image capture devices. Often, the most obvious effect of poor calibration happens when a merged image is presented to the user and one of the cameras is not correctly calibrated. In this case an object/feature on the ground can appear disjointed or elongated or disappear completely from the view presented to the user. [0004] Accordingly, it is essential that the view images of the ground when displayed on the visual display screen juxtapositioned along with the plan view image of the vehicle must be representative of plan views of the ground on respective opposite sides of the vehicle exactly as would be seen from a top plan view of the vehicle and adjacent ground. In order to achieve such accuracy, the image capture devices would have to be precision mounted on the vehicle. In practice this is not possible. Accordingly, in order to achieve the appropriate degree of exactness and accuracy of the plan view images of the ground relative to the plan view image of the vehicle, it is necessary to calibrate the outputs of the image capture devices. Calibration values determined during calibration of the image capture devices are then used to correct subsequently captured image frames for offset of the image capture devices from ideal positions thereof, so that plan view images of the ground subsequently outputted for display with the plan view image of the vehicle are exact representations of the ground on respective opposite sides of the vehicle. Such calibration can be accurately carried out in a factory during production of the motor vehicle. The calibration may also use the absolute position/rotation of the camera.
Typically, the image capture devices are relatively accurately fitted in the side mirror housings of the motor vehicle, and by using suitable grid patterns on the ground, calibration can be effected. However, the environments in which motor vehicles must operate are generally relatively harsh environments, in that side mirror housings are vulnerable to impacts with other vehicles or stationary objects. While such impacts may not render the orientation of the side mirror housing unsuitable for producing an adequate rear view from a rear view mirror mounted therein, such impacts can and in general do result in the image capturing device mounted therein being knocked out of alignment, in other words, being offset from its ideal position. Additionally, where a vehicle is involved in a crash, or alternatively, where a side mirror housing requires replacement, re -calibration of the image capture device refitted in the new side mirror housing will be required. Such re -calibration, which typically would be carried out using a grid pattern on the ground, is unsatisfactory, since in general, it is impossible to accurately align the vehicle with the grid pattern in order to adequately calibrate the image capture device, unless the calibration is being carried out under factory conditions. The same applies for rear or front placed image capture devices even in the case, those devices are placed in the interior of the vehicle since harsh conditions apply also there.
[0005] In WO2009/027090 is described a method and system for online calibration of a vehicle video system using vanishing points evaluated from frames of a camera image containing identified markings or edges on a road. In US2010/0110194 is described a method for producing a time sequence of images of motor vehicle surroundings from a virtual elevated central perspective on a display wherein several cameras are used to record time sequences of subimage data records from a plurality of real perspective differently offset and tilted with respect to the virtual perspective. Recurrently, a calibration is performed using a prescribed quality criterion for the overall image. The quality criterion is stipulated such that it reflects the quality of the overall image.
SUMMARY OF THE INVENTION
[0006] In view of the above, there is a need for a method and a calibration system for calibrating an output of an image capture device or camera mounted on a vehicle to compensate for offset of the camera from an ideal position, such method and calibration system being based on a simple use of available longitudinal road features without requiring a calibration of the camera to be carried out in a controlled environment.
[0007] According to a first aspect of the embodiment of the present invention, a method for online calibration of a vehicle video system is achieved by evaluating image frames of a camera containing longitudinal road features. The method comprises the following steps of capturing in image frames by at least two cameras of the vehicle video system of adjacent portions of the road surface. Longitudinal road features are identified within the image frames. Identified longitudinal road features are selected within at least two image frames respectively taken by two cameras as matching a single line between the two image frames. An analysis of the matching of the single line is performed by determining an offset of the line between the two image frames. The consequently determined offset of the line is applied for a calibration of the camera. [0008] Advantageously, the determined offset is applied as error measure to be minimised when adjusting rotation parameters used for the calibration of the cameras. In an embodiment according to the invention, the two considered image frames comprise an overlapping area. In an alternative, more than two image frames are considered captured respectively by more than two adjacent cameras of the vehicle video system, the image frames comprising some overlapping area between each other.
[0009] In some embodiment according to the invention, selected longitudinal road features corresponding to the matching line are stored over a period of time. This is used for a determination of a deviation of the points applied as error measure to be minimised when adjusting rotation parameters used for the calibration of the camera.
[0010] Advantageously, longitudinal road features are selected corresponding to at least two matching lines. In some embodiment, the calibration are performed for a vehicle video system adapted for displaying a virtual bird view of the adjacent environment of the vehicle by transforming each image frame from the cameras to a top down view to be merged into a single bird view.
[0011] Steering angle of the vehicle can be considered when selecting longitudinal road features as lines. This allows applying present online calibration method also when the vehicle is possibly not moving straight ahead.
[0012] It is also possible to analyse the selected longitudinal road features to be rejected as outlier if fulfilling some criteria. Such selection of the identified longitudinal road features as outlier can be performed if corresponding to a line with an angle curvature greater than a predefined value.
[0013] According to a second aspect of the embodiment, a computer program product for processing data relating to online calibration of a vehicle video system is also described and claimed herein. The computer program product comprises a computer usable medium having computer usable program code embodied therewith, the computer usable program code being configured to perform the above summarized methods. [0014] According to a third aspect of the embodiment, a online calibration system for a vehicle video system is also described and claimed herein. The online calibration system comprises a computer program product for processing data relating to an online calibration method and an image processing apparatus with a camera for taking image frames to be used by the online calibration method such to perform the above summarized methods.
[0015] Additional features and advantages are realized through the techniques of the present invention. Other embodiments and aspects of the invention are described in detail herein and are considered a part of the claimed invention. For a better understanding of the invention with advantages and features, refer to the description and to the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other objects, features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
[0017] FIG. 1 illustrates a side view and a top view of a vehicle with a 3d co-ordinate system;
[0018] FIG. 2 illustrates a side view of a vehicle with cameras of the vehicle video system;
[0019] FIG. 3 illustrates a schematic view of a vehicle video system for a bird view display;
[0020] FIG. 4 illustrates a flow diagram according to the invention;
[0021] FIG. 5 illustrates a bird view display according to the invention; [0022] FIG. 6 illustrates a corrected bird view display according to the invention.
[0023] The detailed description explains the preferred embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.
DETAILED DESCRIPTION OF THE INVENTION
[0024] As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
[0025] Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. [0026] A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
[0027] Program code embodied on a computer readable medium may be transmitted within the vehicle using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
[0028] Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the vehicle's computer, partly on the vehicle's computer, as a stand-alone software package, partly on the vehicle's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the vehicle's computer through any type of wireless network, including a wireless local area network (WLAN), possibly but not necessarily through the Internet using an Internet Service Provider.
[0029] Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0030] These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
[0031] The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0032] Present invention is a means of calibrating a camera on a vehicle to determine the extrinsic camera rotation parameters relative to the vehicle co-ordinate system when the camera is positioned in an undertermined relevant position on the vehicle body.
Calibration allows the vehicle manufacturer to provide a geometrically representative and more importantly a useful view to a vehicle user. Additionally, by extension, the height of the camera off the ground plane (XY-plane) can be determined.
[0033] Turning now to the drawings in greater detail, it will be seen that in Figure 1 is shown a side view and a top view of a vehicle 1 with a defined three dimensional coordinate system x, y, z as used in the following, x-axis is chosen along the longitudinal direction, y-axis along the transverse direction and z-axis along the vertical direction of the vehicle 1. [0034] Figure 2 shows a side view of the vehicle 1 together with the x, y, z coordinate system and three cameras 21-23 from the vehicle video system. One camera 21 is placed at the rear of the vehicle, another 22 at the side usually at the wing-mirror and another 23 at the front. A further camera is placed on the other side of the vehicle and is accordingly not visible on figure 2. All the cameras of the video vehicle system capture a significant portion of road surfaces in the scene. Such portion of road surfaces is stored in image frames from which longitudinal road features like markings or road edge are identified to be selected.
[0035] Figure 3 shows a schematic view of the vehicle video system with four cameras from a video vehicle system as of fig. 2. Four video streams are obtained from the four cameras CI to C4 to be merged into a bird view. Each frame from the cameras is transformed to a top down view (Tl to T4). The transforms are functions of the camera rotation matrices (Rl to R4). Additionally, the transforms are functions of the position of each camera in the reference coordinate system and (if applicable) the fisheye function that describes the intrinsic parameters of the camera (see, for example,
http :/7en. wikipedia.org/wiki/Fisheye lens#Mapping function, though any appropriate function can be used). However, these additional parameters are not determined by the online calibration, and can therefore be considered as constant within the transforms Tl to T4. The rotations matrices (Rl to R4) are 3x3 matrices that can typically be generated from three rotation angles (see
htt ://en. wikipedia. org/wiki/Eu ler angles#Matrix notation, the angles can be xyz, xzz or any appropriate combination). The merge function M combines the four transformed frames into a single bird view (or top view) for display to the driver.
[0036] Figure 4 shows a flow diagram according to present invention. The online calibration method is started at some stage when vehicle is driving. Such starting event can be programmed in different ways e.g. after a predefined time during driving or after travelling some predefined distance or a combination of several parameters. Some initial parameters of the vehicle video system can be loaded first. Alternatively or in combination with that, some parameters from previous calibration (Rl to R4 being the previous rotation matrix parameters for each camera) can be loaded to be considered for the new started calibration procedure.
[0037] Then, some points on the longitudinal road features like road markings (e.g. broken lines, or edgeline) or road edges are extracted from each of the frames. Any adequate method can be used here to extract those longitudinal road features. One possibility is obtained by analyzing several columns of the raw video within a predetermined region of interest of the frame at regular vertical intervals to detect light colored blobs. The blob detection itself can be performed by looking for a rising edge followed by a falling edge. For road marking, a blob is accepted only if its average luminance is significantly greater than the surrounding road and its width is within the road marking min and max constraints. Accordingly, this must be adapted if edge of the road are considered instead. A set of points are extracted from each of the imaged road marking. The longitudinal road features are extracted from each of the four cameras. If no valid road marking (or edge of the road) are present in the frames (e.g. no road marking is present that spans more than one camera view or if for any other reason the road marking is not valid), the frame is ignored.
[0038] Advantageously, the steering angle of the vehicle can be currently used to ensure that the vehicle is travelling along a straight direction when online calibration shall be applied. Additionally, a rejection criterion can be defined for outliner such that a longitudinal road feature detected with an angle greater than a given threshold could be rejected. The threshold can be predefined possibly according to some experience collected in advance. Alternatively, the constraint that the tracks are parallel to the x-axis of the vehicle could be relaxed for by using the steering information or by having a range of predefined expected steering curvature. If initial estimates of the camera extrinsic parameters are known (e.g. from vehicle mechanical data), they can be used as starting points for the calibration. Advantageously, the speed and steering information possibly available on the vehicular network (e.g. via CAN, LIN, Wireless or other) can be used when transforming the extracted points to remove (relax) the necessity for the vehicle to be moving in a straight line and at constant speed. Furthermore, if the speed of the vehicle is considered, then the actual physical distance between the extracted points from the longitudinal road features could be computed. And this combined with a triangulation method would allow determining the height of the camera.
[0039] The criteria for rejecting extracted points as outliers using the velocity of the vehicle can be based on the fact that when a vehicle is travelling faster, it is more likely to be travelling parallel to the longitudinal road features like the markings or the edge of the road. In contrast, when a vehicle is travelling slower (e.g. at junctions and
roundabouts, etc.) it is likely that the road features captured by the camera are not actually parallel to the direction of the vehicle motion (i.e. parallel to the vehicle x-axis). Road-marking color information can also be considered as rejection criteria possibly but not necessarily in combination with the vehicle velocity. For example, if a green road blob is detected in areas where only white or yellow/orange markings are expected, it is highly likely that this is an erroneous detection and should be rejected as outlier.
[0040] The extracted longitudinal road features once extracted are stored. Several alternatives can be used for this purpose. For example, the point locations on the longitudinal road markings extracted from each of the frames are simply stored. These can then be transformed later by some error function E using Tl to T4. Alternatively, each point on each longitudinal road feature is transform as it is extracted using Tl to T4. Than the transformed points are stored for use with the error function E. A third possibility involves to transform each point of each longitudinal road feature from each frame using Tl to T4 and then fit the equation of a line (e.g. y = mx + c, where m is the slope of the line, and c is the y axis intercept) to the transformed points. The equation parameters can then be stored for use by the error function E. The line fitting can be carried out using any suitable method (e.g. Hough or least-squares).
[0041] A single line or pair of lines can have multiple (till infinite) rotation parameters that cause them to align. Therefore, to find the correct set of rotation parameters, the error minimization will need a larger set of extracted road marking lines. Prior to performing the error minimization, a set of lines corresponding to road markings is gathered over a period of time. Once enough lines have been gathered, the error minimization can be applied to determine the optimal rotation parameters. Several appropriate methods can be used to determine if enough data has been gathered. Advantageously, the frames are divided into regions. Only when a line has been extracted from each of these regions (or a percentage of the regions) then the calibration procedure will be pursued. Such method offers the best potential for accurate results to be returned. Finally, a new set of rotation matrices Rl to R4 using the error measure E is determined and used to update the rotation matrix parameters Rl to R4 for each of the four cameras from the vehicle video system.
[0042] Figure 5 shows a top down or bird view obtained by combining the image frames of the four cameras and performing some appropriate transformation taking possibly into account distortions coming e.g. by the use of fish eye camera lenses. In the middle is drawn in black the location of the vehicle 50 while at the bottom and the top are displayed the image frame 51, 53 gathered by the camera respectively at the back and at the front of the vehicle. At the left and at the right of the vehicle 50 are displayed the image frames 52, 54 gathered by the respective left and right wing mirror cameras.
Clearly visible from figure 5 is the fact that the four image frames 51 to 54 have some overlapping area 55 to 58 between two adjacent image frames. Also shown are the identified longitudinal road features (in the present case two road marking possibly but not necessarily edgelines) which on figure 5 do not match between the different image frames. The two lines 5Γ and 51 " identified within the image frame 51 taken by the rear camera (see figure 2) do not match with the lines 52' and 54' identified within the image frame 52 and 54 of the respective wing mirror camera in the corresponding overlapping area 55 and 58. Similar situation are shown for the same lines 53' and 53" this time identified within the image frame 53 taken by the front camera which also do not match with the lines 52' and 54' now in the overlapping area 56 and 57.
[0043] Figure 5 describes how the extracted lines do not align when transformed using Tl to T4 with an incorrect choice of the rotation parameters for the cameras. This misalignment of extracted lines between image frames is used as a measure of error. Any suitable measure of misalignment of the extracted lines can be used. Advantageously, the difference between the slope m and the y-intercept c (see above equation) are used as a measure of the misalignment after the fitting of a line to the transformed points.
Additionally, if the vehicle is traveling in a straight line, the longitudinal road features (marking or edge) are parallel to the direction of travel of the vehicle (i.e. parallel to the x-axis, see fig. 1 ) once transformed. Therefore, the parallelism of the lines to the vehicle x-axis can also be used as an additional error measure. Furthermore, given a set of line data extracted from the longitudinal road features over a period of time, the total error can be computed as the sum of the errors in alignment.
[0044] By applying the method according to the invention, any offset between the lines 5 , 52' and 53' as well as between 51 ", 54' and 53' can be detected and used for defining an online calibration of the cameras from the vehicle video system. As offset, a difference (deviation) in the slope of the identified lines 51 ', 51", 52', 53', 53", 54' can be measured and used for the calibration. An error function expressing such offset is defined to be minimized using some appropriate algorithm like Nelder-Mead, Levenberg- Marquardt. For present case with tree parameters for each camera i.e. in total 12 parameters to take into account during the minimization procedure, Levenberg-Marquardt may offer the better convergence.
[0045] Figure 6 shows a top down or bird view similar to figure 5 with the effect of the online calibration performed on the vehicle video system i.e. with the consideration of correct rotation parameters for the camera of the vehicle video system. The vehicle 60 is surrounded by four image frames 61 to 64 from the four cameras. The two identified lines 69, 69' one at the left and the other at the right of the vehicle 60 parallel to its
longitudinal direction are now well matching within the overlapping area 65, 66, 67, 68.
[0046] The capabilities of the present invention can be implemented in software, firmware, hardware or some combination thereof.
[0047] As one example, one or more aspects of the present invention can be included in an article of manufacture (e.g., one or more computer program products) having, for instance, computer usable media. The media has embodied therein, for instance, computer readable program code means for providing and facilitating the capabilities of the present invention. The article of manufacture can be included as a part of a computer system or sold separately. Additionally, at least one program storage device readable by a machine, tangibly embodying at least one program of instructions executable by the machine to perform the capabilities of the present invention can be provided.
[0048] While the preferred embodiment to the invention has been described, it will be understood that those skilled in the art, both now and in the future, may make various improvements and enhancements which fall within the scope of the claims which follow. These claims should be construed to maintain the proper protection for the invention first described.

Claims

1. A method for online calibration of a vehicle video system evaluated from image frames of a camera containing longitudinal road features, the method comprising the following steps of:
• Capturing in image frames (51, 52, 53, 54, 61, 62, 63, 64) by at least two cameras (21, 22, 23) of the vehicle video system of adjacent portions of the road surface;
• Identifying longitudinal road features (51 ', 51 ", 52', 53', 54') within the image frames;
The method being characterised by the further steps of
• Selecting identified longitudinal road features within at least two image frames respectively taken by two cameras as matching a single line between the two image frames;
• Analysing the matching of the single line by determining an offset of the line between the two image frames;
• Applying the determined offset of the line for a calibration of the camera.
2. The online calibration method according to claim 1 whereby the determined offset is applied as error measure to be minimised when adjusting rotation parameters used for the calibration of the cameras.
3. The online calibration method according to claim 1 whereby the two considered image frames comprise an overlapping area.
4. The online calibration method according to claim 1 whereby selected longitudinal road features corresponding to the matching line are stored over a period of time for a determination of a deviation of the points applied as error measure to be minimised when adjusting rotation parameters used for the calibration of the camera.
5. The online calibration method according to claim 1 whereby selecting longitudinal road features corresponding to at least two matching lines.
6. The online calibration method according to claim 1 whereby the calibration are performed for a vehicle video system adapted for displaying a virtual bird view of the adjacent environment of the vehicle by transforming each image frame from the cameras to a top down view to be merged into a single bird view.
7. The online calibration method according to claim 1 whereby steering angle of the vehicle is used when selecting longitudinal road features as lines.
8. The online calibration method according to claim 1 whereby analysing the selected longitudinal road features to be rejected as outlier if fulfilling some criteria.
9. The online calibration method according to claim 8 whereby rejecting the selected identified longitudinal road features if corresponding to a line with an angle curvature greater than a predefined value.
10. A computer program product for processing data relating to online calibration of a vehicle video system, the computer program product comprising a computer usable medium having computer usable program code embodied therewith, the computer usable program code being configured to perform the steps of any of the preceding claims 1 to 10.
11. An online calibration system for a vehicle video system, the online calibration system comprising a computer program product for processing data relating to an online calibration method and an image processing apparatus with a camera for taking image frames to be used by the online calibration method such to perform the steps of any of the preceding claims 1 to 10.
PCT/EP2011/056107 2011-04-18 2011-04-18 Online vehicle camera calibration based on continuity of features WO2012143036A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/056107 WO2012143036A1 (en) 2011-04-18 2011-04-18 Online vehicle camera calibration based on continuity of features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/056107 WO2012143036A1 (en) 2011-04-18 2011-04-18 Online vehicle camera calibration based on continuity of features

Publications (1)

Publication Number Publication Date
WO2012143036A1 true WO2012143036A1 (en) 2012-10-26

Family

ID=44626040

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/056107 WO2012143036A1 (en) 2011-04-18 2011-04-18 Online vehicle camera calibration based on continuity of features

Country Status (1)

Country Link
WO (1) WO2012143036A1 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150092058A1 (en) * 2013-10-01 2015-04-02 Application Solutions (Electronics and Vision) Ltd. System, Vehicle and Method for Online Calibration of a Camera on a Vehicle
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US9205776B2 (en) 2013-05-21 2015-12-08 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US9357208B2 (en) 2011-04-25 2016-05-31 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
WO2016087298A1 (en) * 2014-12-04 2016-06-09 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system
EP3035676A1 (en) 2014-12-19 2016-06-22 Conti Temic microelectronic GmbH Surround view system and vehicle including a surround view system
US9491451B2 (en) 2011-11-15 2016-11-08 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US9491450B2 (en) 2011-08-01 2016-11-08 Magna Electronic Inc. Vehicle camera alignment system
EP2830305B1 (en) * 2013-07-24 2016-11-23 SMR Patents S.à.r.l. Display device for a motor vehicle and method for operating such a display device
CN106233711A (en) * 2014-09-30 2016-12-14 歌乐株式会社 Camera calibration apparatus and camera calibration system
US9563951B2 (en) 2013-05-21 2017-02-07 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
EP3041228A4 (en) * 2013-08-30 2017-06-28 Clarion Co., Ltd. Camera calibration device, camera calibration system, and camera calibration method
US9723272B2 (en) 2012-10-05 2017-08-01 Magna Electronics Inc. Multi-camera image stitching calibration system
US9762880B2 (en) 2011-12-09 2017-09-12 Magna Electronics Inc. Vehicle vision system with customized display
KR20170120010A (en) * 2016-04-20 2017-10-30 엘지이노텍 주식회사 Image acquiring device and methid of the same
US9834153B2 (en) 2011-04-25 2017-12-05 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US9916660B2 (en) 2015-01-16 2018-03-13 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US9972100B2 (en) 2007-08-17 2018-05-15 Magna Electronics Inc. Vehicular imaging system comprising an imaging device with a single image sensor and image processor for determining a totally blocked state or partially blocked state of the single image sensor as well as an automatic correction for misalignment of the imaging device
DE102017000454A1 (en) 2017-01-19 2018-07-19 e.solutions GmbH Method, system and vehicle for calibrating image data of two cameras
US10071687B2 (en) 2011-11-28 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
JP2019087858A (en) * 2017-11-06 2019-06-06 パナソニックIpマネジメント株式会社 Camera correction device, camera correction system, camera correction method, and program
CN110654382A (en) * 2018-06-28 2020-01-07 比亚迪股份有限公司 Calibration method and calibration system of lane yaw early warning system and vehicle
US10620000B2 (en) 2015-10-20 2020-04-14 Clarion Co., Ltd. Calibration apparatus, calibration method, and calibration program
WO2020239457A1 (en) 2019-05-29 2020-12-03 Connaught Electronics Ltd. Image acquisition system
US10868974B2 (en) 2010-12-01 2020-12-15 Magna Electronics Inc. Method for determining alignment of vehicular cameras
CN112304356A (en) * 2019-07-31 2021-02-02 大众汽车股份公司 Method and device for checking the calibration of an environmental sensor
US10946799B2 (en) 2015-04-21 2021-03-16 Magna Electronics Inc. Vehicle vision system with overlay calibration
CN112819901A (en) * 2021-02-26 2021-05-18 中国人民解放军93114部队 Infrared camera self-calibration method based on image edge information
JP2021513247A (en) * 2018-06-05 2021-05-20 上▲海▼商▲湯▼智能科技有限公司Shanghai Sensetime Intelligent Technology Co., Ltd. In-vehicle camera self-calibration method, in-vehicle camera self-calibration device, electronic device and storage medium
US11340071B2 (en) 2016-02-10 2022-05-24 Clarion Co., Ltd. Calibration system and calibration apparatus
GB2605425A (en) * 2021-03-31 2022-10-05 Continental Automotive Gmbh Vehicular camera calibration system and method
US11645783B2 (en) 2018-12-19 2023-05-09 Faurecia Clarion Electronics Co., Ltd. Calibration apparatus and calibration method
US11877054B2 (en) 2011-09-21 2024-01-16 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070008091A1 (en) * 2005-06-09 2007-01-11 Hitachi, Ltd. Method and system of monitoring around a vehicle
WO2009027090A2 (en) 2007-08-31 2009-03-05 Valeo Schalter Und Sensoren Gmbh Method and system for online calibration of a video system
US20100110194A1 (en) 2008-10-24 2010-05-06 Euler Christian Method For Automatically Calibrating A Virtual Camera System

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070008091A1 (en) * 2005-06-09 2007-01-11 Hitachi, Ltd. Method and system of monitoring around a vehicle
WO2009027090A2 (en) 2007-08-31 2009-03-05 Valeo Schalter Und Sensoren Gmbh Method and system for online calibration of a video system
US20100110194A1 (en) 2008-10-24 2010-05-06 Euler Christian Method For Automatically Calibrating A Virtual Camera System

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ANDERSON RIBEIRO ET AL: "Automatic camera calibration for driver assistance systems", PROC. INTERNATIONAL CONFERENCE ON SYSTEMS, SIGNALS AND IMAGE PROCESSING, 21 September 2006 (2006-09-21), XP055015132, Retrieved from the Internet <URL:http://www.inf.ufrgs.br/~crjung/research_arquivos/paper_1123.pdf> [retrieved on 20111219] *
COLLADO J M ET AL: "Self-calibration of an On-Board Stereo-vision System for Driver Assistance Systems", INTELLIGENT VEHICLES SYMPOSIUM, 2006 IEEE MEGURO-KU, JAPAN 13-15 JUNE 2006, PISCATAWAY, NJ, USA,IEEE, 13 June 2006 (2006-06-13), pages 156 - 162, XP010937006, ISBN: 978-4-901122-86-3, DOI: 10.1109/IVS.2006.1689621 *
NAVAB N ET AL: "The critical sets of lines for camera displacement estimation: A mixed Euclidean-projective and constructive approach", COMPUTER VISION, 1993. PROCEEDINGS., FOURTH INTERNATIONAL CONFERENCE O N BERLIN, GERMANY 11-14 MAY 1993, LOS ALAMITOS, CA, USA,IEEE COMPUTER SOCI, 11 May 1993 (1993-05-11), pages 713 - 723, XP010128552, ISBN: 978-0-8186-3870-1, DOI: 10.1109/ICCV.1993.378143 *
PFLUGFELDER R ET AL: "Online Auto-Calibration in Man-Made Worlds", DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS, 2005. DICTA '05. PROCEEDINGS 2005 QUEENSLAND, AUSTRALIA 06-08 DEC. 2005, PISCATAWAY, NJ, USA,IEEE, 6 December 2005 (2005-12-06), pages 75 - 75, XP010943811, ISBN: 978-0-7695-2467-2, DOI: 10.1109/DICTA.2005.63 *
YAN JIANG ET AL: "Self-calibrated multiple-lane detection system", POSITION LOCATION AND NAVIGATION SYMPOSIUM (PLANS), 2010 IEEE/ION, IEEE, PISCATAWAY, NJ, USA, 4 May 2010 (2010-05-04), pages 1052 - 1056, XP031707108, ISBN: 978-1-4244-5036-7 *

Cited By (96)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10509972B2 (en) 2004-12-23 2019-12-17 Magna Electronics Inc. Vehicular vision system
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US9193303B2 (en) 2004-12-23 2015-11-24 Magna Electronics Inc. Driver assistance system for vehicle
US9940528B2 (en) 2004-12-23 2018-04-10 Magna Electronics Inc. Driver assistance system for vehicle
US11308720B2 (en) 2004-12-23 2022-04-19 Magna Electronics Inc. Vehicular imaging system
US11328447B2 (en) 2007-08-17 2022-05-10 Magna Electronics Inc. Method of blockage determination and misalignment correction for vehicular vision system
US11908166B2 (en) 2007-08-17 2024-02-20 Magna Electronics Inc. Vehicular imaging system with misalignment correction of camera
US9972100B2 (en) 2007-08-17 2018-05-15 Magna Electronics Inc. Vehicular imaging system comprising an imaging device with a single image sensor and image processor for determining a totally blocked state or partially blocked state of the single image sensor as well as an automatic correction for misalignment of the imaging device
US10726578B2 (en) 2007-08-17 2020-07-28 Magna Electronics Inc. Vehicular imaging system with blockage determination and misalignment correction
US10868974B2 (en) 2010-12-01 2020-12-15 Magna Electronics Inc. Method for determining alignment of vehicular cameras
US11553140B2 (en) 2010-12-01 2023-01-10 Magna Electronics Inc. Vehicular vision system with multiple cameras
US10654423B2 (en) 2011-04-25 2020-05-19 Magna Electronics Inc. Method and system for dynamically ascertaining alignment of vehicular cameras
US10202077B2 (en) 2011-04-25 2019-02-12 Magna Electronics Inc. Method for dynamically calibrating vehicular cameras
US11554717B2 (en) 2011-04-25 2023-01-17 Magna Electronics Inc. Vehicular vision system that dynamically calibrates a vehicular camera
US10640041B2 (en) 2011-04-25 2020-05-05 Magna Electronics Inc. Method for dynamically calibrating vehicular cameras
US11007934B2 (en) 2011-04-25 2021-05-18 Magna Electronics Inc. Method for dynamically calibrating a vehicular camera
US10919458B2 (en) 2011-04-25 2021-02-16 Magna Electronics Inc. Method and system for calibrating vehicular cameras
US9357208B2 (en) 2011-04-25 2016-05-31 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US9834153B2 (en) 2011-04-25 2017-12-05 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US9491450B2 (en) 2011-08-01 2016-11-08 Magna Electronic Inc. Vehicle camera alignment system
US11877054B2 (en) 2011-09-21 2024-01-16 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US9491451B2 (en) 2011-11-15 2016-11-08 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US10264249B2 (en) 2011-11-15 2019-04-16 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US11634073B2 (en) 2011-11-28 2023-04-25 Magna Electronics Inc. Multi-camera vehicular vision system
US11142123B2 (en) 2011-11-28 2021-10-12 Magna Electronics Inc. Multi-camera vehicular vision system
US11305691B2 (en) 2011-11-28 2022-04-19 Magna Electronics Inc. Vehicular vision system
US10099614B2 (en) 2011-11-28 2018-10-16 Magna Electronics Inc. Vision system for vehicle
US10071687B2 (en) 2011-11-28 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10640040B2 (en) 2011-11-28 2020-05-05 Magna Electronics Inc. Vision system for vehicle
US11787338B2 (en) 2011-11-28 2023-10-17 Magna Electronics Inc. Vehicular vision system
US11082678B2 (en) 2011-12-09 2021-08-03 Magna Electronics Inc. Vehicular vision system with customized display
US11689703B2 (en) 2011-12-09 2023-06-27 Magna Electronics Inc. Vehicular vision system with customized display
US10542244B2 (en) 2011-12-09 2020-01-21 Magna Electronics Inc. Vehicle vision system with customized display
US10129518B2 (en) 2011-12-09 2018-11-13 Magna Electronics Inc. Vehicle vision system with customized display
US9762880B2 (en) 2011-12-09 2017-09-12 Magna Electronics Inc. Vehicle vision system with customized display
US10904489B2 (en) 2012-10-05 2021-01-26 Magna Electronics Inc. Multi-camera calibration method for a vehicle moving along a vehicle assembly line
US9723272B2 (en) 2012-10-05 2017-08-01 Magna Electronics Inc. Multi-camera image stitching calibration system
US11265514B2 (en) 2012-10-05 2022-03-01 Magna Electronics Inc. Multi-camera calibration method for a vehicle moving along a vehicle assembly line
US10284818B2 (en) 2012-10-05 2019-05-07 Magna Electronics Inc. Multi-camera image stitching calibration system
US11192500B2 (en) 2013-02-27 2021-12-07 Magna Electronics Inc. Method for stitching image data captured by multiple vehicular cameras
US10780827B2 (en) 2013-02-27 2020-09-22 Magna Electronics Inc. Method for stitching images captured by multiple vehicular cameras
US11572015B2 (en) 2013-02-27 2023-02-07 Magna Electronics Inc. Multi-camera vehicular vision system with graphic overlay
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
US10486596B2 (en) 2013-02-27 2019-11-26 Magna Electronics Inc. Multi-camera dynamic top view vision system
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
US11794647B2 (en) 2013-05-21 2023-10-24 Magna Electronics Inc. Vehicular vision system having a plurality of cameras
US11597319B2 (en) 2013-05-21 2023-03-07 Magna Electronics Inc. Targetless vehicular camera calibration system
US11109018B2 (en) 2013-05-21 2021-08-31 Magna Electronics Inc. Targetless vehicular camera misalignment correction method
US10780826B2 (en) 2013-05-21 2020-09-22 Magna Electronics Inc. Method for determining misalignment of a vehicular camera
US9701246B2 (en) 2013-05-21 2017-07-11 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US10567748B2 (en) 2013-05-21 2020-02-18 Magna Electronics Inc. Targetless vehicular camera calibration method
US10266115B2 (en) 2013-05-21 2019-04-23 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US11919449B2 (en) 2013-05-21 2024-03-05 Magna Electronics Inc. Targetless vehicular camera calibration system
US11447070B2 (en) 2013-05-21 2022-09-20 Magna Electronics Inc. Method for determining misalignment of a vehicular camera
US9979957B2 (en) 2013-05-21 2018-05-22 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US9205776B2 (en) 2013-05-21 2015-12-08 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US9563951B2 (en) 2013-05-21 2017-02-07 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
EP2830305B1 (en) * 2013-07-24 2016-11-23 SMR Patents S.à.r.l. Display device for a motor vehicle and method for operating such a display device
EP3041228A4 (en) * 2013-08-30 2017-06-28 Clarion Co., Ltd. Camera calibration device, camera calibration system, and camera calibration method
EP2858035A1 (en) 2013-10-01 2015-04-08 Application Solutions (Electronics and Vision) Limited System, vehicle and method for online calibration of a camera on a vehicle
US9792683B2 (en) 2013-10-01 2017-10-17 Application Solutions (Electronics and Vision) Ltd. System, vehicle and method for online calibration of a camera on a vehicle
US20150092058A1 (en) * 2013-10-01 2015-04-02 Application Solutions (Electronics and Vision) Ltd. System, Vehicle and Method for Online Calibration of a Camera on a Vehicle
CN104517283B (en) * 2013-10-01 2019-04-23 应用解决方案(电子及视频)有限公司 The system of the on-line calibration of video camera, vehicle and method on vehicle
JP2015072269A (en) * 2013-10-01 2015-04-16 アプリケーション・ソリューションズ・(エレクトロニクス・アンド・ヴィジョン)・リミテッド System, vehicle and method for online calibration of camera on vehicle
CN104517283A (en) * 2013-10-01 2015-04-15 应用解决方案(电子及视频)有限公司 System, vehicle and method for online calibration of a camera on a vehicle
CN106233711B (en) * 2014-09-30 2019-08-02 歌乐株式会社 Camera calibration apparatus and camera calibration system
US10594943B2 (en) 2014-09-30 2020-03-17 Clarion Co., Ltd. Camera calibration device and camera calibration system
CN106233711A (en) * 2014-09-30 2016-12-14 歌乐株式会社 Camera calibration apparatus and camera calibration system
US20170061622A1 (en) * 2014-09-30 2017-03-02 Clarion Co., Ltd. Camera calibration device and camera calibration system
CN107004277A (en) * 2014-12-04 2017-08-01 康诺特电子有限公司 The on-line proving of motor vehicles camera system
US10163226B2 (en) 2014-12-04 2018-12-25 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system
WO2016087298A1 (en) * 2014-12-04 2016-06-09 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system
JP2017536613A (en) * 2014-12-04 2017-12-07 コノート、エレクトロニクス、リミテッドConnaught Electronics Ltd. Online calibration of automatic vehicle camera systems
EP3035676A1 (en) 2014-12-19 2016-06-22 Conti Temic microelectronic GmbH Surround view system and vehicle including a surround view system
WO2016096506A1 (en) 2014-12-19 2016-06-23 Conti Temic Microelectronic Gmbh Surround view system and vehicle including a surround view system
US9916660B2 (en) 2015-01-16 2018-03-13 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US10235775B2 (en) 2015-01-16 2019-03-19 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US10946799B2 (en) 2015-04-21 2021-03-16 Magna Electronics Inc. Vehicle vision system with overlay calibration
US11535154B2 (en) 2015-04-21 2022-12-27 Magna Electronics Inc. Method for calibrating a vehicular vision system
US10620000B2 (en) 2015-10-20 2020-04-14 Clarion Co., Ltd. Calibration apparatus, calibration method, and calibration program
US11340071B2 (en) 2016-02-10 2022-05-24 Clarion Co., Ltd. Calibration system and calibration apparatus
KR20170120010A (en) * 2016-04-20 2017-10-30 엘지이노텍 주식회사 Image acquiring device and methid of the same
US11151745B2 (en) * 2016-04-20 2021-10-19 Lg Innotek Co., Ltd. Image acquisition apparatus and method therefor
KR102597435B1 (en) 2016-04-20 2023-11-03 엘지이노텍 주식회사 Image acquiring device and methid of the same
DE102017000454A1 (en) 2017-01-19 2018-07-19 e.solutions GmbH Method, system and vehicle for calibrating image data of two cameras
DE102017000454B4 (en) 2017-01-19 2018-09-27 e.solutions GmbH Method and system for calibrating image data of two cameras
JP2019087858A (en) * 2017-11-06 2019-06-06 パナソニックIpマネジメント株式会社 Camera correction device, camera correction system, camera correction method, and program
JP2021513247A (en) * 2018-06-05 2021-05-20 上▲海▼商▲湯▼智能科技有限公司Shanghai Sensetime Intelligent Technology Co., Ltd. In-vehicle camera self-calibration method, in-vehicle camera self-calibration device, electronic device and storage medium
JP7082671B2 (en) 2018-06-05 2022-06-08 上▲海▼商▲湯▼智能科技有限公司 In-vehicle camera self-calibration method, in-vehicle camera self-calibration device, electronic device and storage medium
CN110654382A (en) * 2018-06-28 2020-01-07 比亚迪股份有限公司 Calibration method and calibration system of lane yaw early warning system and vehicle
US11645783B2 (en) 2018-12-19 2023-05-09 Faurecia Clarion Electronics Co., Ltd. Calibration apparatus and calibration method
WO2020239457A1 (en) 2019-05-29 2020-12-03 Connaught Electronics Ltd. Image acquisition system
CN112304356A (en) * 2019-07-31 2021-02-02 大众汽车股份公司 Method and device for checking the calibration of an environmental sensor
US11645782B2 (en) 2019-07-31 2023-05-09 Volkswagen Aktiengesellschaft Method and device for checking a calibration of environment sensors
CN112819901A (en) * 2021-02-26 2021-05-18 中国人民解放军93114部队 Infrared camera self-calibration method based on image edge information
GB2605425A (en) * 2021-03-31 2022-10-05 Continental Automotive Gmbh Vehicular camera calibration system and method

Similar Documents

Publication Publication Date Title
WO2012143036A1 (en) Online vehicle camera calibration based on continuity of features
WO2012139660A1 (en) Online vehicle camera calibration based on road marking extractions
US10620000B2 (en) Calibration apparatus, calibration method, and calibration program
KR101245906B1 (en) Calibration indicator used for calibration of onboard camera, calibration method of onboard camera using calibration indicator, and program for calibration device of onboard camera using calibration indicator
US9569673B2 (en) Method and device for detecting a position of a vehicle on a lane
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
KR102365501B1 (en) Method and apparatus for calibrating the extrinsic parameter of an image sensor
WO2012139636A1 (en) Online vehicle camera calibration based on road surface texture tracking and geometric properties
EP3032818B1 (en) Image processing device
CN101676686B (en) Method for calculating the position and orientation of a camera in a vehicle
US20130002861A1 (en) Camera distance measurement device
US20090179916A1 (en) Method and apparatus for calibrating a video display overlay
CN108886606B (en) Mounting angle detection device, mounting angle calibration device, and mounting angle detection method for in-vehicle camera
KR20150135697A (en) Apparatus and method for generation of camera parameter
CN110176038A (en) Calibrate the method and system of the camera of vehicle
US9892519B2 (en) Method for detecting an object in an environmental region of a motor vehicle, driver assistance system and motor vehicle
US10554951B2 (en) Method and apparatus for the autocalibration of a vehicle camera system
KR20150041334A (en) Image processing method of around view monitoring system
US20140241587A1 (en) Apparatus for estimating of vehicle movement using stereo matching
JP5539250B2 (en) Approaching object detection device and approaching object detection method
KR20150051388A (en) Stereo Camera Rectification System
CN109074480A (en) Method, computing device, driver assistance system and the motor vehicles of rolling shutter effect in the image of environmental area for detecting motor vehicles
KR20200118073A (en) System and method for dynamic three-dimensional calibration
KR102466305B1 (en) System and method for compensating avm tolerance
US10249056B2 (en) Vehicle position estimation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11716502

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11716502

Country of ref document: EP

Kind code of ref document: A1